WorldWideScience

Sample records for large intelligent process

  1. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  2. Intelligent multivariate process supervision

    International Nuclear Information System (INIS)

    Visuri, Pertti.

    1986-01-01

    This thesis addresses the difficulties encountered in managing large amounts of data in supervisory control of complex systems. Some previous alarm and disturbance analysis concepts are reviewed and a method for improving the supervision of complex systems is presented. The method, called multivariate supervision, is based on adding low level intelligence to the process control system. By using several measured variables linked together by means of deductive logic, the system can take into account the overall state of the supervised system. Thus, it can present to the operators fewer messages with higher information content than the conventional control systems which are based on independent processing of each variable. In addition, the multivariate method contains a special information presentation concept for improving the man-machine interface. (author)

  3. Clinical Process Intelligence

    DEFF Research Database (Denmark)

    Vilstrup Pedersen, Klaus

    2006-01-01

    .e. local guidelines. From a knowledge management point of view, this externalization of generalized processes, gives the opportunity to learn from, evaluate and optimize the processes. "Clinical Process Intelligence" (CPI), will denote the goal of getting generalized insight into patient centered health...

  4. Business process intelligence

    NARCIS (Netherlands)

    Castellanos, M.; Alves De Medeiros, A.K.; Mendling, J.; Weber, B.; Weijters, A.J.M.M.; Cardoso, J.; Aalst, van der W.M.P.

    2009-01-01

    Business Process Intelligence (BPI,) is an emerging area that is getting increasingly popularfor enterprises. The need to improve business process efficiency, to react quickly to changes and to meet regulatory compliance is among the main drivers for BPI. BPI refers to the application of Business

  5. BUSINESS INTELLIGENCE ADOPTION IN LARGE ROMANIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Flavia CAIA

    2014-11-01

    Full Text Available The economic conditions and market competition create pressures on companies to adopt new technologies that can provide more efficient information and can support decision-making better. The purpose of the research is to investigate the decision support information systems in order to apprise and enhance the capacity of the entities to apply the new knowledge that BI produces for organizational success and competitiveness. The importance of the conducted research consists in identifying solutions to improve reporting and stimulate the entities to start using business intelligence (BI technologies, which facilitate obtaining new information, in order to ensure flexibility, resilience and provide answers to questions that go beyond what the pre-defined reports can do to support decision-making. The estimated result is a technical and operational overview of the large companies in Romania, drawing future directions for an improved competitive behaviour and strategic awareness, and identifying the significant factors for optimizing the decision-making process.

  6. Computational Intelligence in Image Processing

    CERN Document Server

    Siarry, Patrick

    2013-01-01

    Computational intelligence based techniques have firmly established themselves as viable, alternate, mathematical tools for more than a decade. They have been extensively employed in many systems and application domains, among these signal processing, automatic control, industrial and consumer electronics, robotics, finance, manufacturing systems, electric power systems, and power electronics. Image processing is also an extremely potent area which has attracted the atten­tion of many researchers who are interested in the development of new computational intelligence-based techniques and their suitable applications, in both research prob­lems and in real-world problems. Part I of the book discusses several image preprocessing algorithms; Part II broadly covers image compression algorithms; Part III demonstrates how computational intelligence-based techniques can be effectively utilized for image analysis purposes; and Part IV shows how pattern recognition, classification and clustering-based techniques can ...

  7. Aktuelles Schlagwort: Business Process Intelligence

    NARCIS (Netherlands)

    Mutschler, B.B.; Reichert, M.U.

    In jüngerer Vergangenheit rückt vermehrt die Erfassung und Analyse von Prozessechtdaten (z.B. zum Start und Ende von Prozessaktivitäten) in den Blickpunkt. Solche Daten werden von den meisten prozessorientierten Informationssystemen geliefert. Das Schlagwort Business Process Intelligence (BPI)

  8. Large Efficient Intelligent Heating Relay Station System

    Science.gov (United States)

    Wu, C. Z.; Wei, X. G.; Wu, M. Q.

    2017-12-01

    The design of large efficient intelligent heating relay station system aims at the improvement of the existing heating system in our country, such as low heating efficiency, waste of energy and serious pollution, and the control still depends on the artificial problem. In this design, we first improve the existing plate heat exchanger. Secondly, the ATM89C51 is used to control the whole system and realize the intelligent control. The detection part is using the PT100 temperature sensor, pressure sensor, turbine flowmeter, heating temperature, detection of user end liquid flow, hydraulic, and real-time feedback, feedback signal to the microcontroller through the heating for users to adjust, realize the whole system more efficient, intelligent and energy-saving.

  9. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  10. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  11. Business Intelligence in Process Control

    Science.gov (United States)

    Kopčeková, Alena; Kopček, Michal; Tanuška, Pavol

    2013-12-01

    The Business Intelligence technology, which represents a strong tool not only for decision making support, but also has a big potential in other fields of application, is discussed in this paper. Necessary fundamental definitions are offered and explained to better understand the basic principles and the role of this technology for company management. Article is logically divided into five main parts. In the first part, there is the definition of the technology and the list of main advantages. In the second part, an overview of the system architecture with the brief description of separate building blocks is presented. Also, the hierarchical nature of the system architecture is shown. The technology life cycle consisting of four steps, which are mutually interconnected into a ring, is described in the third part. In the fourth part, analytical methods incorporated in the online analytical processing and data mining used within the business intelligence as well as the related data mining methodologies are summarised. Also, some typical applications of the above-mentioned particular methods are introduced. In the final part, a proposal of the knowledge discovery system for hierarchical process control is outlined. The focus of this paper is to provide a comprehensive view and to familiarize the reader with the Business Intelligence technology and its utilisation.

  12. Machine intelligence and signal processing

    CERN Document Server

    Vatsa, Mayank; Majumdar, Angshul; Kumar, Ajay

    2016-01-01

    This book comprises chapters on key problems in machine learning and signal processing arenas. The contents of the book are a result of a 2014 Workshop on Machine Intelligence and Signal Processing held at the Indraprastha Institute of Information Technology. Traditionally, signal processing and machine learning were considered to be separate areas of research. However in recent times the two communities are getting closer. In a very abstract fashion, signal processing is the study of operator design. The contributions of signal processing had been to device operators for restoration, compression, etc. Applied Mathematicians were more interested in operator analysis. Nowadays signal processing research is gravitating towards operator learning – instead of designing operators based on heuristics (for example wavelets), the trend is to learn these operators (for example dictionary learning). And thus, the gap between signal processing and machine learning is fast converging. The 2014 Workshop on Machine Intel...

  13. Artificial intelligence and process management

    International Nuclear Information System (INIS)

    Epton, J.B.A.

    1989-01-01

    Techniques derived from work in artificial intelligence over the past few decades are beginning to change the approach in applying computers to process management. To explore this new approach and gain real practical experience of its potential a programme of experimental applications was initiated by Sira in collaboration with the process industry. This programme encompassed a family of experimental applications ranging from process monitoring, through supervisory control and troubleshooting to planning and scheduling. The experience gained has led to a number of conclusions regarding the present level of maturity of the technology, the potential for further developments and the measures required to secure the levels of system integrity necessary in on-line applications to critical processes. (author)

  14. Parallel processing for artificial intelligence 2

    CERN Document Server

    Kumar, V; Suttner, CB

    1994-01-01

    With the increasing availability of parallel machines and the raising of interest in large scale and real world applications, research on parallel processing for Artificial Intelligence (AI) is gaining greater importance in the computer science environment. Many applications have been implemented and delivered but the field is still considered to be in its infancy. This book assembles diverse aspects of research in the area, providing an overview of the current state of technology. It also aims to promote further growth across the discipline. Contributions have been grouped according to their

  15. Report : business process intelligence challenge 2013

    NARCIS (Netherlands)

    Dongen, van B.F.; Weber, B.; Ferreira, D.R.; De Weerdt, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    For the third time, the Business Process Intelligence workshop hosted the Business Process Intelligence Challenge. The goal of this challenge is twofold. On the one hand, the challenge allows researchers and practitioners in the field to show their analytical capabilities to a broader audience. On

  16. Intelligent systems for KSC ground processing

    Science.gov (United States)

    Heard, Astrid E.

    1992-01-01

    The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

  17. Intelligence amplification framework for enhancing scheduling processes

    NARCIS (Netherlands)

    Dobrkovic, Andrej; Liu, Luyao; Iacob, Maria Eugenia; van Hillegersberg, Jos

    2016-01-01

    The scheduling process in a typical business environment consists of predominantly repetitive tasks that have to be completed in limited time and often containing some form of uncertainty. The intelligence amplification is a symbiotic relationship between a human and an intelligent agent. This

  18. The Predictive Aspect of Business Process Intelligence

    DEFF Research Database (Denmark)

    Pérez, Moisés Lima; Møller, Charles

    2007-01-01

    This paper presents the arguments for a research proposal on predicting business events in a Business Process Intelligence (BPI) context. The paper argues that BPI holds a potential for leveraging enterprise benefits by supporting real-time processes. However, based on the experiences from past...... business intelligence projects the paper argues that it is necessary to establish a new methodology to mine and extract the intelligence on the business level which is different from that, which will improve a business process in an enterprise. In conclusion the paper proposes a new research project aimed...

  19. The big data processing platform for intelligent agriculture

    Science.gov (United States)

    Huang, Jintao; Zhang, Lichen

    2017-08-01

    Big data technology is another popular technology after the Internet of Things and cloud computing. Big data is widely used in many fields such as social platform, e-commerce, and financial analysis and so on. Intelligent agriculture in the course of the operation will produce large amounts of data of complex structure, fully mining the value of these data for the development of agriculture will be very meaningful. This paper proposes an intelligent data processing platform based on Storm and Cassandra to realize the storage and management of big data of intelligent agriculture.

  20. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  1. Parallel processing for artificial intelligence 1

    CERN Document Server

    Kanal, LN; Kumar, V; Suttner, CB

    1994-01-01

    Parallel processing for AI problems is of great current interest because of its potential for alleviating the computational demands of AI procedures. The articles in this book consider parallel processing for problems in several areas of artificial intelligence: image processing, knowledge representation in semantic networks, production rules, mechanization of logic, constraint satisfaction, parsing of natural language, data filtering and data mining. The publication is divided into six sections. The first addresses parallel computing for processing and understanding images. The second discus

  2. Large-uncertainty intelligent states for angular momentum and angle

    International Nuclear Information System (INIS)

    Goette, Joerg B; Zambrini, Roberta; Franke-Arnold, Sonja; Barnett, Stephen M

    2005-01-01

    The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In this paper, we highlight the difference between intelligent states and minimum uncertainty states by investigating a class of intelligent states which obey the equality in the angular uncertainty relation while having an arbitrarily large uncertainty product. To develop an understanding for the uncertainties of angle and angular momentum for the large-uncertainty intelligent states we compare exact solutions with analytical approximations in two limiting cases

  3. Intelligent medical image processing by simulated annealing

    International Nuclear Information System (INIS)

    Ohyama, Nagaaki

    1992-01-01

    Image processing is being widely used in the medical field and already has become very important, especially when used for image reconstruction purposes. In this paper, it is shown that image processing can be classified into 4 categories; passive, active, intelligent and visual image processing. These 4 classes are explained at first through the use of several examples. The results show that the passive image processing does not give better results than the others. Intelligent image processing, then, is addressed, and the simulated annealing method is introduced. Due to the flexibility of the simulated annealing, formulated intelligence is shown to be easily introduced in an image reconstruction problem. As a practical example, 3D blood vessel reconstruction from a small number of projections, which is insufficient for conventional method to give good reconstruction, is proposed, and computer simulation clearly shows the effectiveness of simulated annealing method. Prior to the conclusion, medical file systems such as IS and C (Image Save and Carry) is pointed out to have potential for formulating knowledge, which is indispensable for intelligent image processing. This paper concludes by summarizing the advantages of simulated annealing. (author)

  4. Intelligent processing for thick composites

    Science.gov (United States)

    Shin, Daniel Dong-Ok

    2000-10-01

    Manufacturing thick composite parts are associated with adverse curing conditions such as large in-plane temperature gradient and exotherms. The condition is further aggravated because the manufacturer's cycle and the existing cure control systems do not adequately counter such affects. In response, the forecast-based thermal control system is developed to have better cure control for thick composites. Accurate cure kinetic model is crucial for correctly identifying the amount of heat generated for composite process simulation. A new technique for identifying cure parameters for Hercules AS4/3502 prepreg is presented by normalizing the DSC data. The cure kinetics is based on an autocatalytic model for the proposed method, which uses dynamic and isothermal DSC data to determine its parameters. Existing models are also used to determine kinetic parameters but rendered inadequate because of the material's temperature dependent final degree of cure. The model predictions determined from the new technique showed good agreement to both isothermal and dynamic DSC data. The final degree of cure was also in good agreement with experimental data. A realistic cure simulation model including bleeder ply analysis and compaction is validated with Hercules AS4/3501-6 based laminates. The nonsymmetrical temperature distribution resulting from the presence of bleeder plies agreed well to the model prediction. Some of the discrepancies in the predicted compaction behavior were attributed to inaccurate viscosity and permeability models. The temperature prediction was quite good for the 3cm laminate. The validated process simulation model along with cure kinetics model for AS4/3502 prepreg were integrated into the thermal control system. The 3cm Hercules AS4/3501-6 and AS4/3502 laminate were fabricated. The resulting cure cycles satisfied all imposed requirements by minimizing exotherms and temperature gradient. Although the duration of the cure cycles increased, such phenomena was

  5. Intelligent process control operator aid -- An artificial intelligence approach

    International Nuclear Information System (INIS)

    Sharma, D.D.; Miller, D.D.; Hajek, B.; Chandrasekaran, B.

    1986-01-01

    This paper describes an approach for designing intelligent process and power plant control operator aids. It is argued that one of the key aspects of an intelligent operator aid is the capability for dynamic procedure synthesis with incomplete definition of initial state, unknown goal states, and the dynamic world situation. The dynamic world state is used to determine the goal, select appropriate plan steps from prespecified procedures to achieve the goal, control the execution of the synthesized plan, and provide for dynamic recovery from failure often using a goal hierarchy. The dynamic synthesis of a plan requires integration of various problems solving capabilities such as plan generation, plan synthesis, plan modification, and failure recovery from a plan. The programming language for implementing the DPS framework provides a convenient tool for developing applications. An application of the DPS approach to a Nuclear Power Plant emergency procedure synthesis is also described. Initial test results indicate that the approach is successful in dynamically synthesizing the procedures. The authors realize that the DPS framework is not a solution for all control tasks. However, many existing process and plant control problems satisfy the requirements discussed in the paper and should be able to benefit from the framework described

  6. New Perspectives on Intelligence Collection and Processing

    Science.gov (United States)

    2016-06-01

    MASINT Measurement and Signature Intelligence NPS Naval Postgraduate School OSINT Open Source Intelligence pdf Probability Density Function SIGINT...MASINT): different types of sensors • Open Source Intelligence ( OSINT ): from all open sources • Signals Intelligence (SIGINT): intercepting the

  7. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  8. Application of artificial intelligence in process control

    CERN Document Server

    Krijgsman, A

    1993-01-01

    This book is the result of a united effort of six European universities to create an overall course on the appplication of artificial intelligence (AI) in process control. The book includes an introduction to key areas including; knowledge representation, expert, logic, fuzzy logic, neural network, and object oriented-based approaches in AI. Part two covers the application to control engineering, part three: Real-Time Issues, part four: CAD Systems and Expert Systems, part five: Intelligent Control and part six: Supervisory Control, Monitoring and Optimization.

  9. Using Intelligent Agents to Manage Business Processes

    OpenAIRE

    Jennings, N. R.; Faratin, P.; Johnson, M. J.; O'Brien, P.; Wiegand, M. E.

    1996-01-01

    Management of the business process requires pertinent, consistent and up-to-date information gathering and information dissemination. These complex and time consuming tasks prompt organizations to develop an Information Technology system to assist with the management of various aspects of their business processes. Intelligent agents are the strongest solution candidates because of their many advantages, namely: autonomy, social ability, responsiveness and proactiveness. Given these characteri...

  10. Editorial: "Business process intelligence : connecting data and processes"

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Zhao, J.L.; Wang, H.; Wang, Harry Jiannan

    2015-01-01

    This introduction to the special issue on Business Process Intelligence (BPI) discusses the relation between data and processes. The recent attention for Big Data illustrates that organizations are aware of the potential of the torrents of data generated by today's information systems. However, at

  11. Large forging manufacturing process

    Science.gov (United States)

    Thamboo, Samuel V.; Yang, Ling

    2002-01-01

    A process for forging large components of Alloy 718 material so that the components do not exhibit abnormal grain growth includes the steps of: a) providing a billet with an average grain size between ASTM 0 and ASTM 3; b) heating the billet to a temperature of between 1750.degree. F. and 1800.degree. F.; c) upsetting the billet to obtain a component part with a minimum strain of 0.125 in at least selected areas of the part; d) reheating the component part to a temperature between 1750.degree. F. and 1800.degree. F.; e) upsetting the component part to a final configuration such that said selected areas receive no strains between 0.01 and 0.125; f) solution treating the component part at a temperature of between 1725.degree. F. and 1750.degree. F.; and g) aging the component part over predetermined times at different temperatures. A modified process achieves abnormal grain growth in selected areas of a component where desirable.

  12. Markov decision processes in artificial intelligence

    CERN Document Server

    Sigaud, Olivier

    2013-01-01

    Markov Decision Processes (MDPs) are a mathematical framework for modeling sequential decision problems under uncertainty as well as Reinforcement Learning problems. Written by experts in the field, this book provides a global view of current research using MDPs in Artificial Intelligence. It starts with an introductory presentation of the fundamental aspects of MDPs (planning in MDPs, Reinforcement Learning, Partially Observable MDPs, Markov games and the use of non-classical criteria). Then it presents more advanced research trends in the domain and gives some concrete examples using illustr

  13. The process of implementing Competitive Intelligence in a company

    Directory of Open Access Journals (Sweden)

    František Bartes

    2013-01-01

    Full Text Available It is a common occurrence in business practice that the management of a company, in an effort to jump-start the function of the Competitive Intelligence unit, makes a number of mistakes and errors. Yet it is not difficult to avoid these missteps and achieve the desired level of Competitive Intelligence activities in a purposeful and effective manner. The author believes that a resolution of this problem lies in his concept of Competitive Intelligence viewed as a system application discipline (like value analysis or value engineering, which is why he approaches the problem of actual implementation of Competitive Intelligence in a company by referring to standards ČSN EN 12 973 and ČSN EN 1325-2. The author then proposes his own procedure for implementing Competitive Intelligence in a company. He first describes the various ways of securing the Competitive Intelligence services. Depending on the manner of securing these services, it is necessary to choose the actual method of bringing Competitive Intelligence into the company. The author goes on to lists the essentials that every program of Competitive Intelligence implementation should have. The process of Competitive Intelligence implementation unfolds in three stages, those being: 1. Managerial preparation for the introduction of Competitive Intelligence. 2. Personnel-oriented and professional preparation for applying Competitive Intelligence. 3. Organizational preparation for the implementation and practice of Competitive Intelligence. In Discussion, the author points out the most common mistakes he encountered in practice when implementing the Competitive Intelligence function.

  14. Process mining : business intelligence software wordt eindelijk intelligent

    NARCIS (Netherlands)

    Aalst, van der W.M.P.

    2007-01-01

    Business Intelligence is een begrip dat verwijst naar software die gebruikt kan worden om gegevens over operationele bedrijfsprocessen te verzamelen en deze vervolgens te analyseren. Het doel van BI software is het verkrijgen van meer kennis en inzicht, welke gebruikt kunnen worden om processen

  15. Tool path strategy and cutting process monitoring in intelligent machining

    Science.gov (United States)

    Chen, Ming; Wang, Chengdong; An, Qinglong; Ming, Weiwei

    2018-06-01

    Intelligent machining is a current focus in advanced manufacturing technology, and is characterized by high accuracy and efficiency. A central technology of intelligent machining—the cutting process online monitoring and optimization—is urgently needed for mass production. In this research, the cutting process online monitoring and optimization in jet engine impeller machining, cranio-maxillofacial surgery, and hydraulic servo valve deburring are introduced as examples of intelligent machining. Results show that intelligent tool path optimization and cutting process online monitoring are efficient techniques for improving the efficiency, quality, and reliability of machining.

  16. Artificial intelligence in the materials processing laboratory

    Science.gov (United States)

    Workman, Gary L.; Kaukler, William F.

    1990-01-01

    Materials science and engineering provides a vast arena for applications of artificial intelligence. Advanced materials research is an area in which challenging requirements confront the researcher, from the drawing board through production and into service. Advanced techniques results in the development of new materials for specialized applications. Hand-in-hand with these new materials are also requirements for state-of-the-art inspection methods to determine the integrity or fitness for service of structures fabricated from these materials. Two problems of current interest to the Materials Processing Laboratory at UAH are an expert system to assist in eddy current inspection of graphite epoxy components for aerospace and an expert system to assist in the design of superalloys for high temperature applications. Each project requires a different approach to reach the defined goals. Results to date are described for the eddy current analysis, but only the original concepts and approaches considered are given for the expert system to design superalloys.

  17. INTELLIGENT SUPPORT OF EDUCATIONAL PROCESSES AT LEVEL OF SPECIALITY

    Directory of Open Access Journals (Sweden)

    Irina I. Kazmina

    2013-01-01

    Full Text Available The article is devoted to intelligent support of educational processes at level of speciality with the help of information system. In this paper intelligent information system of Modern Humanitarian Academy is considered and three directions of development of intelligent support within the scope of developed information system are offered. These directions include: development of model of student, data mining of quality of teaching and prediction of quality of teaching in the future. 

  18. Internet-based intelligent information processing systems

    CERN Document Server

    Tonfoni, G; Ichalkaranje, N S

    2003-01-01

    The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap

  19. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    Science.gov (United States)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  20. Complex Formation Control of Large-Scale Intelligent Autonomous Vehicles

    Directory of Open Access Journals (Sweden)

    Ming Lei

    2012-01-01

    Full Text Available A new formation framework of large-scale intelligent autonomous vehicles is developed, which can realize complex formations while reducing data exchange. Using the proposed hierarchy formation method and the automatic dividing algorithm, vehicles are automatically divided into leaders and followers by exchanging information via wireless network at initial time. Then, leaders form formation geometric shape by global formation information and followers track their own virtual leaders to form line formation by local information. The formation control laws of leaders and followers are designed based on consensus algorithms. Moreover, collision-avoiding problems are considered and solved using artificial potential functions. Finally, a simulation example that consists of 25 vehicles shows the effectiveness of theory.

  1. Predicting speech intelligibility in conditions with nonlinearly processed noisy speech

    DEFF Research Database (Denmark)

    Jørgensen, Søren; Dau, Torsten

    2013-01-01

    The speech-based envelope power spectrum model (sEPSM; [1]) was proposed in order to overcome the limitations of the classical speech transmission index (STI) and speech intelligibility index (SII). The sEPSM applies the signal-tonoise ratio in the envelope domain (SNRenv), which was demonstrated...... to successfully predict speech intelligibility in conditions with nonlinearly processed noisy speech, such as processing with spectral subtraction. Moreover, a multiresolution version (mr-sEPSM) was demonstrated to account for speech intelligibility in various conditions with stationary and fluctuating...

  2. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  3. Multiple multichannel spectra acquisition and processing system with intelligent interface

    International Nuclear Information System (INIS)

    Chen Ying; Wei Yixiang; Qu Jianshi; Zheng Futang; Xu Shengkui; Xie Yuanming; Qu Xing; Ji Weitong; Qiu Xuehua

    1986-01-01

    A Multiple multichannel spectra acquisition and processing system with intelligent interface is described. Sixteen spectra measured with various lengths, channel widths, back biases and acquisition times can be identified and collected by the intelligent interface simultaneously while the connected computer is doing data processing. The execution time for the Ge(Li) gamma-ray spectrum analysis software on IBM PC-XT is about 55 seconds

  4. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  5. Sustainability Reporting Process Model using Business Intelligence

    OpenAIRE

    Alxneit, Thorsten Julius

    2015-01-01

    Sustainability including the reporting requirements is one of the most relevant topics for companies. In recent years, many software providers have launched new software tools targeting companies committed to implementing sustainability reporting. But it’s not only companies willing to use their Business Intelligence (BI) solution, there are also basic principles such as the single source of truth and tendencies to combine sustainability reporting with the financial reporting (...

  6. An Approach to quantify the Costs of Business Process Intelligence.

    NARCIS (Netherlands)

    Mutschler, B.B.; Bumiller, J.; Reichert, M.U.; Desel, J.; Frank, U.

    2005-01-01

    Today, enterprises are forced to continuously optimize their business as well as service processes. In this context the process-centered alignment of information systems is crucial. The use of business process intelligence (BPI) tools offers promising perspectives in this respect. However, when

  7. Large transverse momentum hadronic processes

    International Nuclear Information System (INIS)

    Darriulat, P.

    1977-01-01

    The possible relations between deep inelastic leptoproduction and large transverse momentum (psub(t)) processes in hadronic collisions are usually considered in the framework of the quark-parton picture. Experiments observing the structure of the final state in proton-proton collisions producing at least one large transverse momentum particle have led to the following conclusions: a large fraction of produced particles are uneffected by the large psub(t) process. The other products are correlated to the large psub(t) particle. Depending upon the sign of scalar product they can be separated into two groups of ''towards-movers'' and ''away-movers''. The experimental evidence are reviewed favouring such a picture and the properties are discussed of each of three groups (underlying normal event, towards-movers and away-movers). Some phenomenological interpretations are presented. The exact nature of away- and towards-movers must be further investigated. Their apparent jet structure has to be confirmed. Angular correlations between leading away and towards movers are very informative. Quantum number flow, both within the set of away and towards-movers, and between it and the underlying normal event, are predicted to behave very differently in different models

  8. Artificial intelligence in process design and operation

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1988-01-01

    Artificial Intelligence (AI) has recently become prominent in the discussion of computer applications in the utility business. In order to assess this technology, a research project was performed to determine whether software development techniques based on AI could be used to facilitate management of information associated with the design of a generating station. The approach taken was the development of an expert system, using a relatively simple set of rules acting on a more complex knowledge base. A successful prototype for the application was developed and its potential extension to a production environment demonstrated. During the course of prototype development, other possible applications of AI in design engineering were discovered, and areas of particular interest selected for further investigation. A plan for AI R and D was formulated. That plan and other possible future work in AI are discussed

  9. Natural language processing in an intelligent writing strategy tutoring system.

    Science.gov (United States)

    McNamara, Danielle S; Crossley, Scott A; Roscoe, Rod

    2013-06-01

    The Writing Pal is an intelligent tutoring system that provides writing strategy training. A large part of its artificial intelligence resides in the natural language processing algorithms to assess essay quality and guide feedback to students. Because writing is often highly nuanced and subjective, the development of these algorithms must consider a broad array of linguistic, rhetorical, and contextual features. This study assesses the potential for computational indices to predict human ratings of essay quality. Past studies have demonstrated that linguistic indices related to lexical diversity, word frequency, and syntactic complexity are significant predictors of human judgments of essay quality but that indices of cohesion are not. The present study extends prior work by including a larger data sample and an expanded set of indices to assess new lexical, syntactic, cohesion, rhetorical, and reading ease indices. Three models were assessed. The model reported by McNamara, Crossley, and McCarthy (Written Communication 27:57-86, 2010) including three indices of lexical diversity, word frequency, and syntactic complexity accounted for only 6% of the variance in the larger data set. A regression model including the full set of indices examined in prior studies of writing predicted 38% of the variance in human scores of essay quality with 91% adjacent accuracy (i.e., within 1 point). A regression model that also included new indices related to rhetoric and cohesion predicted 44% of the variance with 94% adjacent accuracy. The new indices increased accuracy but, more importantly, afford the means to provide more meaningful feedback in the context of a writing tutoring system.

  10. Data Processing Languages for Business Intelligence. SQL vs. R

    Directory of Open Access Journals (Sweden)

    Marin FOTACHE

    2016-01-01

    Full Text Available As data centric approach, Business Intelligence (BI deals with the storage, integration, processing, exploration and analysis of information gathered from multiple sources in various formats and volumes. BI systems are generally synonymous to costly, complex platforms that require vast organizational resources. But there is also an-other face of BI, that of a pool of data sources, applications, services developed at different times using different technologies. This is “democratic” BI or, in some cases, “fragmented”, “patched” (or “chaotic” BI. Fragmentation creates not only integration problems, but also supports BI agility as new modules can be quickly developed. Among various languages and tools that cover large extents of BI activities, SQL and R are instrumental for both BI platform developers and BI users. SQL and R address both monolithic and democratic BI. This paper compares essential data processing features of two languages, identifying similarities and differences among them and also their strengths and limits.

  11. Intelligent query processing for semantic mediation of information systems

    Directory of Open Access Journals (Sweden)

    Saber Benharzallah

    2011-11-01

    Full Text Available We propose an intelligent and an efficient query processing approach for semantic mediation of information systems. We propose also a generic multi agent architecture that supports our approach. Our approach focuses on the exploitation of intelligent agents for query reformulation and the use of a new technology for the semantic representation. The algorithm is self-adapted to the changes of the environment, offers a wide aptitude and solves the various data conflicts in a dynamic way; it also reformulates the query using the schema mediation method for the discovered systems and the context mediation for the other systems.

  12. Cognitive Process as a Basis for Intelligent Retrieval Systems Design.

    Science.gov (United States)

    Chen, Hsinchun; Dhar, Vasant

    1991-01-01

    Two studies of the cognitive processes involved in online document-based information retrieval were conducted. These studies led to the development of five computational models of online document retrieval which were incorporated into the design of an "intelligent" document-based retrieval system. Both the system and the broader implications of…

  13. Meaning of cognitive processes for creating artificial intelligence

    OpenAIRE

    Pangrác, Vojtěch

    2011-01-01

    This diploma thesis brings an integral view at cognitive processes connected with artificial intelligence systems, and makes a comparison with the processes observed in nature, including human being. A historical background helps us to look at the whole issue from a certain point of view. The main axis of interest comes after the historical overview and includes the following: environment -- stimulations -- processing -- reflection in the cognitive system -- reaction to stimulation; I balance...

  14. Intelligent control for scalable video processing

    NARCIS (Netherlands)

    Wüst, C.C.

    2006-01-01

    In this thesis we study a problem related to cost-effective video processing in software by consumer electronics devices, such as digital TVs. Video processing is the task of transforming an input video signal into an output video signal, for example to improve the quality of the signal. This

  15. Advances in Reasoning-Based Image Processing Intelligent Systems Conventional and Intelligent Paradigms

    CERN Document Server

    Nakamatsu, Kazumi

    2012-01-01

    The book puts special stress on the contemporary techniques for reasoning-based image processing and analysis: learning based image representation and advanced video coding; intelligent image processing and analysis in medical vision systems; similarity learning models for image reconstruction; visual perception for mobile robot motion control, simulation of human brain activity in the analysis of video sequences; shape-based invariant features extraction; essential of paraconsistent neural networks, creativity and intelligent representation in computational systems. The book comprises 14 chapters. Each chapter is a small monograph, representing resent investigations of authors in the area. The topics of the chapters cover wide scientific and application areas and complement each-other very well. The chapters’ content is based on fundamental theoretical presentations, followed by experimental results and comparison with similar techniques. The size of the chapters is well-ballanced which permits a thorough ...

  16. Intelligent Controller Design for a Chemical Process

    OpenAIRE

    Mr. Glan Devadhas G; Dr.Pushpakumar S.

    2010-01-01

    Chemical process control is a challenging problem due to the strong on*line non*linearity and extreme sensitivity to disturbances of the process. Ziegler – Nichols tuned PI and PID controllers are found to provide poor performances for higher*order and non–linear systems. This paper presents an application of one*step*ahead fuzzy as well as ANFIS (adaptive*network*based fuzzy inference system) tuning scheme for an Continuous Stirred Tank Reactor CSTR process. The controller is designed based ...

  17. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model. Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used. Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase. Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  18. The Role of Intelligence Quotient and Emotional Intelligence in Cognitive Control Processes

    Science.gov (United States)

    Checa, Purificación; Fernández-Berrocal, Pablo

    2015-01-01

    The relationship between intelligence quotient (IQ) and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI) in individuals' cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed. PMID:26648901

  19. The role of Intelligence Quotient and Emotional Intelligence in cognitive control processes

    Directory of Open Access Journals (Sweden)

    Purificación eCheca

    2015-12-01

    Full Text Available The relationship between intelligence quotient (IQ and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI in individuals’ cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test, but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed

  20. Matching intelligent systems with business process reengineering

    NARCIS (Netherlands)

    Hart, 't M.W.

    1996-01-01

    According to Venkatraman (1991) five degrees of IT-induced business reconfiguration can be distinguished: (1) localized exploitation of IT, (2) internal integration, (3) business process redesign, (4) business network redesign, and (5) business scope redefinition. On each of these levels, different

  1. Intelligent Predictive Control of Nonlienar Processes Using

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Sørensen, Paul Haase; Poulsen, Niels Kjølstad

    1996-01-01

    This paper presents a novel approach to design of generalized predictive controllers (GPC) for nonlinear processes. A neural network is used for modelling the process and a gain-scheduling type of GPC is subsequently designed. The combination of neural network models and predictive control has...... frequently been discussed in the neural network community. This paper proposes an approximate scheme, the approximate predictive control (APC), which facilitates the implementation and gives a substantial reduction in the required amount of computations. The method is based on a technique for extracting...... linear models from a nonlinear neural network and using them in designing the control system. The performance of the controller is demonstrated in a simulation study of a pneumatic servo system...

  2. Intelligence and neuroticism in relation to depression and psychological distress: Evidence from two large population cohorts.

    Science.gov (United States)

    Navrady, L B; Ritchie, S J; Chan, S W Y; Kerr, D M; Adams, M J; Hawkins, E H; Porteous, D; Deary, I J; Gale, C R; Batty, G D; McIntosh, A M

    2017-06-01

    Neuroticism is a risk factor for selected mental and physical illnesses and is inversely associated with intelligence. Intelligence appears to interact with neuroticism and mitigate its detrimental effects on physical health and mortality. However, the inter-relationships of neuroticism and intelligence for major depressive disorder (MDD) and psychological distress has not been well examined. Associations and interactions between neuroticism and general intelligence (g) on MDD, self-reported depression, and psychological distress were examined in two population-based cohorts: Generation Scotland: Scottish Family Health Study (GS:SFHS, n=19,200) and UK Biobank (n=90,529). The Eysenck Personality Scale Short Form-Revised measured neuroticism and g was extracted from multiple cognitive ability tests in each cohort. Family structure was adjusted for in GS:SFHS. Neuroticism was strongly associated with increased risk for depression and higher psychological distress in both samples. Although intelligence conferred no consistent independent effects on depression, it did increase the risk for depression across samples once neuroticism was adjusted for. Results suggest that higher intelligence may ameliorate the association between neuroticism and self-reported depression although no significant interaction was found for clinical MDD. Intelligence was inversely associated with psychological distress across cohorts. A small interaction was found across samples such that lower psychological distress associates with higher intelligence and lower neuroticism, although effect sizes were small. From two large cohort studies, our findings suggest intelligence acts a protective factor in mitigating the effects of neuroticism on psychological distress. Intelligence does not confer protection against diagnosis of depression in those high in neuroticism. Copyright © 2017 The Authors. Published by Elsevier Masson SAS.. All rights reserved.

  3. Building the competitive intelligence knowledge: processes and activities in a corporate organisation

    OpenAIRE

    Sreenivasulu, V.

    1999-01-01

    This paper discusses the process of building and developing comprehensive tools, techniques, support systems, and better methods of harnessing the competitive intelligence knowledge processes. The author stresses the need for building sophisticated methodological competitive intelligence knowledge acquisition, systematic collection of competitive intelligence knowledge from various sources for critical analysis, process, organization, synthesis, assessment, screening, filtering and interpreta...

  4. A conceptual framework for intelligent real-time information processing

    Science.gov (United States)

    Schudy, Robert

    1987-01-01

    By combining artificial intelligence concepts with the human information processing model of Rasmussen, a conceptual framework was developed for real time artificial intelligence systems which provides a foundation for system organization, control and validation. The approach is based on the description of system processing terms of an abstraction hierarchy of states of knowledge. The states of knowledge are organized along one dimension which corresponds to the extent to which the concepts are expressed in terms of the system inouts or in terms of the system response. Thus organized, the useful states form a generally triangular shape with the sensors and effectors forming the lower two vertices and the full evaluated set of courses of action the apex. Within the triangle boundaries are numerous processing paths which shortcut the detailed processing, by connecting incomplete levels of analysis to partially defined responses. Shortcuts at different levels of abstraction include reflexes, sensory motor control, rule based behavior, and satisficing. This approach was used in the design of a real time tactical decision aiding system, and in defining an intelligent aiding system for transport pilots.

  5. An intelligent allocation algorithm for parallel processing

    Science.gov (United States)

    Carroll, Chester C.; Homaifar, Abdollah; Ananthram, Kishan G.

    1988-01-01

    The problem of allocating nodes of a program graph to processors in a parallel processing architecture is considered. The algorithm is based on critical path analysis, some allocation heuristics, and the execution granularity of nodes in a program graph. These factors, and the structure of interprocessor communication network, influence the allocation. To achieve realistic estimations of the executive durations of allocations, the algorithm considers the fact that nodes in a program graph have to communicate through varying numbers of tokens. Coarse and fine granularities have been implemented, with interprocessor token-communication duration, varying from zero up to values comparable to the execution durations of individual nodes. The effect on allocation of communication network structures is demonstrated by performing allocations for crossbar (non-blocking) and star (blocking) networks. The algorithm assumes the availability of as many processors as it needs for the optimal allocation of any program graph. Hence, the focus of allocation has been on varying token-communication durations rather than varying the number of processors. The algorithm always utilizes as many processors as necessary for the optimal allocation of any program graph, depending upon granularity and characteristics of the interprocessor communication network.

  6. Intelligent Adaptation Process for Case Based Systems

    International Nuclear Information System (INIS)

    Nassar, A.M.; Mohamed, A.H.; Mohamed, A.H.

    2014-01-01

    Case Based Reasoning (CBR) Systems is one of the important decision making systems applied in many fields all over the world. The effectiveness of any CBR system based on the quality of the storage cases in the case library. Similar cases can be retrieved and adapted to produce the solution for the new problem. One of the main issues faced the CBR systems is the difficulties of achieving the useful cases. The proposed system introduces a new approach that uses the genetic algorithm (GA) technique to automate constructing the cases into the case library. Also, it can optimize the best one to be stored in the library for the future uses. However, the proposed system can avoid the problems of the uncertain and noisy cases. Besides, it can simply the retrieving and adaptation processes. So, it can improve the performance of the CBR system. The suggested system can be applied for many real-time problems. It has been applied for diagnosis the faults of the wireless network, diagnosis of the cancer diseases, diagnosis of the debugging of a software as cases of study. The proposed system has proved its performance in this field

  7. Employing the intelligence cycle process model within the Homeland Security Enterprise

    OpenAIRE

    Stokes, Roger L.

    2013-01-01

    CHDS State/Local The purpose of this thesis was to examine the employment and adherence of the intelligence cycle process model within the National Network of Fusion Centers and the greater Homeland Security Enterprise by exploring the customary intelligence cycle process model established by the United States Intelligence Community (USIC). This thesis revealed there are various intelligence cycle process models used by the USIC and taught to the National Network. Given the numerous differ...

  8. Artificial intelligence, expert systems, computer vision, and natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1984-01-01

    An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

  9. Working memory and intelligibility of hearing-aid processed speech

    Science.gov (United States)

    Souza, Pamela E.; Arehart, Kathryn H.; Shen, Jing; Anderson, Melinda; Kates, James M.

    2015-01-01

    Previous work suggested that individuals with low working memory capacity may be at a disadvantage in adverse listening environments, including situations with background noise or substantial modification of the acoustic signal. This study explored the relationship between patient factors (including working memory capacity) and intelligibility and quality of modified speech for older individuals with sensorineural hearing loss. The modification was created using a combination of hearing aid processing [wide-dynamic range compression (WDRC) and frequency compression (FC)] applied to sentences in multitalker babble. The extent of signal modification was quantified via an envelope fidelity index. We also explored the contribution of components of working memory by including measures of processing speed and executive function. We hypothesized that listeners with low working memory capacity would perform more poorly than those with high working memory capacity across all situations, and would also be differentially affected by high amounts of signal modification. Results showed a significant effect of working memory capacity for speech intelligibility, and an interaction between working memory, amount of hearing loss and signal modification. Signal modification was the major predictor of quality ratings. These data add to the literature on hearing-aid processing and working memory by suggesting that the working memory-intelligibility effects may be related to aggregate signal fidelity, rather than to the specific signal manipulation. They also suggest that for individuals with low working memory capacity, sensorineural loss may be most appropriately addressed with WDRC and/or FC parameters that maintain the fidelity of the signal envelope. PMID:25999874

  10. Advanced multiresponse process optimisation an intelligent and integrated approach

    CERN Document Server

    Šibalija, Tatjana V

    2016-01-01

    This book presents an intelligent, integrated, problem-independent method for multiresponse process optimization. In contrast to traditional approaches, the idea of this method is to provide a unique model for the optimization of various processes, without imposition of assumptions relating to the type of process, the type and number of process parameters and responses, or interdependences among them. The presented method for experimental design of processes with multiple correlated responses is composed of three modules: an expert system that selects the experimental plan based on the orthogonal arrays; the factor effects approach, which performs processing of experimental data based on Taguchi’s quality loss function and multivariate statistical methods; and process modeling and optimization based on artificial neural networks and metaheuristic optimization algorithms. The implementation is demonstrated using four case studies relating to high-tech industries and advanced, non-conventional processes.

  11. [INVITED] Computational intelligence for smart laser materials processing

    Science.gov (United States)

    Casalino, Giuseppe

    2018-03-01

    Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

  12. Artificial intelligence implementation in the APS process diagnostic

    Energy Technology Data Exchange (ETDEWEB)

    Guessasma, Sofiane; Salhi, Zahir; Montavon, Ghislain; Gougeon, Patrick; Coddet, Christian

    2004-07-25

    Thermal spray process is a technique of coating manufacturing implementing a wide variety of materials and processes. This technique is characterized by up to 150 processing parameters influencing the coating properties. The control of the coating quality is needed through the consideration of a robust methodology that takes into account the parameter interdependencies, the process variability and offers the ability to quantify the processing parameter-process response relationships. The aim of this work is to introduce a new approach based on artificial intelligence responding to these requirements. A detailed procedure is presented considering an artificial neural network (ANN) structure which encodes implicitly the physical phenomena governing the process. The implementation of such a structure was coupled to experimental results of an optic sensor controlling the powder particle fusion state before the coating formation. The optimization steps were discussed and the predicted results were compared to the experimental ones allowing the identification of the control factors.

  13. Artificial intelligence implementation in the APS process diagnostic

    International Nuclear Information System (INIS)

    Guessasma, Sofiane; Salhi, Zahir; Montavon, Ghislain; Gougeon, Patrick; Coddet, Christian

    2004-01-01

    Thermal spray process is a technique of coating manufacturing implementing a wide variety of materials and processes. This technique is characterized by up to 150 processing parameters influencing the coating properties. The control of the coating quality is needed through the consideration of a robust methodology that takes into account the parameter interdependencies, the process variability and offers the ability to quantify the processing parameter-process response relationships. The aim of this work is to introduce a new approach based on artificial intelligence responding to these requirements. A detailed procedure is presented considering an artificial neural network (ANN) structure which encodes implicitly the physical phenomena governing the process. The implementation of such a structure was coupled to experimental results of an optic sensor controlling the powder particle fusion state before the coating formation. The optimization steps were discussed and the predicted results were compared to the experimental ones allowing the identification of the control factors

  14. Natural language processing in psychiatry. Artificial intelligence technology and psychopathology.

    Science.gov (United States)

    Garfield, D A; Rapp, C; Evens, M

    1992-04-01

    The potential benefit of artificial intelligence (AI) technology as a tool of psychiatry has not been well defined. In this essay, the technology of natural language processing and its position with regard to the two main schools of AI is clearly outlined. Past experiments utilizing AI techniques in understanding psychopathology are reviewed. Natural language processing can automate the analysis of transcripts and can be used in modeling theories of language comprehension. In these ways, it can serve as a tool in testing psychological theories of psychopathology and can be used as an effective tool in empirical research on verbal behavior in psychopathology.

  15. Ramp Technology and Intelligent Processing in Small Manufacturing

    Science.gov (United States)

    Rentz, Richard E.

    1992-01-01

    To address the issues of excessive inventories and increasing procurement lead times, the Navy is actively pursuing flexible computer integrated manufacturing (FCIM) technologies, integrated by communication networks to respond rapidly to its requirements for parts. The Rapid Acquisition of Manufactured Parts (RAMP) program, initiated in 1986, is an integral part of this effort. The RAMP program's goal is to reduce the current average production lead times experienced by the Navy's inventory control points by a factor of 90 percent. The manufacturing engineering component of the RAMP architecture utilizes an intelligent processing technology built around a knowledge-based shell provided by ICAD, Inc. Rules and data bases in the software simulate an expert manufacturing planner's knowledge of shop processes and equipment. This expert system can use Product Data Exchange using STEP (PDES) data to determine what features the required part has, what material is required to manufacture it, what machines and tools are needed, and how the part should be held (fixtured) for machining, among other factors. The program's rule base then indicates, for example, how to make each feature, in what order to make it, and to which machines on the shop floor the part should be routed for processing. This information becomes part of the shop work order. The process planning function under RAMP greatly reduces the time and effort required to complete a process plan. Since the PDES file that drives the intelligent processing is 100 percent complete and accurate to start with, the potential for costly errors is greatly diminished.

  16. Open-source intelligence in the Czech military knowledge syst em and process design

    OpenAIRE

    Krejci, Roman

    2002-01-01

    Owing to the recent transitions in the Czech Republic, the Czech military must satisfy a large set of new requirements. One way the military intelligence can become more effective and can conserve resources is by increasing the efficiency of open-source intelligence (OSINT), which plays an important part in intelligence gathering in the age of information. When using OSINT effectively, the military intelligence can elevate its responsiveness to different types of crises and can also properly ...

  17. The Relationship between Multiple Intelligences with Preferred Science Teaching and Science Process Skills

    Directory of Open Access Journals (Sweden)

    Mohd Ali Samsudin

    2015-02-01

    Full Text Available This study was undertaken to identify the relationship between multiple intelligences with preferred science teaching and science process skills. The design of the study is a survey using three questionnaires reported in the literature: Multiple Intelligences Questionnaire, Preferred Science Teaching Questionnaire and Science Process Skills Questionnaire. The study selected 300 primary school students from five (5 primary schools in Penang, Malaysia. The findings showed a relationship between kinesthetic, logical-mathematical, visual-spatial and naturalistic intelligences with the preferred science teaching. In addition there was a correlation between kinesthetic and visual-spatial intelligences with science process skills, implying that multiple intelligences are related to science learning.

  18. The ethical intelligence: a tool guidance in the process of the negotiation

    Directory of Open Access Journals (Sweden)

    Cristina Seijo

    2014-08-01

    Full Text Available The present article is the result of a research, which has as object present a theoretical contrast that invites to the reflection on the ethical intelligence as a tool guidance in the negotiation. In the same one there are approached the different types of ethical intelligence; spatial intelligence, rational intelligence, emotional intelligence among others, equally one refers associative intelligence to the processes of negotiation and to the tactics of negotiation. In this respect, it is possible to deal to the ethical intelligence as the aptitude to examine the moral standards of the individual and of the society to decide between what this one correct or incorrect and to be able like that to solve the different problematic ones for which an individual or a society cross. For this reason, one invites to start mechanisms of transparency and participation by virtue of which the ethical intelligence is born in mind as the threshold that orientates this process of negotiation. 

  19. Working memory and intelligibility of hearing-aid processed speech

    Directory of Open Access Journals (Sweden)

    Pamela eSouza

    2015-05-01

    Full Text Available Previous work suggested that individuals with low working memory capacity may be at a disadvantage in adverse listening environments, including situations with background noise or substantial modification of the acoustic signal. This study explored the relationship between patient factors (including working memory capacity and intelligibility and quality of modified speech for older individuals with sensorineural hearing loss. The modification was created using a combination of hearing aid processing (wide-dynamic range compression and frequency compression applied to sentences in multitalker babble. The extent of signal modification was quantified via an envelope fidelity index. We also explored the contribution of components of working memory by including measures of processing speed and executive function. We hypothesized that listeners with low working memory capacity would perform more poorly than those with high working memory capacity across all situations, and would also be differentially affected by high amounts of signal modification. Results showed a significant effect of working memory capacity for speech intelligibility, and an interaction between working memory, amount of hearing loss and signal modification. Signal modification was the major predictor of quality ratings. These data add to the literature on hearing-aid processing and working memory by suggesting that the working memory-intelligibility effects may be related to aggregate signal fidelity, rather than on the specific signal manipulation. They also suggest that for individuals with low working memory capacity, sensorineural loss may be most appropriately addressed with wide-dynamic range compression and/or frequency compression parameters that maintain the fidelity of the signal envelope.

  20. Adaptive Moving Object Tracking Integrating Neural Networks And Intelligent Processing

    Science.gov (United States)

    Lee, James S. J.; Nguyen, Dziem D.; Lin, C.

    1989-03-01

    A real-time adaptive scheme is introduced to detect and track moving objects under noisy, dynamic conditions including moving sensors. This approach integrates the adaptiveness and incremental learning characteristics of neural networks with intelligent reasoning and process control. Spatiotemporal filtering is used to detect and analyze motion, exploiting the speed and accuracy of multiresolution processing. A neural network algorithm constitutes the basic computational structure for classification. A recognition and learning controller guides the on-line training of the network, and invokes pattern recognition to determine processing parameters dynamically and to verify detection results. A tracking controller acts as the central control unit, so that tracking goals direct the over-all system. Performance is benchmarked against the Widrow-Hoff algorithm, for target detection scenarios presented in diverse FLIR image sequences. Efficient algorithm design ensures that this recognition and control scheme, implemented in software and commercially available image processing hardware, meets the real-time requirements of tracking applications.

  1. Intelligence

    Science.gov (United States)

    Sternberg, Robert J.

    2012-01-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain—especially with regard to the functioning in the prefrontal cortex—and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

  2. Intelligence.

    Science.gov (United States)

    Sternberg, Robert J

    2012-03-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex-and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret.

  3. Acoustic richness modulates the neural networks supporting intelligible speech processing.

    Science.gov (United States)

    Lee, Yune-Sang; Min, Nam Eun; Wingfield, Arthur; Grossman, Murray; Peelle, Jonathan E

    2016-03-01

    The information contained in a sensory signal plays a critical role in determining what neural processes are engaged. Here we used interleaved silent steady-state (ISSS) functional magnetic resonance imaging (fMRI) to explore how human listeners cope with different degrees of acoustic richness during auditory sentence comprehension. Twenty-six healthy young adults underwent scanning while hearing sentences that varied in acoustic richness (high vs. low spectral detail) and syntactic complexity (subject-relative vs. object-relative center-embedded clause structures). We manipulated acoustic richness by presenting the stimuli as unprocessed full-spectrum speech, or noise-vocoded with 24 channels. Importantly, although the vocoded sentences were spectrally impoverished, all sentences were highly intelligible. These manipulations allowed us to test how intelligible speech processing was affected by orthogonal linguistic and acoustic demands. Acoustically rich speech showed stronger activation than acoustically less-detailed speech in a bilateral temporoparietal network with more pronounced activity in the right hemisphere. By contrast, listening to sentences with greater syntactic complexity resulted in increased activation of a left-lateralized network including left posterior lateral temporal cortex, left inferior frontal gyrus, and left dorsolateral prefrontal cortex. Significant interactions between acoustic richness and syntactic complexity occurred in left supramarginal gyrus, right superior temporal gyrus, and right inferior frontal gyrus, indicating that the regions recruited for syntactic challenge differed as a function of acoustic properties of the speech. Our findings suggest that the neural systems involved in speech perception are finely tuned to the type of information available, and that reducing the richness of the acoustic signal dramatically alters the brain's response to spoken language, even when intelligibility is high. Copyright © 2015 Elsevier

  4. USE OF ARTIFICIAL INTELLIGENCE TECHNIQUES IN QUALITY IMPROVING PROCESS

    OpenAIRE

    KALİTE İYİLEŞTİRME SÜRECİNDE YAPAY ZEKÃ KAYA; Orhan ENGİN

    2005-01-01

    Today, changing of competition conditions and customer preferences caused to happen many differences in the viewpoint of firms' quality studies. At the same time, improvements in computer technologies accelerated use of artificial intelligence. Artificial intelligence technologies are being used to solve many industry problems. In this paper, we investigated the use of artificial intelligence techniques to solve quality problems. The artificial intelligence techniques, which are used in quali...

  5. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent...... males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two.......2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were...

  6. A new type of intelligent wireless sensing network for health monitoring of large-size structures

    Science.gov (United States)

    Lei, Ying; Liu, Ch.; Wu, D. T.; Tang, Y. L.; Wang, J. X.; Wu, L. J.; Jiang, X. D.

    2009-07-01

    In recent years, some innovative wireless sensing systems have been proposed. However, more exploration and research on wireless sensing systems are required before wireless systems can substitute for the traditional wire-based systems. In this paper, a new type of intelligent wireless sensing network is proposed for the heath monitoring of large-size structures. Hardware design of the new wireless sensing units is first studied. The wireless sensing unit mainly consists of functional modules of: sensing interface, signal conditioning, signal digitization, computational core, wireless communication and battery management. Then, software architecture of the unit is introduced. The sensing network has a two-level cluster-tree architecture with Zigbee communication protocol. Important issues such as power saving and fault tolerance are considered in the designs of the new wireless sensing units and sensing network. Each cluster head in the network is characterized by its computational capabilities that can be used to implement the computational methodologies of structural health monitoring; making the wireless sensing units and sensing network have "intelligent" characteristics. Primary tests on the measurement data collected by the wireless system are performed. The distributed computational capacity of the intelligent sensing network is also demonstrated. It is shown that the new type of intelligent wireless sensing network provides an efficient tool for structural health monitoring of large-size structures.

  7. Integrating artificial and human intelligence into tablet production process.

    Science.gov (United States)

    Gams, Matjaž; Horvat, Matej; Ožek, Matej; Luštrek, Mitja; Gradišek, Anton

    2014-12-01

    We developed a new machine learning-based method in order to facilitate the manufacturing processes of pharmaceutical products, such as tablets, in accordance with the Process Analytical Technology (PAT) and Quality by Design (QbD) initiatives. Our approach combines the data, available from prior production runs, with machine learning algorithms that are assisted by a human operator with expert knowledge of the production process. The process parameters encompass those that relate to the attributes of the precursor raw materials and those that relate to the manufacturing process itself. During manufacturing, our method allows production operator to inspect the impacts of various settings of process parameters within their proven acceptable range with the purpose of choosing the most promising values in advance of the actual batch manufacture. The interaction between the human operator and the artificial intelligence system provides improved performance and quality. We successfully implemented the method on data provided by a pharmaceutical company for a particular product, a tablet, under development. We tested the accuracy of the method in comparison with some other machine learning approaches. The method is especially suitable for analyzing manufacturing processes characterized by a limited amount of data.

  8. Synthetic-Creative Intelligence and Psychometric Intelligence: Analysis of the Threshold Theory and Creative Process

    Science.gov (United States)

    Ferrando, Mercedes; Soto, Gloria; Prieto, Lola; Sáinz, Marta; Ferrándiz, Carmen

    2016-01-01

    There has been an increasing body of research to uncover the relationship between creativity and intelligence. This relationship usually has been examined using traditional measures of intelligence and seldom using new approaches (i.e. Ferrando et al. 2005). In this work, creativity is measured by tools developed based on Sternberg's successful…

  9. The brain as a distributed intelligent processing system: an EEG study.

    Science.gov (United States)

    da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo

    2011-03-15

    Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. The present results support these claims and the neural efficiency hypothesis.

  10. The brain as a distributed intelligent processing system: an EEG study.

    Directory of Open Access Journals (Sweden)

    Armando Freitas da Rocha

    Full Text Available BACKGROUND: Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS, first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. METHODOLOGY AND PRINCIPAL FINDINGS: In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale and WISC (Wechsler Intelligence Scale for Children, and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. CONCLUSION: The present results support these claims and the neural efficiency hypothesis.

  11. The Brain as a Distributed Intelligent Processing System: An EEG Study

    Science.gov (United States)

    da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo

    2011-01-01

    Background Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. Methodology and Principal Findings In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Whechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. Conclusion The present results support these claims and the neural efficiency hypothesis. PMID:21423657

  12. Intelligent sensor networks the integration of sensor networks, signal processing and machine learning

    CERN Document Server

    Hu, Fei

    2012-01-01

    Although governments worldwide have invested significantly in intelligent sensor network research and applications, few books cover intelligent sensor networks from a machine learning and signal processing perspective. Filling this void, Intelligent Sensor Networks: The Integration of Sensor Networks, Signal Processing and Machine Learning focuses on the close integration of sensing, networking, and smart signal processing via machine learning. Based on the world-class research of award-winning authors, the book provides a firm grounding in the fundamentals of intelligent sensor networks, incl

  13. Efficient querying of large process model repositories

    NARCIS (Netherlands)

    Jin, Tao; Wang, Jianmin; La Rosa, M.; Hofstede, ter A.H.M.; Wen, Lijie

    2013-01-01

    Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business

  14. Intelligent control system for continuous technological process of alkylation

    Science.gov (United States)

    Gebel, E. S.; Hakimov, R. A.

    2018-01-01

    Relevance of intelligent control for complex dynamic objects and processes are shown in this paper. The model of a virtual analyzer based on a neural network is proposed. Comparative analysis of mathematical models implemented in MathLab software showed that the most effective from the point of view of the reproducibility of the result is the model with seven neurons in the hidden layer, the training of which was performed using the method of scaled coupled gradients. Comparison of the data from the laboratory analysis and the theoretical model are showed that the root-mean-square error does not exceed 3.5, and the calculated value of the correlation coefficient corresponds to a "strong" connection between the values.

  15. The process of implementing Competitive Intelligence in a company

    OpenAIRE

    František Bartes

    2013-01-01

    It is a common occurrence in business practice that the management of a company, in an effort to jump-start the function of the Competitive Intelligence unit, makes a number of mistakes and errors. Yet it is not difficult to avoid these missteps and achieve the desired level of Competitive Intelligence activities in a purposeful and effective manner. The author believes that a resolution of this problem lies in his concept of Competitive Intelligence viewed as a system application discipline ...

  16. Intelligent Integration between Human Simulated Intelligence and Expert Control Technology for the Combustion Process of Gas Heating Furnace

    Directory of Open Access Journals (Sweden)

    Yucheng Liu

    2014-01-01

    Full Text Available Due to being poor in control quality of the combustion process of gas heating furnace, this paper explored a sort of strong robust control algorithm in order to improve the control quality of the combustion process of gas heating furnace. The paper analyzed the control puzzle in the complex combustion process of gas heating furnace, summarized the cybernetics characteristic of the complex combustion process, researched into control strategy of the uncertainty complex control process, discussed the control model of the complex process, presented a sort of intelligent integration between human-simulated intelligence and expert control technology, and constructed the control algorithm for the combustion process controlling of gas heating furnace. The simulation results showed that the control algorithm proposed in the paper is not only better in dynamic and steady quality of the combustion process, but also obvious in energy saving effect, feasible, and effective in control strategy.

  17. Advances in intelligent process-aware information systems concepts, methods, and technologies

    CERN Document Server

    Oberhauser, Roy; Reichert, Manfred

    2017-01-01

    This book provides a state-of-the-art perspective on intelligent process-aware information systems and presents chapters on specific facets and approaches applicable to such systems. Further, it highlights novel advances and developments in various aspects of intelligent process-aware information systems and business process management systems. Intelligence capabilities are increasingly being integrated into or created in many of today’s software products and services. Process-aware information systems provide critical computing infrastructure to support the various processes involved in the creation and delivery of business products and services. Yet the integration of intelligence capabilities into process-aware information systems is a non-trivial yet necessary evolution of these complex systems. The book’s individual chapters address adaptive process management, case management processes, autonomically-capable processes, process-oriented information logistics, process recommendations, reasoning over ...

  18. Intelligent control for large-scale variable speed variable pitch wind turbines

    Institute of Scientific and Technical Information of China (English)

    Xinfang ZHANG; Daping XU; Yibing LIU

    2004-01-01

    Large-scale wind turbine generator systems have strong nonlinear multivariable characteristics with many uncertain factors and disturbances.Automatic control is crucial for the efficiency and reliability of wind turbines.On the basis of simplified and proper model of variable speed variable pitch wind turbines,the effective wind speed is estimated using extended Kalman filter.Intelligent control schemes proposed in the paper include two loops which operate in synchronism with each other.At below-rated wind speed,the inner loop adopts adaptive fuzzy control based on variable universe for generator torque regulation to realize maximum wind energy capture.At above-rated wind speed, a controller based on least square support vector machine is proposed to adjust pitch angle and keep rated output power.The simulation shows the effectiveness of the intelligent control.

  19. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    Science.gov (United States)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  20. Feature-based tolerancing for intelligent inspection process definition

    International Nuclear Information System (INIS)

    Brown, C.W.

    1993-07-01

    This paper describes a feature-based tolerancing capability that complements a geometric solid model with an explicit representation of conventional and geometric tolerances. This capability is focused on supporting an intelligent inspection process definition system. The feature-based tolerance model's benefits include advancing complete product definition initiatives (e.g., STEP -- Standard for Exchange of Product model dam), suppling computer-integrated manufacturing applications (e.g., generative process planning and automated part programming) with product definition information, and assisting in the solution of measurement performance issues. A feature-based tolerance information model was developed based upon the notion of a feature's toleranceable aspects and describes an object-oriented scheme for representing and relating tolerance features, tolerances, and datum reference frames. For easy incorporation, the tolerance feature entities are interconnected with STEP solid model entities. This schema will explicitly represent the tolerance specification for mechanical products, support advanced dimensional measurement applications, and assist in tolerance-related methods divergence issues

  1. Large scale processing of dielectric electroactive polymers

    DEFF Research Database (Denmark)

    Vudayagiri, Sindhu

    Efficient processing techniques are vital to the success of any manufacturing industry. The processing techniques determine the quality of the products and thus to a large extent the performance and reliability of the products that are manufactured. The dielectric electroactive polymer (DEAP...

  2. Front-End Intelligence for Large-Scale Application-Oriented Internet-of-Things

    KAUST Repository

    Bader, Ahmed; Ghazzai, Hakim; Kadri, Abdullah; Alouini, Mohamed-Slim

    2016-01-01

    The Internet-of-things (IoT) refers to the massive integration of electronic devices, vehicles, buildings, and other objects to collect and exchange data. It is the enabling technology for a plethora of applications touching various aspects of our lives such as healthcare, wearables, surveillance, home automation, smart manufacturing, and intelligent automotive systems. Existing IoT architectures are highly centralized and heavily rely on a back-end core network for all decision-making processes. This may lead to inefficiencies in terms of latency, network traffic management, computational processing, and power consumption. In this paper, we advocate the empowerment of front-end IoT devices to support the back-end network in fulfilling end-user applications requirements mainly by means of improved connectivity and efficient network management. A novel conceptual framework is presented for a new generation of IoT devices that will enable multiple new features for both the IoT administrators as well as end users. Exploiting the recent emergence of software-defined architecture, these smart IoT devices will allow fast, reliable, and intelligent management of diverse IoT-based applications. After highlighting relevant shortcomings of the existing IoT architectures, we outline some key design perspectives to enable front-end intelligence while shedding light on promising future research directions.

  3. Front-End Intelligence for Large-Scale Application-Oriented Internet-of-Things

    KAUST Repository

    Bader, Ahmed

    2016-06-14

    The Internet-of-things (IoT) refers to the massive integration of electronic devices, vehicles, buildings, and other objects to collect and exchange data. It is the enabling technology for a plethora of applications touching various aspects of our lives such as healthcare, wearables, surveillance, home automation, smart manufacturing, and intelligent automotive systems. Existing IoT architectures are highly centralized and heavily rely on a back-end core network for all decision-making processes. This may lead to inefficiencies in terms of latency, network traffic management, computational processing, and power consumption. In this paper, we advocate the empowerment of front-end IoT devices to support the back-end network in fulfilling end-user applications requirements mainly by means of improved connectivity and efficient network management. A novel conceptual framework is presented for a new generation of IoT devices that will enable multiple new features for both the IoT administrators as well as end users. Exploiting the recent emergence of software-defined architecture, these smart IoT devices will allow fast, reliable, and intelligent management of diverse IoT-based applications. After highlighting relevant shortcomings of the existing IoT architectures, we outline some key design perspectives to enable front-end intelligence while shedding light on promising future research directions.

  4. Information Processing and Coaching Treatments in an Intelligent Tutoring System

    National Research Council Canada - National Science Library

    Dillon, Ronna

    1997-01-01

    The purpose of this effort was to develop an intelligent tutoring system (ITS) to train test administrators how to operate computerized adaptive testing Armed Services Vocational Aptitude Battery (CAT-ASVAB...

  5. Artificial intelligence in NMR imaging and image processing

    International Nuclear Information System (INIS)

    Kuhn, M.H.

    1988-01-01

    NMR tomography offers a wealth of information and data acquisition variants. Artificial intelligence is able to efficiently support the selection of measuring parameters and the evaluation of results. (orig.) [de

  6. Intelligent technologies in process of highly-precise products manufacturing

    Science.gov (United States)

    Vakhidova, K. L.; Khakimov, Z. L.; Isaeva, M. R.; Shukhin, V. V.; Labazanov, M. A.; Ignatiev, S. A.

    2017-10-01

    One of the main control methods of the surface layer of bearing parts is the eddy current testing method. Surface layer defects of bearing parts, like burns, cracks and some others, are reflected in the results of the rolling surfaces scan. The previously developed method for detecting defects from the image of the raceway was quite effective, but the processing algorithm is complicated and lasts for about 12 ... 16 s. The real non-stationary signals from an eddy current transducer (ECT) consist of short-time high-frequency and long-time low-frequency components, therefore a transformation is used for their analysis, which provides different windows for different frequencies. The wavelet transform meets these conditions. Based on aforesaid, a methodology for automatically detecting and recognizing local defects in bearing parts surface layer has been developed on the basis of wavelet analysis using integral estimates. Some of the defects are recognized by the amplitude component, otherwise an automatic transition to recognition by the phase component of information signals (IS) is carried out. The use of intelligent technologies in the manufacture of bearing parts will, firstly, significantly improve the quality of bearings, and secondly, significantly improve production efficiency by reducing (eliminating) rejections in the manufacture of products, increasing the period of normal operation of the technological equipment (inter-adjustment period), the implementation of the system of Flexible facilities maintenance, as well as reducing production costs.

  7. Detection, information fusion, and temporal processing for intelligence in recognition

    Energy Technology Data Exchange (ETDEWEB)

    Casasent, D. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1996-12-31

    The use of intelligence in vision recognition uses many different techniques or tools. This presentation discusses several of these techniques for recognition. The recognition process is generally separated into several steps or stages when implemented in hardware, e.g. detection, segmentation and enhancement, and recognition. Several new distortion-invariant filters, biologically-inspired Gabor wavelet filter techniques, and morphological operations that have been found very useful for detection and clutter rejection are discussed. These are all shift-invariant operations that allow multiple object regions of interest in a scene to be located in parallel. We also discuss new algorithm fusion concepts by which the results from different detection algorithms are combined to reduce detection false alarms; these fusion methods utilize hierarchical processing and fuzzy logic concepts. We have found this to be most necessary, since no single detection algorithm is best for all cases. For the final recognition stage, we describe a new method of representing all distorted versions of different classes of objects and determining the object class and pose that most closely matches that of a given input. Besides being efficient in terms of storage and on-line computations required, it overcomes many of the problems that other classifiers have in terms of the required training set size, poor generalization with many hidden layer neurons, etc. It is also attractive in its ability to reject input regions as clutter (non-objects) and to learn new object descriptions. We also discuss its use in processing a temporal sequence of input images of the contents of each local region of interest. We note how this leads to robust results in which estimation efforts in individual frames can be overcome. This seems very practical, since in many scenarios a decision need not be made after only one frame of data, since subsequent frames of data enter immediately in sequence.

  8. A MURI Center for Intelligent Biomimetic Image Processing and Classification

    Science.gov (United States)

    2007-11-01

    times, labeled "plage" or "open space" or "natural", the system learns to associate multiple classes with a given input. Testbed image examples have shown...brain color perception and category learning. Commentary on "Coordinating perceptually grounded categories through language " by Luc Steels and Tony...Symposium on Computational Intelligence (ISCI), Kosice, Slovakia, June 2002. 9. Carpenter, G.A., Award from the Slovak Artificial Intelligence Society, 2002

  9. Racial Equality in Intelligence: Predictions from a Theory of Intelligence as Processing

    Science.gov (United States)

    Fagan, Joseph F.; Holland, Cynthia R.

    2007-01-01

    African-Americans and Whites were asked to solve problems typical of those administered on standard tests of intelligence. Half of the problems were solvable on the basis of information generally available to either race and/or on the basis of information newly learned. Such knowledge did not vary with race. Other problems were only solvable on…

  10. Effect of promoting self-esteem by participatory learning process on emotional intelligence among early adolescents.

    Science.gov (United States)

    Munsawaengsub, Chokchai; Yimklib, Somkid; Nanthamongkolchai, Sutham; Apinanthavech, Suporn

    2009-12-01

    To study the effect of promoting self-esteem by participatory learning program on emotional intelligence among early adolescents. The quasi-experimental study was conducted in grade 9 students from two schools in Bangbuathong district, Nonthaburi province. Each experimental and comparative group consisted of 34 students with the lowest score of emotional intelligence. The instruments were questionnaires, Program to Develop Emotional Intelligence and Handbook of Emotional Intelligence Development. The experimental group attended 8 participatory learning activities in 4 weeks to Develop Emotional Intelligence while the comparative group received the handbook for self study. Assessment the effectiveness of program was done by pre-test and post-test immediately and 4 weeks apart concerning the emotional intelligence. Implementation and evaluation was done during May 24-August 12, 2005. Data were analyzed by frequency, percentage, mean, standard deviation, Chi-square, independent sample t-test and paired sample t-test. Before program implementation, both groups had no statistical difference in mean score of emotional intelligence. After intervention, the experimental group had higher mean score of emotional intelligence both immediately and 4 weeks later with statistical significant (p = 0.001 and self-esteem by participatory learning process could enhance the emotional intelligence in early-adolescent. This program could be modified and implemented for early adolescent in the community.

  11. Intelligent workflow driven processing for electronic mail management

    African Journals Online (AJOL)

    Email has become one of the most efficient means of electronics communication for many years and email management has become a critical issue due to congestion. Different client/individuals encounter problems while processing their emails due to large volume of email being received and lot of request to be replied.

  12. Storage process of large solid radioactive wastes

    International Nuclear Information System (INIS)

    Morin, Bruno; Thiery, Daniel.

    1976-01-01

    Process for the storage of large size solid radioactive waste, consisting of contaminated objects such as cartridge filters, metal swarf, tools, etc, whereby such waste is incorporated in a thermohardening resin at room temperature, after prior addition of at least one inert charge to the resin. Cross-linking of the resin is then brought about [fr

  13. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  14. Intelligent systems/software engineering methodology - A process to manage cost and risk

    Science.gov (United States)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  15. Decision Support for Software Process Management Teams: An Intelligent Software Agent Approach

    National Research Council Canada - National Science Library

    Church, Lori

    2000-01-01

    ... to market, eliminate redundancy, and ease job stress. This thesis proposes a conceptual model for software process management decision support in the form of an intelligent software agent network...

  16. Interated Intelligent Industrial Process Sensing and Control: Applied to and Demonstrated on Cupola Furnaces

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed Abdelrahman; roger Haggard; Wagdy Mahmoud; Kevin Moore; Denis Clark; Eric Larsen; Paul King

    2003-02-12

    The final goal of this project was the development of a system that is capable of controlling an industrial process effectively through the integration of information obtained through intelligent sensor fusion and intelligent control technologies. The industry of interest in this project was the metal casting industry as represented by cupola iron-melting furnaces. However, the developed technology is of generic type and hence applicable to several other industries. The system was divided into the following four major interacting components: 1. An object oriented generic architecture to integrate the developed software and hardware components @. Generic algorithms for intelligent signal analysis and sensor and model fusion 3. Development of supervisory structure for integration of intelligent sensor fusion data into the controller 4. Hardware implementation of intelligent signal analysis and fusion algorithms

  17. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    Energy Technology Data Exchange (ETDEWEB)

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  18. The Relationship between Multiple Intelligences with Preferred Science Teaching and Science Process Skills

    OpenAIRE

    Mohd Ali Samsudin; Noor Hasyimah Haniza; Corrienna Abdul-Talib; Hayani Marlia Mhd Ibrahim

    2015-01-01

    This study was undertaken to identify the relationship between multiple intelligences with preferred science teaching and science process skills. The design of the study is a survey using three questionnaires reported in the literature: Multiple Intelligences Questionnaire, Preferred Science Teaching Questionnaire and Science Process Skills Questionnaire. The study selected 300 primary school students from five (5) primary schools in Penang, Malaysia. The findings showed a relationship betwee...

  19. The application of neural networks with artificial intelligence technique in the modeling of industrial processes

    International Nuclear Information System (INIS)

    Saini, K. K.; Saini, Sanju

    2008-01-01

    Neural networks are a relatively new artificial intelligence technique that emulates the behavior of biological neural systems in digital software or hardware. These networks can 'learn', automatically, complex relationships among data. This feature makes the technique very useful in modeling processes for which mathematical modeling is difficult or impossible. The work described here outlines some examples of the application of neural networks with artificial intelligence technique in the modeling of industrial processes.

  20. Financial intelligence of business process outsourcing professional in Davao City Philippines

    Directory of Open Access Journals (Sweden)

    Samantha Ferraren

    2016-12-01

    Full Text Available This research determined the financial intelligence using the Kiyosaki Cashflow Quadrant. The study employed the Kiyosaki Cashflow Quadrant to classify employees financial intelligence as likely to be an investor, big business owner, self-employed and employed. The Ordinal Regression was employed in determining the parameters of the chosen variables through Maximum Likelihood Estimation (MLE. The results showed that income is a significant factor in the financial intelligence of the Business Process Outsourcing (BPO employees; the higher the income the better is the financial intelligence. The type of BPO employer is a significantly determined by the degree of financial intelligence in the BPO employees in financial services had higher financial intelligence than that of non-financial services. There is a significant relationship between financial literacy and financial intelligence.Although, there were some rank and file employees who earned less, and they may be classified as investor because of their behavior towards money. Financial wellness program for BPO employees in financial and non-financial services alike was recommended to improve financial intelligence to be able to achieve financial freedom .

  1. Artificial Intelligence in ADA: Pattern-Directed Processing. Final Report.

    Science.gov (United States)

    Reeker, Larry H.; And Others

    To demonstrate to computer programmers that the programming language Ada provides superior facilities for use in artificial intelligence applications, the three papers included in this report investigate the capabilities that exist within Ada for "pattern-directed" programming. The first paper (Larry H. Reeker, Tulane University) is…

  2. Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems

    Science.gov (United States)

    Bourgine, P.; Johnson, J.

    2009-04-01

    The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.

  3. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    Science.gov (United States)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  4. Is general intelligence little more than the speed of higher-order processing?

    Science.gov (United States)

    Schubert, Anna-Lena; Hagemann, Dirk; Frischkorn, Gidon T

    2017-10-01

    Individual differences in the speed of information processing have been hypothesized to give rise to individual differences in general intelligence. Consistent with this hypothesis, reaction times (RTs) and latencies of event-related potential have been shown to be moderately associated with intelligence. These associations have been explained either in terms of individual differences in some brain-wide property such as myelination, the speed of neural oscillations, or white-matter tract integrity, or in terms of individual differences in specific processes such as the signal-to-noise ratio in evidence accumulation, executive control, or the cholinergic system. Here we show in a sample of 122 participants, who completed a battery of RT tasks at 2 laboratory sessions while an EEG was recorded, that more intelligent individuals have a higher speed of higher-order information processing that explains about 80% of the variance in general intelligence. Our results do not support the notion that individuals with higher levels of general intelligence show advantages in some brain-wide property. Instead, they suggest that more intelligent individuals benefit from a more efficient transmission of information from frontal attention and working memory processes to temporal-parietal processes of memory storage. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Original article Functioning of memory and attention processes in children with intelligence below average

    Directory of Open Access Journals (Sweden)

    Aneta Rita Borkowska

    2014-05-01

    Full Text Available BACKGROUND The aim of the research was to assess memorization and recall of logically connected and unconnected material, coded graphically and linguistically, and the ability to focus attention, in a group of children with intelligence below average, compared to children with average intelligence. PARTICIPANTS AND PROCEDURE The study group included 27 children with intelligence below average. The control group consisted of 29 individuals. All of them were examined using the authors’ experimental trials and the TUS test (Attention and Perceptiveness Test. RESULTS Children with intelligence below average memorized significantly less information contained in the logical material, demonstrated lower ability to memorize the visual material, memorized significantly fewer words in the verbal material learning task, achieved lower results in such indicators of the visual attention process pace as the number of omissions and mistakes, and had a lower pace of perceptual work, compared to children with average intelligence. CONCLUSIONS The results confirm that children with intelligence below average have difficulties with memorizing new material, both logically connected and unconnected. The significantly lower capacity of direct memory is independent of modality. The results of the study on the memory process confirm the hypothesis about lower abilities of children with intelligence below average, in terms of concentration, work pace, efficiency and perception.

  6. The role of across-frequency envelope processing for speech intelligibility

    DEFF Research Database (Denmark)

    Chabot-Leclerc, Alexandre; Jørgensen, Søren; Dau, Torsten

    2013-01-01

    Speech intelligibility models consist of a preprocessing part that transforms the stimuli into some internal (auditory) representation, and a decision metric that quantifies effects of transmission channel, speech interferers, and auditory processing on the speech intelligibility. Here, two recent...... speech intelligibility models, the spectro-temporal modulation index [STMI; Elhilali et al. (2003)] and the speech-based envelope power spectrum model [sEPSM; Jørgensen and Dau (2011)] were evaluated in conditions of noisy speech subjected to reverberation, and to nonlinear distortions through either...

  7. The role of across-frequency envelope processing for speech intelligibility

    DEFF Research Database (Denmark)

    Chabot-Leclerc, Alexandre; Jørgensen, Søren; Dau, Torsten

    2013-01-01

    Speech intelligibility models consist of a preprocessing part that transforms the stimuli into some internal (auditory) representation, and a decision metric that quantifies effects of transmission channel, speech interferers, and auditory processing on the speech intelligibility. Here, two recent...... speech intelligibility models, the spectro-temporal modulation index (STMI; Elhilali et al., 2003) and the speech-based envelope power spectrum model (sEPSM; Jørgensen and Dau, 2011) were evaluated in conditions of noisy speech subjected to reverberation, and to nonlinear distortions through either...

  8. Suppressive mechanisms in visual motion processing: From perception to intelligence.

    Science.gov (United States)

    Tadin, Duje

    2015-10-01

    Perception operates on an immense amount of incoming information that greatly exceeds the brain's processing capacity. Because of this fundamental limitation, the ability to suppress irrelevant information is a key determinant of perceptual efficiency. Here, I will review a series of studies investigating suppressive mechanisms in visual motion processing, namely perceptual suppression of large, background-like motions. These spatial suppression mechanisms are adaptive, operating only when sensory inputs are sufficiently robust to guarantee visibility. Converging correlational and causal evidence links these behavioral results with inhibitory center-surround mechanisms, namely those in cortical area MT. Spatial suppression is abnormally weak in several special populations, including the elderly and individuals with schizophrenia-a deficit that is evidenced by better-than-normal direction discriminations of large moving stimuli. Theoretical work shows that this abnormal weakening of spatial suppression should result in motion segregation deficits, but direct behavioral support of this hypothesis is lacking. Finally, I will argue that the ability to suppress information is a fundamental neural process that applies not only to perception but also to cognition in general. Supporting this argument, I will discuss recent research that shows individual differences in spatial suppression of motion signals strongly predict individual variations in IQ scores. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Mothers' daily person and process praise: implications for children's theory of intelligence and motivation.

    Science.gov (United States)

    Pomerantz, Eva M; Kempner, Sara G

    2013-11-01

    This research examined if mothers' day-to-day praise of children's success in school plays a role in children's theory of intelligence and motivation. Participants were 120 children (mean age = 10.23 years) and their mothers who took part in a 2-wave study spanning 6 months. During the first wave, mothers completed a 10-day daily interview in which they reported on their use of person (e.g., "You are smart") and process (e.g., "You tried hard") praise. Children's entity theory of intelligence and preference for challenge in school were assessed with surveys at both waves. Mothers' person, but not process, praise was predictive of children's theory of intelligence and motivation: The more person praise mothers used, the more children subsequently held an entity theory of intelligence and avoided challenge over and above their earlier functioning on these dimensions.

  10. Beyond Massive MIMO: The Potential of Positioning With Large Intelligent Surfaces

    Science.gov (United States)

    Hu, Sha; Rusek, Fredrik; Edfors, Ove

    2018-04-01

    We consider the potential for positioning with a system where antenna arrays are deployed as a large intelligent surface (LIS), which is a newly proposed concept beyond massive-MIMO where future man-made structures are electronically active with integrated electronics and wireless communication making the entire environment \\lq\\lq{}intelligent\\rq\\rq{}. In a first step, we derive Fisher-information and Cram\\'{e}r-Rao lower bounds (CRLBs) in closed-form for positioning a terminal located perpendicular to the center of the LIS, whose location we refer to as being on the central perpendicular line (CPL) of the LIS. For a terminal that is not on the CPL, closed-form expressions of the Fisher-information and CRLB seem out of reach, and we alternatively find approximations of them which are shown to be accurate. Under mild conditions, we show that the CRLB for all three Cartesian dimensions ($x$, $y$ and $z$) decreases quadratically in the surface-area of the LIS, except for a terminal exactly on the CPL where the CRLB for the $z$-dimension (distance from the LIS) decreases linearly in the same. In a second step, we analyze the CRLB for positioning when there is an unknown phase $\\varphi$ presented in the analog circuits of the LIS. We then show that the CRLBs are dramatically increased for all three dimensions but decrease in the third-order of the surface-area. Moreover, with an infinitely large LIS the CRLB for the $z$-dimension with an unknown $\\varphi$ is 6 dB higher than the case without phase uncertainty, and the CRLB for estimating $\\varphi$ converges to a constant that is independent of the wavelength $\\lambda$. At last, we extensively discuss the impact of centralized and distributed deployments of LIS, and show that a distributed deployment of LIS can enlarge the coverage for terminal-positioning and improve the overall positioning performance.

  11. Beyond Massive MIMO: The Potential of Data Transmission With Large Intelligent Surfaces

    Science.gov (United States)

    Hu, Sha; Rusek, Fredrik; Edfors, Ove

    2018-05-01

    In this paper, we consider the potential of data-transmission in a system with a massive number of radiating and sensing elements, thought of as a contiguous surface of electromagnetically active material. We refer to this as a large intelligent surface (LIS). The "LIS" is a newly proposed concept, which conceptually goes beyond contemporary massive MIMO technology, that arises from our vision of a future where man-made structures are electronically active with integrated electronics and wireless communication making the entire environment "intelligent". We consider capacities of single-antenna autonomous terminals communicating to the LIS where the entire surface is used as a receiving antenna array. Under the condition that the surface-area is sufficiently large, the received signal after a matched-filtering (MF) operation can be closely approximated by a sinc-function-like intersymbol interference (ISI) channel. We analyze the capacity per square meter (m^2) deployed surface, \\hat{C}, that is achievable for a fixed transmit power per volume-unit, \\hat{P}. Moreover, we also show that the number of independent signal dimensions per m deployed surface is 2/\\lambda for one-dimensional terminal-deployment, and \\pi/\\lambda^2 per m^2 for two and three dimensional terminal-deployments. Lastly, we consider implementations of the LIS in the form of a grid of conventional antenna elements and show that, the sampling lattice that minimizes the surface-area of the LIS and simultaneously obtains one signal space dimension for every spent antenna is the hexagonal lattice. We extensively discuss the design of the state-of-the-art low-complexity channel shortening (CS) demodulator for data-transmission with the LIS.

  12. The Very Large Array Data Processing Pipeline

    Science.gov (United States)

    Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako

    2018-01-01

    We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an

  13. Intelligent methods for the process parameter determination of plastic injection molding

    Science.gov (United States)

    Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn

    2018-03-01

    Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.

  14. Cognitive and emotional demands of black humour processing: the role of intelligence, aggressiveness and mood.

    Science.gov (United States)

    Willinger, Ulrike; Hergovich, Andreas; Schmoeger, Michaela; Deckert, Matthias; Stoettner, Susanne; Bunda, Iris; Witting, Andrea; Seidler, Melanie; Moser, Reinhilde; Kacena, Stefanie; Jaeckle, David; Loader, Benjamin; Mueller, Christian; Auff, Eduard

    2017-05-01

    Humour processing is a complex information-processing task that is dependent on cognitive and emotional aspects which presumably influence frame-shifting and conceptual blending, mental operations that underlie humour processing. The aim of the current study was to find distinctive groups of subjects with respect to black humour processing, intellectual capacities, mood disturbance and aggressiveness. A total of 156 adults rated black humour cartoons and conducted measurements of verbal and nonverbal intelligence, mood disturbance and aggressiveness. Cluster analysis yields three groups comprising following properties: (1) moderate black humour preference and moderate comprehension; average nonverbal and verbal intelligence; low mood disturbance and moderate aggressiveness; (2) low black humour preference and moderate comprehension; average nonverbal and verbal intelligence, high mood disturbance and high aggressiveness; and (3) high black humour preference and high comprehension; high nonverbal and verbal intelligence; no mood disturbance and low aggressiveness. Age and gender do not differ significantly, differences in education level can be found. Black humour preference and comprehension are positively associated with higher verbal and nonverbal intelligence as well as higher levels of education. Emotional instability and higher aggressiveness apparently lead to decreased levels of pleasure when dealing with black humour. These results support the hypothesis that humour processing involves cognitive as well as affective components and suggest that these variables influence the execution of frame-shifting and conceptual blending in the course of humour processing.

  15. Relationships among processing speed, working memory, and fluid intelligence in children.

    Science.gov (United States)

    Fry, A F; Hale, S

    2000-10-01

    The present review focuses on three issues, (a) the time course of developmental increases in cognitive abilities; (b) the impact of age on individual differences in these abilities, and (c) the mechanisms by which developmental increases in different aspects of cognition affect each other. We conclude from our review of the literature that the development of processing speed, working memory, and fluid intelligence, all follow a similar time course, suggesting that all three abilities develop in concert. Furthermore, the strength of the correlation between speed and intelligence does not appear to change with age, and most of the effect of the age-related increase in speed on intelligence appears to be mediated through the effect of speed on working memory. Finally, most of the effect of the age-related improvement in working memory on intelligence is itself attributable to the effect of the increase in speed on working memory, providing evidence of a cognitive developmental cascade.

  16. A New Tool for Intelligent Parallel Processing of Radar/SAR Remotely Sensed Imagery

    Directory of Open Access Journals (Sweden)

    A. Castillo Atoche

    2013-01-01

    Full Text Available A novel parallel tool for large-scale image enhancement/reconstruction and postprocessing of radar/SAR sensor systems is addressed. The proposed parallel tool performs the following intelligent processing steps: image formation, for the application of different system-level effects of image degradation with a particular remote sensing (RS system and simulation of random noising effects, enhancement/reconstruction by employing nonparametric robust high-resolution techniques, and image postprocessing using the fuzzy anisotropic diffusion technique which incorporates a better edge-preserving noise removal effect and faster diffusion process. This innovative tool allows the processing of high-resolution images provided with different radar/SAR sensor systems as required by RS endusers for environmental monitoring, risk prevention, and resource management. To verify the performance implementation of the proposed parallel framework, the processing steps are developed and specifically tested on graphic processing units (GPU, achieving considerable speedups compared to the serial version of the same techniques implemented in C language.

  17. Intelligent techniques in signal processing for multimedia security

    CERN Document Server

    Santhi, V

    2017-01-01

    This book proposes new algorithms to ensure secured communications and prevent unauthorized data exchange in secured multimedia systems. Focusing on numerous applications’ algorithms and scenarios, it offers an in-depth analysis of data hiding technologies including watermarking, cryptography, encryption, copy control, and authentication. The authors present a framework for visual data hiding technologies that resolves emerging problems of modern multimedia applications in several contexts including the medical, healthcare, education, and wireless communication networking domains. Further, it introduces several intelligent security techniques with real-time implementation. As part of its comprehensive coverage, the book discusses contemporary multimedia authentication and fingerprinting techniques, while also proposing personal authentication/recognition systems based on hand images, surveillance system security using gait recognition, face recognition under restricted constraints such as dry/wet face condi...

  18. The remarkable cell: Intelligently designed or by evolutionary process?

    Directory of Open Access Journals (Sweden)

    Mark Pretorius

    2013-02-01

    Full Text Available The objective of this article was to deal with the challenging theme of the Origin of Life. Science has been arguing the when and how of the beginning of life for centuries. It is a subject which remains perplexing despite all the technological advances made in science. The first part of the article dealt with the idea of a universe and earth divinely created to sustain life. The second part dealt with the premise that the first life forms were the miraculous work of an intelligent designer, which is revealed by the sophisticated and intricate design of these first life forms. The article concluded with an explanation that these life forms are in stark contrast to the idea of a random Darwinian type evolution for life�s origin, frequently referred to as abiogenesis or spontaneous generation.

  19. Common genetic influences on intelligence and auditory simple reaction time in a large Swedish sample

    NARCIS (Netherlands)

    Madison, G.; Mosing, M.A.; Verweij, K.J.H.; Pedersen, N.L.; Ullén, F.

    2016-01-01

    Intelligence and cognitive ability have long been associated with chronometric performance measures, such as reaction time (RT), but few studies have investigated auditory RT in this context. The nature of this relationship is important for understanding the etiology and structure of intelligence.

  20. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  1. Using Software Zelio Soft in Educational Process to Simulation Control Programs for Intelligent Relays

    Science.gov (United States)

    Michalik, Peter; Mital, Dusan; Zajac, Jozef; Brezikova, Katarina; Duplak, Jan; Hatala, Michal; Radchenko, Svetlana

    2016-10-01

    Article deals with point to using intelligent relay and PLC systems in practice, to their architecture and principles of programming and simulations for education process on all types of school from secondary to universities. Aim of the article is proposal of simple examples of applications, where is demonstrated methodology of programming on real simple practice examples and shown using of chosen instructions. In practical part is described process of creating schemas and describing of function blocks, where are described methodologies of creating program and simulations of output reactions on changeable inputs for intelligent relays.

  2. Dehydrogenation in large ingot casting process

    International Nuclear Information System (INIS)

    Ubukata, Takashi; Suzuki, Tadashi; Ueda, Sou; Shibata, Takashi

    2009-01-01

    Forging components (for nuclear power plants) have become larger and larger because of decreased weld lines from a safety point of view. Consequently they have been manufactured from ingots requirement for 200 tons or more. Dehydrogenation is one of the key issues for large ingot manufacturing process. In the case of ingots of 200 tons or heavier, mold stream degassing (MSD) has been applied for dehydrogenation. Although JSW had developed mold stream degassing by argon (MSD-Ar) as a more effective dehydrogenating practice, MSD-Ar was not applied for these ingots, because conventional refractory materials of a stopper rod for the Ar blowing hole had low durability. In this study, we have developed a new type of stopper rod through modification of both refractory materials and the stopper rod construction and have successfully expanded the application range of MSD-Ar up to ingots weighting 330 tons. Compared with the conventional MSD, the hydrogen content in ingots after MSD-Ar has decreased by 24 percent due to the dehydrogenation rate of MSD-Ar increased by 34 percent. (author)

  3. Data transfer based on intelligent ethernet card

    International Nuclear Information System (INIS)

    Zhu Haitao; Chinese Academy of Sciences, Beijing; Chu Yuanping; Zhao Jingwei

    2007-01-01

    Intelligent Ethernet Cards are widely used in systems where the network throughout is very large, such as the DAQ systems for modern high energy physics experiments, web service. With the example of a commercial intelligent Ethernet card, this paper introduces the architecture, the principle and the process of intelligent Ethernet cards. In addition, the results of several experiments showing the differences between intelligent Ethernet cards and general ones are also presented. (authors)

  4. "Intelligent" tools for workflow process redesign : a research agenda

    NARCIS (Netherlands)

    Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.

    2006-01-01

    Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research

  5. Competitive Intelligence.

    Science.gov (United States)

    Bergeron, Pierrette; Hiller, Christine A.

    2002-01-01

    Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligence process; and discusses education and training opportunities for competitive intelligence, including core competencies…

  6. 8th International Symposium on Intelligent Distributed Computing & Workshop on Cyber Security and Resilience of Large-Scale Systems & 6th International Workshop on Multi-Agent Systems Technology and Semantics

    CERN Document Server

    Braubach, Lars; Venticinque, Salvatore; Badica, Costin

    2015-01-01

    This book represents the combined peer-reviewed proceedings of the Eight International Symposium on Intelligent Distributed Computing - IDC'2014, of the Workshop on Cyber Security and Resilience of Large-Scale Systems - WSRL-2014, and of the Sixth International Workshop on Multi-Agent Systems Technology and Semantics- MASTS-2014. All the events were held in Madrid, Spain, during September 3-5, 2014. The 47 contributions published in this book address several topics related to theory and applications of the intelligent distributed computing and multi-agent systems, including: agent-based data processing, ambient intelligence, collaborative systems, cryptography and security, distributed algorithms, grid and cloud computing, information extraction, knowledge management, big data and ontologies, social networks, swarm intelligence or videogames amongst others.

  7. An intelligent system for monitoring and diagnosis of the CO{sub 2} capture process

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Q.; Chan, C.W.; Tontiwachwuthikul, P. [University of Regina, Regina, SK (Canada). Faculty of Engineering

    2011-07-15

    Amine-based carbon dioxide capture has been widely considered as a feasible ideal technology for reducing large-scale CO{sub 2} emissions and mitigating global warming. The operation of amine-based CO{sub 2} capture is a complicated task, which involves monitoring over 100 process parameters and careful manipulation of numerous valves and pumps. The current research in the field of CO{sub 2} capture has emphasized the need for improving CO{sub 2} capture efficiency and enhancing plant performance. In the present study, artificial intelligence techniques were applied for developing a knowledge-based expert system that aims at effectively monitoring and controlling the CO{sub 2} capture process and thereby enhancing CO{sub 2} capture efficiency. In developing the system, the inferential modeling technique (IMT) was applied to analyze the domain knowledge and problem-solving techniques, and a knowledge base was developed on DeltaV Simulate. The expert system helps to enhance CO{sub 2} capture system performance and efficiency by reducing the time required for diagnosis and problem solving if abnormal conditions occur. The expert system can be used as a decision-support tool that helps inexperienced operators control the plant: it can be used also for training novice operators.

  8. Processing Speed and Intelligence as Predictors of School Achievement: Mediation or Unique Contribution?

    Science.gov (United States)

    Dodonova, Yulia A.; Dodonov, Yury S.

    2012-01-01

    The relationships between processing speed, intelligence, and school achievement were analyzed on a sample of 184 Russian 16-year-old students. Two speeded tasks required the discrimination of simple geometrical shapes and the recognition of the presented meaningless figures. Raven's Advanced Progressive Matrices and the verbal subtests of…

  9. Optimization of chemical composition in the manufacturing process of flotation balls based on intelligent soft sensing

    Directory of Open Access Journals (Sweden)

    Dučić Nedeljko

    2016-01-01

    Full Text Available This paper presents an application of computational intelligence in modeling and optimization of parameters of two related production processes - ore flotation and production of balls for ore flotation. It is proposed that desired chemical composition of flotation balls (Mn=0.69%; Cr=2.247%; C=3.79%; Si=0.5%, which ensures minimum wear rate (0.47 g/kg during copper milling is determined by combining artificial neural network (ANN and genetic algorithm (GA. Based on the results provided by neuro-genetic combination, a second neural network was derived as an ‘intelligent soft sensor’ in the process of white cast iron production. The proposed ANN 12-16-12-4 model demonstrated favourable prediction capacity, and can be recommended as a ‘intelligent soft sensor’ in the alloying process intended for obtaining favourable chemical composition of white cast iron for production of flotation balls. In the development of intelligent soft sensor data from the two real production processes was used. [Projekat Ministarstva nauke Republike Srbije, br. TR35037 i br. TR35015

  10. A Multidirectional Model for Assessing Learning Disabled Students' Intelligence: An Information-Processing Framework.

    Science.gov (United States)

    Swanson, H. Lee

    1982-01-01

    An information processing approach to the assessment of learning disabled students' intellectual performance is presented. The model is based on the assumption that intelligent behavior is comprised of a variety of problem- solving strategies. An account of child problem solving is explained and illustrated with a "thinking aloud" protocol.…

  11. The influence of masker type on early reflection processing and speech intelligibility (L)

    DEFF Research Database (Denmark)

    Arweiler, Iris; Buchholz, Jörg M.; Dau, Torsten

    2013-01-01

    Arweiler and Buchholz [J. Acoust. Soc. Am. 130, 996-1005 (2011)] showed that, while the energy of early reflections (ERs) in a room improves speech intelligibility, the benefit is smaller than that provided by the energy of the direct sound (DS). In terms of integration of ERs and DS, binaural...... listening did not provide a benefit from ERs apart from a binaural energy summation, such that monaural auditory processing could account for the data. However, a diffuse speech shaped noise (SSN) was used in the speech intelligibility experiments, which does not provide distinct binaural cues...... to the auditory system. In the present study, the monaural and binaural benefit from ERs for speech intelligibility was investigated using three directional maskers presented from 90° azimuth: a SSN, a multi-talker babble, and a reversed two-talker masker. For normal-hearing as well as hearing-impaired listeners...

  12. Ground Processing Optimization Using Artificial Intelligence Techniques, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The ultimate goal is the automation of a large amount of KSC's planning, scheduling, and execution decision making. Phase II will result in a complete full-scale...

  13. Process mining in the large : a tutorial

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Zimányi, E.

    2014-01-01

    Recently, process mining emerged as a new scientific discipline on the interface between process models and event data. On the one hand, conventional Business Process Management (BPM) and Workflow Management (WfM) approaches and tools are mostly model-driven with little consideration for event data.

  14. Integrated Intelligent Modeling, Design and Control of Crystal Growth Processes

    National Research Council Canada - National Science Library

    Prasad, V

    2000-01-01

    .... This MURI program took an integrated approach towards modeling, design and control of crystal growth processes and in conjunction with growth and characterization experiments developed much better...

  15. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  16. Business Intelligence Applied to the ALMA Software Integration Process

    Science.gov (United States)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  17. Intelligent process mapping through systematic improvement of heuristics

    Science.gov (United States)

    Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.

    1992-01-01

    The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.

  18. QuikForm: Intelligent deformation processing of structural alloys

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, R.J.; Wellman, G.W.

    1994-09-01

    There currently exists a critical need for tools to enhance the industrial competitiveness and agility of US industries involved in deformation processing of structural alloys. In response to this need, Sandia National Laboratories has embarked upon the QuikForm Initiative. The goal of this program is the development of computer-based tools to facilitate the design of deformation processing operations. The authors are currently focusing their efforts on the definition/development of a comprehensive system for the design of sheet metal stamping operations. The overall structure of the proposed QuikForm system is presented, and the focus of their thrust in each technical area is discussed.

  19. Enhancing the Scientific Process with Artificial Intelligence: Forest Science Applications

    Science.gov (United States)

    Ronald E. McRoberts; Daniel L. Schmoldt; H. Michael Rauscher

    1991-01-01

    Forestry, as a science, is a process for investigating nature. It consists of repeatedly cycling through a number of steps, including identifying knowledge gaps, creating knowledge to fill them, and organizing, evaluating, and delivering this knowledge. Much of this effort is directed toward creating abstract models of natural phenomena. The cognitive techniques of AI...

  20. Personality and Information Processing Speed: Independent Influences on Intelligent Performance

    Science.gov (United States)

    Bates, Timothy C.; Rock, Andrew

    2004-01-01

    Raven's matrices and inspection time (IT) were recorded from 56 subjects under five arousal levels. Raven's and IT correlated strongly (r = -0.7) as predicted by processing-speed theories of "g." In line with Eysenck's [Eysenck, H. J. (1967). "The biological basis of personality". Springfield, IL: Thomas] arousal theory of extraversion, there was…

  1. The Use Of Computer Intelligent Processing Technologies Among ...

    African Journals Online (AJOL)

    This paper assesses the awareness and usage of a novel approach to data and information processing among scientists, researchers and students in the field of environmental sciences. In depth and structured interview was conducted, targeting a population who are working in a variety of environmental issues. The data ...

  2. Intelligent process control of fiber chemical vapor deposition

    Science.gov (United States)

    Jones, John Gregory

    Chemical Vapor Deposition (CVD) is a widely used process for the application of thin films. In this case, CVD is being used to apply a thin film interface coating to single crystal monofilament sapphire (Alsb2Osb3) fibers for use in Ceramic Matrix Composites (CMC's). The hot-wall reactor operates at near atmospheric pressure which is maintained using a venturi pump system. Inert gas seals obviate the need for a sealed system. A liquid precursor delivery system has been implemented to provide precise stoichiometry control. Neural networks have been implemented to create real-time process description models trained using data generated based on a Navier-Stokes finite difference model of the process. Automation of the process to include full computer control and data logging capability is also presented. In situ sensors including a quadrupole mass spectrometer, thermocouples, laser scanner, and Raman spectrometer have been implemented to determine the gas phase reactants and coating quality. A fuzzy logic controller has been developed to regulate either the gas phase or the in situ temperature of the reactor using oxygen flow rate as an actuator. Scanning electron microscope (SEM) images of various samples are shown. A hierarchical control structure upon which the control structure is based is also presented.

  3. Software architecture for intelligent image processing using Prolog

    Science.gov (United States)

    Jones, Andrew C.; Batchelor, Bruce G.

    1994-10-01

    We describe a prototype system for interactive image processing using Prolog, implemented by the first author on an Apple Macintosh computer. This system is inspired by Prolog+, but differs from it in two particularly important respects. The first is that whereas Prolog+ assumes the availability of dedicated image processing hardware, with which the Prolog system communicates, our present system implements image processing functions in software using the C programming language. The second difference is that although our present system supports Prolog+ commands, these are implemented in terms of lower-level Prolog predicates which provide a more flexible approach to image manipulation. We discuss the impact of the Apple Macintosh operating system upon the implementation of the image-processing functions, and the interface between these functions and the Prolog system. We also explain how the Prolog+ commands have been implemented. The system described in this paper is a fairly early prototype, and we outline how we intend to develop the system, a task which is expedited by the extensible architecture we have implemented.

  4. 12th International Conference on Intelligent Information Hiding and Multimedia Signal Processing

    CERN Document Server

    Tsai, Pei-Wei; Huang, Hsiang-Cheh

    2017-01-01

    This volume of Smart Innovation, Systems and Technologies contains accepted papers presented in IIH-MSP-2016, the 12th International Conference on Intelligent Information Hiding and Multimedia Signal Processing. The conference this year was technically co-sponsored by Tainan Chapter of IEEE Signal Processing Society, Fujian University of Technology, Chaoyang University of Technology, Taiwan Association for Web Intelligence Consortium, Fujian Provincial Key Laboratory of Big Data Mining and Applications (Fujian University of Technology), and Harbin Institute of Technology Shenzhen Graduate School. IIH-MSP 2016 is held in 21-23, November, 2016 in Kaohsiung, Taiwan. The conference is an international forum for the researchers and professionals in all areas of information hiding and multimedia signal processing. .

  5. Predicting speech intelligibility based on the signal-to-noise envelope power ratio after modulation-frequency selective processing

    DEFF Research Database (Denmark)

    Jørgensen, Søren; Dau, Torsten

    2011-01-01

    A model for predicting the intelligibility of processed noisy speech is proposed. The speech-based envelope power spectrum model has a similar structure as the model of Ewert and Dau [(2000). J. Acoust. Soc. Am. 108, 1181-1196], developed to account for modulation detection and masking data. The ...... process provides a key measure of speech intelligibility. © 2011 Acoustical Society of America.......A model for predicting the intelligibility of processed noisy speech is proposed. The speech-based envelope power spectrum model has a similar structure as the model of Ewert and Dau [(2000). J. Acoust. Soc. Am. 108, 1181-1196], developed to account for modulation detection and masking data....... The model estimates the speech-to-noise envelope power ratio, SNR env, at the output of a modulation filterbank and relates this metric to speech intelligibility using the concept of an ideal observer. Predictions were compared to data on the intelligibility of speech presented in stationary speech...

  6. A Latent Variable Analysis of Working Memory Capacity, Short-Term Memory Capacity, Processing Speed, and General Fluid Intelligence.

    Science.gov (United States)

    Conway, Andrew R. A.; Cowan, Nelsin; Bunting, Michael F.; Therriault, David J.; Minkoff, Scott R. B.

    2002-01-01

    Studied the interrelationships among general fluid intelligence, short-term memory capacity, working memory capacity, and processing speed in 120 young adults and used structural equation modeling to determine the best predictor of general fluid intelligence. Results suggest that working memory capacity, but not short-term memory capacity or…

  7. Influence of Family Processes, Motivation, and Beliefs about Intelligence on Creative Problem Solving of Scientifically Talented Individuals

    Science.gov (United States)

    Cho, Seokhee; Lin, Chia-Yi

    2011-01-01

    Predictive relationships among perceived family processes, intrinsic and extrinsic motivation, incremental beliefs about intelligence, confidence in intelligence, and creative problem-solving practices in mathematics and science were examined. Participants were 733 scientifically talented Korean students in fourth through twelfth grades as well as…

  8. Information processing speed mediates the relationship between white matter and general intelligence in schizophrenia.

    Science.gov (United States)

    Alloza, Clara; Cox, Simon R; Duff, Barbara; Semple, Scott I; Bastin, Mark E; Whalley, Heather C; Lawrie, Stephen M

    2016-08-30

    Several authors have proposed that schizophrenia is the result of impaired connectivity between specific brain regions rather than differences in local brain activity. White matter abnormalities have been suggested as the anatomical substrate for this dysconnectivity hypothesis. Information processing speed may act as a key cognitive resource facilitating higher order cognition by allowing multiple cognitive processes to be simultaneously available. However, there is a lack of established associations between these variables in schizophrenia. We hypothesised that the relationship between white matter and general intelligence would be mediated by processing speed. White matter water diffusion parameters were studied using Tract-based Spatial Statistics and computed within 46 regions-of-interest (ROI). Principal component analysis was conducted on these white matter ROI for fractional anisotropy (FA) and mean diffusivity, and on neurocognitive subtests to extract general factors of white mater structure (gFA, gMD), general intelligence (g) and processing speed (gspeed). There was a positive correlation between g and gFA (r= 0.67, p =0.001) that was partially and significantly mediated by gspeed (56.22% CI: 0.10-0.62). These findings suggest a plausible model of structure-function relations in schizophrenia, whereby white matter structure may provide a neuroanatomical substrate for general intelligence, which is partly supported by speed of information processing. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Automated business process management – in times of digital transformation using machine learning or artificial intelligence

    Directory of Open Access Journals (Sweden)

    Paschek Daniel

    2017-01-01

    Full Text Available The continuous optimization of business processes is still a challenge for companies. In times of digital transformation, faster changing internal and external framework conditions and new customer expectations for fastest delivery and best quality of goods and many more, companies should set up their internal process at the best way. But what to do if framework conditions changed unexpectedly? The purpose of the paper is to analyse how the digital transformation will impact the Business Process Management (BPM while using methods like machine learning or artificial intelligence. Therefore, the core components will be explained, compared and set up in relation. To identify application areas interviews and analysis will be held up with digital companies. The finding of the paper will be recommendation for action in the field of BPM and process optimization through machine learning and artificial intelligence. The Approach of optimizing and management processes via machine learning and artificial intelligence will support companies to decide which tool will be the best for automated BPM.

  10. Intelligent Optimization of a Mixed Culture Cultivation Process

    Directory of Open Access Journals (Sweden)

    Petia Koprinkova-Hristova

    2015-04-01

    Full Text Available In the present paper a neural network approach called "Adaptive Critic Design" (ACD was applied to optimal tuning of set point controllers of the three main substrates (sugar, nitrogen source and dissolved oxygen for PHB production process. For approximation of the critic and the controllers a special kind of recurrent neural networks called Echo state networks (ESN were used. Their structure allows fast training that will be of crucial importance in on-line applications. The critic network is trained to minimize the temporal difference error using Recursive Least Squares method. Two approaches - gradient and heuristic - were exploited for training of the controllers. The comparison is made with respect to achieved improvement of the utility function subject of optimization as well as with known expert strategy for control the PHB production process.

  11. Improving content marketing processes with the approaches by artificial intelligence

    OpenAIRE

    Kose, Utku; Sert, Selcuk

    2017-01-01

    Content marketing is todays one of the most remarkable approaches in the context of marketing processes of companies. Value of this kind of marketing has improved in time, thanks to the latest developments regarding to computer and communication technologies. Nowadays, especially social media based platforms have a great importance on enabling companies to design multimedia oriented, interactive content. But on the other hand, there is still something more to do for improved content marketing...

  12. A Large-Scale Evaluation of an Intelligent Discovery World: Smithtown.

    Science.gov (United States)

    Shute, Valerie J.; Glaser, Robert

    1990-01-01

    Presents an evaluation of "Smithtown," an intelligent tutoring system designed to teach inductive inquiry skills and principles of basic microeconomics. Two studies of individual differences in learning are described, including a comparison of knowledge acquisition with traditional instruction; hypotheses tested are discussed; and the…

  13. Artificial Intelligence for Inferential Control of Crude Oil Stripping Process

    Directory of Open Access Journals (Sweden)

    Mehdi Ebnali

    2018-01-01

    Full Text Available Stripper columns are used for sweetening crude oil, and they must hold product hydrogen sulfide content as near the set points as possible in the faces of upsets. Since product    quality cannot be measured easily and economically online, the control of product quality is often achieved by maintaining a suitable tray temperature near its set point. Tray temperature control method, however, is not a proper option for a multi-component stripping column because the tray temperature does not correspond exactly to the product composition. To overcome this problem, secondary measurements can be used to infer the product quality and adjust the values of the manipulated variables. In this paper, we have used a novel inferential control approach base on adaptive network fuzzy inference system (ANFIS for stripping process. ANFIS with different learning algorithms is used for modeling the process and building a composition estimator to estimate the composition of the bottom product. The developed estimator is tested, and the results show that the predictions made by ANFIS structure are in good agreement with the results of simulation by ASPEN HYSYS process simulation package. In addition, inferential control by the implementation of ANFIS-based online composition estimator in a cascade control scheme is superior to traditional tray temperature control method based on less integral time absolute error and low duty consumption in reboiler.

  14. The process of deforestation in weak democracies and the role of Intelligence.

    Science.gov (United States)

    Obydenkova, Anastassia; Nazarov, Zafar; Salahodjaev, Raufhon

    2016-07-01

    This article examines the interconnection between national intelligence, political institutions, and the mismanagement of public resources (deforestations). The paper examines the reasons for deforestation and investigates the factors accountable for it. The analysis builds on authors-compiled cross-national dataset on 185 countries over the time period of twenty years, from 1990 to 2010. We find that, first, nation's intelligence reduces significantly the level of deforestation in a state. Moreover, the nations' IQ seems to play an offsetting role in the natural resource conservation (forest management) in the countries with weak democratic institutions. The analysis also discovered the presence of the U-shaped relationship between democracy and deforestation. Intelligence sheds more light on this interconnection and explains the results. Our results are robust to various sample selection strategies and model specifications. The main implication from our study is that intelligence not only shapes formal rules and informal regulations such as social trust, norms and traditions but also it has the ability to reverse the paradoxical process known as "resource curse." The study contributes to better understanding of reasons of deforestation and shed light on the debated impact of political regime on forest management. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Recent Technological Advances in Natural Language Processing and Artificial Intelligence

    OpenAIRE

    Shah, Nishal Pradeepkumar

    2012-01-01

    A recent advance in computer technology has permitted scientists to implement and test algorithms that were known from quite some time (or not) but which were computationally expensive. Two such projects are IBM's Jeopardy as a part of its DeepQA project [1] and Wolfram's Wolframalpha[2]. Both these methods implement natural language processing (another goal of AI scientists) and try to answer questions as asked by the user. Though the goal of the two projects is similar, both of them have a ...

  16. Statistical processing of large image sequences.

    Science.gov (United States)

    Khellah, F; Fieguth, P; Murray, M J; Allen, M

    2005-01-01

    The dynamic estimation of large-scale stochastic image sequences, as frequently encountered in remote sensing, is important in a variety of scientific applications. However, the size of such images makes conventional dynamic estimation methods, for example, the Kalman and related filters, impractical. In this paper, we present an approach that emulates the Kalman filter, but with considerably reduced computational and storage requirements. Our approach is illustrated in the context of a 512 x 512 image sequence of ocean surface temperature. The static estimation step, the primary contribution here, uses a mixture of stationary models to accurately mimic the effect of a nonstationary prior, simplifying both computational complexity and modeling. Our approach provides an efficient, stable, positive-definite model which is consistent with the given correlation structure. Thus, the methods of this paper may find application in modeling and single-frame estimation.

  17. 2nd International Symposium on Signal Processing and Intelligent Recognition Systems

    CERN Document Server

    Bandyopadhyay, Sanghamitra; Krishnan, Sri; Li, Kuan-Ching; Mosin, Sergey; Ma, Maode

    2016-01-01

    This Edited Volume contains a selection of refereed and revised papers originally presented at the second International Symposium on Signal Processing and Intelligent Recognition Systems (SIRS-2015), December 16-19, 2015, Trivandrum, India. The program committee received 175 submissions. Each paper was peer reviewed by at least three or more independent referees of the program committee and the 59 papers were finally selected. The papers offer stimulating insights into biometrics, digital watermarking, recognition systems, image and video processing, signal and speech processing, pattern recognition, machine learning and knowledge-based systems. The book is directed to the researchers and scientists engaged in various field of signal processing and related areas. .

  18. An Intelligent System for Modelling, Design and Analysis of Chemical Processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    ICAS, Integrated Computer Aided System, is a software that consists of a number of intelligent tools, which are very suitable, among others, for computer aided modelling, sustainable design of chemical and biochemical processes, and design-analysis of product-process monitoring systems. Each...... the computer aided modelling tool will illustrate how to generate a desired process model, how to analyze the model equations, how to extract data and identify the model and make it ready for various types of application. In sustainable process design, the example will highlight the issue of integration...

  19. Network-Capable Application Process and Wireless Intelligent Sensors for ISHM

    Science.gov (United States)

    Figueroa, Fernando; Morris, Jon; Turowski, Mark; Wang, Ray

    2011-01-01

    Intelligent sensor technology and systems are increasingly becoming attractive means to serve as frameworks for intelligent rocket test facilities with embedded intelligent sensor elements, distributed data acquisition elements, and onboard data acquisition elements. Networked intelligent processors enable users and systems integrators to automatically configure their measurement automation systems for analog sensors. NASA and leading sensor vendors are working together to apply the IEEE 1451 standard for adding plug-and-play capabilities for wireless analog transducers through the use of a Transducer Electronic Data Sheet (TEDS) in order to simplify sensor setup, use, and maintenance, to automatically obtain calibration data, and to eliminate manual data entry and error. A TEDS contains the critical information needed by an instrument or measurement system to identify, characterize, interface, and properly use the signal from an analog sensor. A TEDS is deployed for a sensor in one of two ways. First, the TEDS can reside in embedded, nonvolatile memory (typically flash memory) within the intelligent processor. Second, a virtual TEDS can exist as a separate file, downloadable from the Internet. This concept of virtual TEDS extends the benefits of the standardized TEDS to legacy sensors and applications where the embedded memory is not available. An HTML-based user interface provides a visual tool to interface with those distributed sensors that a TEDS is associated with, to automate the sensor management process. Implementing and deploying the IEEE 1451.1-based Network-Capable Application Process (NCAP) can achieve support for intelligent process in Integrated Systems Health Management (ISHM) for the purpose of monitoring, detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, mitigation to maintain operability, and integrated awareness of system health by the operator. It can also support local data collection and storage. This

  20. The use of artificial intelligence techniques to improve the multiple payload integration process

    Science.gov (United States)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  1. Developing an Intelligent Automatic Appendix Extraction Method from Ultrasonography Based on Fuzzy ART and Image Processing

    Directory of Open Access Journals (Sweden)

    Kwang Baek Kim

    2015-01-01

    Full Text Available Ultrasound examination (US does a key role in the diagnosis and management of the patients with clinically suspected appendicitis which is the most common abdominal surgical emergency. Among the various sonographic findings of appendicitis, outer diameter of the appendix is most important. Therefore, clear delineation of the appendix on US images is essential. In this paper, we propose a new intelligent method to extract appendix automatically from abdominal sonographic images as a basic building block of developing such an intelligent tool for medical practitioners. Knowing that the appendix is located at the lower organ area below the bottom fascia line, we conduct a series of image processing techniques to find the fascia line correctly. And then we apply fuzzy ART learning algorithm to the organ area in order to extract appendix accurately. The experiment verifies that the proposed method is highly accurate (successful in 38 out of 40 cases in extracting appendix.

  2. Intelligent Processing Equipment Developments Within the Navy's Manufacturing Technology Centers of Excellence

    Science.gov (United States)

    Nanzetta, Philip

    1992-01-01

    The U.S. Navy has had an active Manufacturing Technology (MANTECH) Program aimed at developing advanced production processes and equipment since the late-1960's. During the past decade, however, the resources of the MANTECH program were concentrated in Centers of Excellence. Today, the Navy sponsors four manufacturing technology Centers of Excellence: the Automated Manufacturing Research Facility (AMRF); the Electronics Manufacturing Productivity Facility (EMPF); the National Center for Excellence in Metalworking Technology (NCEMT); and the Center of Excellence for Composites Manufacturing Technology (CECMT). This paper briefly describes each of the centers and summarizes typical Intelligent Equipment Processing (IEP) projects that were undertaken.

  3. Development of a process model for intelligent control of gas metal arc welding

    International Nuclear Information System (INIS)

    Smartt, H.B.; Johnson, J.A.; Einerson, C.J.; Watkins, A.D.; Carlson, N.M.

    1991-01-01

    This paper discusses work in progress on the development of an intelligent control scheme for arc welding. A set of four sensors is used to detect weld bead cooling rate, droplet transfer mode, weld pool and joint location and configuration, and weld defects during welding. A neural network is being developed as the bridge between the multiple sensor set a conventional proportional-integral controller that provides independent control of process variables. This approach is being developed for the gas metal arc welding process. 20 refs., 8 figs

  4. Study on intelligent processing system of man-machine interactive garment frame model

    Science.gov (United States)

    Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian

    2018-05-01

    A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.

  5. Comparing Binaural Pre-processing Strategies II: Speech Intelligibility of Bilateral Cochlear Implant Users.

    Science.gov (United States)

    Baumgärtel, Regina M; Hu, Hongmei; Krawczyk-Becker, Martin; Marquardt, Daniel; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Bomke, Katrin; Plotz, Karsten; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias

    2015-12-30

    Several binaural audio signal enhancement algorithms were evaluated with respect to their potential to improve speech intelligibility in noise for users of bilateral cochlear implants (CIs). 50% speech reception thresholds (SRT50) were assessed using an adaptive procedure in three distinct, realistic noise scenarios. All scenarios were highly nonstationary, complex, and included a significant amount of reverberation. Other aspects, such as the perfectly frontal target position, were idealized laboratory settings, allowing the algorithms to perform better than in corresponding real-world conditions. Eight bilaterally implanted CI users, wearing devices from three manufacturers, participated in the study. In all noise conditions, a substantial improvement in SRT50 compared to the unprocessed signal was observed for most of the algorithms tested, with the largest improvements generally provided by binaural minimum variance distortionless response (MVDR) beamforming algorithms. The largest overall improvement in speech intelligibility was achieved by an adaptive binaural MVDR in a spatially separated, single competing talker noise scenario. A no-pre-processing condition and adaptive differential microphones without a binaural link served as the two baseline conditions. SRT50 improvements provided by the binaural MVDR beamformers surpassed the performance of the adaptive differential microphones in most cases. Speech intelligibility improvements predicted by instrumental measures were shown to account for some but not all aspects of the perceptually obtained SRT50 improvements measured in bilaterally implanted CI users. © The Author(s) 2015.

  6. A New Dimension of Business Intelligence: Location-based Intelligence

    OpenAIRE

    Zeljko Panian

    2012-01-01

    Through the course of this paper we define Locationbased Intelligence (LBI) which is outgrowing from process of amalgamation of geolocation and Business Intelligence. Amalgamating geolocation with traditional Business Intelligence (BI) results in a new dimension of BI named Location-based Intelligence. LBI is defined as leveraging unified location information for business intelligence. Collectively, enterprises can transform location data into business intelligence applic...

  7. Development and evaluation of an intelligent traceability system for frozen tilapia fillet processing.

    Science.gov (United States)

    Xiao, Xinqing; Fu, Zetian; Qi, Lin; Mira, Trebar; Zhang, Xiaoshuan

    2015-10-01

    The main export varieties in China are brand-name, high-quality bred aquatic products. Among them, tilapia has become the most important and fast-growing species since extensive consumer markets in North America and Europe have evolved as a result of commodity prices, year-round availability and quality of fresh and frozen products. As the largest tilapia farming country, China has over one-third of its tilapia production devoted to further processing and meeting foreign market demand. Using by tilapia fillet processing, this paper introduces the efforts for developing and evaluating ITS-TF: an intelligent traceability system integrated with statistical process control (SPC) and fault tree analysis (FTA). Observations, literature review and expert questionnaires were used for system requirement and knowledge acquisition; scenario simulation was applied to evaluate and validate ITS-TF performance. The results show that traceability requirement is evolved from a firefighting model to a proactive model for enhancing process management capacity for food safety; ITS-TF transforms itself as an intelligent system to provide functions on early warnings and process management by integrated SPC and FTA. The valuable suggestion that automatic data acquisition and communication technology should be integrated into ITS-TF was achieved for further system optimization, perfection and performance improvement. © 2014 Society of Chemical Industry.

  8. An intelligent approach for cooling radiator fault diagnosis based on infrared thermal image processing technique

    International Nuclear Information System (INIS)

    Taheri-Garavand, Amin; Ahmadi, Hojjat; Omid, Mahmoud; Mohtasebi, Seyed Saeid; Mollazade, Kaveh; Russell Smith, Alan John; Carlomagno, Giovanni Maria

    2015-01-01

    This research presents a new intelligent fault diagnosis and condition monitoring system for classification of different conditions of cooling radiator using infrared thermal images. The system was adopted to classify six types of cooling radiator faults; radiator tubes blockage, radiator fins blockage, loose connection between fins and tubes, radiator door failure, coolant leakage, and normal conditions. The proposed system consists of several distinct procedures including thermal image acquisition, image pre-processing, image processing, two-dimensional discrete wavelet transform (2D-DWT), feature extraction, feature selection using a genetic algorithm (GA), and finally classification by artificial neural networks (ANNs). The 2D-DWT is implemented to decompose the thermal images. Subsequently, statistical texture features are extracted from the original images and are decomposed into thermal images. The significant selected features are used to enhance the performance of the designed ANN classifier for the 6 types of cooling radiator conditions (output layer) in the next stage. For the tested system, the input layer consisted of 16 neurons based on the feature selection operation. The best performance of ANN was obtained with a 16-6-6 topology. The classification results demonstrated that this system can be employed satisfactorily as an intelligent condition monitoring and fault diagnosis for a class of cooling radiator. - Highlights: • Intelligent fault diagnosis of cooling radiator using thermal image processing. • Thermal image processing in a multiscale representation structure by 2D-DWT. • Selection features based on a hybrid system that uses both GA and ANN. • Application of ANN as classifier. • Classification accuracy of fault detection up to 93.83%

  9. Large deviations for Gaussian processes in Hoelder norm

    International Nuclear Information System (INIS)

    Fatalov, V R

    2003-01-01

    Some results are proved on the exact asymptotic representation of large deviation probabilities for Gaussian processes in the Hoeder norm. The following classes of processes are considered: the Wiener process, the Brownian bridge, fractional Brownian motion, and stationary Gaussian processes with power-law covariance function. The investigation uses the method of double sums for Gaussian fields

  10. Working memory - not processing speed - mediates fluid intelligence deficits associated with attention deficit/hyperactivity disorder symptoms.

    Science.gov (United States)

    Brydges, Christopher R; Ozolnieks, Krista L; Roberts, Gareth

    2017-09-01

    Attention deficit/hyperactivity disorder (ADHD) is a psychological condition characterized by inattention and hyperactivity. Cognitive deficits are commonly observed in ADHD patients, including impaired working memory, processing speed, and fluid intelligence, the three of which are theorized to be closely associated with one another. In this study, we aimed to determine if decreased fluid intelligence was associated with ADHD, and was mediated by deficits in working memory and processing speed. This study tested 142 young adults from the general population on a range of working memory, processing speed, and fluid intelligence tasks, and an ADHD self-report symptoms questionnaire. Results showed that total and hyperactive ADHD symptoms correlated significantly and negatively with fluid intelligence, but this association was fully mediated by working memory. However, inattentive symptoms were not associated with fluid intelligence. Additionally, processing speed was not associated with ADHD symptoms at all, and was not uniquely predictive of fluid intelligence. The results provide implications for working memory training programs for ADHD patients, and highlight potential differences between the neuropsychological profiles of ADHD subtypes. © 2015 The British Psychological Society.

  11. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining.

    Science.gov (United States)

    Salehi, Mojtaba; Bahreininejad, Ardeshir

    2011-08-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.

  12. Artificial Intelligence.

    Science.gov (United States)

    Wash, Darrel Patrick

    1989-01-01

    Making a machine seem intelligent is not easy. As a consequence, demand has been rising for computer professionals skilled in artificial intelligence and is likely to continue to go up. These workers develop expert systems and solve the mysteries of machine vision, natural language processing, and neural networks. (Editor)

  13. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  14. Event Detection Intelligent Camera: Demonstration of flexible, real-time data taking and processing

    Energy Technology Data Exchange (ETDEWEB)

    Szabolics, Tamás, E-mail: szabolics.tamas@wigner.mta.hu; Cseh, Gábor; Kocsis, Gábor; Szepesi, Tamás; Zoletnik, Sándor

    2015-10-15

    Highlights: • We present EDICAM's operation principles description. • Firmware tests results. • Software test results. • Further developments. - Abstract: An innovative fast camera (EDICAM – Event Detection Intelligent CAMera) was developed by MTA Wigner RCP in the last few years. This new concept was designed for intelligent event driven processing to be able to detect predefined events and track objects in the plasma. The camera provides a moderate frame rate of 400 Hz at full frame resolution (1280 × 1024), and readout of smaller region of interests can be done in the 1–140 kHz range even during exposure of the full image. One of the most important advantages of this hardware is a 10 Gbit/s optical link which ensures very fast communication and data transfer between the PC and the camera, enabling two level of processing: primitive algorithms in the camera hardware and high-level processing in the PC. This camera hardware has successfully proven to be able to monitoring the plasma in several fusion devices for example at ASDEX Upgrade, KSTAR and COMPASS with the first version of firmware. A new firmware and software package is under development. It allows to detect predefined events in real time and therefore the camera is capable to change its own operation or to give warnings e.g. to the safety system of the experiment. The EDICAM system can handle a huge amount of data (up to TBs) with high data rate (950 MB/s) and will be used as the central element of the 10 camera overview video diagnostic system of Wendenstein 7-X (W7-X) stellarator. This paper presents key elements of the newly developed built-in intelligence stressing the revolutionary new features and the results of the test of the different software elements.

  15. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  16. The Virtual UNICOS Process Expert: integration of Artificial Intelligence tools in Control Systems

    CERN Multimedia

    Vilches Calvo, I; Barillere, R

    2009-01-01

    UNICOS is a CERN framework to produce control applications. It provides operators with ways to interact with all process items from the most simple (e.g. I/O channels) to the most abstract objects (e.g. a part of the plant). This possibility of fine grain operation is particularly useful to recover from abnormal situations if operators have the required knowledge. The Virtual UNICOS Process Expert project aims at providing operators with means to handle difficult operation cases for which the intervention of process experts is usually requested. The main idea of project is to use the openness of the UNICOS-based applications to integrate tools (e.g. Artificial Intelligence tools) which will act as Process Experts to analyze complex situations, to propose and to execute smooth recovery procedures.

  17. Intelligent structural optimization: Concept, Model and Methods

    International Nuclear Information System (INIS)

    Lu, Dagang; Wang, Guangyuan; Peng, Zhang

    2002-01-01

    Structural optimization has many characteristics of Soft Design, and so, it is necessary to apply the experience of human experts to solving the uncertain and multidisciplinary optimization problems in large-scale and complex engineering systems. With the development of artificial intelligence (AI) and computational intelligence (CI), the theory of structural optimization is now developing into the direction of intelligent optimization. In this paper, a concept of Intelligent Structural Optimization (ISO) is proposed. And then, a design process model of ISO is put forward in which each design sub-process model are discussed. Finally, the design methods of ISO are presented

  18. Cognitive Processing Speed, Working Memory, and the Intelligibility of Hearing Aid-Processed Speech in Persons with Hearing Impairment

    Directory of Open Access Journals (Sweden)

    Wycliffe Kabaywe Yumba

    2017-08-01

    Full Text Available Previous studies have demonstrated that successful listening with advanced signal processing in digital hearing aids is associated with individual cognitive capacity, particularly working memory capacity (WMC. This study aimed to examine the relationship between cognitive abilities (cognitive processing speed and WMC and individual listeners’ responses to digital signal processing settings in adverse listening conditions. A total of 194 native Swedish speakers (83 women and 111 men, aged 33–80 years (mean = 60.75 years, SD = 8.89, with bilateral, symmetrical mild to moderate sensorineural hearing loss who had completed a lexical decision speed test (measuring cognitive processing speed and semantic word-pair span test (SWPST, capturing WMC participated in this study. The Hagerman test (capturing speech recognition in noise was conducted using an experimental hearing aid with three digital signal processing settings: (1 linear amplification without noise reduction (NoP, (2 linear amplification with noise reduction (NR, and (3 non-linear amplification without NR (“fast-acting compression”. The results showed that cognitive processing speed was a better predictor of speech intelligibility in noise, regardless of the types of signal processing algorithms used. That is, there was a stronger association between cognitive processing speed and NR outcomes and fast-acting compression outcomes (in steady state noise. We observed a weaker relationship between working memory and NR, but WMC did not relate to fast-acting compression. WMC was a relatively weaker predictor of speech intelligibility in noise. These findings might have been different if the participants had been provided with training and or allowed to acclimatize to binary masking noise reduction or fast-acting compression.

  19. Distributed Model Predictive Control over Multiple Groups of Vehicles in Highway Intelligent Space for Large Scale System

    Directory of Open Access Journals (Sweden)

    Tang Xiaofeng

    2014-01-01

    Full Text Available The paper presents the three time warning distances for solving the large scale system of multiple groups of vehicles safety driving characteristics towards highway tunnel environment based on distributed model prediction control approach. Generally speaking, the system includes two parts. First, multiple vehicles are divided into multiple groups. Meanwhile, the distributed model predictive control approach is proposed to calculate the information framework of each group. Each group of optimization performance considers the local optimization and the neighboring subgroup of optimization characteristics, which could ensure the global optimization performance. Second, the three time warning distances are studied based on the basic principles used for highway intelligent space (HIS and the information framework concept is proposed according to the multiple groups of vehicles. The math model is built to avoid the chain avoidance of vehicles. The results demonstrate that the proposed highway intelligent space method could effectively ensure driving safety of multiple groups of vehicles under the environment of fog, rain, or snow.

  20. Identification of low order models for large scale processes

    NARCIS (Netherlands)

    Wattamwar, S.K.

    2010-01-01

    Many industrial chemical processes are complex, multi-phase and large scale in nature. These processes are characterized by various nonlinear physiochemical effects and fluid flows. Such processes often show coexistence of fast and slow dynamics during their time evolutions. The increasing demand

  1. Research on application of intelligent computation based LUCC model in urbanization process

    Science.gov (United States)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents

  2. Work process and task-based design of intelligent assistance systems in German textile industry

    Science.gov (United States)

    Löhrer, M.; Ziesen, N.; Altepost, A.; Saggiomo, M.; Gloy, Y. S.

    2017-10-01

    The mid-sized embossed German textile industry must face social challenges e.g. demographic change or technical changing processes. Interaction with intelligent systems (on machines) and increasing automation changes processes, working structures and employees’ tasks on all levels. Work contents are getting more complex, resulting in the necessity for diversified and enhanced competencies. Mobile devices like tablets or smartphones are increasingly finding their way into the workplace. Employees who grew up with new forms of media have certain advantages regarding the usage of modern technologies compared to older employees. Therefore, it is necessary to design new systems which help to adapt the competencies of both younger and older employees to new automated production processes in the digital work environment. The key to successful integration of technical assistance systems is user-orientated design and development that includes concepts for competency development under consideration of, e.g., ethical and legal aspects.

  3. The research of new type stratified water injection process intelligent measurement technology

    Science.gov (United States)

    Zhao, Xin

    2017-10-01

    To meet the needs of injection and development of Daqing Oilfield, the injection of oil from the early stage of general water injection to the subdivision of water is the purpose of improving the utilization degree and the qualified rate of water injection, improving the performance of water injection column and the matching process. Sets of suitable for high water content of the effective water injection technology supporting technology. New layered water injection technology intelligent measurement technology will be more information testing and flow control combined into a unified whole, long-term automatic monitoring of the work of the various sections, in the custom The process has the characteristics of "multi-layer synchronous measurement, continuous monitoring of process parameters, centralized admission data", which can meet the requirement of subdivision water injection, but also realize the automatic synchronization measurement of each interval, greatly improve the efficiency of tiered injection wells to provide a new means for the remaining oil potential.

  4. Political and Budgetary Oversight of the Ukrainian Intelligence Community: Processes, Problems and Prospects for Reform

    National Research Council Canada - National Science Library

    Petrov, Oleksii

    2007-01-01

    .... Official government documents, news reports and other literature on the intelligence system in Ukraine, as well as studies of intelligence oversight within democracies are the primary sources of data...

  5. A Spatially Intelligent Public Participation System for the Environmental Impact Assessment Process

    Directory of Open Access Journals (Sweden)

    Lei Lei

    2013-05-01

    Full Text Available An environmental impact assessment (EIA is a decision-making process that evaluates the possible significant effects that a proposed project may exert on the environment. The EIA scoping and reviewing stages often involve public participation. Although its importance has long been recognized, public participation in the EIA process is often regarded as ineffective, due to time, budget, resource, technical and procedural constraints, as well as the complexity of environmental information. Geographic Information System (GIS and Volunteer Geographic Information (VGI have the potential to contribute to data collection, sharing and presentation, utilize local user-generated content to benefit decision-making and increase public outreach. This research integrated GIS, VGI, social media tools, data mining and mobile technology to design a spatially intelligent framework that presented and shared EIA information effectively to the public. A spatially intelligent public participative system (SIPPS was also developed as a proof-of-concept of the framework. The research selected the Tehachapi Renewable Transmission Project (TRTP as the pilot study area. Survey questionnaires were designed to collect feedback and conduct evaluation. Results show that SIPPS was able to improve the effectiveness of public participation, promote environmental awareness and achieve good system usability.

  6. Study of an intelligent system for wells elevation and petroliferous processes control; Estudo de um sistema inteligente para elevacao de pocos e controle de processos petroliferos

    Energy Technology Data Exchange (ETDEWEB)

    Patricio, Antonio Rodrigues

    1996-11-01

    The petroleum production problems were studied by means of an integrated process evaluation of a rod pumping well, a gas lift well and a process until for produced fluids. Using the artificial intelligent concepts as fuzzy logic and neural systems is presented SIEP, An Intelligent for Production Lift and Process Control, aimed to do the integrated management of the petroleum production process. (author)

  7. Co-evolution of intelligent socio-technical systems modelling and applications in large scale emergency and transport domains

    CERN Document Server

    2013-01-01

    As the interconnectivity between humans through technical devices is becoming ubiquitous, the next step is already in the making: ambient intelligence, i.e. smart (technical) environments, which will eventually play the same active role in communication as the human players, leading to a co-evolution in all domains where real-time communication is essential. This topical volume, based on the findings of the Socionical European research project, gives equal attention to two highly relevant domains of applications: transport, specifically traffic, dynamics from the viewpoint of a socio-technical interaction and evacuation scenarios for large-scale emergency situations. Care was taken to investigate as much as possible the limits of scalability and to combine the modeling using complex systems science approaches with relevant data analysis.

  8. Emotional Intelligence Tests: Potential Impacts on the Hiring Process for Accounting Students

    Science.gov (United States)

    Nicholls, Shane; Wegener, Matt; Bay, Darlene; Cook, Gail Lynn

    2012-01-01

    Emotional intelligence is increasingly recognized as being important for professional career success. Skills related to emotional intelligence (e.g. organizational commitment, public speaking, teamwork, and leadership) are considered essential. Human resource professionals have begun including tests of emotional intelligence (EI) in job applicant…

  9. Broadband Reflective Coating Process for Large FUVOIR Mirrors, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZeCoat Corporation will develop and demonstrate a set of revolutionary coating processes for making broadband reflective coatings suitable for very large mirrors (4+...

  10. Special Issue on Intelligent Robots

    Directory of Open Access Journals (Sweden)

    Genci Capi

    2013-08-01

    Full Text Available The research on intelligent robots will produce robots that are able to operate in everyday life environments, to adapt their program according to environment changes, and to cooperate with other team members and humans. Operating in human environments, robots need to process, in real time, a large amount of sensory data—such as vision, laser, microphone—in order to determine the best action. Intelligent algorithms have been successfully applied to link complex sensory data to robot action. This editorial briefly summarizes recent findings in the field of intelligent robots as described in the articles published in this special issue.

  11. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  12. The Viewpoint Paradigm: a semiotic based approach for the intelligibility of a cooperative designing process

    Directory of Open Access Journals (Sweden)

    Pierre-Jean Charrel

    2002-11-01

    Full Text Available The concept of viewpoint is studied in the field of the modelling and the knowledge management concerned in the upstream phases of a designing process. The concept is approached by semiotics, i.e. in dealing with the requirements so that an actor gives sense to an object. This gives means to transform the intuitive concepts of viewpoint and relation between viewpoints into the Viewpoint Paradigm: the sense of an object is the integration of the viewpoints which exert on it. The elements of this paradigm are integrated in a general model, which defines two concepts formally: Viewpoint and Correlation of viewpoints. The Viewpoint Paradigm is then implemented in operational concerns which are related with the intelligibility of the designing process. Two models of viewpoint and correlation are proposed. They raise of viewpoints management such as one can identify them in the written documents of a project.

  13. Pipeline defect prediction using long range ultrasonic testing and intelligent processing

    International Nuclear Information System (INIS)

    Dino Isa; Rajprasad Rajkumar

    2009-01-01

    This paper deals with efforts to improve nondestructive testing (NDT) techniques by using artificial intelligence in detecting and predicting pipeline defects such as cracks and wall thinning. The main emphasis here will be on the prediction of corrosion type defects rather than just detection after the fact. Long range ultrasonic testing will be employed, where a ring of piezoelectric transducers are used to generate torsional guided waves. Various defects such as cracks as well as corrosion under insulation (CUI) will be simulated on a test pipe. The machine learning algorithm known as the Support Vector Machine (SVM) will be used to predict and classify transducer signals using regression and large margin classification. Regression results show that the SVM is able to accurately predict future defects based on trends of previous defect. The classification performance was also exceptional showing a facility to detect defects at different depths as well as for distinguishing closely spaced defects. (author)

  14. Energy-efficient hierarchical processing in the network of wireless intelligent sensors (WISE)

    Science.gov (United States)

    Raskovic, Dejan

    Sensor network nodes have benefited from technological advances in the field of wireless communication, processing, and power sources. However, the processing power of microcontrollers is often not sufficient to perform sophisticated processing, while the power requirements of digital signal processing boards or handheld computers are usually too demanding for prolonged system use. We are matching the intrinsic hierarchical nature of many digital signal-processing applications with the natural hierarchy in distributed wireless networks, and building the hierarchical system of wireless intelligent sensors. Our goal is to build a system that will exploit the hierarchical organization to optimize the power consumption and extend battery life for the given time and memory constraints, while providing real-time processing of sensor signals. In addition, we are designing our system to be able to adapt to the current state of the environment, by dynamically changing the algorithm through procedure replacement. This dissertation presents the analysis of hierarchical environment and methods for energy profiling used to evaluate different system design strategies, and to optimize time-effective and energy-efficient processing.

  15. Optimal sensor placement for large structures using the nearest neighbour index and a hybrid swarm intelligence algorithm

    International Nuclear Information System (INIS)

    Lian, Jijian; He, Longjun; Ma, Bin; Peng, Wenxiang; Li, Huokun

    2013-01-01

    Research on optimal sensor placement (OSP) has become very important due to the need to obtain effective testing results with limited testing resources in health monitoring. In this study, a new methodology is proposed to select the best sensor locations for large structures. First, a novel fitness function derived from the nearest neighbour index is proposed to overcome the drawbacks of the effective independence method for OSP for large structures. This method maximizes the contribution of each sensor to modal observability and simultaneously avoids the redundancy of information between the selected degrees of freedom. A hybrid algorithm combining the improved discrete particle swarm optimization (DPSO) with the clonal selection algorithm is then implemented to optimize the proposed fitness function effectively. Finally, the proposed method is applied to an arch dam for performance verification. The results show that the proposed hybrid swarm intelligence algorithm outperforms a genetic algorithm with decimal two-dimension array encoding and DPSO in the capability of global optimization. The new fitness function is advantageous in terms of sensor distribution and ensuring a well-conditioned information matrix and orthogonality of modes, indicating that this method may be used to provide guidance for OSP in various large structures. (paper)

  16. An Integrated Open Approach to Capturing Systematic Knowledge for Manufacturing Process Innovation Based on Collective Intelligence

    Directory of Open Access Journals (Sweden)

    Gangfeng Wang

    2018-02-01

    Full Text Available Process innovation plays a vital role in the manufacture realization of increasingly complex new products, especially in the context of sustainable development and cleaner production. Knowledge-based innovation design can inspire designers’ creative thinking; however, the existing scattered knowledge has not yet been properly captured and organized according to Computer-Aided Process Innovation (CAPI. Therefore, this paper proposes an integrated approach to tackle this non-trivial issue. By analyzing the design process of CAPI and technical features of open innovation, a novel holistic paradigm of process innovation knowledge capture based on collective intelligence (PIKC-CI is constructed from the perspective of the knowledge life cycle. Then, a multi-source innovation knowledge fusion algorithm based on semantic elements reconfiguration is applied to form new public knowledge. To ensure the credibility and orderliness of innovation knowledge refinement, a collaborative editing strategy based on knowledge lock and knowledge–social trust degree is explored. Finally, a knowledge management system MPI-OKCS integrating the proposed techniques is implemented into the pre-built CAPI general platform, and a welding process innovation example is provided to illustrate the feasibility of the proposed approach. It is expected that our work would lay the foundation for the future knowledge-inspired CAPI and smart process planning.

  17. From Collective Knowledge to Intelligence : Pre-Requirements Analysis of Large and Complex Systems

    NARCIS (Netherlands)

    Liang, Peng; Avgeriou, Paris; He, Keqing; Xu, Lai

    2010-01-01

    Requirements engineering is essentially a social collaborative activity in which involved stakeholders have to closely work together to communicate, elicit, negotiate, define, confirm, and finally come up with the requirements for the system to be implemented or upgraded. In the development of large

  18. [Dual process in large number estimation under uncertainty].

    Science.gov (United States)

    Matsumuro, Miki; Miwa, Kazuhisa; Terai, Hitoshi; Yamada, Kento

    2016-08-01

    According to dual process theory, there are two systems in the mind: an intuitive and automatic System 1 and a logical and effortful System 2. While many previous studies about number estimation have focused on simple heuristics and automatic processes, the deliberative System 2 process has not been sufficiently studied. This study focused on the System 2 process for large number estimation. First, we described an estimation process based on participants’ verbal reports. The task, corresponding to the problem-solving process, consisted of creating subgoals, retrieving values, and applying operations. Second, we investigated the influence of such deliberative process by System 2 on intuitive estimation by System 1, using anchoring effects. The results of the experiment showed that the System 2 process could mitigate anchoring effects.

  19. ECG Signal Processing, Classification and Interpretation A Comprehensive Framework of Computational Intelligence

    CERN Document Server

    Pedrycz, Witold

    2012-01-01

    Electrocardiogram (ECG) signals are among the most important sources of diagnostic information in healthcare so improvements in their analysis may also have telling consequences. Both the underlying signal technology and a burgeoning variety of algorithms and systems developments have proved successful targets for recent rapid advances in research. ECG Signal Processing, Classification and Interpretation shows how the various paradigms of Computational Intelligence, employed either singly or in combination, can produce an effective structure for obtaining often vital information from ECG signals. Neural networks do well at capturing the nonlinear nature of the signals, information granules realized as fuzzy sets help to confer interpretability on the data and evolutionary optimization may be critical in supporting the structural development of ECG classifiers and models of ECG signals. The contributors address concepts, methodology, algorithms, and case studies and applications exploiting the paradigm of Comp...

  20. Argumentative SOX Compliant and Quality Decision Support Intelligent Expert System over the Suppliers Selection Process

    Directory of Open Access Journals (Sweden)

    Jesus Angel Fernandez Canelas

    2013-01-01

    Full Text Available The objective of this paper is to define a decision support system over SOX (Sarbanes-Oxley Act compatibility and quality of the Suppliers Selection Process based on Artificial Intelligence and Argumentation Theory knowledge and techniques. The present SOX Law, in effect nowadays, was created to improve financial government control over US companies. This law is a factor standard out United States due to several factors like present globalization, expansion of US companies, or key influence of US stock exchange markets worldwide. This paper constitutes a novel approach to this kind of problems due to following elements: (1 it has an optimized structure to look for the solution, (2 it has a dynamic learning method to handle court and control gonvernment bodies decisions, (3 it uses fuzzy knowledge to improve its performance, and (4 it uses its past accumulated experience to let the system evolve far beyond its initial state.

  1. Study on robot motion control for intelligent welding processes based on the laser tracking sensor

    Science.gov (United States)

    Zhang, Bin; Wang, Qian; Tang, Chen; Wang, Ju

    2017-06-01

    A robot motion control method is presented for intelligent welding processes of complex spatial free-form curve seams based on the laser tracking sensor. First, calculate the tip position of the welding torch according to the velocity of the torch and the seam trajectory detected by the sensor. Then, search the optimal pose of the torch under constraints using genetic algorithms. As a result, the intersection point of the weld seam and the laser plane of the sensor is within the detectable range of the sensor. Meanwhile, the angle between the axis of the welding torch and the tangent of the weld seam meets the requirements. The feasibility of the control method is proved by simulation.

  2. Development of an intelligent CAI system for a distributed processing environment

    International Nuclear Information System (INIS)

    Fujii, M.; Sasaki, K.; Ohi, T.; Itoh, T.

    1993-01-01

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  3. Intelligent Processing Equipment Research and Development Programs of the Department of Commerce

    Science.gov (United States)

    Simpson, J. A.

    1992-01-01

    The intelligence processing equipment (IPE) research and development (R&D) programs of the Department of Commerce are carried out within the National Institute of Standards and Technology (NIST). This institute has had work in support of industrial productivity as part of its mission since its founding in 1901. With the advent of factory automation these efforts have increasingly turned to R&D in IPE. The Manufacturing Engineering Laboratory (MEL) of NIST devotes a major fraction of its efforts to this end while other elements within the organization, notably the Material Science and Engineering Laboratory, have smaller but significant programs. An inventory of all such programs at NIST and a representative selection of projects that at least demonstrate the scope of the efforts are presented.

  4. Service with a smile: do emotional intelligence, gender, and autonomy moderate the emotional labor process?

    Science.gov (United States)

    Johnson, Hazel-Anne M; Spector, Paul E

    2007-10-01

    This survey study of 176 participants from eight customer service organizations investigated how individual factors moderate the impact of emotional labor strategies on employee well-being. Hierarchical regression analyses indicated that gender and autonomy were significant moderators of the relationships between emotional labor strategies and the personal outcomes of emotional exhaustion, affective well-being, and job satisfaction. Females were more likely to experience negative consequences when engaging in surface acting. Autonomy served to alleviate negative outcomes for individuals who used emotional labor strategies often. Contrary to our hypotheses, emotional intelligence did not moderate the relationship between the emotional labor strategies and personal outcomes. Results demonstrated how the emotional labor process can influence employee well-being. (c) 2007 APA, all rights reserved.

  5. WAMS Based Intelligent Operation and Control of Modern Power System with large Scale Renewable Energy Penetration

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain

    security limits. Under such scenario, progressive displacement of conventional generation by wind generation is expected to eventually lead a complex power system with least presence of central power plants. Consequently the support from conventional power plants is expected to reach its all-time low...... system voltage control responsibility from conventional power plants to wind turbines. With increased wind penetration and displaced conventional central power plants, dynamic voltage security has been identified as one of the challenging issue for large scale wind integration. To address the dynamic...... security issue, a WAMS based systematic voltage control scheme for large scale wind integrated power system has been proposed. Along with the optimal reactive power compensation, the proposed scheme considers voltage support from wind farms (equipped with voltage support functionality) and refurbished...

  6. Efficient large-scale graph data optimization for intelligent video surveillance

    Science.gov (United States)

    Shang, Quanhong; Zhang, Shujun; Wang, Yanbo; Sun, Chen; Wang, Zepeng; Zhang, Luming

    2017-08-01

    Society is rapidly accepting the use of a wide variety of cameras Location and applications: site traffic monitoring, parking Lot surveillance, car and smart space. These ones here the camera provides data every day in an analysis Effective way. Recent advances in sensor technology Manufacturing, communications and computing are stimulating.The development of new applications that can change the traditional Vision system incorporating universal smart camera network. This Analysis of visual cues in multi camera networks makes wide Applications ranging from smart home and office automation to large area surveillance and traffic surveillance. In addition, dense Camera networks, most of which have large overlapping areas of cameras. In the view of good research, we focus on sparse camera networks. One Sparse camera network using large area surveillance. As few cameras as possible, most cameras do not overlap Each other’s field of vision. This task is challenging Lack of knowledge of topology Network, the specific changes in appearance and movement Track different opinions of the target, as well as difficulties Understanding complex events in a network. In this review in this paper, we present a comprehensive survey of recent studies Results to solve the problem of topology learning, Object appearance modeling and global activity understanding sparse camera network. In addition, some of the current open Research issues are discussed.

  7. The Relationship between Emotional Intelligence and Cool and Hot Cognitive Processes: A Systematic Review

    Science.gov (United States)

    Gutiérrez-Cobo, María José; Cabello, Rosario; Fernández-Berrocal, Pablo

    2016-01-01

    Although emotion and cognition were considered to be separate aspects of the psyche in the past, researchers today have demonstrated the existence of an interplay between the two processes. Emotional intelligence (EI), or the ability to perceive, use, understand, and regulate emotions, is a relatively young concept that attempts to connect both emotion and cognition. While EI has been demonstrated to be positively related to well-being, mental and physical health, and non-aggressive behaviors, little is known about its underlying cognitive processes. The aim of the present study was to systematically review available evidence about the relationship between EI and cognitive processes as measured through “cool” (i.e., not emotionally laden) and “hot” (i.e., emotionally laden) laboratory tasks. We searched Scopus and Medline to find relevant articles in Spanish and English, and divided the studies following two variables: cognitive processes (hot vs. cool) and EI instruments used (performance-based ability test, self-report ability test, and self-report mixed test). We identified 26 eligible studies. The results provide a fair amount of evidence that performance-based ability EI (but not self-report EI tests) is positively related with efficiency in hot cognitive tasks. EI, however, does not appear to be related with cool cognitive tasks: neither through self-reporting nor through performance-based ability instruments. These findings suggest that performance-based ability EI could improve individuals’ emotional information processing abilities. PMID:27303277

  8. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  9. Artificial Intelligence Tools for Scaling Up of High Shear Wet Granulation Process.

    Science.gov (United States)

    Landin, Mariana

    2017-01-01

    The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R 2 > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  10. Development of automatic radiographic inspection system using digital image processing and artificial intelligence

    International Nuclear Information System (INIS)

    Itoga, Kouyu; Sugimoto, Koji; Michiba, Koji; Kato, Yuhei; Sugita, Yuji; Onda, Katsuhiro.

    1991-01-01

    The application of computers to welding inspection is expanding rapidly. The classification of the application is the collection, analysis and processing of data, the graphic display of results, the distinction of the kinds of defects and the evaluation of the harmufulness of defects and the judgement of acceptance or rejection. The application of computer techniques to the automation of data collection was realized at the relatively early stage. Data processing and the graphic display of results are the techniques in progress now, and the application of artificial intelligence to the distinction of the kinds of defects and the evaluation of harmfulness is expected to expand rapidly. In order to computerize radiographic inspection, the abilities of image processing technology and knowledge engineering must be given to computers. The object of this system is the butt joints by arc welding of the steel materials of up to 30 mm thickness. The digitizing transformation of radiographs, the distinction and evaluation of transmissivity and gradation by image processing, and only as for those, of which the picture quality satisfies the standard, the extraction of defect images, their display, the distinction of the kinds and the final judgement are carried out. The techniques of image processing, the knowledge for distinguishing the kinds of defects and the concept of the practical system are reported. (K.I.)

  11. Real-time operation guide system for sintering process with artificial intelligence

    Institute of Scientific and Technical Information of China (English)

    FAN Xiao-hui; CHEN Xu-ling; JIANG Tao; LI Tao

    2005-01-01

    In order to optimize the sintering process, a real-time operation guide system with artificial intelligence was developed, mainly including the data acquisition online subsystem, the sinter chemical composition controller, the sintering process state controller, and the abnormal conditions diagnosis subsystem. Knowledge base of the sintering process controlling was constructed, and inference engine of the system was established. Sinter chemical compositions were controlled by the strategies of self-adaptive prediction, internal optimization and center on basicity. And the state of sintering was stabilized centering on permeability. In order to meet the needs of process change and make the system clear, the system has learning ability and explanation function. The software of the system was developed in Visual C++ programming language. The application of the system shows that the hitting accuracy of sinter compositions and burning through point prediction are more than 85%; the first-grade rate of sinter chemical composition, stability rate of burning through point and stability rate of sintering process are increased by 3%, 9% and 4%, respectively.

  12. Artificial Intelligence Mechanisms on Interactive Modified Simplex Method with Desirability Function for Optimising Surface Lapping Process

    Directory of Open Access Journals (Sweden)

    Pongchanun Luangpaiboon

    2014-01-01

    Full Text Available A study has been made to optimise the influential parameters of surface lapping process. Lapping time, lapping speed, downward pressure, and charging pressure were chosen from the preliminary studies as parameters to determine process performances in terms of material removal, lap width, and clamp force. The desirability functions of the-nominal-the-best were used to compromise multiple responses into the overall desirability function level or D response. The conventional modified simplex or Nelder-Mead simplex method and the interactive desirability function are performed to optimise online the parameter levels in order to maximise the D response. In order to determine the lapping process parameters effectively, this research then applies two powerful artificial intelligence optimisation mechanisms from harmony search and firefly algorithms. The recommended condition of (lapping time, lapping speed, downward pressure, and charging pressure at (33, 35, 6.0, and 5.0 has been verified by performing confirmation experiments. It showed that the D response level increased to 0.96. When compared with the current operating condition, there is a decrease of the material removal and lap width with the improved process performance indices of 2.01 and 1.14, respectively. Similarly, there is an increase of the clamp force with the improved process performance index of 1.58.

  13. An architectural framework for developing intelligent applications for the carbon dioxide capture process

    Energy Technology Data Exchange (ETDEWEB)

    Luo, C.; Zhou, Q.; Chan, C.W. [Regina Univ., SK (Canada)

    2009-07-01

    This presentation reported on the development of automated application solutions for the carbon dioxide (CO{sub 2}) capture process. An architectural framework was presented for developing intelligent systems for the process system. The chemical absorption process consists of dozens of components. It therefore generates more than a hundred different types of data. Developing automated support for these tasks is desirable because the monitoring, analysis and diagnosis of the data is very complex. The proposed framework interacts with an implemented domain ontology for the CO{sub 2} capture process, which consists of information derived from senior operators of the CO{sub 2} pilot plant at the International Test Centre for Carbon Dioxide Capture at University of Regina. The well-defined library within the framework reduces development time and cost. The framework also has built-in web-based software components for data monitoring, management, and analysis. These components provide support for generating automated solutions for the CO{sub 2} capture process. An automated monitoring system that was also developed based on the architectural framework.

  14. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  15. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  16. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    Science.gov (United States)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  17. Political and Budgetary Oversight of the Ukrainian Intelligence Community: Processes, Problems and Prospects for Reform

    National Research Council Canada - National Science Library

    Petrov, Oleksii

    2007-01-01

    This thesis addresses the problem of providing policy and budget oversight of Ukrainian intelligence organizations in accordance with norms and practices developed in contemporary Western democracies...

  18. Process querying : enabling business intelligence through query-based process analytics

    NARCIS (Netherlands)

    Polyvyanyy, A.; Ouyang, C.; Barros, A.; van der Aalst, W.M.P.

    2017-01-01

    The volume of process-related data is growing rapidly: more and more business operations are being supported and monitored by information systems. Industry 4.0 and the corresponding industrial Internet of Things are about to generate new waves of process-related data, next to the abundance of event

  19. Prodiag--a hybrid artificial intelligence based reactor diagnostic system for process faults

    International Nuclear Information System (INIS)

    Reifman, J.; Wei, T.Y.C.; Vitela, J.E.; Applequist, C. A.; Chasensky, T.M.

    1996-01-01

    Commonwealth Research Corporation (CRC) and Argonne National Laboratory (ANL) are collaborating on a DOE-sponsored Cooperative Research and Development Agreement (CRADA), project to perform feasibility studies on a novel approach to Artificial Intelligence (Al) based diagnostics for component faults in nuclear power plants. Investigations are being performed in the construction of a first-principles physics-based plant level process diagnostic expert system (ES) and the identification of component-level fault patterns through operating component characteristics using artificial neural networks (ANNs). The purpose of the proof-of-concept project is to develop a computer-based system using this Al approach to assist process plant operators during off-normal plant conditions. The proposed computer-based system will use thermal hydraulic (T-H) signals complemented by other non-T-H signals available in the data stream to provide the process operator with the component which most likely caused the observed process disturbance.To demonstrate the scale-up feasibility of the proposed diagnostic system it is being developed for use with the Chemical Volume Control System (CVCS) of a nuclear power plant. A full-scope operator training simulator representing the Commonwealth Edison Braidwood nuclear power plant is being used both as the source of development data and as the means to evaluate the advantages of the proposed diagnostic system. This is an ongoing multi-year project and this paper presents the results to date of the CRADA phase

  20. Information Design for “Weak Signal” detection and processing in Economic Intelligence: A case study on Health resources

    Directory of Open Access Journals (Sweden)

    Sahbi Sidhom

    2011-12-01

    Full Text Available The topics of this research cover all phases of “Information Design” applied to detect and profit from weak signals in economic intelligence (EI or business intelligence (BI. The field of the information design (ID applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing, human performances technology and human factors. Applied in the context of information system, it allows end-users to easily detect implicit topics known as “weak signals” (WS. In our approach to implement the ID, the processes cover the development of a knowledge management (KM process in the context of EI. A case study concerning information monitoring health resources is presented using ID processes to outline weak signals. Both French and American bibliographic databases were applied to make the connection to multilingual concepts in the health watch process.

  1. Design of intelligent power consumption optimization and visualization management platform for large buildings based on internet of things

    Directory of Open Access Journals (Sweden)

    Gong Shulan

    2017-01-01

    Full Text Available The buildings provide a significant contribution to total energy consumption and CO2 emission. It has been estimated that the development of an intelligent power consumption monitor and control system will result in about 30% savings in energy consumption. This design innovatively integrates the advanced technologies such as the internet of things, the internet, intelligent buildings and intelligent electricity which can offer open, efficient, convenient energy consumption detection platform in demand side and visual management demonstration application platform in power enterprises side. The system was created to maximize the effective and efficient the use of energy resource. It was development around sensor networks and intelligent gateway and the monitoring center software. This will realize the highly integration and comprehensive application in energy and information to meet the needs with intelligent buildings

  2. Black-White Differences in Cognitive Processing: A Study of the Planning, Attention, Simultaneous, and Successive Theory of Intelligence

    Science.gov (United States)

    Naglieri, Jack A.; Rojahn, Johannes; Matto, Holly C.; Aquilino, Sally A.

    2005-01-01

    Researchers have typically found a mean difference of about 15 points between Blacks and Whites on traditional measures of intelligence. Some have argued that the difference between Blacks and Whites would be smaller on measures of cognitive processing. This study examined Black (n = 298) and White (n = 1,691) children on Planning, Attention,…

  3. Drell–Yan process at Large Hadron Collider

    Indian Academy of Sciences (India)

    the Drell–Yan process [1] first studied with muon final states. In Standard .... Two large-statistics sets of signal events, based on the value of the dimuon invariant mass, .... quality control criteria are applied to this globally reconstructed muon.

  4. Dissolved Gas Analysis Principle-Based Intelligent Approaches to Fault Diagnosis and Decision Making for Large Oil-Immersed Power Transformers: A Survey

    Directory of Open Access Journals (Sweden)

    Lefeng Cheng

    2018-04-01

    Full Text Available Compared with conventional methods of fault diagnosis for power transformers, which have defects such as imperfect encoding and too absolute encoding boundaries, this paper systematically discusses various intelligent approaches applied in fault diagnosis and decision making for large oil-immersed power transformers based on dissolved gas analysis (DGA, including expert system (EPS, artificial neural network (ANN, fuzzy theory, rough sets theory (RST, grey system theory (GST, swarm intelligence (SI algorithms, data mining technology, machine learning (ML, and other intelligent diagnosis tools, and summarizes existing problems and solutions. From this survey, it is found that a single intelligent approach for fault diagnosis can only reflect operation status of the transformer in one particular aspect, causing various degrees of shortcomings that cannot be resolved effectively. Combined with the current research status in this field, the problems that must be addressed in DGA-based transformer fault diagnosis are identified, and the prospects for future development trends and research directions are outlined. This contribution presents a detailed and systematic survey on various intelligent approaches to faults diagnosing and decisions making of the power transformer, in which their merits and demerits are thoroughly investigated, as well as their improvement schemes and future development trends are proposed. Moreover, this paper concludes that a variety of intelligent algorithms should be combined for mutual complementation to form a hybrid fault diagnosis network, such that avoiding these algorithms falling into a local optimum. Moreover, it is necessary to improve the detection instruments so as to acquire reasonable characteristic gas data samples. The research summary, empirical generalization and analysis of predicament in this paper provide some thoughts and suggestions for the research of complex power grid in the new environment, as

  5. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  6. Nonterrestrial material processing and manufacturing of large space systems

    Science.gov (United States)

    Von Tiesenhausen, G.

    1979-01-01

    Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.

  7. Artificial intelligence approach to planning the robotic assembly of large tetrahedral truss structures

    Science.gov (United States)

    Homemdemello, Luiz S.

    1992-01-01

    An assembly planner for tetrahedral truss structures is presented. To overcome the difficulties due to the large number of parts, the planner exploits the simplicity and uniformity of the shapes of the parts and the regularity of their interconnection. The planning automation is based on the computational formalism known as production system. The global data base consists of a hexagonal grid representation of the truss structure. This representation captures the regularity of tetrahedral truss structures and their multiple hierarchies. It maps into quadratic grids and can be implemented in a computer by using a two-dimensional array data structure. By maintaining the multiple hierarchies explicitly in the model, the choice of a particular hierarchy is only made when needed, thus allowing a more informed decision. Furthermore, testing the preconditions of the production rules is simple because the patterned way in which the struts are interconnected is incorporated into the topology of the hexagonal grid. A directed graph representation of assembly sequences allows the use of both graph search and backtracking control strategies.

  8. Intelligence in Artificial Intelligence

    OpenAIRE

    Datta, Shoumen Palit Austin

    2016-01-01

    The elusive quest for intelligence in artificial intelligence prompts us to consider that instituting human-level intelligence in systems may be (still) in the realm of utopia. In about a quarter century, we have witnessed the winter of AI (1990) being transformed and transported to the zenith of tabloid fodder about AI (2015). The discussion at hand is about the elements that constitute the canonical idea of intelligence. The delivery of intelligence as a pay-per-use-service, popping out of ...

  9. Process sensors characterization based on noise analysis technique and artificial intelligence

    International Nuclear Information System (INIS)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos

    2005-01-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  10. Process sensors characterization based on noise analysis technique and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: rnavarro@ipen.br; sperillo@ipen.br; rcsantos@ipen.br

    2005-07-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  11. Expert system in OPS5 for intelligent processing of the alarms in nuclear plants

    International Nuclear Information System (INIS)

    Cavalcante Junior, Jose Airton Chaves

    1997-11-01

    This work intends to establish a model of knowledge representation based on a expert system to supervise either security or operating to be applied generally on monitoring and detecting faults of industrial processes. The model structure proposed here let the system represent the knowledge related to faults on a process using a combination of rules either basic or associative. Besides, the model proposed has a mechanism of propagation of events in real time that acts on this structure making it possible to have an intelligent alarm processing. The rules used by the system define faults from the data acquired by instrumentation (basic rules), or from the establishment of a conjunction of faults already existent (associate rules). The computing implementation of the model defined in this work was developed in OPS5. It was applied on an example consisting of the shutdown of the Angra-I's power plant and was called FDAX (FDA Extended). For the simulated tests the FDAX was connected to the SICA (Integrated System of Angra-I Computers). It results save validity to the model, confirming thus its performance to real time applications. (author)

  12. Gender Differences in the Relationship between Emotional Intelligence and Right Hemisphere Lateralization for Facial Processing

    Science.gov (United States)

    Castro-Schilo, Laura; Kee, Daniel W.

    2010-01-01

    The present study examined relationships between emotional intelligence, measured by the Mayer-Salovey-Caruso Emotional Intelligence Test, and right hemisphere dominance for a free vision chimeric face test. A sample of 122 ethnically diverse college students participated and completed online versions of the forenamed tests. A hierarchical…

  13. Resin infusion of large composite structures modeling and manufacturing process

    Energy Technology Data Exchange (ETDEWEB)

    Loos, A.C. [Michigan State Univ., Dept. of Mechanical Engineering, East Lansing, MI (United States)

    2006-07-01

    The resin infusion processes resin transfer molding (RTM), resin film infusion (RFI) and vacuum assisted resin transfer molding (VARTM) are cost effective techniques for the fabrication of complex shaped composite structures. The dry fibrous preform is placed in the mold, consolidated, resin impregnated and cured in a single step process. The fibrous performs are often constructed near net shape using highly automated textile processes such as knitting, weaving and braiding. In this paper, the infusion processes RTM, RFI and VARTM are discussed along with the advantages of each technique compared with traditional composite fabrication methods such as prepreg tape lay up and autoclave cure. The large number of processing variables and the complex material behavior during infiltration and cure make experimental optimization of the infusion processes costly and inefficient. Numerical models have been developed which can be used to simulate the resin infusion processes. The model formulation and solution procedures for the VARTM process are presented. A VARTM process simulation of a carbon fiber preform was presented to demonstrate the type of information that can be generated by the model and to compare the model predictions with experimental measurements. Overall, the predicted flow front positions, resin pressures and preform thicknesses agree well with the measured values. The results of the simulation show the potential cost and performance benefits that can be realized by using a simulation model as part of the development process. (au)

  14. The Impact of Business Intelligence (BI Competence on Customer Relationship Management (CRM Process: An Empirical Investigation of the Banking Industry

    Directory of Open Access Journals (Sweden)

    Ali Mortezaei

    2018-03-01

    Full Text Available Nowadays, establishing long-term and effective relationships with customers is a key factor in understanding customers’ needs and preferences and achieving competitive advantage. In addition, companies are facing with a growing need for information and analytical knowledge about their customers, market, competitors, organizational environment, and other factors affecting their business. Business intelligence has been considered as a response to this need. The purpose of this study is to investigate the role of business intelligence competence in improving customer relationship management process. Based on the literature review and the competence – capability relationship paradigm, a conceptual model was developed comprising of different dimensions of business intelligence competence and customer relationship management processes. The data were collected from the banking sector and partial least squares structural equation modelling was employed for data analysis. Empirical results showed that organizational business intelligence competence, comprising of managerial, technical, and cultural competence, has a significantly positive impact on enhancing capabilities of customer relationship management process including initiation, maintenance, and termination of the relationship.

  15. QCD phenomenology of the large P/sub T/ processes

    International Nuclear Information System (INIS)

    Stroynowski, R.

    1979-11-01

    Quantum Chromodynamics (QCD) provides a framework for the possible high-accuracy calculations of the large-p/sub T/ processes. The description of the large-transverse-momentum phenomena is introduced in terms of the parton model, and the modifications expected from QCD are described by using as an example single-particle distributions. The present status of available data (π, K, p, p-bar, eta, particle ratios, beam ratios, direct photons, nuclear target dependence), the evidence for jets, and the future prospects are reviewed. 80 references, 33 figures, 3 tables

  16. Intelligent Mission Controller Node

    National Research Council Canada - National Science Library

    Perme, David

    2002-01-01

    The goal of the Intelligent Mission Controller Node (IMCN) project was to improve the process of translating mission taskings between real-world Command, Control, Communications, Computers, and Intelligence (C41...

  17. Neuronal factors determining high intelligence.

    Science.gov (United States)

    Dicke, Ursula; Roth, Gerhard

    2016-01-05

    Many attempts have been made to correlate degrees of both animal and human intelligence with brain properties. With respect to mammals, a much-discussed trait concerns absolute and relative brain size, either uncorrected or corrected for body size. However, the correlation of both with degrees of intelligence yields large inconsistencies, because although they are regarded as the most intelligent mammals, monkeys and apes, including humans, have neither the absolutely nor the relatively largest brains. The best fit between brain traits and degrees of intelligence among mammals is reached by a combination of the number of cortical neurons, neuron packing density, interneuronal distance and axonal conduction velocity--factors that determine general information processing capacity (IPC), as reflected by general intelligence. The highest IPC is found in humans, followed by the great apes, Old World and New World monkeys. The IPC of cetaceans and elephants is much lower because of a thin cortex, low neuron packing density and low axonal conduction velocity. By contrast, corvid and psittacid birds have very small and densely packed pallial neurons and relatively many neurons, which, despite very small brain volumes, might explain their high intelligence. The evolution of a syntactical and grammatical language in humans most probably has served as an additional intelligence amplifier, which may have happened in songbirds and psittacids in a convergent manner. © 2015 The Author(s).

  18. Intelligent Technique for Signal Processing to Identify the Brain Disorder for Epilepsy Captures Using Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Gurumurthy Sasikumar

    2016-01-01

    Full Text Available The new direction of understand the signal that is created from the brain organization is one of the main chores in the brain signal processing. Amid all the neurological disorders the human brain epilepsy is measured as one of the extreme prevalent and then programmed artificial intelligence detection technique is an essential due to the crooked and unpredictable nature of happening of epileptic seizures. We proposed an Improved Fuzzy firefly algorithm, which would enhance the classification of the brain signal efficiently with minimum iteration. An important bunching technique created on fuzzy logic is the Fuzzy C means. Together in the feature domain with the spatial domain the features gained after multichannel EEG signals remained combined by means of fuzzy algorithms. And for better precision segmentation process the firefly algorithm is applied to optimize the Fuzzy C-means membership function. Simultaneously for the efficient clustering method the convergence criteria are set. On the whole the proposed technique yields more accurate results and that gives an edge over other techniques. This proposed algorithm result compared with other algorithms like fuzzy c means algorithm and PSO algorithm.

  19. An intelligent signal processing and pattern recognition technique for defect identification using an active sensor network

    Science.gov (United States)

    Su, Zhongqing; Ye, Lin

    2004-08-01

    The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.

  20. Middle manager role and contribution towards the competitive intelligence process: A case of Irish subsidiaries

    Directory of Open Access Journals (Sweden)

    Willie Chinyamurindi

    2016-07-01

    Full Text Available Background: Calls have been made especially during a period of global competition and economic austerity for research that focuses on how competitive intelligence (CI is actually generated within organisations. Objectives: The aim of this study was to understand the views and experiences of middle managers with regard to their role and contribution towards the CI process within Irish subsidiaries of the Multinational Corporation (MNC. Method: The study adopts a qualitative approach using the semi-structured interview technique to generate narratives and themes around how CI is generated using a sample of 15 middle managers drawn from five participating Irish subsidiaries. Results: Based on the analysis of the narratives of the middle managers, three main themes emerged as findings. Firstly, the process of gathering CI was facilitated by the reliance on internal and external tools. Secondly, information gathered from the use of such tools was then communicated by middle managers to top managers to inform the making of strategic decisions. Thus, (and thirdly, middle managers were found to occupy an important role not only through the execution of their management duties but by extending this influence towards the generation of information deemed to affect the competitive position of not just the subsidiary but also the parent company. Conclusion: The study concludes by focusing on the implications and recommendations based on the three themes drawn from the empirical data.

  1. An Intelligent Optimization Method for Vortex-Induced Vibration Reducing and Performance Improving in a Large Francis Turbine

    Directory of Open Access Journals (Sweden)

    Xuanlin Peng

    2017-11-01

    Full Text Available In this paper, a new methodology is proposed to reduce the vortex-induced vibration (VIV and improve the performance of the stay vane in a 200-MW Francis turbine. The process can be divided into two parts. Firstly, a diagnosis method for stay vane vibration based on field experiments and a finite element method (FEM is presented. It is found that the resonance between the Kármán vortex and the stay vane is the main cause for the undesired vibration. Then, we focus on establishing an intelligent optimization model of the stay vane’s trailing edge profile. To this end, an approach combining factorial experiments, extreme learning machine (ELM and particle swarm optimization (PSO is implemented. Three kinds of improved profiles of the stay vane are proposed and compared. Finally, the profile with a Donaldson trailing edge is adopted as the best solution for the stay vane, and verifications such as computational fluid dynamics (CFD simulations, structural analysis and fatigue analysis are performed to validate the optimized geometry.

  2. Artificial Intelligence Based Selection of Optimal Cutting Tool and Process Parameters for Effective Turning and Milling Operations

    Science.gov (United States)

    Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta

    2016-06-01

    With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.

  3. The prediction of breast cancer biopsy outcomes using two CAD approaches that both emphasize an intelligible decision process

    International Nuclear Information System (INIS)

    Elter, M.; Schulz-Wendtland, R.; Wittenberg, T.

    2007-01-01

    Mammography is the most effective method for breast cancer screening available today. However, the low positive predictive value of breast biopsy resulting from mammogram interpretation leads to approximately 70% unnecessary biopsies with benign outcomes. To reduce the high number of unnecessary breast biopsies, several computer-aided diagnosis (CAD) systems have been proposed in the last several years. These systems help physicians in their decision to perform a breast biopsy on a suspicious lesion seen in a mammogram or to perform a short term follow-up examination instead. We present two novel CAD approaches that both emphasize an intelligible decision process to predict breast biopsy outcomes from BI-RADS findings. An intelligible reasoning process is an important requirement for the acceptance of CAD systems by physicians. The first approach induces a global model based on decison-tree learning. The second approach is based on case-based reasoning and applies an entropic similarity measure. We have evaluated the performance of both CAD approaches on two large publicly available mammography reference databases using receiver operating characteristic (ROC) analysis, bootstrap sampling, and the ANOVA statistical significance test. Both approaches outperform the diagnosis decisions of the physicians. Hence, both systems have the potential to reduce the number of unnecessary breast biopsies in clinical practice. A comparison of the performance of the proposed decision tree and CBR approaches with a state of the art approach based on artificial neural networks (ANN) shows that the CBR approach performs slightly better than the ANN approach, which in turn results in slightly better performance than the decision-tree approach. The differences are statistically significant (p value <0.001). On 2100 masses extracted from the DDSM database, the CRB approach for example resulted in an area under the ROC curve of A(z)=0.89±0.01, the decision-tree approach in A(z)=0.87±0

  4. Visual function and cognitive speed of processing mediate age-related decline in memory span and fluid intelligence.

    Science.gov (United States)

    Clay, Olivio J; Edwards, Jerri D; Ross, Lesley A; Okonkwo, Ozioma; Wadley, Virginia G; Roth, David L; Ball, Karlene K

    2009-06-01

    To evaluate the relationship between sensory and cognitive decline, particularly with respect to speed of processing, memory span, and fluid intelligence. In addition, the common cause, sensory degradation and speed of processing hypotheses were compared. Structural equation modeling was used to investigate the complex relationships among age-related decrements in these areas. Cross-sectional data analyses included 842 older adult participants (M = 73 years). After accounting for age-related declines in vision and processing speed, the direct associations between age and memory span and between age and fluid intelligence were nonsignificant. Older age was associated with visual decline, which was associated with slower speed of processing, which in turn was associated with greater cognitive deficits. The findings support both the sensory degradation and speed of processing accounts of age-related, cognitive decline. Furthermore, the findings highlight positive aspects of normal cognitive aging in that older age may not be associated with a loss of fluid intelligence if visual sensory functioning and processing speed can be maintained.

  5. Large sample hydrology in NZ: Spatial organisation in process diagnostics

    Science.gov (United States)

    McMillan, H. K.; Woods, R. A.; Clark, M. P.

    2013-12-01

    A key question in hydrology is how to predict the dominant runoff generation processes in any given catchment. This knowledge is vital for a range of applications in forecasting hydrological response and related processes such as nutrient and sediment transport. A step towards this goal is to map dominant processes in locations where data is available. In this presentation, we use data from 900 flow gauging stations and 680 rain gauges in New Zealand, to assess hydrological processes. These catchments range in character from rolling pasture, to alluvial plains, to temperate rainforest, to volcanic areas. By taking advantage of so many flow regimes, we harness the benefits of large-sample and comparative hydrology to study patterns and spatial organisation in runoff processes, and their relationship to physical catchment characteristics. The approach we use to assess hydrological processes is based on the concept of diagnostic signatures. Diagnostic signatures in hydrology are targeted analyses of measured data which allow us to investigate specific aspects of catchment response. We apply signatures which target the water balance, the flood response and the recession behaviour. We explore the organisation, similarity and diversity in hydrological processes across the New Zealand landscape, and how these patterns change with scale. We discuss our findings in the context of the strong hydro-climatic gradients in New Zealand, and consider the implications for hydrological model building on a national scale.

  6. Planning pesticides usage for herbal and animal pests based on intelligent classification system with image processing and neural networks

    Directory of Open Access Journals (Sweden)

    Dimililer Kamil

    2018-01-01

    Full Text Available Pests are divided into two as herbal and animal pests in agriculture, and detection and use of minimum pesticides are quite challenging task. Last three decades, researchers have been improving their studies on these manners. Therefore, effective, efficient, and as well as intelligent systems are designed and modelled. In this paper, an intelligent classification system is designed for detecting pests as herbal or animal to use of proper pesticides accordingly. The designed system suggests two main stages. Firstly, images are processed using different image processing techniques that images have specific distinguishing geometric patterns. The second stage is neural network phase for classification. A backpropagation neural network is used for training and testing with processed images. System is tested, and experiment results show efficiency and effective classification rate. Autonomy and time efficiency within the pesticide usage are also discussed.

  7. 2015 Chinese Intelligent Systems Conference

    CERN Document Server

    Du, Junping; Li, Hongbo; Zhang, Weicun; CISC’15

    2016-01-01

    This book presents selected research papers from the 2015 Chinese Intelligent Systems Conference (CISC’15), held in Yangzhou, China. The topics covered include multi-agent systems, evolutionary computation, artificial intelligence, complex systems, computation intelligence and soft computing, intelligent control, advanced control technology, robotics and applications, intelligent information processing, iterative learning control, and machine learning. Engineers and researchers from academia, industry and the government can gain valuable insights into solutions combining ideas from multiple disciplines in the field of intelligent systems.

  8. Artificial intelligence framework for simulating clinical decision-making: a Markov decision process approach.

    Science.gov (United States)

    Bennett, Casey C; Hauser, Kris

    2013-01-01

    In the modern healthcare system, rapidly expanding costs/complexity, the growing myriad of treatment options, and exploding information streams that often do not effectively reach the front lines hinder the ability to choose optimal treatment decisions over time. The goal in this paper is to develop a general purpose (non-disease-specific) computational/artificial intelligence (AI) framework to address these challenges. This framework serves two potential functions: (1) a simulation environment for exploring various healthcare policies, payment methodologies, etc., and (2) the basis for clinical artificial intelligence - an AI that can "think like a doctor". This approach combines Markov decision processes and dynamic decision networks to learn from clinical data and develop complex plans via simulation of alternative sequential decision paths while capturing the sometimes conflicting, sometimes synergistic interactions of various components in the healthcare system. It can operate in partially observable environments (in the case of missing observations or data) by maintaining belief states about patient health status and functions as an online agent that plans and re-plans as actions are performed and new observations are obtained. This framework was evaluated using real patient data from an electronic health record. The results demonstrate the feasibility of this approach; such an AI framework easily outperforms the current treatment-as-usual (TAU) case-rate/fee-for-service models of healthcare. The cost per unit of outcome change (CPUC) was $189 vs. $497 for AI vs. TAU (where lower is considered optimal) - while at the same time the AI approach could obtain a 30-35% increase in patient outcomes. Tweaking certain AI model parameters could further enhance this advantage, obtaining approximately 50% more improvement (outcome change) for roughly half the costs. Given careful design and problem formulation, an AI simulation framework can approximate optimal

  9. Modeling of steam distillation mechanism during steam injection process using artificial intelligence.

    Science.gov (United States)

    Daryasafar, Amin; Ahadi, Arash; Kharrat, Riyaz

    2014-01-01

    Steam distillation as one of the important mechanisms has a great role in oil recovery in thermal methods and so it is important to simulate this process experimentally and theoretically. In this work, the simulation of steam distillation is performed on sixteen sets of crude oil data found in the literature. Artificial intelligence (AI) tools such as artificial neural network (ANN) and also adaptive neurofuzzy interference system (ANFIS) are used in this study as effective methods to simulate the distillate recoveries of these sets of data. Thirteen sets of data were used to train the models and three sets were used to test the models. The developed models are highly compatible with respect to input oil properties and can predict the distillate yield with minimum entry. For showing the performance of the proposed models, simulation of steam distillation is also done using modified Peng-Robinson equation of state. Comparison between the calculated distillates by ANFIS and neural network models and also equation of state-based method indicates that the errors of the ANFIS model for training data and test data sets are lower than those of other methods.

  10. Digital image analysis in breast pathology-from image processing techniques to artificial intelligence.

    Science.gov (United States)

    Robertson, Stephanie; Azizpour, Hossein; Smith, Kevin; Hartman, Johan

    2018-04-01

    Breast cancer is the most common malignant disease in women worldwide. In recent decades, earlier diagnosis and better adjuvant therapy have substantially improved patient outcome. Diagnosis by histopathology has proven to be instrumental to guide breast cancer treatment, but new challenges have emerged as our increasing understanding of cancer over the years has revealed its complex nature. As patient demand for personalized breast cancer therapy grows, we face an urgent need for more precise biomarker assessment and more accurate histopathologic breast cancer diagnosis to make better therapy decisions. The digitization of pathology data has opened the door to faster, more reproducible, and more precise diagnoses through computerized image analysis. Software to assist diagnostic breast pathology through image processing techniques have been around for years. But recent breakthroughs in artificial intelligence (AI) promise to fundamentally change the way we detect and treat breast cancer in the near future. Machine learning, a subfield of AI that applies statistical methods to learn from data, has seen an explosion of interest in recent years because of its ability to recognize patterns in data with less need for human instruction. One technique in particular, known as deep learning, has produced groundbreaking results in many important problems including image classification and speech recognition. In this review, we will cover the use of AI and deep learning in diagnostic breast pathology, and other recent developments in digital image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Modeling of Steam Distillation Mechanism during Steam Injection Process Using Artificial Intelligence

    Science.gov (United States)

    Ahadi, Arash; Kharrat, Riyaz

    2014-01-01

    Steam distillation as one of the important mechanisms has a great role in oil recovery in thermal methods and so it is important to simulate this process experimentally and theoretically. In this work, the simulation of steam distillation is performed on sixteen sets of crude oil data found in the literature. Artificial intelligence (AI) tools such as artificial neural network (ANN) and also adaptive neurofuzzy interference system (ANFIS) are used in this study as effective methods to simulate the distillate recoveries of these sets of data. Thirteen sets of data were used to train the models and three sets were used to test the models. The developed models are highly compatible with respect to input oil properties and can predict the distillate yield with minimum entry. For showing the performance of the proposed models, simulation of steam distillation is also done using modified Peng-Robinson equation of state. Comparison between the calculated distillates by ANFIS and neural network models and also equation of state-based method indicates that the errors of the ANFIS model for training data and test data sets are lower than those of other methods. PMID:24883365

  12. Prediction of Surface Roughness in End Milling Process Using Intelligent Systems: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Abdel Badie Sharkawy

    2011-01-01

    Full Text Available A study is presented to model surface roughness in end milling process. Three types of intelligent networks have been considered. They are (i radial basis function neural networks (RBFNs, (ii adaptive neurofuzzy inference systems (ANFISs, and (iii genetically evolved fuzzy inference systems (G-FISs. The machining parameters, namely, the spindle speed, feed rate, and depth of cut have been used as inputs to model the workpiece surface roughness. The goal is to get the best prediction accuracy. The procedure is illustrated using experimental data of end milling 6061 aluminum alloy. The three networks have been trained using experimental training data. After training, they have been examined using another set of data, that is, validation data. Results are compared with previously published results. It is concluded that ANFIS networks may suffer the local minima problem, and genetic tuning of fuzzy networks cannot insure perfect optimality unless suitable parameter setting (population size, number of generations etc. and tuning range for the FIS, parameters are used which can be hardly satisfied. It is shown that the RBFN model has the best performance (prediction accuracy in this particular case.

  13. Driver's various information process and multi-ruled decision-making mechanism: a fundamental of intelligent driving shaping model

    Directory of Open Access Journals (Sweden)

    Wuhong Wang

    2011-05-01

    Full Text Available The most difficult but important problem in advance driver assistance system development is how to measure and model the behavioral response of drivers with focusing on the cognition process. This paper describes driver's deceleration and acceleration behavior based on driving situation awareness in the car-following process, and then presents several driving models for analysis of driver's safety approaching behavior in traffic operation. The emphasis of our work is placed on the research of driver's various information process and multi-ruled decisionmaking mechanism by considering the complicated control process of driving; the results will be able to provide a theoretical basis for intelligent driving shaping model.

  14. Artificial intelligence versus statistical modeling and optimization of continuous bead milling process for bacterial cell lysis

    Directory of Open Access Journals (Sweden)

    Shafiul Haque

    2016-11-01

    Full Text Available AbstractFor a commercially viable recombinant intracellular protein production process, efficient cell lysis and protein release is a major bottleneck. The recovery of recombinant protein, cholesterol oxidase (COD was studied in a continuous bead milling process. A full factorial Response Surface Model (RSM design was employed and compared to Artificial Neural Networks coupled with Genetic Algorithm (ANN-GA. Significant process variables, cell slurry feed rate (A, bead load (B, cell load (C and run time (D, were investigated and optimized for maximizing COD recovery. RSM predicted an optimum of feed rate of 310.73 mL/h, bead loading of 79.9% (v/v, cell loading OD600 nm of 74, and run time of 29.9 min with a recovery of ~3.2 g/L. ANN coupled with GA predicted a maximum COD recovery of ~3.5 g/L at an optimum feed rate (mL/h: 258.08, bead loading (%, v/v: 80%, cell loading (OD600 nm: 73.99, and run time of 32 min. An overall 3.7-fold increase in productivity is obtained when compared to a batch process. Optimization and comparison of statistical vs. artificial intelligence techniques in continuous bead milling process has been attempted for the very first time in our study. We were able to successfully represent the complex non-linear multivariable dependence of enzyme recovery on bead milling parameters. The quadratic second order response functions are not flexible enough to represent such complex non-linear dependence. ANN being a summation function of multiple layers are capable to represent complex non-linear dependence of variables in this case; enzyme recovery as a function of bead milling parameters. Since GA can even optimize discontinuous functions present study cites a perfect example of using machine learning (ANN in combination with evolutionary optimization (GA for representing undefined biological functions which is the case for common industrial processes involving biological moieties.

  15. Large Eddy Simulation of Cryogenic Injection Processes at Supercritical Pressure

    Science.gov (United States)

    Oefelein, Joseph C.

    2002-01-01

    This paper highlights results from the first of a series of hierarchical simulations aimed at assessing the modeling requirements for application of the large eddy simulation technique to cryogenic injection and combustion processes in liquid rocket engines. The focus is on liquid-oxygen-hydrogen coaxial injectors at a condition where the liquid-oxygen is injected at a subcritical temperature into a supercritical environment. For this situation a diffusion dominated mode of combustion occurs in the presence of exceedingly large thermophysical property gradients. Though continuous, these gradients approach the behavior of a contact discontinuity. Significant real gas effects and transport anomalies coexist locally in colder regions of the flow, with ideal gas and transport characteristics occurring within the flame zone. The current focal point is on the interfacial region between the liquid-oxygen core and the coaxial hydrogen jet where the flame anchors itself.

  16. 9th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Camacho, David; Analide, Cesar; Seghrouchni, Amal; Badica, Costin

    2016-01-01

    This book represents the combined peer-reviewed proceedings of the ninth International Symposium on Intelligent Distributed Computing – IDC’2015, of the Workshop on Cyber Security and Resilience of Large-Scale Systems – WSRL’2015, and of the International Workshop on Future Internet and Smart Networks – FI&SN’2015. All the events were held in Guimarães, Portugal during October 7th-9th, 2015. The 46 contributions published in this book address many topics related to theory and applications of intelligent distributed computing, including: Intelligent Distributed Agent-Based Systems, Ambient Intelligence and Social Networks, Computational Sustainability, Intelligent Distributed Knowledge Representation and Processing, Smart Networks, Networked Intelligence and Intelligent Distributed Applications, amongst others.

  17. Artificial intelligence for the modeling and control of combustion processes: a review

    Energy Technology Data Exchange (ETDEWEB)

    Kalogirou, S.A. [Higher Technical Inst., Nicosia, Cyprus (Greece). Dept. of Mechanical Engineering

    2003-07-01

    Artificial intelligence (AI) systems are widely accepted as a technology offering an alternative way to tackle complex and ill-defined problems. They can learn from examples, are fault tolerant in the sense that they are able to handle noisy and incomplete data, are able to deal with non-linear problems, and once trained can perform prediction and generalization at high speed. They have been used in diverse applications in control, robotics, pattern recognition, forecasting, medicine, power systems, manufacturing, optimization, signal processing, and social/psychological sciences. They are particularly useful in system modeling such as in implementing complex mappings and system identification. Al systems comprise areas like, expert systems, artificial neural networks, genetic algorithms, fuzzy logic and various hybrid systems, which combine two or more techniques. The major objective of this paper is to illustrate how Al techniques might play an important role in modeling and prediction of the performance and control of combustion process. The paper outlines an understanding of how AI systems operate by way of presenting a number of problems in the different disciplines of combustion engineering. The various applications of AI are presented in a thematic rather than a chronological or any other order. Problems presented include two main areas: combustion systems and internal combustion (IC) engines. Combustion systems include boilers, furnaces and incinerators modeling and emissions prediction, whereas, IC engines include diesel and spark ignition engines and gas engines modeling and control. Results presented in this paper, are testimony to the potential of Al as a design tool in many areas of combustion engineering. (author)

  18. Artificial intelligence for the modeling and control of combustion processes: a review

    Energy Technology Data Exchange (ETDEWEB)

    Soteris A. Kalogirou, [Higher Technical Institute, Nicosia (Cyprus). Department of Mechanical Engineering

    2003-07-01

    Artificial intelligence (AI) systems are widely accepted as a technology offering an alternative way to tackle complex and ill-defined problems. They can learn from examples, are fault tolerant in the sense that they are able to handle noisy and incomplete data, are able to deal with non-linear problems, and once trained can perform prediction and generalization at high speed. They have been used in diverse applications in control, robotics, pattern recognition, forecasting, medicine, power systems, manufacturing, optimization, signal processing, and social/psychological sciences. They are particularly useful in system modeling such as in implementing complex mappings and system identification. AI systems comprise areas like, expert systems, artificial neural networks, genetic algorithms, fuzzy logic and various hybrid systems, which combine two or more techniques. The major objective of this paper is to illustrate how AI techniques might play an important role in modeling and prediction of the performance and control of combustion process. The paper outlines an understanding of how AI systems operate by way of presenting a number of problems in the different disciplines of combustion engineering. The various applications of AI are presented in a thematic rather than a chronological or any other order. Problems presented include two main areas: combustion systems and internal combustion (IC) engines. Combustion systems include boilers, furnaces and incinerators modeling and emissions prediction, whereas, IC engines include diesel and spark ignition engines and gas engines modeling and control. Results presented in this paper, are testimony to the potential of AI as a design tool in many areas of combustion engineering. 109 refs., 31 figs., 11 tabs.

  19. On Building and Processing of Large Digitalized Map Archive

    Directory of Open Access Journals (Sweden)

    Milan Simunek

    2011-07-01

    Full Text Available A tall list of problems needs to be solved during a long-time work on a virtual model of Prague aim of which is to show historical development of the city in virtual reality. This paper presents an integrated solution to digitalizing, cataloguing and processing of a large number of maps from different periods and from variety of sources. A specialized (GIS software application was developed to allow for a fast georeferencing (using an evolutionary algorithm, for cataloguing in an internal database, and subsequently for an easy lookup of relevant maps. So the maps could be processed further to serve as a main input for a proper modeling of a changing face of the city through times.

  20. Possible implications of large scale radiation processing of food

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of successful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fulfilment of conditions for successful processing is observed in the group of dry food, in expensive spices in particular. (author)

  1. Possible implications of large scale radiation processing of food

    Science.gov (United States)

    Zagórski, Z. P.

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.

  2. Hierarchical optimal control of large-scale nonlinear chemical processes.

    Science.gov (United States)

    Ramezani, Mohammad Hossein; Sadati, Nasser

    2009-01-01

    In this paper, a new approach is presented for optimal control of large-scale chemical processes. In this approach, the chemical process is decomposed into smaller sub-systems at the first level, and a coordinator at the second level, for which a two-level hierarchical control strategy is designed. For this purpose, each sub-system in the first level can be solved separately, by using any conventional optimization algorithm. In the second level, the solutions obtained from the first level are coordinated using a new gradient-type strategy, which is updated by the error of the coordination vector. The proposed algorithm is used to solve the optimal control problem of a complex nonlinear chemical stirred tank reactor (CSTR), where its solution is also compared with the ones obtained using the centralized approach. The simulation results show the efficiency and the capability of the proposed hierarchical approach, in finding the optimal solution, over the centralized method.

  3. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Applicability of vector processing to large-scale nuclear codes

    International Nuclear Information System (INIS)

    Ishiguro, Misako; Harada, Hiroo; Matsuura, Toshihiko; Okuda, Motoi; Ohta, Fumio; Umeya, Makoto.

    1982-03-01

    To meet the growing trend of computational requirements in JAERI, introduction of a high-speed computer with vector processing faculty (a vector processor) is desirable in the near future. To make effective use of a vector processor, appropriate optimization of nuclear codes to pipelined-vector architecture is vital, which will pose new problems concerning code development and maintenance. In this report, vector processing efficiency is assessed with respect to large-scale nuclear codes by examining the following items: 1) The present feature of computational load in JAERI is analyzed by compiling the computer utilization statistics. 2) Vector processing efficiency is estimated for the ten heavily-used nuclear codes by analyzing their dynamic behaviors run on a scalar machine. 3) Vector processing efficiency is measured for the other five nuclear codes by using the current vector processors, FACOM 230-75 APU and CRAY-1. 4) Effectiveness of applying a high-speed vector processor to nuclear codes is evaluated by taking account of the characteristics in JAERI jobs. Problems of vector processors are also discussed from the view points of code performance and ease of use. (author)

  5. Processing and properties of large-sized ceramic slabs

    Energy Technology Data Exchange (ETDEWEB)

    Raimondo, M.; Dondi, M.; Zanelli, C.; Guarini, G.; Gozzi, A.; Marani, F.; Fossa, L.

    2010-07-01

    Large-sized ceramic slabs with dimensions up to 360x120 cm{sup 2} and thickness down to 2 mm are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites). Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD) and microstructural (SEM) viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated facades, tunnel coverings, insulating panelling), indoor furnitures (table tops, doors), support for photovoltaic ceramic panels. (Author) 24 refs.

  6. Processing and properties of large-sized ceramic slabs

    International Nuclear Information System (INIS)

    Raimondo, M.; Dondi, M.; Zanelli, C.; Guarini, G.; Gozzi, A.; Marani, F.; Fossa, L.

    2010-01-01

    Large-sized ceramic slabs with dimensions up to 360x120 cm 2 and thickness down to 2 mm are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites). Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD) and microstructural (SEM) viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated facades, tunnel coverings, insulating panelling), indoor furnitures (table tops, doors), support for photovoltaic ceramic panels. (Author) 24 refs.

  7. Processing and properties of large grain (RE)BCO

    International Nuclear Information System (INIS)

    Cardwell, D.A.

    1998-01-01

    The potential of high temperature superconductors to generate large magnetic fields and to carry current with low power dissipation at 77 K is particularly attractive for a variety of permanent magnet applications. As a result large grain bulk (RE)-Ba-Cu-O ((RE)BCO) materials have been developed by melt process techniques in an attempt to fabricate practical materials for use in high field devices. This review outlines the current state of the art in this field of processing, including seeding requirements for the controlled fabrication of these materials, the origin of striking growth features such as the formation of a facet plane around the seed, platelet boundaries and (RE) 2 BaCuO 5 (RE-211) inclusions in the seeded melt grown microstructure. An observed variation in critical current density in large grain (RE)BCO samples is accounted for by Sm contamination of the material in the vicinity of the seed and with the development of a non-uniform growth morphology at ∼4 mm from the seed position. (RE)Ba 2 Cu 3 O 7-δ (RE-123) dendrites are observed to form and bro[en preferentially within the a/b plane of the lattice in this growth regime. Finally, trapped fields in excess of 3 T have been reported in irr[iated U-doped YBCO and (RE) 1+x Ba 2-x Cu 3 O y (RE=Sm, Nd) materials have been observed to carry transport current in fields of up to 10 T at 77 K. This underlines the potential of bulk (RE)BCO materials for practical permanent magnet type applications. (orig.)

  8. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    Science.gov (United States)

    Stoitsis, John; Valavanis, Ioannis; Mougiakakou, Stavroula G.; Golemati, Spyretta; Nikita, Alexandra; Nikita, Konstantina S.

    2006-12-01

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

  9. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    International Nuclear Information System (INIS)

    Stoitsis, John; Valavanis, Ioannis; Mougiakakou, Stavroula G.; Golemati, Spyretta; Nikita, Alexandra; Nikita, Konstantina S.

    2006-01-01

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis

  10. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Stoitsis, John [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece)]. E-mail: stoitsis@biosim.ntua.gr; Valavanis, Ioannis [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Mougiakakou, Stavroula G. [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Golemati, Spyretta [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Nikita, Alexandra [University of Athens, Medical School 152 28 Athens (Greece); Nikita, Konstantina S. [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece)

    2006-12-20

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

  11. Processes with large Psub(T) in the quantum chromodynamics

    International Nuclear Information System (INIS)

    Slepchenko, L.A.

    1981-01-01

    Necessary data on deep inelastic processes and processes of hard collision of hadrons and their interpretation in QCD are stated. Low of power reduction of exclusive and inclusive cross sections at large transverse momenta, electromagnetic and inelastic (structural functions) formfactors of hadrons have been discussed. When searching for a method of taking account of QCD effects scaling disturbance was considered. It is shown that for the large transverse momenta the deep inelastic l-h scatterina is represented as the scattering with a compound system (hadron) in the pulse approximation. In an assumption of a parton model obtained was a hadron cross section calculated through a renormalized structural parton function was obtained. Proof of the factorization in the principal logarithmic approximation of QCD has been obtained by means of a quark-gluon diagram technique. The cross section of the hadron reaction in the factorized form, which is analogous to the l-h scattering, has been calculated. It is shown that a) the diagram summing with the gluon emission generates the scaling disturbance in renormalized structural functions (SF) of quarks and gluons and a running coupling constant arises simultaneously; b) the disturbance character of the Bjorken scaling of SF is the same as in the deep inelasic lepton scattering. QCD problems which can not be solved within the framework of the perturbation theory, are discussed. The evolution of SF describing the bound state of a hadron and the hadron light cone have been studied. Radiation corrections arising in two-loop and higher approximations have been evaluated. QCD corrections for point-similar power asymptotes of processes with high energies and transfers of momenta have been studied on the example of the inclusive production of quark and gluon jets. Rules of the quark counting of anomalous dimensionalities of QCD have been obtained. It is concluded that the considered limit of the inclusive cross sections is close to

  12. Intelligent systems

    CERN Document Server

    Irwin, J David

    2011-01-01

    Technology has now progressed to the point that intelligent systems are replacing humans in the decision making processes as well as aiding in the solution of very complex problems. In many cases intelligent systems are already outperforming human activities. Artificial neural networks are not only capable of learning how to classify patterns, such images or sequence of events, but they can also effectively model complex nonlinear systems. Their ability to classify sequences of events is probably more popular in industrial applications where there is an inherent need to model nonlinear system

  13. Advanced intelligent systems

    CERN Document Server

    Ryoo, Young; Jang, Moon-soo; Bae, Young-Chul

    2014-01-01

    Intelligent systems have been initiated with the attempt to imitate the human brain. People wish to let machines perform intelligent works. Many techniques of intelligent systems are based on artificial intelligence. According to changing and novel requirements, the advanced intelligent systems cover a wide spectrum: big data processing, intelligent control, advanced robotics, artificial intelligence and machine learning. This book focuses on coordinating intelligent systems with highly integrated and foundationally functional components. The book consists of 19 contributions that features social network-based recommender systems, application of fuzzy enforcement, energy visualization, ultrasonic muscular thickness measurement, regional analysis and predictive modeling, analysis of 3D polygon data, blood pressure estimation system, fuzzy human model, fuzzy ultrasonic imaging method, ultrasonic mobile smart technology, pseudo-normal image synthesis, subspace classifier, mobile object tracking, standing-up moti...

  14. Application of large radiation sources in chemical processing industry

    International Nuclear Information System (INIS)

    Krishnamurthy, K.

    1977-01-01

    Large radiation sources and their application in chemical processing industry are described. A reference has also been made to the present developments in this field in India. Radioactive sources, notably 60 Co, are employed in production of wood-plastic and concrete-polymer composites, vulcanised rubbers, polymers, sulfochlorinated paraffin hydrocarbons and in a number of other applications which require deep penetration and high reliability of source. Machine sources of electrons are used in production of heat shrinkable plastics, insulation materials for cables, curing of paints etc. Radiation sources have also been used for sewage hygienisation. As for the scene in India, 60 Co sources, gamma chambers and batch irradiators are manufactured. A list of the on-going R and D projects and organisations engaged in research in this field is given. (M.G.B.)

  15. Accelerated decomposition techniques for large discounted Markov decision processes

    Science.gov (United States)

    Larach, Abdelhadi; Chafik, S.; Daoui, C.

    2017-12-01

    Many hierarchical techniques to solve large Markov decision processes (MDPs) are based on the partition of the state space into strongly connected components (SCCs) that can be classified into some levels. In each level, smaller problems named restricted MDPs are solved, and then these partial solutions are combined to obtain the global solution. In this paper, we first propose a novel algorithm, which is a variant of Tarjan's algorithm that simultaneously finds the SCCs and their belonging levels. Second, a new definition of the restricted MDPs is presented to ameliorate some hierarchical solutions in discounted MDPs using value iteration (VI) algorithm based on a list of state-action successors. Finally, a robotic motion-planning example and the experiment results are presented to illustrate the benefit of the proposed decomposition algorithms.

  16. Modeling of large-scale oxy-fuel combustion processes

    DEFF Research Database (Denmark)

    Yin, Chungen

    2012-01-01

    Quite some studies have been conducted in order to implement oxy-fuel combustion with flue gas recycle in conventional utility boilers as an effective effort of carbon capture and storage. However, combustion under oxy-fuel conditions is significantly different from conventional air-fuel firing......, among which radiative heat transfer under oxy-fuel conditions is one of the fundamental issues. This paper demonstrates the nongray-gas effects in modeling of large-scale oxy-fuel combustion processes. Oxy-fuel combustion of natural gas in a 609MW utility boiler is numerically studied, in which...... calculation of the oxy-fuel WSGGM remarkably over-predicts the radiative heat transfer to the furnace walls and under-predicts the gas temperature at the furnace exit plane, which also result in a higher incomplete combustion in the gray calculation. Moreover, the gray and non-gray calculations of the same...

  17. Hydrothermal processes above the Yellowstone magma chamber: Large hydrothermal systems and large hydrothermal explosions

    Science.gov (United States)

    Morgan, L.A.; Shanks, W.C. Pat; Pierce, K.L.

    2009-01-01

    Hydrothermal explosions are violent and dramatic events resulting in the rapid ejection of boiling water, steam, mud, and rock fragments from source craters that range from a few meters up to more than 2 km in diameter; associated breccia can be emplaced as much as 3 to 4 km from the largest craters. Hydrothermal explosions occur where shallow interconnected reservoirs of steam- and liquid-saturated fluids with temperatures at or near the boiling curve underlie thermal fields. Sudden reduction in confi ning pressure causes fluids to fl ash to steam, resulting in signifi cant expansion, rock fragmentation, and debris ejection. In Yellowstone, hydrothermal explosions are a potentially signifi cant hazard for visitors and facilities and can damage or even destroy thermal features. The breccia deposits and associated craters formed from hydrothermal explosions are mapped as mostly Holocene (the Mary Bay deposit is older) units throughout Yellowstone National Park (YNP) and are spatially related to within the 0.64-Ma Yellowstone caldera and along the active Norris-Mammoth tectonic corridor. In Yellowstone, at least 20 large (>100 m in diameter) hydrothermal explosion craters have been identifi ed; the scale of the individual associated events dwarfs similar features in geothermal areas elsewhere in the world. Large hydrothermal explosions in Yellowstone have occurred over the past 16 ka averaging ??1 every 700 yr; similar events are likely in the future. Our studies of large hydrothermal explosion events indicate: (1) none are directly associated with eruptive volcanic or shallow intrusive events; (2) several historical explosions have been triggered by seismic events; (3) lithic clasts and comingled matrix material that form hydrothermal explosion deposits are extensively altered, indicating that explosions occur in areas subjected to intense hydrothermal processes; (4) many lithic clasts contained in explosion breccia deposits preserve evidence of repeated fracturing

  18. The Influence of Cochlear Mechanical Dysfunction, Temporal Processing Deficits, and Age on the Intelligibility of Audible Speech in Noise for Hearing-Impaired Listeners

    Directory of Open Access Journals (Sweden)

    Peter T. Johannesen

    2016-05-01

    Full Text Available The aim of this study was to assess the relative importance of cochlear mechanical dysfunction, temporal processing deficits, and age on the ability of hearing-impaired listeners to understand speech in noisy backgrounds. Sixty-eight listeners took part in the study. They were provided with linear, frequency-specific amplification to compensate for their audiometric losses, and intelligibility was assessed for speech-shaped noise (SSN and a time-reversed two-talker masker (R2TM. Behavioral estimates of cochlear gain loss and residual compression were available from a previous study and were used as indicators of cochlear mechanical dysfunction. Temporal processing abilities were assessed using frequency modulation detection thresholds. Age, audiometric thresholds, and the difference between audiometric threshold and cochlear gain loss were also included in the analyses. Stepwise multiple linear regression models were used to assess the relative importance of the various factors for intelligibility. Results showed that (a cochlear gain loss was unrelated to intelligibility, (b residual cochlear compression was related to intelligibility in SSN but not in a R2TM, (c temporal processing was strongly related to intelligibility in a R2TM and much less so in SSN, and (d age per se impaired intelligibility. In summary, all factors affected intelligibility, but their relative importance varied across maskers.

  19. Artificial intelligence in process control: Knowledge base for the shuttle ECS model

    Science.gov (United States)

    Stiffler, A. Kent

    1989-01-01

    The general operation of KATE, an artificial intelligence controller, is outlined. A shuttle environmental control system (ECS) demonstration system for KATE is explained. The knowledge base model for this system is derived. An experimental test procedure is given to verify parameters in the model.

  20. Joint Intelligence Operations Center (JIOC) Baseline Business Process Model & Capabilities Evaluation Methodology

    Science.gov (United States)

    2012-03-01

    Targeting Review Board OPLAN Operations Plan OPORD Operations Order OPSIT Operational Situation OSINT Open Source Intelligence OV...Analysis Evaluate FLTREPs MISREPs Unit Assign Assets Feedback Asset Shortfalls Multi-Int Collection Political & Embasy Law Enforcement HUMINT OSINT ...Embassy Information OSINT Manage Theater HUMINT Law Enforcement Collection Sort Requests Platform Information Agency Information M-I Collect

  1. Overview of the research and development on knowledge information processing and intelligent robots

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, K

    1982-04-01

    To implement intelligent computers, the problem of formalization of human intellectual activity must be considered. Insight into formalized intellectual activity can be gained by examination of its four abilities: (1) problem-solving; (2) learning, recognition and understanding; (3) language analysis and understanding; and (4) intellectual interaction. These are the topics discussed in the paper. 68 references.

  2. Large earthquake rupture process variations on the Middle America megathrust

    Science.gov (United States)

    Ye, Lingling; Lay, Thorne; Kanamori, Hiroo

    2013-11-01

    The megathrust fault between the underthrusting Cocos plate and overriding Caribbean plate recently experienced three large ruptures: the August 27, 2012 (Mw 7.3) El Salvador; September 5, 2012 (Mw 7.6) Costa Rica; and November 7, 2012 (Mw 7.4) Guatemala earthquakes. All three events involve shallow-dipping thrust faulting on the plate boundary, but they had variable rupture processes. The El Salvador earthquake ruptured from about 4 to 20 km depth, with a relatively large centroid time of ˜19 s, low seismic moment-scaled energy release, and a depleted teleseismic short-period source spectrum similar to that of the September 2, 1992 (Mw 7.6) Nicaragua tsunami earthquake that ruptured the adjacent shallow portion of the plate boundary. The Costa Rica and Guatemala earthquakes had large slip in the depth range 15 to 30 km, and more typical teleseismic source spectra. Regional seismic recordings have higher short-period energy levels for the Costa Rica event relative to the El Salvador event, consistent with the teleseismic observations. A broadband regional waveform template correlation analysis is applied to categorize the focal mechanisms for larger aftershocks of the three events. Modeling of regional wave spectral ratios for clustered events with similar mechanisms indicates that interplate thrust events have corner frequencies, normalized by a reference model, that increase down-dip from anomalously low values near the Middle America trench. Relatively high corner frequencies are found for thrust events near Costa Rica; thus, variations along strike of the trench may also be important. Geodetic observations indicate trench-parallel motion of a forearc sliver extending from Costa Rica to Guatemala, and low seismic coupling on the megathrust has been inferred from a lack of boundary-perpendicular strain accumulation. The slip distributions and seismic radiation from the large regional thrust events indicate relatively strong seismic coupling near Nicoya, Costa

  3. Selected Ethical Issues in Artificial Intelligence, Autonomous System Development and Large Data Set Processing

    Directory of Open Access Journals (Sweden)

    Zgrzebnicki Paweł

    2017-07-01

    Full Text Available Due to information technology development and its industrial adaptation, the dilemmas so far specific to philosophical speculations and science-fiction literature and films have become the real problems of the contemporary world. On one hand, these issues are related to an unprecedented scale on which computational algorithms are currently used as well as a level of complexity of mutual connections; on the other hand, these are linked to their autonomous behavior. States, industry, and users themselves demand formulation of understandable ethical categories and determination of transparency standards and legal norms for these algorithms’ functioning in the near future.

  4. Machine listening intelligence

    Science.gov (United States)

    Cella, C. E.

    2017-05-01

    This manifesto paper will introduce machine listening intelligence, an integrated research framework for acoustic and musical signals modelling, based on signal processing, deep learning and computational musicology.

  5. Intelligent Optics Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Intelligent Optics Laboratory supports sophisticated investigations on adaptive and nonlinear optics; advancedimaging and image processing; ground-to-ground and...

  6. Patterns and Intelligent Systems

    International Nuclear Information System (INIS)

    Cordes, Gail A.

    2003-01-01

    The recognition and analysis of evolving patterns provides a unifying concept for studying and implementing intelligent information processing for open feedback control systems within the nuclear industry. Control is considered as influence of a large system to achieve the goals of the human (who might or might not be part of an open feedback loop) and is not limited to operation of a component within a nuclear power plant. The intelligent control system includes open logic and can automatically react to new data in an unprogrammed way. This application of evolving patterns integrates current research developments in human cognition and scientific semiotics with traditional feedback control. A preliminary implementation of such a system using existing computational techniques is postulated, and tools that are lacking at this time are identified. Proof-of-concept applications for the nuclear industry are referenced

  7. Process evaluation of treatment times in a large radiotherapy department

    International Nuclear Information System (INIS)

    Beech, R.; Burgess, K.; Stratford, J.

    2016-01-01

    Purpose/objective: The Department of Health (DH) recognises access to appropriate and timely radiotherapy (RT) services as crucial in improving cancer patient outcomes, especially when facing a predicted increase in cancer diagnosis. There is a lack of ‘real-time’ data regarding daily demand of a linear accelerator, the impact of increasingly complex techniques on treatment times, and whether current scheduling reflects time needed for RT delivery, which would be valuable in highlighting current RT provision. Material/methods: A systematic quantitative process evaluation was undertaken in a large regional cancer centre, including a satellite centre, between January and April 2014. Data collected included treatment room-occupancy time, RT site, RT and verification technique and patient mobility status. Data was analysed descriptively; average room-occupancy times were calculated for RT techniques and compared to historical standardised treatment times within the department. Results: Room-occupancy was recorded for over 1300 fractions, over 50% of which overran their allotted treatment time. In a focused sample of 16 common techniques, 10 overran their allocated timeslots. Verification increased room-occupancy by six minutes (50%) over non-imaging. Treatments for patients requiring mobility assistance took four minutes (29%) longer. Conclusion: The majority of treatments overran their standardised timeslots. Although technique advancement has reduced RT delivery time, room-occupancy has not necessarily decreased. Verification increases room-occupancy and needs to be considered when moving towards adaptive techniques. Mobility affects room-occupancy and will become increasingly significant in an ageing population. This evaluation assesses validity of current treatment times in this department, and can be modified and repeated as necessary. - Highlights: • A process evaluation examined room-occupancy for various radiotherapy techniques. • Appointment lengths

  8. Neutral processes forming large clones during colonization of new areas.

    Science.gov (United States)

    Rafajlović, M; Kleinhans, D; Gulliksson, C; Fries, J; Johansson, D; Ardehed, A; Sundqvist, L; Pereyra, R T; Mehlig, B; Jonsson, P R; Johannesson, K

    2017-08-01

    In species reproducing both sexually and asexually clones are often more common in recently established populations. Earlier studies have suggested that this pattern arises due to natural selection favouring generally or locally successful genotypes in new environments. Alternatively, as we show here, this pattern may result from neutral processes during species' range expansions. We model a dioecious species expanding into a new area in which all individuals are capable of both sexual and asexual reproduction, and all individuals have equal survival rates and dispersal distances. Even under conditions that favour sexual recruitment in the long run, colonization starts with an asexual wave. After colonization is completed, a sexual wave erodes clonal dominance. If individuals reproduce more than one season, and with only local dispersal, a few large clones typically dominate for thousands of reproductive seasons. Adding occasional long-distance dispersal, more dominant clones emerge, but they persist for a shorter period of time. The general mechanism involved is simple: edge effects at the expansion front favour asexual (uniparental) recruitment where potential mates are rare. Specifically, our model shows that neutral processes (with respect to genotype fitness) during the population expansion, such as random dispersal and demographic stochasticity, produce genotype patterns that differ from the patterns arising in a selection model. The comparison with empirical data from a post-glacially established seaweed species (Fucus radicans) shows that in this case, a neutral mechanism is strongly supported. © 2017 The Authors. Journal of Evolutionary Biology Published by John Wiley & Sons ltd on Behalf of European Society for Evolutionary Biology.

  9. Artificial Intelligence Framework for Simulating Clinical Decision-Making: A Markov Decision Process Approach

    OpenAIRE

    Bennett, Casey C.; Hauser, Kris

    2013-01-01

    In the modern healthcare system, rapidly expanding costs/complexity, the growing myriad of treatment options, and exploding information streams that often do not effectively reach the front lines hinder the ability to choose optimal treatment decisions over time. The goal in this paper is to develop a general purpose (non-disease-specific) computational/artificial intelligence (AI) framework to address these challenges. This serves two potential functions: 1) a simulation environment for expl...

  10. Writing cases as a knowledge capture process in a competitive intelligence program

    OpenAIRE

    Mallowan , Monica; Marcon , Christian

    2009-01-01

    International audience; Students in Competitive Intelligence (CI) programs submit a report following their internship in an organisation. It is proposed that the result of their experiences be shared with their peers, in the form of cases written for in-class analysis. A knowledge base is thus created, which gradually becomes the program's memory and, by its constant renewal and connection with the reality, the most useful teaching tool for the professor.

  11. Processing and properties of large-sized ceramic slabs

    Directory of Open Access Journals (Sweden)

    Fossa, L.

    2010-10-01

    Full Text Available Large-sized ceramic slabs – with dimensions up to 360x120 cm2 and thickness down to 2 mm – are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites. Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD and microstructural (SEM viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated façades, tunnel coverings, insulating panelling, indoor furnitures (table tops, doors, support for photovoltaic ceramic panels.

    Se han fabricado piezas de gran formato, con dimensiones de hasta 360x120 cm, y menos de 2 mm, de espesor, empleando métodos innovadores de fabricación, partiendo de composiciones de gres porcelánico y utilizando, molienda con bolas por vía húmeda, atomización, prensado a baja velocidad sin boquilla de extrusión, secado y cocción rápido en una sola etapa, y un acabado que incluye la adhesión de fibra de vidrio al soporte cerámico y el rectificado de la pieza final. Se han

  12. Summer Decay Processes in a Large Tabular Iceberg

    Science.gov (United States)

    Wadhams, P.; Wagner, T. M.; Bates, R.

    2012-12-01

    Summer Decay Processes in a Large Tabular Iceberg Peter Wadhams (1), Till J W Wagner(1) and Richard Bates(2) (1) Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA, UK (2) Scottish Oceans Institute, School of Geography and Geosciences, University of St Andrews, St. Andrews, Scotland KY16 9AL We present observational results from an experiment carried out during July-August 2012 on a giant grounded tabular iceberg off Baffin Island. The iceberg studied was part of the Petermann Ice Island B1 (PIIB1) which calved off the Petermann Glacier in NW Greenland in 2010. Since 2011 it has been aground in 100 m of water on the Baffin Island shelf at 69 deg 06'N, 66 deg 06'W. As part of the project a set of high resolution GPS sensors and tiltmeters was placed on the ice island to record rigid body motion as well as flexural responses to wind, waves, current and tidal forces, while a Waverider buoy monitored incident waves and swell. On July 31, 2012 a major breakup event was recorded, with a piece of 25,000 sq m surface area calving off the iceberg. At the time of breakup, GPS sensors were collecting data both on the main berg as well as on the newly calved piece, while two of us (PW and TJWW) were standing on the broken-out portion which rose by 0.6 m to achieve a new isostatic equilibrium. Crucially, there was no significant swell at the time of breakup, which suggests a melt-driven decay process rather than wave-driven flexural break-up. The GPS sensors recorded two disturbances during the hour preceding the breakup, indicative of crack growth and propagation. Qualitative observation during the two weeks in which our research ship was moored to, or was close to, the ice island edge indicates that an important mechanism for summer ablation is successive collapses of the overburden from above an unsupported wave cut, which creates a submerged ram fringing the berg. A model of buoyancy stresses induced by

  13. A KPI-based process monitoring and fault detection framework for large-scale processes.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Intelligence of programs

    Energy Technology Data Exchange (ETDEWEB)

    Novak, D

    1982-01-01

    A general discussion about the level of artificial intelligence in computer programs is presented. The suitability of various languages for the development of complex, intelligent programs is discussed, considering fourth-generation language as well as the well established structured COBOL language. It is concluded that the success of automation in many administrative fields depends to a large extent on the development of intelligent programs.

  15. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  16. Intelligence and treaty ratification

    International Nuclear Information System (INIS)

    Cahn, A.H.

    1990-01-01

    This paper reports that there are two sets of questions applicable to the ratification phase: what is the role of intelligence in the ratification process? What effect did intelligence have on that process. The author attempts to answer these and other questions

  17. Intelligible Artificial Intelligence

    OpenAIRE

    Weld, Daniel S.; Bansal, Gagan

    2018-01-01

    Since Artificial Intelligence (AI) software uses techniques like deep lookahead search and stochastic optimization of huge neural networks to fit mammoth datasets, it often results in complex behavior that is difficult for people to understand. Yet organizations are deploying AI algorithms in many mission-critical settings. In order to trust their behavior, we must make it intelligible --- either by using inherently interpretable models or by developing methods for explaining otherwise overwh...

  18. Intelligence and Physical Attractiveness

    Science.gov (United States)

    Kanazawa, Satoshi

    2011-01-01

    This brief research note aims to estimate the magnitude of the association between general intelligence and physical attractiveness with large nationally representative samples from two nations. In the United Kingdom, attractive children are more intelligent by 12.4 IQ points (r=0.381), whereas in the United States, the correlation between…

  19. Design Methodology of Process Layout considering Various Equipment Types for Large scale Pyro processing Facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Lee, Hyo Jik

    2016-01-01

    At present, each item of process equipment required for integrated processing is being examined, based on experience acquired during the Pyropocess Integrated Inactive Demonstration Facility (PRIDE) project, and considering the requirements and desired performance enhancement of KAPF as a new facility beyond PRIDE. Essentially, KAPF will be required to handle hazardous materials such as spent nuclear fuel, which must be processed in an isolated and shielded area separate from the operator location. Moreover, an inert-gas atmosphere must be maintained, because of the radiation and deliquescence of the materials. KAPF must also achieve the goal of significantly increased yearly production beyond that of the previous facility; therefore, several parts of the production line must be automated. This article presents the method considered for the conceptual design of both the production line and the overall layout of the KAPF process equipment. This study has proposed a design methodology that can be utilized as a preliminary step for the design of a hot-cell-type, large-scale facility, in which the various types of processing equipment operated by the remote handling system are integrated. The proposed methodology applies to part of the overall design procedure and contains various weaknesses. However, if the designer is required to maximize the efficiency of the installed material-handling system while considering operation restrictions and maintenance conditions, this kind of design process can accommodate the essential components that must be employed simultaneously in a general hot-cell system

  20. Intelligent Design and Intelligent Failure

    Science.gov (United States)

    Jerman, Gregory

    2015-01-01

    Good Evening, my name is Greg Jerman and for nearly a quarter century I have been performing failure analysis on NASA's aerospace hardware. During that time I had the distinct privilege of keeping the Space Shuttle flying for two thirds of its history. I have analyzed a wide variety of failed hardware from simple electrical cables to cryogenic fuel tanks to high temperature turbine blades. During this time I have found that for all the time we spend intelligently designing things, we need to be equally intelligent about understanding why things fail. The NASA Flight Director for Apollo 13, Gene Kranz, is best known for the expression "Failure is not an option." However, NASA history is filled with failures both large and small, so it might be more accurate to say failure is inevitable. It is how we react and learn from our failures that makes the difference.

  1. Principles of artificial intelligence

    CERN Document Server

    Nilsson, Nils J

    1980-01-01

    A classic introduction to artificial intelligence intended to bridge the gap between theory and practice, Principles of Artificial Intelligence describes fundamental AI ideas that underlie applications such as natural language processing, automatic programming, robotics, machine vision, automatic theorem proving, and intelligent data retrieval. Rather than focusing on the subject matter of the applications, the book is organized around general computational concepts involving the kinds of data structures used, the types of operations performed on the data structures, and the properties of th

  2. DB-XES : enabling process discovery in the large

    NARCIS (Netherlands)

    Syamsiyah, A.; van Dongen, B.F.; van der Aalst, W.M.P.; Ceravolo, P.; Guetl, C.; Rinderle-Ma, S.

    2018-01-01

    Dealing with the abundance of event data is one of the main process discovery challenges. Current process discovery techniques are able to efficiently handle imported event log files that fit in the computer’s memory. Once data files get bigger, scalability quickly drops since the speed required to

  3. Benchmarking processes for managing large international space programs

    Science.gov (United States)

    Mandell, Humboldt C., Jr.; Duke, Michael B.

    1993-01-01

    The relationship between management style and program costs is analyzed to determine the feasibility of financing large international space missions. The incorporation of management systems is considered to be essential to realizing low cost spacecraft and planetary surface systems. Several companies ranging from large Lockheed 'Skunk Works' to small companies including Space Industries, Inc., Rocket Research Corp., and Orbital Sciences Corp. were studied. It is concluded that to lower the prices, the ways in which spacecraft and hardware are developed must be changed. Benchmarking of successful low cost space programs has revealed a number of prescriptive rules for low cost managements, including major changes in the relationships between the public and private sectors.

  4. Really big data: Processing and analysis of large datasets

    Science.gov (United States)

    Modern animal breeding datasets are large and getting larger, due in part to the recent availability of DNA data for many animals. Computational methods for efficiently storing and analyzing those data are under development. The amount of storage space required for such datasets is increasing rapidl...

  5. Understanding the Globalization of Intelligence

    DEFF Research Database (Denmark)

    Svendsen, Adam David Morgan

    "This book provides an introduction to the complexities of contemporary Western Intelligence and its dynamics during an era of globalization. Towards an understanding of the globalization of intelligence process, Svendsen focuses on the secretive phenomenon of international or foreign intelligence...... cooperation ('liaison'), as it occurs in both theory and practice. Reflecting a complex coexistence plurality of several different and overlapping concepts in action, the challenging process of the globalization of intelligence emerges as essential for complex issue management purposes during a globalized era...

  6. Understanding Genetic Breast Cancer Risk: Processing Loci of the BRCA Gist Intelligent Tutoring System.

    Science.gov (United States)

    Wolfe, Christopher R; Reyna, Valerie F; Widmer, Colin L; Cedillos-Whynott, Elizabeth M; Brust-Renck, Priscila G; Weil, Audrey M; Hu, Xiangen

    2016-07-01

    The BRCA Gist Intelligent Tutoring System helps women understand and make decisions about genetic testing for breast cancer risk. BRCA Gist is guided by Fuzzy-Trace Theory, (FTT) and built using AutoTutor Lite. It responds differently to participants depending on what they say. Seven tutorial dialogues requiring explanation and argumentation are guided by three FTT concepts: forming gist explanations in one's own words, emphasizing decision-relevant information, and deliberating the consequences of decision alternatives. Participants were randomly assigned to BRCA Gist , a control, or impoverished BRCA Gist conditions removing gist explanation dialogues, argumentation dialogues, or FTT images. All BRCA Gist conditions performed significantly better than controls on knowledge, comprehension, and risk assessment. Significant differences in knowledge, comprehension, and fine-grained dialogue analyses demonstrate the efficacy of gist explanation dialogues. FTT images significantly increased knowledge. Providing more elements in arguments against testing correlated with increased knowledge and comprehension.

  7. Prototype interface facility for intelligent handling and processing of medical image and data

    Science.gov (United States)

    Lymberopoulos, Dimitris C.; Garantziotis, Giannis; Spiropoulos, Kostas V.; Kotsopoulos, Stavros A.; Goutis, Costas E.

    1993-06-01

    This paper introduces an interface facility (IF) developed within the overall framework of RACE research project. Due to the nature of the project which it has been focused in the Remote Medical Expert Consultation, the involvement of distances, the versatile user advocation and familiarity with newly introduced methods of medical diagnosis, considerable deficiencies can arise. The aim was to intelligently assist the user/physician by providing an ergonomic environment which would contain operational and functional deficiencies to the lowest possible levels. IF, energizes and activates system and application level commands and procedures along with the necessary exemplified and instructional help facilities, in order to concisely allow the user to interact with the system safely and easily at all levels.

  8. Supporting the personnel reliability decision-making process with artificial intelligence

    International Nuclear Information System (INIS)

    Harte, D.C.

    1991-01-01

    Recent legislation concerning personnel security has vastly increased the responsibility and accountability of the security manager. Access authorization, fitness for duty, and personnel security access programs require decisions regarding an individual's trustworthiness and reliability based on the findings of a background investigation. While these guidelines provide significant data and are useful as a tool, limited resources are available to the adjudicator of derogatory information on what is and is not acceptable in terms of granting access to sensitive areas of nuclear plants. The reason why one individual is deemed unacceptable and the next acceptable may be questioned and cause discriminatory accusations. This paper is continuation of discussion on workforce reliability, focusing on the use of artificial intelligence to support the decisions of a security manager. With this support, the benefit of previous decisions helps ensure consistent adjudication of background investigations

  9. Drell–Yan process at Large Hadron Collider

    Indian Academy of Sciences (India)

    Drell–Yan process at LHC, q q ¯ → /* → ℓ+ ℓ-, is one of the benchmarks for confirmation of Standard Model at TeV energy scale. Since the theoretical prediction for the rate is precise and the final state is clean as well as relatively easy to measure, the process can be studied at the LHC even at relatively low luminosity.

  10. Intelligent Monitoring System with High Temperature Distributed Fiberoptic Sensor for Power Plant Combustion Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kwang Y. Lee; Stuart S. Yin; Andre Boehman

    2006-09-26

    The objective of the proposed work is to develop an intelligent distributed fiber optical sensor system for real-time monitoring of high temperature in a boiler furnace in power plants. Of particular interest is the estimation of spatial and temporal distributions of high temperatures within a boiler furnace, which will be essential in assessing and controlling the mechanisms that form and remove pollutants at the source, such as NOx. The basic approach in developing the proposed sensor system is three fold: (1) development of high temperature distributed fiber optical sensor capable of measuring temperatures greater than 2000 C degree with spatial resolution of less than 1 cm; (2) development of distributed parameter system (DPS) models to map the three-dimensional (3D) temperature distribution for the furnace; and (3) development of an intelligent monitoring system for real-time monitoring of the 3D boiler temperature distribution. Under Task 1, we have set up a dedicated high power, ultrafast laser system for fabricating in-fiber gratings in harsh environment optical fibers, successfully fabricated gratings in single crystal sapphire fibers by the high power laser system, and developed highly sensitive long period gratings (lpg) by electric arc. Under Task 2, relevant mathematical modeling studies of NOx formation in practical combustors have been completed. Studies show that in boiler systems with no swirl, the distributed temperature sensor may provide information sufficient to predict trends of NOx at the boiler exit. Under Task 3, we have investigated a mathematical approach to extrapolation of the temperature distribution within a power plant boiler facility, using a combination of a modified neural network architecture and semigroup theory. Given a set of empirical data with no analytic expression, we first developed an analytic description and then extended that model along a single axis.

  11. Manufacturing process to reduce large grain growth in zirconium alloys

    International Nuclear Information System (INIS)

    Rosecrans, P.M.

    1987-01-01

    A method is described of treating cold worked zirconium alloys to reduce large grain growth during thermal treatment above its recrystallization temperature. The method comprises heating the zirconium alloy at a temperature of about 1300 0 F. to 1350 0 F. for about 1 to 3 hours subsequent to cold working the zirconium alloy and prior to the thermal treatment at a temperature of between 1450 0 -1550 0 F., the thermal treatment temperature being above the recrystallization temperature

  12. Manufacturing Process Simulation of Large-Scale Cryotanks

    Science.gov (United States)

    Babai, Majid; Phillips, Steven; Griffin, Brian

    2003-01-01

    NASA's Space Launch Initiative (SLI) is an effort to research and develop the technologies needed to build a second-generation reusable launch vehicle. It is required that this new launch vehicle be 100 times safer and 10 times cheaper to operate than current launch vehicles. Part of the SLI includes the development of reusable composite and metallic cryotanks. The size of these reusable tanks is far greater than anything ever developed and exceeds the design limits of current manufacturing tools. Several design and manufacturing approaches have been formulated, but many factors must be weighed during the selection process. Among these factors are tooling reachability, cycle times, feasibility, and facility impacts. The manufacturing process simulation capabilities available at NASA.s Marshall Space Flight Center have played a key role in down selecting between the various manufacturing approaches. By creating 3-D manufacturing process simulations, the varying approaches can be analyzed in a virtual world before any hardware or infrastructure is built. This analysis can detect and eliminate costly flaws in the various manufacturing approaches. The simulations check for collisions between devices, verify that design limits on joints are not exceeded, and provide cycle times which aide in the development of an optimized process flow. In addition, new ideas and concerns are often raised after seeing the visual representation of a manufacturing process flow. The output of the manufacturing process simulations allows for cost and safety comparisons to be performed between the various manufacturing approaches. This output helps determine which manufacturing process options reach the safety and cost goals of the SLI. As part of the SLI, The Boeing Company was awarded a basic period contract to research and propose options for both a metallic and a composite cryotank. Boeing then entered into a task agreement with the Marshall Space Flight Center to provide manufacturing

  13. Process component inventory in a large commercial reprocessing facility

    International Nuclear Information System (INIS)

    Canty, M.J.; Berliner, A.; Spannagel, G.

    1983-01-01

    Using a computer simulation program, the equilibrium operation of the Pu-extraction and purification processes of a reference commercial reprocessing facility was investigated. Particular attention was given to the long-term net fluctuations of Pu inventories in hard-to-measure components such as the solvent extraction contractors. Comparing the variance of these inventories with the measurement variance for Pu contained in feed, analysis and buffer tanks, it was concluded that direct or indirect periodic estimation of contactor inventories would not contribute significantly to improving the quality of closed material balances over the process MBA

  14. How to collect and process large polyhedral viruses of insects

    Science.gov (United States)

    W. D. Rollinson; F. B. Lewis

    1962-01-01

    Polyhedral viruses have proved highly effective and very practical for control of certain pine sawflies; and a method of collecting and processing the small polyhedra (5 microns or less) characteristic of sawflies has been described. There is experimental evidence that the virus diseases of many Lepidopterous insects can be used similarly for direct control. The...

  15. ARMA modelling of neutron stochastic processes with large measurement noise

    International Nuclear Information System (INIS)

    Zavaljevski, N.; Kostic, Lj.; Pesic, M.

    1994-01-01

    An autoregressive moving average (ARMA) model of the neutron fluctuations with large measurement noise is derived from langevin stochastic equations and validated using time series data obtained during prompt neutron decay constant measurements at the zero power reactor RB in Vinca. Model parameters are estimated using the maximum likelihood (ML) off-line algorithm and an adaptive pole estimation algorithm based on the recursive prediction error method (RPE). The results show that subcriticality can be determined from real data with high measurement noise using much shorter statistical sample than in standard methods. (author)

  16. Intelligent Tutor

    Science.gov (United States)

    1990-01-01

    NASA also seeks to advance American education by employing the technology utilization process to develop a computerized, artificial intelligence-based Intelligent Tutoring System (ITS) to help high school and college physics students. The tutoring system is designed for use with the lecture and laboratory portions of a typical physics instructional program. Its importance lies in its ability to observe continually as a student develops problem solutions and to intervene when appropriate with assistance specifically directed at the student's difficulty and tailored to his skill level and learning style. ITS originated as a project of the Johnson Space Center (JSC). It is being developed by JSC's Software Technology Branch in cooperation with Dr. R. Bowen Loftin at the University of Houston-Downtown. Program is jointly sponsored by NASA and ACOT (Apple Classrooms of Tomorrow). Other organizations providing support include Texas Higher Education Coordinating Board, the National Research Council, Pennzoil Products Company and the George R. Brown Foundation. The Physics I class of Clear Creek High School, League City, Texas are providing the classroom environment for test and evaluation of the system. The ITS is a spinoff product developed earlier to integrate artificial intelligence into training/tutoring systems for NASA astronauts flight controllers and engineers.

  17. Artificial Intelligence, Counseling, and Cognitive Psychology.

    Science.gov (United States)

    Brack, Greg; And Others

    With the exception of a few key writers, counselors largely ignore the benefits that Artificial Intelligence (AI) and Cognitive Psychology (CP) can bring to counseling. It is demonstrated that AI and CP can be integrated into the counseling literature. How AI and CP can offer new perspectives on information processing, cognition, and helping is…

  18. Artificial intelligence

    CERN Document Server

    Hunt, Earl B

    1975-01-01

    Artificial Intelligence provides information pertinent to the fundamental aspects of artificial intelligence. This book presents the basic mathematical and computational approaches to problems in the artificial intelligence field.Organized into four parts encompassing 16 chapters, this book begins with an overview of the various fields of artificial intelligence. This text then attempts to connect artificial intelligence problems to some of the notions of computability and abstract computing devices. Other chapters consider the general notion of computability, with focus on the interaction bet

  19. Intelligent mechatronics; Intelligent mechatronics

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, H. [The University of Tokyo, Tokyo (Japan). Institute of Industrial Science

    1995-10-01

    Intelligent mechatronics (IM) was explained as follows: a study of IM essentially targets realization of a robot namely, but in the present stage the target is a creation of new values by intellectualization of machine, that is, a combination of the information infrastructure and the intelligent machine system. IM is also thought to be constituted of computers positively used and micromechatronics. The paper next introduces examples of IM study, mainly those the author is concerned with as shown below: sensor gloves, robot hands, robot eyes, tele operation, three-dimensional object recognition, mobile robot, magnetic bearing, construction of remote controlled unmanned dam, robot network, sensitivity communication using neuro baby, etc. 27 figs.

  20. The National Air Intelligence Center Software Process Improvement Effort (NAIC SPI)

    National Research Council Canada - National Science Library

    Blankenship, Donald

    2001-01-01

    ...) Software Process Improvements effort. The objective of this effort was for the contractor to provide engineering and software process improvement for NAIC/SCD to reach SEI's CMM Level 2 in process maturity...

  1. Processing large remote sensing image data sets on Beowulf clusters

    Science.gov (United States)

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Schmidt, Gail

    2003-01-01

    High-performance computing is often concerned with the speed at which floating- point calculations can be performed. The architectures of many parallel computers and/or their network topologies are based on these investigations. Often, benchmarks resulting from these investigations are compiled with little regard to how a large dataset would move about in these systems. This part of the Beowulf study addresses that concern by looking at specific applications software and system-level modifications. Applications include an implementation of a smoothing filter for time-series data, a parallel implementation of the decision tree algorithm used in the Landcover Characterization project, a parallel Kriging algorithm used to fit point data collected in the field on invasive species to a regular grid, and modifications to the Beowulf project's resampling algorithm to handle larger, higher resolution datasets at a national scale. Systems-level investigations include a feasibility study on Flat Neighborhood Networks and modifications of that concept with Parallel File Systems.

  2. Artificial intelligence applications concepts for the remote sensing and earth science community

    Science.gov (United States)

    Campbell, W. J.; Roelofs, L. H.

    1984-01-01

    The following potential applications of AI to the study of earth science are described: (1) intelligent data management systems; (2) intelligent processing and understanding of spatial data; and (3) automated systems which perform tasks that currently require large amounts of time by scientists and engineers to complete. An example is provided of how an intelligent information system might operate to support an earth science project.

  3. Analytical, Practical and Emotional Intelligence and Line Manager Competencies

    Directory of Open Access Journals (Sweden)

    Anna Baczyńska

    2015-12-01

    Full Text Available Purpose: The research objective was to examine to what extent line manager competencies are linked to intelligence, and more specifically, three types of intelligence: analytical (fluid, practical and emotional. Methodology: The research was carried out with line managers (N=98 who took part in 12 Assessment Centre sessions and completed tests measuring analytical, practical and emotional intelligence. The adopted hypotheses were tested using a multiple regression. In the regression model, the dependent variable was a managerial competency (management and striving for results, social skills, openness to change, problem solving, employee development and the explanatory variables were the three types of intelligence. Five models, each for a separate management competency, were tested in this way. Findings: In the study, it was hypothesized that practical intelligence relates to procedural tacit knowledge and is the strongest indicator of managerial competency. Analysis of the study results testing this hypothesis indicated that practical intelligence largely accounts for the level of competency used in managerial work (from 21% to 38%. The study findings suggest that practical intelligence is a better indicator of managerial competencies among line managers than traditionally measured IQ or emotional intelligence. Originality: This research fills an important gap in the literature on the subject, indicating the links between major contemporary selection indicators (i.e., analytical, practical and emotional intelligence and managerial competencies presented in realistic work simulations measured using the Assessment Centre process.

  4. Service-oriented architecture of adaptive, intelligent data acquisition and processing systems for long-pulse fusion experiments

    International Nuclear Information System (INIS)

    Gonzalez, J.; Ruiz, M.; Barrera, E.; Lopez, J.M.; Arcas, G. de; Vega, J.

    2010-01-01

    The data acquisition systems used in long-pulse fusion experiments need to implement data reduction and pattern recognition algorithms in real time. In order to accomplish these operations, it is essential to employ software tools that allow for hot swap capabilities throughout the temporal evolution of the experiments. This is very important because processing needs are not equal during different phases of the experiment. The intelligent test and measurement system (ITMS) developed by UPM and CIEMAT is an example of a technology for implementing scalable data acquisition and processing systems based on PXI and CompactPCI hardware. In the ITMS platform, a set of software tools allows the user to define the processing algorithms associated with the different experimental phases using state machines driven by software events. These state machines are specified using the State Chart XML (SCXML) language. The software tools are developed using JAVA, JINI, an SCXML engine and several LabVIEW applications. Within this schema, it is possible to execute data acquisition and processing applications in an adaptive way. The power of SCXML semantics and the ability to work with XML user-defined data types allow for very easy programming of the ITMS platform. With this approach, the ITMS platform is a suitable solution for implementing scalable data acquisition and processing systems based on a service-oriented model with the ability to easily implement remote participation applications.

  5. Services oriented architecture for adaptive and intelligent data acquisition and processing systems in long pulse fusion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, J.; Ruiz, M.; Barrera, E.; Lopez, J.M.; De Arcas, G. [Universidad Politecnica de Madrid (Spain); Vega, J. [Association EuratomCIEMAT para Fusion, Madrid (Spain)

    2009-07-01

    Data acquisition systems used in long pulse fusion experiments require to implement data reduction and pattern recognition algorithms in real time. In order to accomplish these operations is essential to dispose software tools that allow hot swap capabilities throughout the temporal evolution of the experiments. This is very important because the processing needs are not equal in the different experiment's phases. The intelligent test and measurement system (ITMS) developed by UPM and CIEMAT is an example of technology for implementing scalable data acquisition and processing systems based in PXI and compact PCI hardware. In the ITMS platform a set of software tools allows the user to define the processing associated with the different experiment's phases using state machines driven by software events. These state machines are specified using State Chart XML (SCXML) language. The software tools are developed using: JAVA, JINI, a SCXML engine and several LabVIEW applications. With this schema it is possible to execute data acquisition and processing applications in an adaptive way. The powerful of SCXML semantics and the possibility of to work with XML user defined data types allow a very easy programming of ITMS platform. With this approach ITMS platform is a suitable solution for implementing scalable data acquisition and processing systems, based in a services oriented model, with ease possibility for implement remote participation applications. (authors)

  6. Service-oriented architecture of adaptive, intelligent data acquisition and processing systems for long-pulse fusion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, J. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada. Universidad Politecnica de Madrid, Crta. Valencia Km-7 Madrid 28031 (Spain); Ruiz, M., E-mail: mariano.ruiz@upm.e [Grupo de Investigacion en Instrumentacion y Acustica Aplicada. Universidad Politecnica de Madrid, Crta. Valencia Km-7 Madrid 28031 (Spain); Barrera, E.; Lopez, J.M.; Arcas, G. de [Grupo de Investigacion en Instrumentacion y Acustica Aplicada. Universidad Politecnica de Madrid, Crta. Valencia Km-7 Madrid 28031 (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2010-07-15

    The data acquisition systems used in long-pulse fusion experiments need to implement data reduction and pattern recognition algorithms in real time. In order to accomplish these operations, it is essential to employ software tools that allow for hot swap capabilities throughout the temporal evolution of the experiments. This is very important because processing needs are not equal during different phases of the experiment. The intelligent test and measurement system (ITMS) developed by UPM and CIEMAT is an example of a technology for implementing scalable data acquisition and processing systems based on PXI and CompactPCI hardware. In the ITMS platform, a set of software tools allows the user to define the processing algorithms associated with the different experimental phases using state machines driven by software events. These state machines are specified using the State Chart XML (SCXML) language. The software tools are developed using JAVA, JINI, an SCXML engine and several LabVIEW applications. Within this schema, it is possible to execute data acquisition and processing applications in an adaptive way. The power of SCXML semantics and the ability to work with XML user-defined data types allow for very easy programming of the ITMS platform. With this approach, the ITMS platform is a suitable solution for implementing scalable data acquisition and processing systems based on a service-oriented model with the ability to easily implement remote participation applications.

  7. How Artificial Intelligence Can Improve Our Understanding of the Genes Associated with Endometriosis: Natural Language Processing of the PubMed Database.

    Science.gov (United States)

    Bouaziz, J; Mashiach, R; Cohen, S; Kedem, A; Baron, A; Zajicek, M; Feldman, I; Seidman, D; Soriano, D

    2018-01-01

    Endometriosis is a disease characterized by the development of endometrial tissue outside the uterus, but its cause remains largely unknown. Numerous genes have been studied and proposed to help explain its pathogenesis. However, the large number of these candidate genes has made functional validation through experimental methodologies nearly impossible. Computational methods could provide a useful alternative for prioritizing those most likely to be susceptibility genes. Using artificial intelligence applied to text mining, this study analyzed the genes involved in the pathogenesis, development, and progression of endometriosis. The data extraction by text mining of the endometriosis-related genes in the PubMed database was based on natural language processing, and the data were filtered to remove false positives. Using data from the text mining and gene network information as input for the web-based tool, 15,207 endometriosis-related genes were ranked according to their score in the database. Characterization of the filtered gene set through gene ontology, pathway, and network analysis provided information about the numerous mechanisms hypothesized to be responsible for the establishment of ectopic endometrial tissue, as well as the migration, implantation, survival, and proliferation of ectopic endometrial cells. Finally, the human genome was scanned through various databases using filtered genes as a seed to determine novel genes that might also be involved in the pathogenesis of endometriosis but which have not yet been characterized. These genes could be promising candidates to serve as useful diagnostic biomarkers and therapeutic targets in the management of endometriosis.

  8. Dual cell conductivity during ionic exchange processes: the intelligent transmitter EXA DC 400

    International Nuclear Information System (INIS)

    Mier, A.

    1997-01-01

    Why is differential conductivity important versus standard conductivity measurement? That entirely depends on the application. If we have a process where the conductivity changes ge.. Cation exchanger, then standard conductivity measurement is not appropriate. With dual cell conductivity we can rate the process and eliminate conductivity changes outside the process. Therefore we achieve more precise control or monitoring of that process. (Author)

  9. Dental ethics and emotional intelligence.

    Science.gov (United States)

    Rosenblum, Alvin B; Wolf, Steve

    2014-01-01

    Dental ethics is often taught, viewed, and conducted as an intell enterprise, uninformed by other noncognitive factors. Emotional intelligence (EQ) is defined distinguished from the cognitive intelligence measured by Intelligence Quotient (IQ). This essay recommends more inclusion of emotional, noncognitive input to the ethical decision process in dental education and dental practice.

  10. LSSA large area silicon sheet task continuous Czochralski process development

    Science.gov (United States)

    Rea, S. N.

    1978-01-01

    A Czochralski crystal growing furnace was converted to a continuous growth facility by installation of a premelter to provide molten silicon flow into the primary crucible. The basic furnace is operational and several trial crystals were grown in the batch mode. Numerous premelter configurations were tested both in laboratory-scale equipment as well as in the actual furnace. The best arrangement tested to date is a vertical, cylindrical graphite heater containing small fused silicon test tube liner in which the incoming silicon is melted and flows into the primary crucible. Economic modeling of the continuous Czochralski process indicates that for 10 cm diameter crystal, 100 kg furnace runs of four or five crystals each are near-optimal. Costs tend to asymptote at the 100 kg level so little additional cost improvement occurs at larger runs. For these conditions, crystal cost in equivalent wafer area of around $20/sq m exclusive of polysilicon and slicing was obtained.

  11. Artificial Intelligence and Moral intelligence

    OpenAIRE

    Laura Pana

    2008-01-01

    We discuss the thesis that the implementation of a moral code in the behaviour of artificial intelligent systems needs a specific form of human and artificial intelligence, not just an abstract intelligence. We present intelligence as a system with an internal structure and the structural levels of the moral system, as well as certain characteristics of artificial intelligent agents which can/must be treated as 1- individual entities (with a complex, specialized, autonomous or selfdetermined,...

  12. The Acquisition of Context Data of Study Process and their Application in Classroom and Intelligent Tutoring Systems

    Directory of Open Access Journals (Sweden)

    Bicans Janis

    2015-12-01

    Full Text Available Over the last decade, researchers are investigating the potential of the educational paradigm shift from the traditional “one-size-fits all” teaching approach to an adaptive and more personalized study process. Availability of fast mobile connections along with the portative handheld device evolution, like phones and tablets, enable teachers and learners to communicate and interact with each other in a completely different way and speed. The mentioned devices not only deliver tutoring material to the learner, but might also serve as sensors to provide data about the learning process itself, e.g., learning conditions, location, detailed information on learning of tutoring material and other information. This sensor data put into the context of the study process can be widely used to improve student experience in the classroom and e-learning by providing more precise and detailed information to the teacher and/or an intelligent tutoring system for the selection of an appropriate tutoring strategy. This paper analyses and discusses acquisition, processing, and application scenarios of contextual information.

  13. The Quanzhou large earthquake: environment impact and deep process

    Science.gov (United States)

    WANG, Y.; Gao*, R.; Ye, Z.; Wang, C.

    2017-12-01

    The Quanzhou earthquake is the largest earthquake in China's southeast coast in history. The ancient city of Quanzhou and its adjacent areas suffered serious damage. Analysis of the impact of Quanzhou earthquake on human activities, ecological environment and social development will provide an example for the research on environment and human interaction.According to historical records, on the night of December 29, 1604, a Ms 8.0 earthquake occurred in the sea area at the east of Quanzhou (25.0°N, 119.5°E) with a focal depth of 25 kilometers. It affected to a maximum distance of 220 kilometers from the epicenter and caused serious damage. Quanzhou, which has been known as one of the world's largest trade ports during Song and Yuan periods was heavily destroyed by this earthquake. The destruction of the ancient city was very serious and widespread. The city wall collapsed in Putian, Nanan, Tongan and other places. The East and West Towers of Kaiyuan Temple, which are famous with magnificent architecture in history, were seriously destroyed.Therefore, an enormous earthquake can exert devastating effects on human activities and social development in the history. It is estimated that a more than Ms. 5.0 earthquake in the economically developed coastal areas in China can directly cause economic losses for more than one hundred million yuan. This devastating large earthquake that severely destroyed the Quanzhou city was triggered under a tectonic-extensional circumstance. In this coastal area of the Fujian Province, the crust gradually thins eastward from inland to coast (less than 29 km thick crust beneath the coast), the lithosphere is also rather thin (60 70 km), and the Poisson's ratio of the crust here appears relatively high. The historical Quanzhou Earthquake was probably correlated with the NE-striking Littoral Fault Zone, which is characterized by right-lateral slip and exhibiting the most active seismicity in the coastal area of Fujian. Meanwhile, tectonic

  14. Trends in ambient intelligent systems the role of computational intelligence

    CERN Document Server

    Khan, Mohammad; Abraham, Ajith

    2016-01-01

    This book demonstrates the success of Ambient Intelligence in providing possible solutions for the daily needs of humans. The book addresses implications of ambient intelligence in areas of domestic living, elderly care, robotics, communication, philosophy and others. The objective of this edited volume is to show that Ambient Intelligence is a boon to humanity with conceptual, philosophical, methodical and applicative understanding. The book also aims to schematically demonstrate developments in the direction of augmented sensors, embedded systems and behavioral intelligence towards Ambient Intelligent Networks or Smart Living Technology. It contains chapters in the field of Ambient Intelligent Networks, which received highly positive feedback during the review process. The book contains research work, with in-depth state of the art from augmented sensors, embedded technology and artificial intelligence along with cutting-edge research and development of technologies and applications of Ambient Intelligent N...

  15. IMPLEMENTATION OF BUSINESS INTELLIGENCE ON BANKING, RETAIL, AND EDUCATIONAL INDUSTRY

    Directory of Open Access Journals (Sweden)

    Arta Moro Sundjaja

    2013-10-01

    Full Text Available Information technology is useful to automate business process involving considerable data transaction in the daily basis. Currently, companies have to tackle large data transaction which is difficult to be handled manually. It is very difficult for a person to manually extract useful information from a large data set despite of the fact that the information may be useful in decision-making process. This article studied and explored the implementation of business intelligence in banking, retail, and educational industries. The article begins with the exposition of business intelligence role in the industries; is followed by an illustration of business intelligence in the industries and finalized with the implication of business intelligence implementation.

  16. Crowd-Sourced Intelligence Agency: Prototyping counterveillance

    Directory of Open Access Journals (Sweden)

    Jennifer Gradecki

    2017-02-01

    Full Text Available This paper discusses how an interactive artwork, the Crowd-Sourced Intelligence Agency (CSIA, can contribute to discussions of Big Data intelligence analytics. The CSIA is a publicly accessible Open Source Intelligence (OSINT system that was constructed using information gathered from technical manuals, research reports, academic papers, leaked documents, and Freedom of Information Act files. Using a visceral heuristic, the CSIA demonstrates how the statistical correlations made by automated classification systems are different from human judgment and can produce false-positives, as well as how the display of information through an interface can affect the judgment of an intelligence agent. The public has the right to ask questions about how a computer program determines if they are a threat to national security and to question the practicality of using statistical pattern recognition algorithms in place of human judgment. Currently, the public’s lack of access to both Big Data and the actual datasets intelligence agencies use to train their classification algorithms keeps the possibility of performing effective sous-dataveillance out of reach. Without this data, the results returned by the CSIA will not be identical to those of intelligence agencies. Because we have replicated how OSINT is processed, however, our results will resemble the type of results and mistakes made by OSINT systems. The CSIA takes some initial steps toward contributing to an informed public debate about large-scale monitoring of open source, social media data and provides a prototype for counterveillance and sousveillance tools for citizens.

  17. Individual differences in working memory, secondary memory, and fluid intelligence: evidence from the levels-of-processing span task.

    Science.gov (United States)

    Rose, Nathan S

    2013-12-01

    Individual differences in working memory (WM) are related to performance on secondary memory (SM), and fluid intelligence (gF) tests. However, the source of the relation remains unclear, in part because few studies have controlled for the nature of encoding; therefore, it is unclear whether individual variation is due to encoding, maintenance, or retrieval processes. In the current study, participants performed a WM task (the levels-of-processing span task; Rose, Myerson, Roediger III, & Hale, 2010) and a SM test that tested for both targets and the distracting processing words from the initial WM task. Deeper levels of processing at encoding did not benefit WM, but did benefit subsequent SM, although the amount of benefit was smaller for those with lower WM spans. This result suggests that, despite encoding cues that facilitate retrieval from SM, low spans may have engaged in shallower, maintenance-focused processing to maintain the words in WM. Low spans also recalled fewer targets, more distractors, and more extralist intrusions than high spans, although this was partially due to low spans' poorer recall of targets, which resulted in a greater number of opportunities to commit recall errors. Delayed recall of intrusions and commission of source errors (labeling targets as processing words and vice versa) were significant negative predictors of gF. These results suggest that the ability to use source information to recall relevant information and withhold recall of irrelevant information is a critical source of both individual variation in WM and the relation between WM, SM, and gF. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  18. An evolutionary approach for business process redesign : towards an intelligent system

    NARCIS (Netherlands)

    Netjes, M.; Limam Mansar, S.; Reijers, H.A.; Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Although extensive literature on BPR is available, there is still a lack of concrete guidance on actually changing processes for the better. It is our goal to provide a redesign approach which describes and supports the steps to derive from an existing process a better performing redesign. In this

  19. Supporting the full BPM life-cycle using process mining and intelligent redesign

    NARCIS (Netherlands)

    Netjes, M.; Reijers, H.A.; Aalst, van der W.M.P.; Siau, K.

    2007-01-01

    Abstract. Business Process Management (BPM) systems provide a broad range of facilities to enact and manage operational business processes. Ideally, these systems should provide support for the complete BPM life-cycle: (re)design, configuration, execution, control, and diagnosis by the FileNet P8

  20. Bio-inspired Artificial Intelligence: А Generalized Net Model of the Regularization Process in MLP

    Directory of Open Access Journals (Sweden)

    Stanimir Surchev

    2013-10-01

    Full Text Available Many objects and processes inspired by the nature have been recreated by the scientists. The inspiration to create a Multilayer Neural Network came from human brain as member of the group. It possesses complicated structure and it is difficult to recreate, because of the existence of too many processes that require different solving methods. The aim of the following paper is to describe one of the methods that improve learning process of Artificial Neural Network. The proposed generalized net method presents Regularization process in Multilayer Neural Network. The purpose of verification is to protect the neural network from overfitting. The regularization is commonly used in neural network training process. Many methods of verification are present, the subject of interest is the one known as Regularization. It contains function in order to set weights and biases with smaller values to protect from overfitting.

  1. Shifts in information processing level: the speed theory of intelligence revisited.

    Science.gov (United States)

    Sircar, S S

    2000-06-01

    A hypothesis is proposed here to reconcile the inconsistencies observed in the IQ-P3 latency relation. The hypothesis stems from the observation that task-induced increase in P3 latency correlates positively with IQ scores. It is hypothesised that: (a) there are several parallel information processing pathways of varying complexity which are associated with the generation of P3 waves of varying latencies; (b) with increasing workload, there is a shift in the 'information processing level' through progressive recruitment of more complex polysynaptic pathways with greater processing power and inhibition of the oligosynaptic pathways; (c) high-IQ subjects have a greater reserve of higher level processing pathways; (d) a given 'task-load' imposes a greater 'mental workload' in subjects with lower IQ than in those with higher IQ. According to this hypothesis, a meaningful comparison of the P3 correlates of IQ is possible only when the information processing level is pushed to its limits.

  2. Valid knowledge for the professional design of large and complex design processes

    NARCIS (Netherlands)

    Aken, van J.E.

    2004-01-01

    The organization and planning of design processes, which we may regard as design process design, is an important issue. Especially for large and complex design-processes traditional approaches to process design may no longer suffice. The design literature gives quite some design process models. As

  3. Towards Intelligent Supply Chains

    DEFF Research Database (Denmark)

    Siurdyban, Artur; Møller, Charles

    2012-01-01

    applied to the context of organizational processes can increase the success rate of business operations. The framework is created using a set of theoretical based constructs grounded in a discussion across several streams of research including psychology, pedagogy, artificial intelligence, learning...... of deploying inapt operations leading to deterioration of profits. To address this problem, we propose a unified business process design framework based on the paradigm of intelligence. Intelligence allows humans and human-designed systems cope with environmental volatility, and we argue that its principles......, business process management and supply chain management. It outlines a number of system tasks combined in four integrated management perspectives: build, execute, grow and innovate, put forward as business process design propositions for Intelligent Supply Chains....

  4. Artificial Intelligence.

    Science.gov (United States)

    Information Technology Quarterly, 1985

    1985-01-01

    This issue of "Information Technology Quarterly" is devoted to the theme of "Artificial Intelligence." It contains two major articles: (1) Artificial Intelligence and Law" (D. Peter O'Neill and George D. Wood); (2) "Artificial Intelligence: A Long and Winding Road" (John J. Simon, Jr.). In addition, it contains two sidebars: (1) "Calculating and…

  5. Automated business process management – in times of digital transformation using machine learning or artificial intelligence

    OpenAIRE

    Paschek Daniel; Luminosu Caius Tudor; Draghici Anca

    2017-01-01

    The continuous optimization of business processes is still a challenge for companies. In times of digital transformation, faster changing internal and external framework conditions and new customer expectations for fastest delivery and best quality of goods and many more, companies should set up their internal process at the best way. But what to do if framework conditions changed unexpectedly? The purpose of the paper is to analyse how the digital transformation will impact the Business Proc...

  6. How holistic processing of faces relates to cognitive control and intelligence.

    Science.gov (United States)

    Gauthier, Isabel; Chua, Kao-Wei; Richler, Jennifer J

    2018-04-16

    The Vanderbilt Holistic Processing Test for faces (VHPT-F) is the first standard test designed to measure individual differences in holistic processing. The test measures failures of selective attention to face parts through congruency effects, an operational definition of holistic processing. However, this conception of holistic processing has been challenged by the suggestion that it may tap into the same selective attention or cognitive control mechanisms that yield congruency effects in Stroop and Flanker paradigms. Here, we report data from 130 subjects on the VHPT-F, several versions of Stroop and Flanker tasks, as well as fluid IQ. Results suggested a small degree of shared variance in Stroop and Flanker congruency effects, which did not relate to congruency effects on the VHPT-F. Variability on the VHPT-F was also not correlated with Fluid IQ. In sum, we find no evidence that holistic face processing as measured by congruency in the VHPT-F is accounted for by domain-general control mechanisms.

  7. Airline Applications of Business Intelligence Systems

    Directory of Open Access Journals (Sweden)

    Mihai ANDRONIE

    2015-09-01

    Full Text Available Airline industry is characterized by large quantities of complex, unstructured and rapid changing data that can be categorized as big data, requiring specialized analysis tools to explore it with the purpose of obtaining useful knowledge as decision support for companies that need to fundament their activities and improve the processes they are carrying on. In this context, business intelligence tools are valuable instruments that can optimally process airline related data so that the activities that are conducted can be optimized to maximize profits, while meeting customer requirements. An airline company that has access to large volumes of data (stored into conventional or big data repositories has two options to extract useful decision support information: processing data by using general-purpose business intelligence systems or processing data by using industry specific business intelligence systems. Each of these two options has both advantages and disadvantages for the airline companies that intend to use them. The present paper presents a comparative study of a number of general-purpose and airline industry specific business intelligence systems, together with their main advantages and disadvantages.

  8. An Introduction to Intelligent Processing Programs Developed by the Air Force Manufacturing Technology Directorate

    Science.gov (United States)

    Sampson, Paul G.; Sny, Linda C.

    1992-01-01

    The Air Force has numerous on-going manufacturing and integration development programs (machine tools, composites, metals, assembly, and electronics) which are instrumental in improving productivity in the aerospace industry, but more importantly, have identified strategies and technologies required for the integration of advanced processing equipment. An introduction to four current Air Force Manufacturing Technology Directorate (ManTech) manufacturing areas is provided. Research is being carried out in the following areas: (1) machining initiatives for aerospace subcontractors which provide for advanced technology and innovative manufacturing strategies to increase the capabilities of small shops; (2) innovative approaches to advance machine tool products and manufacturing processes; (3) innovative approaches to advance sensors for process control in machine tools; and (4) efforts currently underway to develop, with the support of industry, the Next Generation Workstation/Machine Controller (Low-End Controller Task).

  9. Introduction of artificial intelligence techniques for computerized management of defects in an industrial process

    International Nuclear Information System (INIS)

    Utzel, N.

    1991-06-01

    An optimized management of Tore Supra Tokamak requires a computerized defect management. The aim is the analysis of an inhibited situation not corrected by automatisms of the process and that can be handled only by human intervention. The operator should understand, make a diagnosis and act to restore the system. In this report are studied an expert system helping the operator to analyze defects of the two main cooling loops (decarbonated water and pressurized water), management of the history of malfunction and recording of diagnosises, elaboration of an adapted expert model and installation of a methodology for defect management in other processes of Tore Supra [fr

  10. Business Intelligence Integrated Solutions

    Directory of Open Access Journals (Sweden)

    Cristescu Marian Pompiliu

    2017-01-01

    Full Text Available This paper shows how businesses make decisions better and faster in terms of customers, partners and operations by turning data into valuable business information. The paper describes how to bring together people's and business intelligence information to achieve successful business strategies. There is the possibility of developing business intelligence projects in large and medium-sized organizations only with the Microsoft product described in the paper, and possible alternatives can be discussed according to the required features.

  11. Pathogen intelligence

    Directory of Open Access Journals (Sweden)

    Michael eSteinert

    2014-01-01

    Full Text Available Different species inhabit different sensory worlds and thus have evolved diverse means of processing information, learning and memory. In the escalated arms race with host defense, each pathogenic bacterium not only has evolved its individual cellular sensing and behaviour, but also collective sensing, interbacterial communication, distributed information processing, joint decision making, dissociative behaviour, and the phenotypic and genotypic heterogeneity necessary for epidemiologic success. Moreover, pathogenic populations take advantage of dormancy strategies and rapid evolutionary speed, which allow them to save co-generated intelligent traits in a collective genomic memory. This review discusses how these mechanisms add further levels of complexity to bacterial pathogenicity and transmission, and how mining for these mechanisms could help to develop new anti-infective strategies.

  12. Intelligent Modeling Combining Adaptive Neuro Fuzzy Inference System and Genetic Algorithm for Optimizing Welding Process Parameters

    Science.gov (United States)

    Gowtham, K. N.; Vasudevan, M.; Maduraimuthu, V.; Jayakumar, T.

    2011-04-01

    Modified 9Cr-1Mo ferritic steel is used as a structural material for steam generator components of power plants. Generally, tungsten inert gas (TIG) welding is preferred for welding of these steels in which the depth of penetration achievable during autogenous welding is limited. Therefore, activated flux TIG (A-TIG) welding, a novel welding technique, has been developed in-house to increase the depth of penetration. In modified 9Cr-1Mo steel joints produced by the A-TIG welding process, weld bead width, depth of penetration, and heat-affected zone (HAZ) width play an important role in determining the mechanical properties as well as the performance of the weld joints during service. To obtain the desired weld bead geometry and HAZ width, it becomes important to set the welding process parameters. In this work, adaptative neuro fuzzy inference system is used to develop independent models correlating the welding process parameters like current, voltage, and torch speed with weld bead shape parameters like depth of penetration, bead width, and HAZ width. Then a genetic algorithm is employed to determine the optimum A-TIG welding process parameters to obtain the desired weld bead shape parameters and HAZ width.

  13. Intelligence Ethics:

    DEFF Research Database (Denmark)

    Rønn, Kira Vrist

    2016-01-01

    Questions concerning what constitutes a morally justified conduct of intelligence activities have received increased attention in recent decades. However, intelligence ethics is not yet homogeneous or embedded as a solid research field. The aim of this article is to sketch the state of the art...... of intelligence ethics and point out subjects for further scrutiny in future research. The review clusters the literature on intelligence ethics into two groups: respectively, contributions on external topics (i.e., the accountability of and the public trust in intelligence agencies) and internal topics (i.......e., the search for an ideal ethical framework for intelligence actions). The article concludes that there are many holes to fill for future studies on intelligence ethics both in external and internal discussions. Thus, the article is an invitation – especially, to moral philosophers and political theorists...

  14. 10th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Seghrouchni, Amal; Beynier, Aurélie; Camacho, David; Herpson, Cédric; Hindriks, Koen; Novais, Paulo

    2017-01-01

    This book presents the combined peer-reviewed proceedings of the tenth International Symposium on Intelligent Distributed Computing (IDC’2016), which was held in Paris, France from October 10th to 12th, 2016. The 23 contributions address a range of topics related to theory and application of intelligent distributed computing, including: Intelligent Distributed Agent-Based Systems, Ambient Intelligence and Social Networks, Computational Sustainability, Intelligent Distributed Knowledge Representation and Processing, Smart Networks, Networked Intelligence and Intelligent Distributed Applications, amongst others.

  15. Logic Programs as a Specification and Description Tool in the Design Process of an Intelligent Tutoring System

    OpenAIRE

    Möbus, Claus

    1987-01-01

    We propose the use of logic programs when designing intelligent tutoring systems. With their help we specified the small-step semantics of the learning curriculum, designed the graphical user interface, derived instructions and modelled students' knowledge.

  16. Ultrasonic velocity measurements- a potential sensor for intelligent processing of austenitic stainless steels

    International Nuclear Information System (INIS)

    Venkadesan, S.; Palanichamy, P.; Vasudevan, M.; Baldev Raj

    1996-01-01

    Development of sensors based on Non-Destructive Evaluation (NDE) techniques for on-line sensing of microstructure and properties requires a thorough knowledge on the relation between the sensing mechanism/measurement of an NDE technique and the microstructure. As a first step towards developing an on-line sensor for studying the dynamic microstructural changes during processing of austenitic stainless steels, ultrasonic velocity measurements have been carried out to study the microstructural changes after processing. Velocity measurements could follow the progress of annealing starting from recovery, onset and completion of recrystallization, sense the differences in the microstructure obtained after hot deformation and estimate the grain size. This paper brings out the relation between the sensing method based on ultrasonic velocity measurements and the microstructure in austenitic stainless steel. (author)

  17. Semantic Business Intelligence - a New Generation of Business Intelligence

    Directory of Open Access Journals (Sweden)

    Dinu AIRINEI

    2012-01-01

    Full Text Available Business Intelligence Solutions represents applications used by companies to manage process and analyze data to provide substantiated decision. In the context of Semantic Web develop-ment trend is to integrate semantic unstructured data, making business intelligence solutions to be redesigned in such a manner that can analyze, process and synthesize, in addition to traditional data and data integrated with semantic another form and structure. This invariably leads appearance of new BI solution, called Semantic Business Intelligence.

  18. On-line Cutting Tool Condition Monitoring in Machining Processes Using Artificial Intelligence

    OpenAIRE

    Vallejo, Antonio J.; Morales-Menéndez, Rub&#;n; Alique, J.R.

    2008-01-01

    This chapter presented new ideas for monitoring and diagnosis of the cutting tool condition with two different algorithms for pattern recognition: HMM, and ANN. The monitoring and diagnosis system was implemented for peripheral milling process in HSM, where several Aluminium alloys and cutting tools were used. The flank wear (VB) was selected as the criterion to evaluate the tool's life and four cutting tool conditions were defined to be recognized: New, half new, half worn, and worn conditio...

  19. Investment Cost Model in Business Process Intelligence in Banking And Electricity Company

    Directory of Open Access Journals (Sweden)

    Arta Moro Sundjaja

    2016-06-01

    Full Text Available Higher demand from the top management in measuring business process performance causes the incremental implementation of BPM and BI in the enterprise. The problem faced by top managements is how to integrate their data from all system used to support the business and process the data become information that able to support the decision-making processes. Our literature review elaborates several implementations of BPI on companies in Australia and Germany, challenges faced by organizations in developing BPI solution in their organizations and some cost model to calculate the investment of BPI solutions. This paper shows the success in BPI application of banks and assurance companies in German and electricity work in Australia aims to give a vision about the importance of BPI application. Many challenges in BPI application of companies in German and Australia, BPI solution, and data warehouse design development have been discussed to add insight in future BPI development. And the last is an explanation about how to analyze cost associated with BPI solution investment.

  20. Intelligence Naturelle et Intelligence Artificielle

    OpenAIRE

    Dubois, Daniel

    2011-01-01

    Cet article présente une approche systémique du concept d’intelligence naturelle en ayant pour objectif de créer une intelligence artificielle. Ainsi, l’intelligence naturelle, humaine et animale non-humaine, est une fonction composée de facultés permettant de connaître et de comprendre. De plus, l'intelligence naturelle reste indissociable de la structure, à savoir les organes du cerveau et du corps. La tentation est grande de doter les systèmes informatiques d’une intelligence artificielle ...

  1. Artificial Intelligence In Processing A Sequence Of Time-Varying Images

    Science.gov (United States)

    Siler, W.; Tucker, D.; Buckley, J.; Hess, R. G.; Powell, V. G.

    1985-04-01

    A computer system is described for unsupervised analysis of five sets of ultrasound images of the heart. Each set consists of 24 frames taken at 33 millisecond intervals. The images are acquired in real time with computer control of the ultrasound apparatus. After acquisition the images are segmented by a sequence of image-processing programs; features are extracted and stored in a version of the Carnegie- Mellon Blackboard. Region classification is accomplished by a fuzzy logic expert system FLOPS based on OPS5. Preliminary results are given.

  2. Intelligent environmental sensing

    CERN Document Server

    Mukhopadhyay, Subhas

    2015-01-01

    Developing environmental sensing and monitoring technologies become essential especially for industries that may cause severe contamination. Intelligent environmental sensing uses novel sensor techniques, intelligent signal and data processing algorithms, and wireless sensor networks to enhance environmental sensing and monitoring. It finds applications in many environmental problems such as oil and gas, water quality, and agriculture. This book addresses issues related to three main approaches to intelligent environmental sensing and discusses their latest technological developments. Key contents of the book include:   Agricultural monitoring Classification, detection, and estimation Data fusion Geological monitoring Motor monitoring Multi-sensor systems Oil reservoirs monitoring Sensor motes Water quality monitoring Wireless sensor network protocol  

  3. Longitudinal Mediation of Processing Speed on Age-Related Change in Memory and Fluid Intelligence

    Science.gov (United States)

    Robitaille, Annie; Piccinin, Andrea M.; Muniz, Graciela; Hoffman, Lesa; Johansson, Boo; Deeg, Dorly J.H.; Aartsen, Marja J.; Comijs, Hannie C.; Hofer, Scott M.

    2014-01-01

    Age-related decline in processing speed has long been considered a key driver of cognitive aging. While the majority of empirical evidence for the processing speed hypothesis has been obtained from analyses of between-person age differences, longitudinal studies provide a direct test of within-person change. Using recent developments in longitudinal mediation analysis, we examine the speed–mediation hypothesis at both the within- and between-person levels in two longitudinal studies, LASA and OCTO-Twin. We found significant within-person indirect effects of change in age, such that increasing age was related to lower speed which, in turn, relates to lower performance across repeated measures on other cognitive outcomes. Although between-person indirect effects were also significant in LASA, they were not in OCTO-Twin. These differing magnitudes of direct and indirect effects across levels demonstrate the importance of separating between- and within-person effects in evaluating theoretical models of age-related change. PMID:23957224

  4. The Professionalization of Intelligence Cooperation

    DEFF Research Database (Denmark)

    Svendsen, Adam David Morgan

    "Providing an in-depth insight into the subject of intelligence cooperation (officially known as liason), this book explores the complexities of this process. Towards facilitating a general understanding of the professionalization of intelligence cooperation, Svendsen's analysis includes risk...... management and encourages the realisation of greater resilience. Svendsen discusses the controversial, mixed and uneven characterisations of the process of the professionalization of intelligence cooperation and argues for a degree of 'fashioning method out of mayhem' through greater operational...

  5. Intelligent tuning of vibration mitigation process for single link manipulator using fuzzy logic

    Directory of Open Access Journals (Sweden)

    Ahmed A. Ali

    2017-08-01

    Full Text Available In this work, active vibration mitigation for smart single link manipulator is presented. Two piezoelectric transducers were utilized to act as actuator and sensor respectively. Classical Proportional (P controller was tested numerically and experimentally. The comparison between measured results showed good agreement. The proposed work includes the introducing of fuzzy logic for tuning controller's gain within finite element method. Classical Proportional-Integral (PI, Fuzzy-P and Fuzzy-PI controllers were totally integrated as a series of [IF-Then] states and solved numerically by using Finite Element (FE solver (ANSYS. Proposed method will pave the way on solving the tuning process totally within single FE solver with high efficiency. Proposed method satisfied mitigation in the overall free response with about 52% and 74% of the manipulator settling time when Fuzzy-P and Fuzzy-PI controllers were activated respectively. This contribution can be utilized for many other applications related to fuzzy topics.

  6. An intelligent approach to optimize the EDM process parameters using utility concept and QPSO algorithm

    Directory of Open Access Journals (Sweden)

    Chinmaya P. Mohanty

    2017-04-01

    Full Text Available Although significant research has gone into the field of electrical discharge machining (EDM, analysis related to the machining efficiency of the process with different electrodes has not been adequately made. Copper and brass are frequently used as electrode materials but graphite can be used as a potential electrode material due to its high melting point temperature and good electrical conductivity. In view of this, the present work attempts to compare the machinability of copper, graphite and brass electrodes while machining Inconel 718 super alloy. Taguchi’s L27 orthogonal array has been employed to collect data for the study and analyze effect of machining parameters on performance measures. The important performance measures selected for this study are material removal rate, tool wear rate, surface roughness and radial overcut. Machining parameters considered for analysis are open circuit voltage, discharge current, pulse-on-time, duty factor, flushing pressure and electrode material. From the experimental analysis, it is observed that electrode material, discharge current and pulse-on-time are the important parameters for all the performance measures. Utility concept has been implemented to transform a multiple performance characteristics into an equivalent performance characteristic. Non-linear regression analysis is carried out to develop a model relating process parameters and overall utility index. Finally, the quantum behaved particle swarm optimization (QPSO and particle swarm optimization (PSO algorithms have been used to compare the optimal level of cutting parameters. Results demonstrate the elegance of QPSO in terms of convergence and computational effort. The optimal parametric setting obtained through both the approaches is validated by conducting confirmation experiments.

  7. How semantics can inform the geological mapping process and support intelligent queries

    Science.gov (United States)

    Lombardo, Vincenzo; Piana, Fabrizio; Mimmo, Dario

    2017-04-01

    The geologic mapping process requires the organization of data according to the general knowledge about the objects, namely the geologic units, and to the objectives of a graphic representation of such objects in a map, following an established model of geotectonic evolution. Semantics can greatly help such a process in two concerns: the provision of a terminological base to name and classify the objects of the map; on the other, the implementation of a machine-readable encoding of the geologic knowledge base supports the application of reasoning mechanisms and the derivation of novel properties and relations about the objects of the map. The OntoGeonous initiative has built a terminological base of geological knowledge in a machine-readable format, following the Semantic Web tenets and the Linked Data paradigm. The major knowledge sources of the OntoGeonous initiative are GeoScience Markup Language schemata and vocabularies (through its last version, GeoSciML 4, 2015, published by the IUGS CGI Commission) and the INSPIRE "Data Specification on Geology" directives (an operative simplification of GeoSciML, published by INSPIRE Thematic Working Group Geology of the European Commission). The Linked Data paradigm has been exploited by linking (without replicating, to avoid inconsistencies) the already existing machine-readable encoding for some specific domains, such as the lithology domain (vocabulary Simple Lithology) and the geochronologic time scale (ontology "gts"). Finally, for the upper level knowledge, shared across several geologic domains, we have resorted to NASA SWEET ontology. The OntoGeonous initiative has also produced a wiki that explains how the geologic knowledge has been encoded from shared geoscience vocabularies (https://www.di.unito.it/wikigeo/). In particular, the sections dedicated to axiomatization will support the construction of an appropriate data base schema that can be then filled with the objects of the map. This contribution will discuss

  8. Intelligent Machine Vision Based Modeling and Positioning System in Sand Casting Process

    Directory of Open Access Journals (Sweden)

    Shahid Ikramullah Butt

    2017-01-01

    Full Text Available Advanced vision solutions enable manufacturers in the technology sector to reconcile both competitive and regulatory concerns and address the need for immaculate fault detection and quality assurance. The modern manufacturing has completely shifted from the manual inspections to the machine assisted vision inspection methodology. Furthermore, the research outcomes in industrial automation have revolutionized the whole product development strategy. The purpose of this research paper is to introduce a new scheme of automation in the sand casting process by means of machine vision based technology for mold positioning. Automation has been achieved by developing a novel system in which casting molds of different sizes, having different pouring cup location and radius, position themselves in front of the induction furnace such that the center of pouring cup comes directly beneath the pouring point of furnace. The coordinates of the center of pouring cup are found by using computer vision algorithms. The output is then transferred to a microcontroller which controls the alignment mechanism on which the mold is placed at the optimum location.

  9. Developing a Business Intelligence Process for a Training Module in SharePoint 2010

    Science.gov (United States)

    Schmidtchen, Bryce; Solano, Wanda M.; Albasini, Colby

    2015-01-01

    Prior to this project, training information for the employees of the National Center for Critical Processing and Storage (NCCIPS) was stored in an array of unrelated spreadsheets and SharePoint lists that had to be manually updated. By developing a content management system through a web application platform named SharePoint, this training system is now highly automated and provides a much less intensive method of storing training data and scheduling training courses. This system was developed by using SharePoint Designer and laying out the data structure for the interaction between different lists of data about the employees. The automation of data population inside of the lists was accomplished by implementing SharePoint workflows which essentially lay out the logic for how data is connected and calculated between certain lists. The resulting training system is constructed from a combination of five lists of data with a single list acting as the user-friendly interface. This interface is populated with the courses required for each employee and includes past and future information about course requirements. The employees of NCCIPS now have the ability to view, log, and schedule their training information and courses with much more ease. This system will relieve a significant amount of manual input and serve as a powerful informational resource for the employees of NCCIPS in the future.

  10. Fetal QRS extraction from abdominal recordings via model-based signal processing and intelligent signal merging

    International Nuclear Information System (INIS)

    Haghpanahi, Masoumeh; Borkholder, David A

    2014-01-01

    Noninvasive fetal ECG (fECG) monitoring has potential applications in diagnosing congenital heart diseases in a timely manner and assisting clinicians to make more appropriate decisions during labor. However, despite advances in signal processing and machine learning techniques, the analysis of fECG signals has still remained in its preliminary stages. In this work, we describe an algorithm to automatically locate QRS complexes in noninvasive fECG signals obtained from a set of four electrodes placed on the mother’s abdomen. The algorithm is based on an iterative decomposition of the maternal and fetal subspaces and filtering of the maternal ECG (mECG) components from the fECG recordings. Once the maternal components are removed, a novel merging technique is applied to merge the signals and detect the fetal QRS (fQRS) complexes. The algorithm was trained and tested on the fECG datasets provided by the PhysioNet/CinC challenge 2013. The final results indicate that the algorithm is able to detect fetal peaks for a variety of signals with different morphologies and strength levels encountered in clinical practice. (paper)

  11. Designing a framework of intelligent information processing for dentistry administration data.

    Science.gov (United States)

    Amiri, N; Matthews, D C; Gao, Q

    2005-07-01

    This study was designed to test a cumulative view of current data in the clinical database at the Faculty of Dentistry, Dalhousie University. We planned to examine associations among demographic factors and treatments. Three tables were selected from the database of the faculty: patient, treatment and procedures. All fields and record numbers in each table were documented. Data was explored using SQL server and Visual Basic and then cleaned by removing incongruent fields. After transformation, a data warehouse was created. This was imported to SQL analysis services manager to create an OLAP (Online Analytic Process) cube. The multidimensional model used for access to data was created using a star schema. Treatment count was the measurement variable. Five dimensions--date, postal code, gender, age group and treatment categories--were used to detect associations. Another data warehouse of 8 tables (international tooth code # 1-8) was created and imported to SAS enterprise miner to complete data mining. Association nodes were used for each table to find sequential associations and minimum criteria were set to 2% of cases. Findings of this study confirmed most assumptions of treatment planning procedures. There were some small unexpected patterns of clinical interest. Further developments are recommended to create predictive models. Recent improvements in information technology offer numerous advantages for conversion of raw data from faculty databases to information and subsequently to knowledge. This knowledge can be used by decision makers, managers, and researchers to answer clinical questions, affect policy change and determine future research needs.

  12. Use of Information Intelligent Components for the Analysis of Complex Processes of Marine Energy Systems

    Directory of Open Access Journals (Sweden)

    Chernyi Sergei

    2016-09-01

    Full Text Available Synchronous motors and their modifications (ac converter-fed motor, etc. enable to develop low-noise, reliable and economically efficient electric drive systems. The construction of up-to-date systems based on the synchronous machines is impossible without development of the computing software incorporating mathematical and computing simulation. In its turn, modelling of the synchronous machines as a rule is based on the equations by Park-Gorev, application of which requires adoption of a series of allowances. These allowances in a number of cases do not permit to obtain adequate results of the simulation coinciding with the results of field experiments of the systems under review. Moreover, while applying the system of equations by Park-Gorev to research the systems including interaction of synchronous machines with semiconducting converters of electric energy it is necessary simulate the process of formation their controlling signals in the sphere of frequency. If the states of converter’s keys are defined not only by controlling impulses but also by the state of currents in the synchronous machines flowing through them, such approach is not reasonable.

  13. Genes, evolution and intelligence.

    Science.gov (United States)

    Bouchard, Thomas J

    2014-11-01

    I argue that the g factor meets the fundamental criteria of a scientific construct more fully than any other conception of intelligence. I briefly discuss the evidence regarding the relationship of brain size to intelligence. A review of a large body of evidence demonstrates that there is a g factor in a wide range of species and that, in the species studied, it relates to brain size and is heritable. These findings suggest that many species have evolved a general-purpose mechanism (a general biological intelligence) for dealing with the environments in which they evolved. In spite of numerous studies with considerable statistical power, we know of very few genes that influence g and the effects are very small. Nevertheless, g appears to be highly polygenic. Given the complexity of the human brain, it is not surprising that that one of its primary faculties-intelligence-is best explained by the near infinitesimal model of quantitative genetics.

  14. Advertising and algorithms – the obvious gains and hidden losses of using software with intelligent agent capabilities in the creative process of art directors and copywriters

    OpenAIRE

    Barker, Richie

    2017-01-01

    Situated at the intersection of information technology, advertising and creativity theory, this thesis presents a detailed picture of the influence of autonomous software applications on the creative process of advertising art directors and copywriters. These applications, which are known in the field of information technology as ‘intelligent agents,’ commonly possess the ability to learn from the user and autonomously pursue their own goals. The search engine Google, which employs intelligen...

  15. Artificial intelligence in cardiology

    OpenAIRE

    Bonderman, Diana

    2017-01-01

    Summary Decision-making is complex in modern medicine and should ideally be based on available data, structured knowledge and proper interpretation in the context of an individual patient. Automated algorithms, also termed artificial intelligence that are able to extract meaningful patterns from data collections and build decisions upon identified patterns may be useful assistants in clinical decision-making processes. In this article, artificial intelligence-based studies in clinical cardiol...

  16. Flow chemistry: intelligent processing of gas-liquid transformations using a tube-in-tube reactor.

    Science.gov (United States)

    Brzozowski, Martin; O'Brien, Matthew; Ley, Steven V; Polyzos, Anastasios

    2015-02-17

    reactive gas in a given reaction mixture. We have developed a tube-in-tube reactor device consisting of a pair of concentric capillaries in which pressurized gas permeates through an inner Teflon AF-2400 tube and reacts with dissolved substrate within a liquid phase that flows within a second gas impermeable tube. This Account examines our efforts toward the development of a simple, unified methodology for the processing of gaseous reagents in flow by way of development of a tube-in-tube reactor device and applications to key C-C, C-N, and C-O bond forming and hydrogenation reactions. We further describe the application to multistep reactions using solid-supported reagents and extend the technology to processes utilizing multiple gas reagents. A key feature of our work is the development of computer-aided imaging techniques to allow automated in-line monitoring of gas concentration and stoichiometry in real time. We anticipate that this Account will illustrate the convenience and benefits of membrane tube-in-tube reactor technology to improve and concomitantly broaden the scope of gas/liquid/solid reactions in organic synthesis.

  17. Intelligence and treaty ratification

    International Nuclear Information System (INIS)

    Sojka, G.L.

    1990-01-01

    What did the intelligence community and the Intelligence Committee di poorly in regard to the treaty ratification process for arms control? We failed to solve the compartmentalization problem/ This is a second-order problem, and, in general, analysts try to be very open; but there are problems nevertheless. There are very few, if any, people within the intelligence community who are cleared for everything relevant to our monitoring capability emdash short of probably the Director of Central Intelligence and the president emdash and this is a major problem. The formal monitoring estimates are drawn up by individuals who do not have access to all the information to make the monitoring judgements. This paper reports that the intelligence community did not present a formal document on either Soviet incentives of disincentives to cheat or on the possibility of cheating scenarios, and that was a mistake. However, the intelligence community was very responsive in producing those types of estimates, and, ultimately, the evidence behind them in response to questions. Nevertheless, the author thinks the intelligence community would do well to address this issue up front before a treaty is submitted to the Senate for advice and consent

  18. The impact of study design and diagnostic approach in a large multi-centre ADHD study: Part 2: Dimensional measures of psychopathology and intelligence

    Directory of Open Access Journals (Sweden)

    Roeyers Herbert

    2011-04-01

    Full Text Available Abstract Background The International Multi-centre ADHD Genetics (IMAGE project with 11 participating centres from 7 European countries and Israel has collected a large behavioural and genetic database for present and future research. Behavioural data were collected from 1068 probands with ADHD and 1446 unselected siblings. The aim was to describe and analyse questionnaire data and IQ measures from all probands and siblings. In particular, to investigate the influence of age, gender, family status (proband vs. sibling, informant, and centres on sample homogeneity in psychopathological measures. Methods Conners' Questionnaires, Strengths and Difficulties Questionnaires, and Wechsler Intelligence Scores were used to describe the phenotype of the sample. Data were analysed by use of robust statistical multi-way procedures. Results Besides main effects of age, gender, informant, and centre, there were considerable interaction effects on questionnaire data. The larger differences between probands and siblings at home than at school may reflect contrast effects in the parents. Furthermore, there were marked gender by status effects on the ADHD symptom ratings with girls scoring one standard deviation higher than boys in the proband sample but lower than boys in the siblings sample. The multi-centre design is another important source of heterogeneity, particularly in the interaction with the family status. To a large extent the centres differed from each other with regard to differences between proband and sibling scores. Conclusions When ADHD probands are diagnosed by use of fixed symptom counts, the severity of the disorder in the proband sample may markedly differ between boys and girls and across age, particularly in samples with a large age range. A multi-centre design carries the risk of considerable phenotypic differences between centres and, consequently, of additional heterogeneity of the sample even if standardized diagnostic procedures are

  19. The impact of study design and diagnostic approach in a large multi-centre ADHD study: Part 2: Dimensional measures of psychopathology and intelligence.

    LENUS (Irish Health Repository)

    Muller, Ueli C

    2011-04-07

    Abstract Background The International Multi-centre ADHD Genetics (IMAGE) project with 11 participating centres from 7 European countries and Israel has collected a large behavioural and genetic database for present and future research. Behavioural data were collected from 1068 probands with ADHD and 1446 unselected siblings. The aim was to describe and analyse questionnaire data and IQ measures from all probands and siblings. In particular, to investigate the influence of age, gender, family status (proband vs. sibling), informant, and centres on sample homogeneity in psychopathological measures. Methods Conners\\' Questionnaires, Strengths and Difficulties Questionnaires, and Wechsler Intelligence Scores were used to describe the phenotype of the sample. Data were analysed by use of robust statistical multi-way procedures. Results Besides main effects of age, gender, informant, and centre, there were considerable interaction effects on questionnaire data. The larger differences between probands and siblings at home than at school may reflect contrast effects in the parents. Furthermore, there were marked gender by status effects on the ADHD symptom ratings with girls scoring one standard deviation higher than boys in the proband sample but lower than boys in the siblings sample. The multi-centre design is another important source of heterogeneity, particularly in the interaction with the family status. To a large extent the centres differed from each other with regard to differences between proband and sibling scores. Conclusions When ADHD probands are diagnosed by use of fixed symptom counts, the severity of the disorder in the proband sample may markedly differ between boys and girls and across age, particularly in samples with a large age range. A multi-centre design carries the risk of considerable phenotypic differences between centres and, consequently, of additional heterogeneity of the sample even if standardized diagnostic procedures are used. These

  20. Assessment of emotion processing skills in acquired brain injury using an ability-based test of emotional intelligence.

    Science.gov (United States)

    Hall, Sarah E; Wrench, Joanne M; Wilson, Sarah J

    2018-04-01

    Social and emotional problems are commonly reported after moderate to severe acquired brain injury (ABI) and pose a significant barrier to rehabilitation. However, progress in assessment of emotional skills has been limited by a lack of validated measurement approaches. This study represents the first formal psychometric evaluation of the use of the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) V2.0 as a tool for assessing skills in perceiving, using, understanding and managing emotions following ABI. The sample consisted of 82 participants aged 18-80 years in the postacute phase of recovery (2 months-7 years) after moderate to severe ABI. Participants completed the MSCEIT V2.0 and measures of cognition and mood. Sociodemographic and clinical variables were collated from participant interview and medical files. Results revealed deficits across all MSCEIT subscales (approximately 1 SD below the normative mean). Internal consistency was adequate at overall, area, and branch levels, and MSCEIT scores correlated in expected ways with key demographic, clinical, cognitive, and mood variables. MSCEIT performance was related to injury severity and clinician-rated functioning after ABI. Confirmatory factor analysis favored a 3-factor model of EI due to statistical redundancy of the Using Emotions branch. Overall, these findings suggest that the MSCEIT V2.0 is sensitive to emotion processing deficits after moderate to severe ABI, and can yield valid and reliable scores in an ABI sample. In terms of theoretical contributions, our findings support a domain-based, 3-factor approach for characterizing emotion-related abilities in brain-injured individuals. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. Large-Deviation Results for Discriminant Statistics of Gaussian Locally Stationary Processes

    Directory of Open Access Journals (Sweden)

    Junichi Hirukawa

    2012-01-01

    Full Text Available This paper discusses the large-deviation principle of discriminant statistics for Gaussian locally stationary processes. First, large-deviation theorems for quadratic forms and the log-likelihood ratio for a Gaussian locally stationary process with a mean function are proved. Their asymptotics are described by the large deviation rate functions. Second, we consider the situations where processes are misspecified to be stationary. In these misspecified cases, we formally make the log-likelihood ratio discriminant statistics and derive the large deviation theorems of them. Since they are complicated, they are evaluated and illustrated by numerical examples. We realize the misspecification of the process to be stationary seriously affecting our discrimination.

  2. Modelling intelligent behavior

    Science.gov (United States)

    Green, H. S.; Triffet, T.

    1993-01-01

    An introductory discussion of the related concepts of intelligence and consciousness suggests criteria to be met in the modeling of intelligence and the development of intelligent materials. Methods for the modeling of actual structure and activity of the animal cortex have been found, based on present knowledge of the ionic and cellular constitution of the nervous system. These have led to the development of a realistic neural network model, which has been used to study the formation of memory and the process of learning. An account is given of experiments with simple materials which exhibit almost all properties of biological synapses and suggest the possibility of a new type of computer architecture to implement an advanced type of artificial intelligence.

  3. Alzheimer's disease and intelligence.

    Science.gov (United States)

    Yeo, R A; Arden, R; Jung, R E

    2011-06-01

    A significant body of evidence has accumulated suggesting that individual variation in intellectual ability, whether assessed directly by intelligence tests or indirectly through proxy measures, is related to risk of developing Alzheimer's disease (AD) in later life. Important questions remain unanswered, however, such as the specificity of risk for AD vs. other forms of dementia, and the specific links between premorbid intelligence and development of the neuropathology characteristic of AD. Lower premorbid intelligence has also emerged as a risk factor for greater mortality across myriad health and mental health diagnoses. Genetic covariance contributes importantly to these associations, and pleiotropic genetic effects may impact diverse organ systems through similar processes, including inefficient design and oxidative stress. Through such processes, the genetic underpinnings of intelligence, specifically, mutation load, may also increase the risk of developing AD. We discuss how specific neurobiologic features of relatively lower premorbid intelligence, including reduced metabolic efficiency, may facilitate the development of AD neuropathology. The cognitive reserve hypothesis, the most widely accepted account of the intelligence-AD association, is reviewed in the context of this larger literature.

  4. Computer automation and artificial intelligence

    International Nuclear Information System (INIS)

    Hasnain, S.B.

    1992-01-01

    Rapid advances in computing, resulting from micro chip revolution has increased its application manifold particularly for computer automation. Yet the level of automation available, has limited its application to more complex and dynamic systems which require an intelligent computer control. In this paper a review of Artificial intelligence techniques used to augment automation is presented. The current sequential processing approach usually adopted in artificial intelligence has succeeded in emulating the symbolic processing part of intelligence, but the processing power required to get more elusive aspects of intelligence leads towards parallel processing. An overview of parallel processing with emphasis on transputer is also provided. A Fuzzy knowledge based controller for amination drug delivery in muscle relaxant anesthesia on transputer is described. 4 figs. (author)

  5. Informational support of the investment process in a large city economy

    Directory of Open Access Journals (Sweden)

    Tamara Zurabovna Chargazia

    2016-12-01

    Full Text Available Large cities possess a sufficient potential to participate in the investment processes both at the national and international levels. A potential investor’s awareness of the possibilities and prospects of a city development is of a great importance for him or her to make a decision. So, providing a potential investor with relevant, laconic and reliable information, the local authorities increase the intensity of the investment process in the city economy and vice-versa. As a hypothesis, there is a proposition that a large city administration can sufficiently activate the investment processes in the economy of a corresponding territorial entity using the tools of the information providing. The purpose of this article is to develop measures for the improvement of the investment portal of a large city as an important instrument of the information providing, which will make it possible to brisk up the investment processes at the level under analysis. The reasons of the unsatisfactory information providing on the investment process in a large city economy are deeply analyzed; the national and international experience in this sphere is studied; advantages and disadvantages of the information providing of the investment process in the economy of the city of Makeyevka are considered; the investment portals of different cities are compared. There are suggested technical approaches for improving the investment portal of a large city. The research results can be used to improve the investment policy of large cities.

  6. Intelligent Design

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    2005-01-01

    Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig.......Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig....

  7. Intelligent editor/printer enhancements

    Science.gov (United States)

    Woodfill, M. C.; Pheanis, D. C.

    1983-01-01

    Microprocessor support hardware, software, and cross assemblers relating to the Motorola 6800 and 6809 process systems were developed. Pinter controller and intelligent CRT development are discussed. The user's manual, design specifications for the MC6809 version of the intelligent printer controller card, and a 132-character by 64-line intelligent CRT display system using a Motorola 6809 MPU, and a one-line assembler and disassembler are provided.

  8. Intelligent Agent Appropriation in the Tracking Phase of an Environmental Scanning Process: A Case Study of a French Trade Union

    Science.gov (United States)

    Lafaye, Christophe

    2009-01-01

    Introduction: The rapid growth of the Internet has modified the boundaries of information acquisition (tracking) in environmental scanning. Despite the numerous advantages of this new medium, information overload is an enormous problem for Internet scanners. In order to help them, intelligent agents (i.e., autonomous, automated software agents…

  9. Exploring Possible Neural Mechanisms of Intelligence Differences Using Processing Speed and Working Memory Tasks: An fMRI Study

    Science.gov (United States)

    Waiter, Gordon D.; Deary, Ian J.; Staff, Roger T.; Murray, Alison D.; Fox, Helen C.; Starr, John M.; Whalley, Lawrence J.

    2009-01-01

    To explore the possible neural foundations of individual differences in intelligence test scores, we examined the associations between Raven's Matrices scores and two tasks that were administered in a functional magnetic resonance imaging (fMRI) setting. The two tasks were an n-back working memory (N = 37) task and inspection time (N = 47). The…

  10. Process variations in surface nano geometries manufacture on large area substrates

    DEFF Research Database (Denmark)

    Calaon, Matteo; Hansen, Hans Nørgaard; Tosello, Guido

    2014-01-01

    The need of transporting, treating and measuring increasingly smaller biomedical samples has pushed the integration of a far reaching number of nanofeatures over large substrates size in respect to the conventional processes working area windows. Dimensional stability of nano fabrication processe...

  11. Feasibility of large volume casting cementation process for intermediate level radioactive waste

    International Nuclear Information System (INIS)

    Chen Zhuying; Chen Baisong; Zeng Jishu; Yu Chengze

    1988-01-01

    The recent tendency of radioactive waste treatment and disposal both in China and abroad is reviewed. The feasibility of the large volume casting cementation process for treating and disposing the intermediate level radioactive waste from spent fuel reprocessing plant in shallow land is assessed on the basis of the analyses of the experimental results (such as formulation study, solidified radioactive waste properties measurement ect.). It can be concluded large volume casting cementation process is a promising, safe and economic process. It is feasible to dispose the intermediate level radioactive waste from reprocessing plant it the disposal site chosen has resonable geological and geographical conditions and some additional effective protection means are taken

  12. Intelligent Extruder

    Energy Technology Data Exchange (ETDEWEB)

    AlperEker; Mark Giammattia; Paul Houpt; Aditya Kumar; Oscar Montero; Minesh Shah; Norberto Silvi; Timothy Cribbs

    2003-04-24

    ''Intelligent Extruder'' described in this report is a software system and associated support services for monitoring and control of compounding extruders to improve material quality, reduce waste and energy use, with minimal addition of new sensors or changes to the factory floor system components. Emphasis is on process improvements to the mixing, melting and de-volatilization of base resins, fillers, pigments, fire retardants and other additives in the :finishing'' stage of high value added engineering polymer materials. While GE Plastics materials were used for experimental studies throughout the program, the concepts and principles are broadly applicable to other manufacturers materials. The project involved a joint collaboration among GE Global Research, GE Industrial Systems and Coperion Werner & Pleiderer, USA, a major manufacturer of compounding equipment. Scope of the program included development of a algorithms for monitoring process material viscosity without rheological sensors or generating waste streams, a novel detection scheme for rapid detection of process upsets and an adaptive feedback control system to compensate for process upsets where at line adjustments are feasible. Software algorithms were implemented and tested on a laboratory scale extruder (50 lb/hr) at GE Global Research and data from a production scale system (2000 lb/hr) at GE Plastics was used to validate the monitoring and detection software. Although not evaluated experimentally, a new concept for extruder process monitoring through estimation of high frequency drive torque without strain gauges is developed and demonstrated in simulation. A plan to commercialize the software system is outlined, but commercialization has not been completed.

  13. Progress in Root Cause and Fault Propagation Analysis of Large-Scale Industrial Processes

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2012-01-01

    Full Text Available In large-scale industrial processes, a fault can easily propagate between process units due to the interconnections of material and information flows. Thus the problem of fault detection and isolation for these processes is more concerned about the root cause and fault propagation before applying quantitative methods in local models. Process topology and causality, as the key features of the process description, need to be captured from process knowledge and process data. The modelling methods from these two aspects are overviewed in this paper. From process knowledge, structural equation modelling, various causal graphs, rule-based models, and ontological models are summarized. From process data, cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian nets are introduced. Based on these models, inference methods are discussed to find root causes and fault propagation paths under abnormal situations. Some future work is proposed in the end.

  14. Extraterrestrial processing and manufacturing of large space systems. Volume 3: Executive summary

    Science.gov (United States)

    Miller, R. H.; Smith, D. B. S.

    1979-01-01

    Facilities and equipment are defined for refining processes to commercial grade of lunar material that is delivered to a 'space manufacturing facility' in beneficiated, primary processed quality. The manufacturing facilities and the equipment for producing elements of large space systems from these materials and providing programmatic assessments of the concepts are also defined. In-space production processes of solar cells (by vapor deposition) and arrays, structures and joints, conduits, waveguides, RF equipment radiators, wire cables, converters, and others are described.

  15. Implementation of Business Intelligence on Banking, Retail, and Educational Industry

    OpenAIRE

    Sundjaja, Arta Moro

    2013-01-01

    Information technology is useful to automate business process involving considerable data transaction in the daily basis. Currently, companies have to tackle large data transaction which is difficult to be handled manually. It is very difficult for a person to manually extract useful information from a large data set despite of the fact that the information may be useful in decision-making process. This article studied and explored the implementation of business intelligence in banking, retai...

  16. Intelligent playgrounds

    DEFF Research Database (Denmark)

    Larsen, Lasse Juel

    2009-01-01

    This paper examines play, gaming and learning in regard to intelligent playware developed for outdoor use. The key questions are how does these novel artefacts influence the concept of play, gaming and learning. Up until now play and game have been understood as different activities. This paper...... examines if the sharp differentiation between the two can be uphold in regard to intelligent playware for outdoor use. Play and game activities will be analysed and viewed in conjunction with learning contexts. This paper will stipulate that intelligent playware facilitates rapid shifts in contexts...

  17. Artificial intelligence

    CERN Document Server

    Ennals, J R

    1987-01-01

    Artificial Intelligence: State of the Art Report is a two-part report consisting of the invited papers and the analysis. The editor first gives an introduction to the invited papers before presenting each paper and the analysis, and then concludes with the list of references related to the study. The invited papers explore the various aspects of artificial intelligence. The analysis part assesses the major advances in artificial intelligence and provides a balanced analysis of the state of the art in this field. The Bibliography compiles the most important published material on the subject of

  18. Artificial Intelligence

    CERN Document Server

    Warwick, Kevin

    2011-01-01

    if AI is outside your field, or you know something of the subject and would like to know more then Artificial Intelligence: The Basics is a brilliant primer.' - Nick Smith, Engineering and Technology Magazine November 2011 Artificial Intelligence: The Basics is a concise and cutting-edge introduction to the fast moving world of AI. The author Kevin Warwick, a pioneer in the field, examines issues of what it means to be man or machine and looks at advances in robotics which have blurred the boundaries. Topics covered include: how intelligence can be defined whether machines can 'think' sensory

  19. Bringing Business Intelligence to Health Information Technology Curriculum

    Science.gov (United States)

    Zheng, Guangzhi; Zhang, Chi; Li, Lei

    2015-01-01

    Business intelligence (BI) and healthcare analytics are the emerging technologies that provide analytical capability to help healthcare industry improve service quality, reduce cost, and manage risks. However, such component on analytical healthcare data processing is largely missed from current healthcare information technology (HIT) or health…

  20. Intelligent Speed Assistance (ISA).

    NARCIS (Netherlands)

    2015-01-01

    Intelligent Speed Assistance (ISA) has been a promising type of advanced driver support system for some decades. From a technical point of view, large scale ISA implementation is possible in the short term. The different types of ISA are expected to have different effects on behaviour and traffic

  1. Distributed intelligence at CELLO

    International Nuclear Information System (INIS)

    Boer, W. de

    1981-01-01

    This paper describes the use of distributed intelligence at CELLO, a large 4π detector at PETRA. Besides special purpose hardware processors for online calibration and reformatting of data, several microcomputers are used for monitoring and testing the various detector components. (orig.)

  2. Large break frequency for the SRS (Savannah River Site) production reactor process water system

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Bush, S.H.

    1989-01-01

    The objective of this paper is to present the results and conclusions of an evaluation of the large break frequency for the process water system (primary coolant system), including the piping, reactor tank, heat exchangers, expansion joints and other process water system components. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break. This evaluation encompasses three specific areas: the failure probability of large process water piping directly from imposed loads, the indirect failure probability of piping caused by the seismic-induced failure of surrounding structures, and the failure of all other process water components. The first two of these areas are discussed in detail in other papers. This paper primarily addresses the failure frequency of components other than piping, and includes the other two areas as contributions to the overall process water system break frequency

  3. The crustal dynamics intelligent user interface anthology

    Science.gov (United States)

    Short, Nicholas M., Jr.; Campbell, William J.; Roelofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has, as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI is to develop a friendly and intelligent user interface service based on expert systems and natural language processing technologies. The purpose of such a service is to support the large number of potential scientific and engineering users that have need of space and land-related research and technical data, but have little or no experience in query languages or understanding of the information content or architecture of the databases of interest. This document presents the design concepts, development approach and evaluation of the performance of a prototype IUI system for the Crustal Dynamics Project Database, which was developed using a microcomputer-based expert system tool (M. 1), the natural language query processor THEMIS, and the graphics software system GSS. The IUI design is based on a multiple view representation of a database from both the user and database perspective, with intelligent processes to translate between the views.

  4. Finding competitive intelligence on Internet start-up companies: a study of secondary resource use and information-seeking processes

    Directory of Open Access Journals (Sweden)

    2001-01-01

    Full Text Available The paper reports findings from a study of CI activities involving Internet start-up companies in the telecommunications industry. The CI gathering was conducted by graduate students in library and information science in the context of a class project for a real business client, itself a small Internet start-up company. The primary objective of the study was to provide empirical insights into the applicability of specific types of secondary information resources to finding competitive intelligence information on small Internet start-up companies. An additional objective was to identify the characteristics of research strategies applied in the collection of CI on Internet start-ups from the perspective of current conceptual frameworks of information-seeking behaviour presented in the library and information science literature. This study revealed some interesting findings regarding the types of secondary information resources that can be used to find competitive intelligence on small, Internet start-up companies. The study also provided insight into the characteristics of the overall information-seeking strategies that are applied in this type of competitive intelligence research.

  5. Medical Students Perceive Better Group Learning Processes when Large Classes Are Made to Seem Small

    Science.gov (United States)

    Hommes, Juliette; Arah, Onyebuchi A.; de Grave, Willem; Schuwirth, Lambert W. T.; Scherpbier, Albert J. J. A.; Bos, Gerard M. J.

    2014-01-01

    Objective Medical schools struggle with large classes, which might interfere with the effectiveness of learning within small groups due to students being unfamiliar to fellow students. The aim of this study was to assess the effects of making a large class seem small on the students' collaborative learning processes. Design A randomised controlled intervention study was undertaken to make a large class seem small, without the need to reduce the number of students enrolling in the medical programme. The class was divided into subsets: two small subsets (n = 50) as the intervention groups; a control group (n = 102) was mixed with the remaining students (the non-randomised group n∼100) to create one large subset. Setting The undergraduate curriculum of the Maastricht Medical School, applying the Problem-Based Learning principles. In this learning context, students learn mainly in tutorial groups, composed randomly from a large class every 6–10 weeks. Intervention The formal group learning activities were organised within the subsets. Students from the intervention groups met frequently within the formal groups, in contrast to the students from the large subset who hardly enrolled with the same students in formal activities. Main Outcome Measures Three outcome measures assessed students' group learning processes over time: learning within formally organised small groups, learning with other students in the informal context and perceptions of the intervention. Results Formal group learning processes were perceived more positive in the intervention groups from the second study year on, with a mean increase of β = 0.48. Informal group learning activities occurred almost exclusively within the subsets as defined by the intervention from the first week involved in the medical curriculum (E-I indexes>−0.69). Interviews tapped mainly positive effects and negligible negative side effects of the intervention. Conclusion Better group learning processes can be

  6. Medical students perceive better group learning processes when large classes are made to seem small.

    Science.gov (United States)

    Hommes, Juliette; Arah, Onyebuchi A; de Grave, Willem; Schuwirth, Lambert W T; Scherpbier, Albert J J A; Bos, Gerard M J

    2014-01-01

    Medical schools struggle with large classes, which might interfere with the effectiveness of learning within small groups due to students being unfamiliar to fellow students. The aim of this study was to assess the effects of making a large class seem small on the students' collaborative learning processes. A randomised controlled intervention study was undertaken to make a large class seem small, without the need to reduce the number of students enrolling in the medical programme. The class was divided into subsets: two small subsets (n=50) as the intervention groups; a control group (n=102) was mixed with the remaining students (the non-randomised group n∼100) to create one large subset. The undergraduate curriculum of the Maastricht Medical School, applying the Problem-Based Learning principles. In this learning context, students learn mainly in tutorial groups, composed randomly from a large class every 6-10 weeks. The formal group learning activities were organised within the subsets. Students from the intervention groups met frequently within the formal groups, in contrast to the students from the large subset who hardly enrolled with the same students in formal activities. Three outcome measures assessed students' group learning processes over time: learning within formally organised small groups, learning with other students in the informal context and perceptions of the intervention. Formal group learning processes were perceived more positive in the intervention groups from the second study year on, with a mean increase of β=0.48. Informal group learning activities occurred almost exclusively within the subsets as defined by the intervention from the first week involved in the medical curriculum (E-I indexes>-0.69). Interviews tapped mainly positive effects and negligible negative side effects of the intervention. Better group learning processes can be achieved in large medical schools by making large classes seem small.

  7. Intelligent Advertising

    OpenAIRE

    Díaz Pinedo, Edilfredo Eliot

    2012-01-01

    Intelligent Advertisement diseña e implementa un sistema de publicidad para dispositivos móviles en un centro comercial, donde los clientes reciben publicidad de forma pasiva en sus dispositivos mientras están dentro.

  8. RESOURCE SAVING TECHNOLOGICAL PROCESS OF LARGE-SIZE DIE THERMAL TREATMENT

    Directory of Open Access Journals (Sweden)

    L. A. Glazkov

    2009-01-01

    Full Text Available The given paper presents a development of a technological process pertaining to hardening large-size parts made of die steel. The proposed process applies a water-air mixture instead of a conventional hardening medium that is industrial oil.While developing this new technological process it has been necessary to solve the following problems: reduction of thermal treatment duration, reduction of power resource expense (natural gas and mineral oil, elimination of fire danger and increase of process ecological efficiency. 

  9. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part II

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    In Part I, an efficient method for identifying faults in large processes was presented. The whole plant is divided into sectors by using structural, functional, or causal decomposition. A signed directed graph (SDG) is the model used for each sector. The SDG represents interactions among process variables. This qualitative model is used to carry out qualitative simulation for all possible faults. The output of this step is information about the process behaviour. This information is used to build rules. When a symptom is detected in one sector, its rules are evaluated using on-line data and fuzzy logic to yield the diagnosis. In this paper the proposed methodology is applied to a multiple stage flash (MSF) desalination process. This process is composed of sequential flash chambers. It was designed for a pilot plant that produces drinkable water for a community in Argentina; that is, it is a real case. Due to the large number of variables, recycles, phase changes, etc., this process is a good challenge for the proposed diagnosis method

  10. BUSINESS INTELLIGENCE

    OpenAIRE

    Bogdan Mohor Dumitrita

    2011-01-01

    The purpose of this work is to present business intelligence systems. These systems can be extremely complex and important in modern market competition. Its effectiveness also reflects in price, so we have to exlore their financial potential before investment. The systems have 20 years long history and during that time many of such tools have been developed, but they are rarely still in use. Business intelligence system consists of three main areas: Data Warehouse, ETL tools and tools f...

  11. Intelligent indexing

    International Nuclear Information System (INIS)

    Farkas, J.

    1992-01-01

    In this paper we discuss the relevance of artificial intelligence to the automatic indexing of natural language text. We describe the use of domain-specific semantically-based thesauruses and address the problem of creating adequate knowledge bases for intelligent indexing systems. We also discuss the relevance of the Hilbert space ι 2 to the compact representation of documents and to the definition of the similarity of natural language texts. (author). 17 refs., 2 figs

  12. Intelligent indexing

    Energy Technology Data Exchange (ETDEWEB)

    Farkas, J

    1993-12-31

    In this paper we discuss the relevance of artificial intelligence to the automatic indexing of natural language text. We describe the use of domain-specific semantically-based thesauruses and address the problem of creating adequate knowledge bases for intelligent indexing systems. We also discuss the relevance of the Hilbert space {iota}{sup 2} to the compact representation of documents and to the definition of the similarity of natural language texts. (author). 17 refs., 2 figs.

  13. Computational intelligence for decision support in cyber-physical systems

    CERN Document Server

    Ali, A; Riaz, Zahid

    2014-01-01

    This book is dedicated to applied computational intelligence and soft computing techniques with special reference to decision support in Cyber Physical Systems (CPS), where the physical as well as the communication segment of the networked entities interact with each other. The joint dynamics of such systems result in a complex combination of computers, software, networks and physical processes all combined to establish a process flow at system level. This volume provides the audience with an in-depth vision about how to ensure dependability, safety, security and efficiency in real time by making use of computational intelligence in various CPS applications ranging from the nano-world to large scale wide area systems of systems. Key application areas include healthcare, transportation, energy, process control and robotics where intelligent decision support has key significance in establishing dynamic, ever-changing and high confidence future technologies. A recommended text for graduate students and researche...

  14. Visualization of the Flux Rope Generation Process Using Large Quantities of MHD Simulation Data

    Directory of Open Access Journals (Sweden)

    Y Kubota

    2013-03-01

    Full Text Available We present a new concept of analysis using visualization of large quantities of simulation data. The time development of 3D objects with high temporal resolution provides the opportunity for scientific discovery. We visualize large quantities of simulation data using the visualization application 'Virtual Aurora' based on AVS (Advanced Visual Systems and the parallel distributed processing at "Space Weather Cloud" in NICT based on Gfarm technology. We introduce two results of high temporal resolution visualization: the magnetic flux rope generation process and dayside reconnection using a system of magnetic field line tracing.

  15. Network control stations in the smart grid. Process and information knots for business intelligence applications; Netzleitstellen im Smart Grid. Prozess- und Informationsknoten fuer Business Intelligence Applikationen

    Energy Technology Data Exchange (ETDEWEB)

    Kautsch, Stephan; Kroll, Meinhard [ABB AG, Mannheim (Germany); Schoellhorn, Daniel [EnBW Regional AG, Stuttgart (Germany)

    2012-07-01

    The degree of automation in the distribution will increase, whereas a more extensive monitoring is possible. Smart metering in the local network station replaces the drag pointers. This allows the pre-determined load flows to be precise and it can be determined and valuable data can be collected about how resources, for example the transformers in the secondary substations, are actually utilized. The amount of information available is increasing steadily, not least because of the increasing expansion of smart meters, that also provide valuable information for the operation of the distribution networks. This ''flood'' of data that is processed by the system, filtered, and analyzed must be prepared for the user in order to make sense, but can also be used to support and optimize many business processes. Although these tasks mentioned are usually not yet allocated within the grid operator organization, they offer themselves to be placed close to the network control centers as they propose new challenges but also opportunities. (orig.)

  16. Artificial Intelligence--Applications in Education.

    Science.gov (United States)

    Poirot, James L.; Norris, Cathleen A.

    1987-01-01

    This first in a projected series of five articles discusses artificial intelligence and its impact on education. Highlights include the history of artificial intelligence and the impact of microcomputers; learning processes; human factors and interfaces; computer assisted instruction and intelligent tutoring systems; logic programing; and expert…

  17. Business Intelligence Solutions for Gaining Competitive Advantage

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Business Intelligence is the process for increasing the competitive advantage of a company by intelligent use of available data in decision-making. Only a revolutionary Business Intelligence solution, like the proposed portal-based, can solve the complex issues faced when evaluating decision support applications and ensures the availability of any business-critical information.

  18. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  19. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  20. Artificial intelligence in nanotechnology.

    Science.gov (United States)

    Sacha, G M; Varona, P

    2013-11-15

    During the last decade there has been increasing use of artificial intelligence tools in nanotechnology research. In this paper we review some of these efforts in the context of interpreting scanning probe microscopy, the study of biological nanosystems, the classification of material properties at the nanoscale, theoretical approaches and simulations in nanoscience, and generally in the design of nanodevices. Current trends and future perspectives in the development of nanocomputing hardware that can boost artificial-intelligence-based applications are also discussed. Convergence between artificial intelligence and nanotechnology can shape the path for many technological developments in the field of information sciences that will rely on new computer architectures and data representations, hybrid technologies that use biological entities and nanotechnological devices, bioengineering, neuroscience and a large variety of related disciplines.

  1. Artificial intelligence in nanotechnology

    Science.gov (United States)

    Sacha, G. M.; Varona, P.

    2013-11-01

    During the last decade there has been increasing use of artificial intelligence tools in nanotechnology research. In this paper we review some of these efforts in the context of interpreting scanning probe microscopy, the study of biological nanosystems, the classification of material properties at the nanoscale, theoretical approaches and simulations in nanoscience, and generally in the design of nanodevices. Current trends and future perspectives in the development of nanocomputing hardware that can boost artificial-intelligence-based applications are also discussed. Convergence between artificial intelligence and nanotechnology can shape the path for many technological developments in the field of information sciences that will rely on new computer architectures and data representations, hybrid technologies that use biological entities and nanotechnological devices, bioengineering, neuroscience and a large variety of related disciplines.

  2. Artificial intelligence in nanotechnology

    International Nuclear Information System (INIS)

    Sacha, G M; Varona, P

    2013-01-01

    During the last decade there has been increasing use of artificial intelligence tools in nanotechnology research. In this paper we review some of these efforts in the context of interpreting scanning probe microscopy, the study of biological nanosystems, the classification of material properties at the nanoscale, theoretical approaches and simulations in nanoscience, and generally in the design of nanodevices. Current trends and future perspectives in the development of nanocomputing hardware that can boost artificial-intelligence-based applications are also discussed. Convergence between artificial intelligence and nanotechnology can shape the path for many technological developments in the field of information sciences that will rely on new computer architectures and data representations, hybrid technologies that use biological entities and nanotechnological devices, bioengineering, neuroscience and a large variety of related disciplines. (topical review)

  3. Assessing Cognitive Abilities: Intelligence and More

    Directory of Open Access Journals (Sweden)

    Keith E. Stanovich

    2014-02-01

    Full Text Available In modern cognitive science, rationality and intelligence are measured using different tasks and operations. Furthermore, in several contemporary dual process theories of cognition, rationality is a more encompassing construct than intelligence. Researchers need to continue to develop measures of rational thought without regard to empirical correlations with intelligence. The measurement of individual differences in rationality should not be subsumed by the intelligence concept.

  4. Large-scale continuous process to vitrify nuclear defense waste: operating experience with nonradioactive waste

    International Nuclear Information System (INIS)

    Cosper, M.B.; Randall, C.T.; Traverso, G.M.

    1982-01-01

    The developmental program underway at SRL has demonstrated the vitrification process proposed for the sludge processing facility of the DWPF on a large scale. DWPF design criteria for production rate, equipment lifetime, and operability have all been met. The expected authorization and construction of the DWPF will result in the safe and permanent immobilization of a major quantity of existing high level waste. 11 figures, 4 tables

  5. Process γ*γ → σ at large virtuality of γ*

    International Nuclear Information System (INIS)

    Volkov, M.K.; Radzhabov, A.E.; Yudichev, V.L.

    2004-01-01

    The process γ*γ → σ is investigated in the framework of the SU(2) x SU(2) chiral NJL model, where γ*γ are photons with the large and small virtuality, respectively, and σ is a pseudoscalar meson. The form factor of the process is derived for arbitrary virtuality of γ* in the Euclidean kinematic domain. The asymptotic behavior of this form factor resembles the asymptotic behavior of the γ*γ → π form factor [ru

  6. Increasing a large petrochemical company efficiency by improvement of decision making process

    OpenAIRE

    Kirin Snežana D.; Nešić Lela G.

    2010-01-01

    The paper shows the results of a research conducted in a large petrochemical company, in a state under transition, with the aim to "shed light" on the decision making process from the aspect of personal characteristics of the employees, in order to use the results to improve decision making process and increase company efficiency. The research was conducted by a survey, i.e. by filling out a questionnaire specially made for this purpose, in real conditions, during working hours. The sample of...

  7. The testing of thermal-mechanical-hydrological-chemical processes using a large block

    International Nuclear Information System (INIS)

    Lin, W.; Wilder, D.G.; Blink, J.A.; Blair, S.C.; Buscheck, T.A.; Chesnut, D.A.; Glassley, W.E.; Lee, K.; Roberts, J.J.

    1994-01-01

    The radioactive decay heat from nuclear waste packages may, depending on the thermal load, create coupled thermal-mechanical-hydrological-chemical (TMHC) processes in the near-field environment of a repository. A group of tests on a large block (LBT) are planned to provide a timely opportunity to test and calibrate some of the TMHC model concepts. The LBT is advantageous for testing and verifying model concepts because the boundary conditions are controlled, and the block can be characterized before and after the experiment. A block of Topopah Spring tuff of about 3 x 3 x 4.5 m will be sawed and isolated at Fran Ridge, Nevada Test Site. Small blocks of the rock adjacent to the large block will be collected for laboratory testing of some individual thermal-mechanical, hydrological, and chemical processes. A constant load of about 4 MPa will be applied to the top and sides of the large block. The sides will be sealed with moisture and thermal barriers. The large block will be heated with one heater in each borehole and guard heaters on the sides so that a dry-out zone and a condensate zone will exist simultaneously. Temperature, moisture content, pore pressure, chemical composition, stress and displacement will be measured throughout the block during the heating and cool-down phases. The results from the experiments on small blocks and the tests on the large block will provide a better understanding of some concepts of the coupled TMHC processes

  8. The Faculty Promotion Process. An Empirical Analysis of the Administration of Large State Universities.

    Science.gov (United States)

    Luthans, Fred

    One phase of academic management, the faculty promotion process, is systematically described and analyzed. The study encompasses three parts: (l) the justification of the use of management concepts in the analysis of academic administration; (2) a descriptive presentation of promotion policies and practices in 46 large state universities; and (3)…

  9. Hadronic processes with large transfer momenta and quark counting rules in multiparticle dual amplitude

    International Nuclear Information System (INIS)

    Akkelin, S.V.; Kobylinskij, N.A.; Martynov, E.S.

    1989-01-01

    A dual N-particle amplitude satisfying the quark counting rules for the processes with large transfer momenta is constructed. The multiparticle channels are shown to give an essential contribution to the amplitude decreasing power in a hard kinematic limit. 19 refs.; 9 figs

  10. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  11. On conservation of the baryon chirality in the processes with large momentum transfer

    International Nuclear Information System (INIS)

    Ioffe, B.L.

    1976-01-01

    The hypothesis of the baryon chirality conservation in the processes with large momentum transfer is suggested and some arguments in its favour are made. Experimental implicatiosns of this assumption for weak and electromagnetic form factors of transitions in the baryon octet and of transitions N → Δ, N → Σsup(*) are considered

  12. Large deviations for the Fleming-Viot process with neutral mutation and selection

    OpenAIRE

    Dawson, Donald; Feng, Shui

    1998-01-01

    Large deviation principles are established for the Fleming-Viot processes with neutral mutation and selection, and the corresponding equilibrium measures as the sampling rate goes to 0. All results are first proved for the finite allele model, and then generalized, through the projective limit technique, to the infinite allele model. Explicit expressions are obtained for the rate functions.

  13. Extraordinary intelligence and the care of infants

    Science.gov (United States)

    Piantadosi, Steven T.; Kidd, Celeste

    2016-01-01

    We present evidence that pressures for early childcare may have been one of the driving factors of human evolution. We show through an evolutionary model that runaway selection for high intelligence may occur when (i) altricial neonates require intelligent parents, (ii) intelligent parents must have large brains, and (iii) large brains necessitate having even more altricial offspring. We test a prediction of this account by showing across primate genera that the helplessness of infants is a particularly strong predictor of the adults’ intelligence. We discuss related implications, including this account’s ability to explain why human-level intelligence evolved specifically in mammals. This theory complements prior hypotheses that link human intelligence to social reasoning and reproductive pressures and explains how human intelligence may have become so distinctive compared with our closest evolutionary relatives. PMID:27217560

  14. Quality Improvement Process in a Large Intensive Care Unit: Structure and Outcomes.

    Science.gov (United States)

    Reddy, Anita J; Guzman, Jorge A

    2016-11-01

    Quality improvement in the health care setting is a complex process, and even more so in the critical care environment. The development of intensive care unit process measures and quality improvement strategies are associated with improved outcomes, but should be individualized to each medical center as structure and culture can differ from institution to institution. The purpose of this report is to describe the structure of quality improvement processes within a large medical intensive care unit while using examples of the study institution's successes and challenges in the areas of stat antibiotic administration, reduction in blood product waste, central line-associated bloodstream infections, and medication errors. © The Author(s) 2015.

  15. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  16. [Breastfeeding and its influence into the cognitive process of Spanish school-children (6 years old), measured by the Wechsler Intelligence Scale].

    Science.gov (United States)

    Pérez Ruiz, Juan Manuel; Miranda León, María Teresa; Peinado Herreros, José María; Iribar Ibabe, María Concepción

    2013-09-01

    Some scientific evidence support that a better cognitive development during the school age is related with breastfeeding. In this study, the potential benefit of breastfeeding duration is evaluated, related to Verbal Comprehension, Perceptual Reasoning, Working Memory and Processing Speed. A total of 103 children, first year of Primary School, six-year-old, (47 boys and 56 girls), were included from different schools in the province of Granada (Spain) at urban, semi-urban and rural areas. The global cognitive capability, as well as some specific intelligence domains which permit a more precise and deeper analysis of the cognitive processes, was evaluated through the Wechsler Intelligence Scale for Children--version IV. The results prove an association, statistically signnificative, between the best values of IQ and the other four WISC-IV indexes and a longer breastfeeding. There is a highly significant (p = 0.000) association between the best scores and those children who were breastfed during 6 months, which validates our hypothesis. The advice of breastfeeding during at least the first six months of life should be reinforced to reduce learning difficulties.

  17. Artificial intelligence in cardiology.

    Science.gov (United States)

    Bonderman, Diana

    2017-12-01

    Decision-making is complex in modern medicine and should ideally be based on available data, structured knowledge and proper interpretation in the context of an individual patient. Automated algorithms, also termed artificial intelligence that are able to extract meaningful patterns from data collections and build decisions upon identified patterns may be useful assistants in clinical decision-making processes. In this article, artificial intelligence-based studies in clinical cardiology are reviewed. The text also touches on the ethical issues and speculates on the future roles of automated algorithms versus clinicians in cardiology and medicine in general.

  18. Failure of Working Memory Training to Enhance Cognition or Intelligence

    Science.gov (United States)

    Thompson, Todd W.; Waskom, Michael L.; Garel, Keri-Lee A.; Cardenas-Iniguez, Carlos; Reynolds, Gretchen O.; Winter, Rebecca; Chang, Patricia; Pollard, Kiersten; Lala, Nupur; Alvarez, George A.; Gabrieli, John D. E.

    2013-01-01

    Fluid intelligence is important for successful functioning in the modern world, but much evidence suggests that fluid intelligence is largely immutable after childhood. Recently, however, researchers have reported gains in fluid intelligence after multiple sessions of adaptive working memory training in adults. The current study attempted to replicate and expand those results by administering a broad assessment of cognitive abilities and personality traits to young adults who underwent 20 sessions of an adaptive dual n-back working memory training program and comparing their post-training performance on those tests to a matched set of young adults who underwent 20 sessions of an adaptive attentional tracking program. Pre- and post-training measurements of fluid intelligence, standardized intelligence tests, speed of processing, reading skills, and other tests of working memory were assessed. Both training groups exhibited substantial and specific improvements on the trained tasks that persisted for at least 6 months post-training, but no transfer of improvement was observed to any of the non-trained measurements when compared to a third untrained group serving as a passive control. These findings fail to support the idea that adaptive working memory training in healthy young adults enhances working memory capacity in non-trained tasks, fluid intelligence, or other measures of cognitive abilities. PMID:23717453

  19. Failure of working memory training to enhance cognition or intelligence.

    Directory of Open Access Journals (Sweden)

    Todd W Thompson

    Full Text Available Fluid intelligence is important for successful functioning in the modern world, but much evidence suggests that fluid intelligence is largely immutable after childhood. Recently, however, researchers have reported gains in fluid intelligence after multiple sessions of adaptive working memory training in adults. The current study attempted to replicate and expand those results by administering a broad assessment of cognitive abilities and personality traits to young adults who underwent 20 sessions of an adaptive dual n-back working memory training program and comparing their post-training performance on those tests to a matched set of young adults who underwent 20 sessions of an adaptive attentional tracking program. Pre- and post-training measurements of fluid intelligence, standardized intelligence tests, speed of processing, reading skills, and other tests of working memory were assessed. Both training groups exhibited substantial and specific improvements on the trained tasks that persisted for at least 6 months post-training, but no transfer of improvement was observed to any of the non-trained measurements when compared to a third untrained group serving as a passive control. These findings fail to support the idea that adaptive working memory training in healthy young adults enhances working memory capacity in non-trained tasks, fluid intelligence, or other measures of cognitive abilities.

  20. Intelligent environmental data warehouse

    International Nuclear Information System (INIS)

    Ekechukwu, B.

    1998-01-01

    Making quick and effective decisions in environment management are based on multiple and complex parameters, a data warehouse is a powerful tool for the over all management of massive environmental information. Selecting the right data from a warehouse is an important factor consideration for end-users. This paper proposed an intelligent environmental data warehouse system. It consists of data warehouse to feed an environmental researchers and managers with desire environmental information needs to their research studies and decision in form of geometric and attribute data for study area, and a metadata for the other sources of environmental information. In addition, the proposed intelligent search engine works according to a set of rule, which enables the system to be aware of the environmental data wanted by the end-user. The system development process passes through four stages. These are data preparation, warehouse development, intelligent engine development and internet platform system development. (author)

  1. Processing graded feedback: electrophysiological correlates of learning from small and large errors.

    Science.gov (United States)

    Luft, Caroline Di Bernardi; Takase, Emilio; Bhattacharya, Joydeep

    2014-05-01

    Feedback processing is important for learning and therefore may affect the consolidation of skills. Considerable research demonstrates electrophysiological differences between correct and incorrect feedback, but how we learn from small versus large errors is usually overlooked. This study investigated electrophysiological differences when processing small or large error feedback during a time estimation task. Data from high-learners and low-learners were analyzed separately. In both high- and low-learners, large error feedback was associated with higher feedback-related negativity (FRN) and small error feedback was associated with a larger P300 and increased amplitude over the motor related areas of the left hemisphere. In addition, small error feedback induced larger desynchronization in the alpha and beta bands with distinctly different topographies between the two learning groups: The high-learners showed a more localized decrease in beta power over the left frontocentral areas, and the low-learners showed a widespread reduction in the alpha power following small error feedback. Furthermore, only the high-learners showed an increase in phase synchronization between the midfrontal and left central areas. Importantly, this synchronization was correlated to how well the participants consolidated the estimation of the time interval. Thus, although large errors were associated with higher FRN, small errors were associated with larger oscillatory responses, which was more evident in the high-learners. Altogether, our results suggest an important role of the motor areas in the processing of error feedback for skill consolidation.

  2. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    Science.gov (United States)

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software

  3. The key network communication technology in large radiation image cooperative process system

    International Nuclear Information System (INIS)

    Li Zheng; Kang Kejun; Gao Wenhuan; Wang Jingjin

    1998-01-01

    Large container inspection system (LCIS) based on radiation imaging technology is a powerful tool for the customs to check the contents inside a large container without opening it. An image distributed network system is composed of operation manager station, image acquisition station, environment control station, inspection processing station, check-in station, check-out station, database station by using advanced network technology. Mass data, such as container image data, container general information, manifest scanning data, commands and status, must be on-line transferred between different stations. Advanced network communication technology is presented

  4. Asymptotic description of two metastable processes of solidification for the case of large relaxation time

    International Nuclear Information System (INIS)

    Omel'yanov, G.A.

    1995-07-01

    The non-isothermal Cahn-Hilliard equations in the n-dimensional case (n = 2,3) are considered. The interaction length is proportional to a small parameter, and the relaxation time is proportional to a constant. The asymptotic solutions describing two metastable processes are constructed and justified. The soliton type solution describes the first stage of separation in alloy, when a set of ''superheated liquid'' appears inside the ''solid'' part. The Van der Waals type solution describes the free interface dynamics for large time. The smoothness of temperature is established for large time and the Mullins-Sekerka problem describing the free interface is derived. (author). 46 refs

  5. Semantic Business Intelligence - a New Generation of Business Intelligence

    OpenAIRE

    Dinu AIRINEI; Dora-Anca BERTA

    2012-01-01

    Business Intelligence Solutions represents applications used by companies to manage process and analyze data to provide substantiated decision. In the context of Semantic Web develop-ment trend is to integrate semantic unstructured data, making business intelligence solutions to be redesigned in such a manner that can analyze, process and synthesize, in addition to traditional data and data integrated with semantic another form and structure. This invariably leads appearance of new BI solutio...

  6. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part I

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    This work presents a new strategy for fault diagnosis in large chemical processes (E.E. Tarifa, Fault diagnosis in complex chemistries plants: plants of large dimensions and batch processes. Ph.D. thesis, Universidad Nacional del Litoral, Santa Fe, 1995). A special decomposition of the plant is made in sectors. Afterwards each sector is studied independently. These steps are carried out in the off-line mode. They produced vital information for the diagnosis system. This system works in the on-line mode and is based on a two-tier strategy. When a fault is produced, the upper level identifies the faulty sector. Then, the lower level carries out an in-depth study that focuses only on the critical sectors to identify the fault. The loss of information produced by the process partition may cause spurious diagnosis. This problem is overcome at the second level using qualitative simulation and fuzzy logic. In the second part of this work, the new methodology is tested to evaluate its performance in practical cases. A multiple stage flash desalination system (MSF) is chosen because it is a complex system, with many recycles and variables to be supervised. The steps for the knowledge base generation and all the blocks included in the diagnosis system are analyzed. Evaluation of the diagnosis performance is carried out using a rigorous dynamic simulator

  7. Intelligent Universe

    Energy Technology Data Exchange (ETDEWEB)

    Hoyle, F

    1983-01-01

    The subject is covered in chapters, entitled: chance and the universe (synthesis of proteins; the primordial soup); the gospel according to Darwin (discussion of Darwin theory of evolution); life did not originate on earth (fossils from space; life in space); the interstellar connection (living dust between the stars; bacteria in space falling to the earth; interplanetary dust); evolution by cosmic control (microorganisms; genetics); why aren't the others here (a cosmic origin of life); after the big bang (big bang and steady state); the information rich universe; what is intelligence up to; the intelligent universe.

  8. Artificial intelligence

    International Nuclear Information System (INIS)

    Perret-Galix, D.

    1992-01-01

    A vivid example of the growing need for frontier physics experiments to make use of frontier technology is in the field of artificial intelligence and related themes. This was reflected in the second international workshop on 'Software Engineering, Artificial Intelligence and Expert Systems in High Energy and Nuclear Physics' which took place from 13-18 January at France Telecom's Agelonde site at La Londe des Maures, Provence. It was the second in a series, the first having been held at Lyon in 1990

  9. Artificial Intelligence and Moral intelligence

    Directory of Open Access Journals (Sweden)

    Laura Pana

    2008-07-01

    Full Text Available We discuss the thesis that the implementation of a moral code in the behaviour of artificial intelligent systems needs a specific form of human and artificial intelligence, not just an abstract intelligence. We present intelligence as a system with an internal structure and the structural levels of the moral system, as well as certain characteristics of artificial intelligent agents which can/must be treated as 1- individual entities (with a complex, specialized, autonomous or selfdetermined, even unpredictable conduct, 2- entities endowed with diverse or even multiple intelligence forms, like moral intelligence, 3- open and, even, free-conduct performing systems (with specific, flexible and heuristic mechanisms and procedures of decision, 4 – systems which are open to education, not just to instruction, 5- entities with “lifegraphy”, not just “stategraphy”, 6- equipped not just with automatisms but with beliefs (cognitive and affective complexes, 7- capable even of reflection (“moral life” is a form of spiritual, not just of conscious activity, 8 – elements/members of some real (corporal or virtual community, 9 – cultural beings: free conduct gives cultural value to the action of a ”natural” or artificial being. Implementation of such characteristics does not necessarily suppose efforts to design, construct and educate machines like human beings. The human moral code is irremediably imperfect: it is a morality of preference, of accountability (not of responsibility and a morality of non-liberty, which cannot be remedied by the invention of ethical systems, by the circulation of ideal values and by ethical (even computing education. But such an imperfect morality needs perfect instruments for its implementation: applications of special logic fields; efficient psychological (theoretical and technical attainments to endow the machine not just with intelligence, but with conscience and even spirit; comprehensive technical

  10. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    Science.gov (United States)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  11. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  12. Modelling hydrologic and hydrodynamic processes in basins with large semi-arid wetlands

    Science.gov (United States)

    Fleischmann, Ayan; Siqueira, Vinícius; Paris, Adrien; Collischonn, Walter; Paiva, Rodrigo; Pontes, Paulo; Crétaux, Jean-François; Bergé-Nguyen, Muriel; Biancamaria, Sylvain; Gosset, Marielle; Calmant, Stephane; Tanimoun, Bachir

    2018-06-01

    Hydrological and hydrodynamic models are core tools for simulation of large basins and complex river systems associated to wetlands. Recent studies have pointed towards the importance of online coupling strategies, representing feedbacks between floodplain inundation and vertical hydrology. Especially across semi-arid regions, soil-floodplain interactions can be strong. In this study, we included a two-way coupling scheme in a large scale hydrological-hydrodynamic model (MGB) and tested different model structures, in order to assess which processes are important to be simulated in large semi-arid wetlands and how these processes interact with water budget components. To demonstrate benefits from this coupling over a validation case, the model was applied to the Upper Niger River basin encompassing the Niger Inner Delta, a vast semi-arid wetland in the Sahel Desert. Simulation was carried out from 1999 to 2014 with daily TMPA 3B42 precipitation as forcing, using both in-situ and remotely sensed data for calibration and validation. Model outputs were in good agreement with discharge and water levels at stations both upstream and downstream of the Inner Delta (Nash-Sutcliffe Efficiency (NSE) >0.6 for most gauges), as well as for flooded areas within the Delta region (NSE = 0.6; r = 0.85). Model estimates of annual water losses across the Delta varied between 20.1 and 30.6 km3/yr, while annual evapotranspiration ranged between 760 mm/yr and 1130 mm/yr. Evaluation of model structure indicated that representation of both floodplain channels hydrodynamics (storage, bifurcations, lateral connections) and vertical hydrological processes (floodplain water infiltration into soil column; evapotranspiration from soil and vegetation and evaporation of open water) are necessary to correctly simulate flood wave attenuation and evapotranspiration along the basin. Two-way coupled models are necessary to better understand processes in large semi-arid wetlands. Finally, such coupled

  13. Analysis of reforming process of large distorted ring in final enlarging forging

    International Nuclear Information System (INIS)

    Miyazawa, Takeshi; Murai, Etsuo

    2002-01-01

    In the construction of reactors or pressure vessels for oil chemical plants and nuclear power stations, mono block open-die forging rings are often utilized. Generally, a large forged ring is manufactured by means of enlarging forging with reductions of the wall thickness. During the enlarging process the circular ring is often distorted and becomes an ellipse in shape. However the shape control of the ring is a complicated work. This phenomenon makes the matter still worse in forging of larger rings. In order to make precision forging of large rings, we have developed the forging method using a v-shape anvil. The v-shape anvil is geometrically adjusted to fit the distorted ring in the final circle and reform automatically the shape of the ring during enlarging forging. This paper has analyzed the reforming process of distorted ring by computer program based on F.E.M. and examined the effect on the precision of ring forging. (author)

  14. A mesh density study for application to large deformation rolling process evaluation

    International Nuclear Information System (INIS)

    Martin, J.A.

    1997-12-01

    When addressing large deformation through an elastic-plastic analysis the mesh density is paramount in determining the accuracy of the solution. However, given the nonlinear nature of the problem, a highly-refined mesh will generally require a prohibitive amount of computer resources. This paper addresses finite element mesh optimization studies considering accuracy of results and computer resource needs as applied to large deformation rolling processes. In particular, the simulation of the thread rolling manufacturing process is considered using the MARC software package and a Cray C90 supercomputer. Both mesh density and adaptive meshing on final results for both indentation of a rigid body to a specified depth and contact rolling along a predetermined length are evaluated

  15. Constructing large scale SCI-based processing systems by switch elements

    International Nuclear Information System (INIS)

    Wu, B.; Kristiansen, E.; Skaali, B.; Bogaerts, A.; Divia, R.; Mueller, H.

    1993-05-01

    The goal of this paper is to study some of the design criteria for the switch elements to form the interconnection of large scale SCI-based processing systems. The approved IEEE standard 1596 makes it possible to couple up to 64K nodes together. In order to connect thousands of nodes to construct large scale SCI-based processing systems, one has to interconnect these nodes by switch elements to form different topologies. A summary of the requirements and key points of interconnection networks and switches is presented. Two models of the SCI switch elements are proposed. The authors investigate several examples of systems constructed for 4-switches with simulations and the results are analyzed. Some issues and enhancements are discussed to provide the ideas behind the switch design that can improve performance and reduce latency. 29 refs., 11 figs., 3 tabs

  16. Large-scale functional networks connect differently for processing words and symbol strings.

    Science.gov (United States)

    Liljeström, Mia; Vartiainen, Johanna; Kujala, Jan; Salmelin, Riitta

    2018-01-01

    Reconfigurations of synchronized large-scale networks are thought to be central neural mechanisms that support cognition and behavior in the human brain. Magnetoencephalography (MEG) recordings together with recent advances in network analysis now allow for sub-second snapshots of such networks. In the present study, we compared frequency-resolved functional connectivity patterns underlying reading of single words and visual recognition of symbol strings. Word reading emphasized coherence in a left-lateralized network with nodes in classical perisylvian language regions, whereas symbol processing recruited a bilateral network, including connections between frontal and parietal regions previously associated with spatial attention and visual working memory. Our results illustrate the flexible nature of functional networks, whereby processing of different form categories, written words vs. symbol strings, leads to the formation of large-scale functional networks that operate at distinct oscillatory frequencies and incorporate task-relevant regions. These results suggest that category-specific processing should be viewed not so much as a local process but as a distributed neural process implemented in signature networks. For words, increased coherence was detected particularly in the alpha (8-13 Hz) and high gamma (60-90 Hz) frequency bands, whereas increased coherence for symbol strings was observed in the high beta (21-29 Hz) and low gamma (30-45 Hz) frequency range. These findings attest to the role of coherence in specific frequency bands as a general mechanism for integrating stimulus-dependent information across brain regions.

  17. Microarray Data Processing Techniques for Genome-Scale Network Inference from Large Public Repositories.

    Science.gov (United States)

    Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas

    2016-09-19

    Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.

  18. Exploratory study of the relationship between the musical, visuospatial, bodily-kinesthetic intelligence and drive creativity in the process of learning

    Directory of Open Access Journals (Sweden)

    Paula MARCHENA CRUZ

    2017-12-01

    Full Text Available Currently, the Spanish educational system focuses its attention on the development of priority subjects such as language and mathematics versus other secondary such as music (Palacios, 2006, without considering numerous neuropsychological research that provides new theories of mind and learning that can positively influence the transformation of current educational models (Martin-Lobo, 2015. This research aims to determine the relation between musical intelligence, bodily-kinesthetic intelligence, intelligence visuospatial and motor creativity in a sample among 5 years old students from the last year of Early Childhood Education. The instrument used to assess the three intelligences, based on Gardner’s theory, was the Multiple Intelligences questionnaire for children of pre-school age (Prieto and Ballester, 2003; for the evaluation of motor creativity was used Test of Creative Thinking in Action and Movement (Torrance, Reisman and Floyd, 1981. A descriptive and correlational statistical analysis (using the Pearson correlation index applying the Microsoft Excel program along with the supplement known as Ezanalyze. The results indicated no significant relationship between musical intelligence and motor creativity (p = 0.988; the visuospatial intelligence and motor creativity (p = 0.992; and the bodily-kinesthetic intelligence and motor creativity (p = 0.636. Although there was significant relation between the musical and visuospatial intelligence (p = 0.000; the musical and bodily-kinesthetic intelligence (p = 0.000; and the bodily-kinesthetic and visuospatial intelligence (p = 0.025.

  19. Extraterrestrial processing and manufacturing of large space systems, volume 1, chapters 1-6

    Science.gov (United States)

    Miller, R. H.; Smith, D. B. S.

    1979-01-01

    Space program scenarios for production of large space structures from lunar materials are defined. The concept of the space manufacturing facility (SMF) is presented. The manufacturing processes and equipment for the SMF are defined and the conceptual layouts are described for the production of solar cells and arrays, structures and joints, conduits, waveguides, RF equipment radiators, wire cables, and converters. A 'reference' SMF was designed and its operation requirements are described.

  20. Large-scale methanol plants. [Based on Japanese-developed process

    Energy Technology Data Exchange (ETDEWEB)

    Tado, Y

    1978-02-01

    A study was made on how to produce methanol economically which is expected as a growth item for use as a material for pollution-free energy or for chemical use, centering on the following subjects: (1) Improvement of thermal economy, (2) Improvement of process, and (3) Problems of hardware attending the expansion of scale. The results of this study were already adopted in actual plants, obtaining good results, and large-scale methanol plants are going to be realized.

  1. Large-scale calculations of the beta-decay rates and r-process nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Borzov, I N; Goriely, S [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); Pearson, J M [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); [Lab. de Physique Nucleaire, Univ. de Montreal, Montreal (Canada)

    1998-06-01

    An approximation to a self-consistent model of the ground state and {beta}-decay properties of neutron-rich nuclei is outlined. The structure of the {beta}-strength functions in stable and short-lived nuclei is discussed. The results of large-scale calculations of the {beta}-decay rates for spherical and slightly deformed nuclides of relevance to the r-process are analysed and compared with the results of existing global calculations and recent experimental data. (orig.)

  2. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    OpenAIRE

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-makin...

  3. A framework for the direct evaluation of large deviations in non-Markovian processes

    International Nuclear Information System (INIS)

    Cavallaro, Massimo; Harris, Rosemary J

    2016-01-01

    We propose a general framework to simulate stochastic trajectories with arbitrarily long memory dependence and efficiently evaluate large deviation functions associated to time-extensive observables. This extends the ‘cloning’ procedure of Giardiná et al (2006 Phys. Rev. Lett. 96 120603) to non-Markovian systems. We demonstrate the validity of this method by testing non-Markovian variants of an ion-channel model and the totally asymmetric exclusion process, recovering results obtainable by other means. (letter)

  4. The effects of large scale processing on caesium leaching from cemented simulant sodium nitrate waste

    International Nuclear Information System (INIS)

    Lee, D.J.; Brown, D.J.

    1982-01-01

    The effects of large scale processing on the properties of cemented simulant sodium nitrate waste have been investigated. Leach tests have been performed on full-size drums, cores and laboratory samples of cement formulations containing Ordinary Portland Cement (OPC), Sulphate Resisting Portland Cement (SRPC) and a blended cement (90% ground granulated blast furnace slag/10% OPC). In addition, development of the cement hydration exotherms with time and the temperature distribution in 220 dm 3 samples have been followed. (author)

  5. Large Data at Small Universities: Astronomical processing using a computer classroom

    Science.gov (United States)

    Fuller, Nathaniel James; Clarkson, William I.; Fluharty, Bill; Belanger, Zach; Dage, Kristen

    2016-06-01

    The use of large computing clusters for astronomy research is becoming more commonplace as datasets expand, but access to these required resources is sometimes difficult for research groups working at smaller Universities. As an alternative to purchasing processing time on an off-site computing cluster, or purchasing dedicated hardware, we show how one can easily build a crude on-site cluster by utilizing idle cycles on instructional computers in computer-lab classrooms. Since these computers are maintained as part of the educational mission of the University, the resource impact on the investigator is generally low.By using open source Python routines, it is possible to have a large number of desktop computers working together via a local network to sort through large data sets. By running traditional analysis routines in an “embarrassingly parallel” manner, gains in speed are accomplished without requiring the investigator to learn how to write routines using highly specialized methodology. We demonstrate this concept here applied to 1. photometry of large-format images and 2. Statistical significance-tests for X-ray lightcurve analysis. In these scenarios, we see a speed-up factor which scales almost linearly with the number of cores in the cluster. Additionally, we show that the usage of the cluster does not severely limit performance for a local user, and indeed the processing can be performed while the computers are in use for classroom purposes.

  6. INTENSITY OF THE DYNAMICS OF PHYSICAL PREPAREDNESS IN THE PROCESS OF STUDENTS GROWING UP WITH DIFFERENT LEVELS OF INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    E. M. Revenko

    2016-01-01

    Full Text Available The purpose of the investigation is the scientific substantiation forthe implementation of a differentiated approach to the organization of physical education on the basis of accounting students individually-typological features of the age.Methods. Motor abilities of students are studied by measuring: force (backbone dynamometry, strength endurance (pulling up on the bar, speed-strength abilities (long jump from their seats, as well as high-speed capacity (running at 30, 60 or 100 meters, depending on the age, aerobic endurance (running at 1000 or 3000 m, depending on age. The dynamics of the integrated physical preparedness (DIPP of each student is estimated by calculating the arithmetic mean value of the growth rate of the development of motor abilities listed above (the results of 5 tests. Evaluation of general intelligence (GI of schoolchildren in 8, 10 and 11 classes, students of 1 – 3 courses carried out by the test of R. Amthauer adapted by L. A. Yasyukova [13], and the 6 classes pupils – through the intellectual test (GIT [1].Results. It was experimentally established that students of 6, 8, 10, 11 grades and 1 – 3 courses at higher intelligence level have a low dynamics of physical fitness. On the contrary, students with a lower level of intelligence revealed a high dynamics of physical fitness. Marked discrepancies in the dynamics of the development of mental and motor areas of maturing personality are interpreted as individual-typological features of age-related development. Based on these facts it is concluded that the organization of physical education on the basis of program-regulatory approach with common unified requirements involved, will not allow to create conditions for highquality physical education of the younger generation and the formation of a sustainable motivation for self-employment by physical exercises.Scientific novelty. Scientific evidence of sustained differences in the dynamics of the

  7. Curbing variations in packaging process through Six Sigma way in a large-scale food-processing industry

    Science.gov (United States)

    Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar

    2015-03-01

    Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.

  8. Social Media- A source of intelligence

    Indian Academy of Sciences (India)

    First page Back Continue Last page Graphics. Any technology that produces large amount of data like social media and CDR is a source of intelligence for the LEA. Any technology that produces large amount of data like social media and CDR is a source of intelligence for the LEA. Data Mining, Machine learning, Big Data, ...

  9. Plant intelligence

    Science.gov (United States)

    Lipavská, Helena; Žárský, Viktor

    2009-01-01

    The concept of plant intelligence, as proposed by Anthony Trewavas, has raised considerable discussion. However, plant intelligence remains loosely defined; often it is either perceived as practically synonymous to Darwinian fitness, or reduced to a mere decorative metaphor. A more strict view can be taken, emphasizing necessary prerequisites such as memory and learning, which requires clarifying the definition of memory itself. To qualify as memories, traces of past events have to be not only stored, but also actively accessed. We propose a criterion for eliminating false candidates of possible plant intelligence phenomena in this stricter sense: an “intelligent” behavior must involve a component that can be approximated by a plausible algorithmic model involving recourse to stored information about past states of the individual or its environment. Re-evaluation of previously presented examples of plant intelligence shows that only some of them pass our test. “You were hurt?” Kumiko said, looking at the scar. Sally looked down. “Yeah.” “Why didn't you have it removed?” “Sometimes it's good to remember.” “Being hurt?” “Being stupid.”—(W. Gibson: Mona Lisa Overdrive) PMID:19816094

  10. Speech Intelligibility

    Science.gov (United States)

    Brand, Thomas

    Speech intelligibility (SI) is important for different fields of research, engineering and diagnostics in order to quantify very different phenomena like the quality of recordings, communication and playback devices, the reverberation of auditoria, characteristics of hearing impairment, benefit using hearing aids or combinations of these things.

  11. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  12. Research on Francis Turbine Modeling for Large Disturbance Hydropower Station Transient Process Simulation

    Directory of Open Access Journals (Sweden)

    Guangtao Zhang

    2015-01-01

    Full Text Available In the field of hydropower station transient process simulation (HSTPS, characteristic graph-based iterative hydroturbine model (CGIHM has been widely used when large disturbance hydroturbine modeling is involved. However, by this model, iteration should be used to calculate speed and pressure, and slow convergence or no convergence problems may be encountered for some reasons like special characteristic graph profile, inappropriate iterative algorithm, or inappropriate interpolation algorithm, and so forth. Also, other conventional large disturbance hydroturbine models are of some disadvantages and difficult to be used widely in HSTPS. Therefore, to obtain an accurate simulation result, a simple method for hydroturbine modeling is proposed. By this method, both the initial operating point and the transfer coefficients of linear hydroturbine model keep changing during simulation. Hence, it can reflect the nonlinearity of the hydroturbine and be used for Francis turbine simulation under large disturbance condition. To validate the proposed method, both large disturbance and small disturbance simulations of a single hydrounit supplying a resistive, isolated load were conducted. It was shown that the simulation result is consistent with that of field test. Consequently, the proposed method is an attractive option for HSTPS involving Francis turbine modeling under large disturbance condition.

  13. Large quantity production of carbon and boron nitride nanotubes by mechano-thermal process

    International Nuclear Information System (INIS)

    Chen, Y.; Fitzgerald, J.D.; Chadderton, L.; Williams, J.S.; Campbell, S.J.

    2002-01-01

    Full text: Nanotube materials including carbon and boron nitride have excellent properties compared with bulk materials. The seamless graphene cylinders with a high length to diameter ratio make them as superstrong fibers. A high amount of hydrogen can be stored into nanotubes as future clean fuel source. Theses applications require large quantity of nanotubes materials. However, nanotube production in large quantity, fully controlled quality and low costs remains challenges for most popular synthesis methods such as arc discharge, laser heating and catalytic chemical decomposition. Discovery of new synthesis methods is still crucial for future industrial application. The new low-temperature mechano-thermal process discovered by the current author provides an opportunity to develop a commercial method for bulk production. This mechano-thermal process consists of a mechanical ball milling and a thermal annealing processes. Using this method, both carbon and boron nitride nanotubes were produced. I will present the mechano-thermal method as the new bulk production technique in the conference. The lecture will summarise main results obtained. In the case of carbon nanotubes, different nanosized structures including multi-walled nanotubes, nanocells, and nanoparticles have been produced in a graphite sample using a mechano-thermal process, consisting of I mechanical milling at room temperature for up to 150 hours and subsequent thermal annealing at 1400 deg C. Metal particles have played an important catalytic effect on the formation of different tubular structures. While defect structure of the milled graphite appears to be responsible for the formation of small tubes. It is found that the mechanical treatment of graphite powder produces a disordered and microporous structure, which provides nucleation sites for nanotubes as well as free carbon atoms. Multiwalled carbon nanotubes appear to grow via growth of the (002) layers during thermal annealing. In the case of BN

  14. ECONOMIC INTELLIGENCE - THEORETICAL AND PRACTICAL ASPECTS

    Directory of Open Access Journals (Sweden)

    VIRGIL - ION POPOVICI

    2014-12-01

    Full Text Available Economic Intelligence (EI may be a solution in knowledge management as involves collecting, evaluating, processing, analysis and dissemination of economic data within organizations. The ultimate goal of economic intelligence (EI is to take advantage of this opportunity to develop and improve methods for identifying relevant information sources, analysis of information collected and manipulation, to give the user all the necessary decisions. Scope of the Economic Intelligence focused on information available outside the organization, covering wide areas from technology to market or legal issues. Economic Intelligence (EI is closely related to other approaches to information management, and knowledge management and business intelligence, excelling in the use of software tools.

  15. Modelling speech intelligibility in adverse conditions

    DEFF Research Database (Denmark)

    Jørgensen, Søren; Dau, Torsten

    2013-01-01

    Jørgensen and Dau (J Acoust Soc Am 130:1475-1487, 2011) proposed the speech-based envelope power spectrum model (sEPSM) in an attempt to overcome the limitations of the classical speech transmission index (STI) and speech intelligibility index (SII) in conditions with nonlinearly processed speech...... subjected to phase jitter, a condition in which the spectral structure of the intelligibility of speech signal is strongly affected, while the broadband temporal envelope is kept largely intact. In contrast, the effects of this distortion can be predicted -successfully by the spectro-temporal modulation...... suggest that the SNRenv might reflect a powerful decision metric, while some explicit across-frequency analysis seems crucial in some conditions. How such across-frequency analysis is "realized" in the auditory system remains unresolved....

  16. Prospecção e monitoramento informacional no processo de inteligência competitiva Information scanning and information mining in the process of competitive intelligence

    Directory of Open Access Journals (Sweden)

    Marta Lígia Pomim Valentim

    2004-01-01

    Full Text Available A prospecção e o monitoramento informacional são atividades base para a inteligência competitiva, entendida como um processo dinâmico, composto pela gestão da informação e pela gestão do conhecimento. O processo de inteligência competitiva (I. C. nas organizações ocorre a partir de diferentes atividades informacionais, dentre elas estão as ligadas a prospecção e ao monitoramento. O papel destas atividades é essencial, pois alimentam todo o processo com dados, informação e conhecimento, constroem diversas estruturas formais e informais de informação dentro da organização, além do que, as atividades de prospecção e monitoramento geram serviços e produtos informacionais sistematizados, com alto valor agregado.The information scanning and information mining are activities base for the competitive intelligence, understood as a dynamic process, composed by the information management and knowledge management. The process of competitive intelligence (I. C. in the organizations it happens starting from different informational activities, they are the tied up ones information scanning and information mining. The function of these activities is essential, because they feed the whole process with data, information and knowledge, they build several formal structures and you inform inside of information of the organization, in addition, the information scanning and information mining activities generate information services and products systematized, with high value aggregate.

  17. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  18. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  19. An intelligent CPIB controller

    International Nuclear Information System (INIS)

    Wikne, J.C.

    1987-12-01

    An intelligent GPIB (General Purpose Interface Bus) controller is described. It employs an autonomous slave CPU together with a dedicated controller/talker/listener chip to handle the GPIB bus protocol, thus freeing the host computer from this time-consuming task. Distribution of a large part of the necessary software to the slave side, assures that the system can be implemented on virtually any computer with a minimum of effort

  20. Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening

    Science.gov (United States)

    Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.

    2017-12-01

    Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.