WorldWideScience

Sample records for intelligent document processing

  1. Clinical Process Intelligence

    DEFF Research Database (Denmark)

    Vilstrup Pedersen, Klaus

    2006-01-01

    .e. local guidelines. From a knowledge management point of view, this externalization of generalized processes, gives the opportunity to learn from, evaluate and optimize the processes. "Clinical Process Intelligence" (CPI), will denote the goal of getting generalized insight into patient centered health...

  2. Business process intelligence

    NARCIS (Netherlands)

    Castellanos, M.; Alves De Medeiros, A.K.; Mendling, J.; Weber, B.; Weijters, A.J.M.M.; Cardoso, J.; Aalst, van der W.M.P.

    2009-01-01

    Business Process Intelligence (BPI,) is an emerging area that is getting increasingly popularfor enterprises. The need to improve business process efficiency, to react quickly to changes and to meet regulatory compliance is among the main drivers for BPI. BPI refers to the application of Business

  3. An Intelligent System for Document Retrieval in Distributed Office Environments.

    Science.gov (United States)

    Mukhopadhyay, Uttam; And Others

    1986-01-01

    MINDS (Multiple Intelligent Node Document Servers) is a distributed system of knowledge-based query engines for efficiently retrieving multimedia documents in an office environment of distributed workstations. By learning document distribution patterns and user interests and preferences during system usage, it customizes document retrievals for…

  4. Intelligent multivariate process supervision

    International Nuclear Information System (INIS)

    Visuri, Pertti.

    1986-01-01

    This thesis addresses the difficulties encountered in managing large amounts of data in supervisory control of complex systems. Some previous alarm and disturbance analysis concepts are reviewed and a method for improving the supervision of complex systems is presented. The method, called multivariate supervision, is based on adding low level intelligence to the process control system. By using several measured variables linked together by means of deductive logic, the system can take into account the overall state of the supervised system. Thus, it can present to the operators fewer messages with higher information content than the conventional control systems which are based on independent processing of each variable. In addition, the multivariate method contains a special information presentation concept for improving the man-machine interface. (author)

  5. Intelligent Bar Chart Plagiarism Detection in Documents

    Directory of Open Access Journals (Sweden)

    Mohammed Mumtaz Al-Dabbagh

    2014-01-01

    Full Text Available This paper presents a novel features mining approach from documents that could not be mined via optical character recognition (OCR. By identifying the intimate relationship between the text and graphical components, the proposed technique pulls out the Start, End, and Exact values for each bar. Furthermore, the word 2-gram and Euclidean distance methods are used to accurately detect and determine plagiarism in bar charts.

  6. Intelligent bar chart plagiarism detection in documents.

    Science.gov (United States)

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Rehman, Amjad; Alkawaz, Mohammed Hazim; Saba, Tanzila; Al-Rodhaan, Mznah; Al-Dhelaan, Abdullah

    2014-01-01

    This paper presents a novel features mining approach from documents that could not be mined via optical character recognition (OCR). By identifying the intimate relationship between the text and graphical components, the proposed technique pulls out the Start, End, and Exact values for each bar. Furthermore, the word 2-gram and Euclidean distance methods are used to accurately detect and determine plagiarism in bar charts.

  7. Intelligent Bar Chart Plagiarism Detection in Documents

    Science.gov (United States)

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Alkawaz, Mohammed Hazim; Saba, Tanzila; Al-Rodhaan, Mznah; Al-Dhelaan, Abdullah

    2014-01-01

    This paper presents a novel features mining approach from documents that could not be mined via optical character recognition (OCR). By identifying the intimate relationship between the text and graphical components, the proposed technique pulls out the Start, End, and Exact values for each bar. Furthermore, the word 2-gram and Euclidean distance methods are used to accurately detect and determine plagiarism in bar charts. PMID:25309952

  8. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  9. Documenting the Engineering Design Process

    Science.gov (United States)

    Hollers, Brent

    2017-01-01

    Documentation of ideas and the engineering design process is a critical, daily component of a professional engineer's job. While patent protection is often cited as the primary rationale for documentation, it can also benefit the engineer, the team, company, and stakeholders through creating a more rigorously designed and purposeful solution.…

  10. Cognitive Process as a Basis for Intelligent Retrieval Systems Design.

    Science.gov (United States)

    Chen, Hsinchun; Dhar, Vasant

    1991-01-01

    Two studies of the cognitive processes involved in online document-based information retrieval were conducted. These studies led to the development of five computational models of online document retrieval which were incorporated into the design of an "intelligent" document-based retrieval system. Both the system and the broader implications of…

  11. Computational Intelligence in Image Processing

    CERN Document Server

    Siarry, Patrick

    2013-01-01

    Computational intelligence based techniques have firmly established themselves as viable, alternate, mathematical tools for more than a decade. They have been extensively employed in many systems and application domains, among these signal processing, automatic control, industrial and consumer electronics, robotics, finance, manufacturing systems, electric power systems, and power electronics. Image processing is also an extremely potent area which has attracted the atten­tion of many researchers who are interested in the development of new computational intelligence-based techniques and their suitable applications, in both research prob­lems and in real-world problems. Part I of the book discusses several image preprocessing algorithms; Part II broadly covers image compression algorithms; Part III demonstrates how computational intelligence-based techniques can be effectively utilized for image analysis purposes; and Part IV shows how pattern recognition, classification and clustering-based techniques can ...

  12. Aktuelles Schlagwort: Business Process Intelligence

    NARCIS (Netherlands)

    Mutschler, B.B.; Reichert, M.U.

    In jüngerer Vergangenheit rückt vermehrt die Erfassung und Analyse von Prozessechtdaten (z.B. zum Start und Ende von Prozessaktivitäten) in den Blickpunkt. Solche Daten werden von den meisten prozessorientierten Informationssystemen geliefert. Das Schlagwort Business Process Intelligence (BPI)

  13. Business Intelligence in Process Control

    Science.gov (United States)

    Kopčeková, Alena; Kopček, Michal; Tanuška, Pavol

    2013-12-01

    The Business Intelligence technology, which represents a strong tool not only for decision making support, but also has a big potential in other fields of application, is discussed in this paper. Necessary fundamental definitions are offered and explained to better understand the basic principles and the role of this technology for company management. Article is logically divided into five main parts. In the first part, there is the definition of the technology and the list of main advantages. In the second part, an overview of the system architecture with the brief description of separate building blocks is presented. Also, the hierarchical nature of the system architecture is shown. The technology life cycle consisting of four steps, which are mutually interconnected into a ring, is described in the third part. In the fourth part, analytical methods incorporated in the online analytical processing and data mining used within the business intelligence as well as the related data mining methodologies are summarised. Also, some typical applications of the above-mentioned particular methods are introduced. In the final part, a proposal of the knowledge discovery system for hierarchical process control is outlined. The focus of this paper is to provide a comprehensive view and to familiarize the reader with the Business Intelligence technology and its utilisation.

  14. Machine intelligence and signal processing

    CERN Document Server

    Vatsa, Mayank; Majumdar, Angshul; Kumar, Ajay

    2016-01-01

    This book comprises chapters on key problems in machine learning and signal processing arenas. The contents of the book are a result of a 2014 Workshop on Machine Intelligence and Signal Processing held at the Indraprastha Institute of Information Technology. Traditionally, signal processing and machine learning were considered to be separate areas of research. However in recent times the two communities are getting closer. In a very abstract fashion, signal processing is the study of operator design. The contributions of signal processing had been to device operators for restoration, compression, etc. Applied Mathematicians were more interested in operator analysis. Nowadays signal processing research is gravitating towards operator learning – instead of designing operators based on heuristics (for example wavelets), the trend is to learn these operators (for example dictionary learning). And thus, the gap between signal processing and machine learning is fast converging. The 2014 Workshop on Machine Intel...

  15. Artificial intelligence and process management

    International Nuclear Information System (INIS)

    Epton, J.B.A.

    1989-01-01

    Techniques derived from work in artificial intelligence over the past few decades are beginning to change the approach in applying computers to process management. To explore this new approach and gain real practical experience of its potential a programme of experimental applications was initiated by Sira in collaboration with the process industry. This programme encompassed a family of experimental applications ranging from process monitoring, through supervisory control and troubleshooting to planning and scheduling. The experience gained has led to a number of conclusions regarding the present level of maturity of the technology, the potential for further developments and the measures required to secure the levels of system integrity necessary in on-line applications to critical processes. (author)

  16. Report : business process intelligence challenge 2013

    NARCIS (Netherlands)

    Dongen, van B.F.; Weber, B.; Ferreira, D.R.; De Weerdt, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    For the third time, the Business Process Intelligence workshop hosted the Business Process Intelligence Challenge. The goal of this challenge is twofold. On the one hand, the challenge allows researchers and practitioners in the field to show their analytical capabilities to a broader audience. On

  17. Intelligence amplification framework for enhancing scheduling processes

    NARCIS (Netherlands)

    Dobrkovic, Andrej; Liu, Luyao; Iacob, Maria Eugenia; van Hillegersberg, Jos

    2016-01-01

    The scheduling process in a typical business environment consists of predominantly repetitive tasks that have to be completed in limited time and often containing some form of uncertainty. The intelligence amplification is a symbiotic relationship between a human and an intelligent agent. This

  18. The Predictive Aspect of Business Process Intelligence

    DEFF Research Database (Denmark)

    Pérez, Moisés Lima; Møller, Charles

    2007-01-01

    This paper presents the arguments for a research proposal on predicting business events in a Business Process Intelligence (BPI) context. The paper argues that BPI holds a potential for leveraging enterprise benefits by supporting real-time processes. However, based on the experiences from past...... business intelligence projects the paper argues that it is necessary to establish a new methodology to mine and extract the intelligence on the business level which is different from that, which will improve a business process in an enterprise. In conclusion the paper proposes a new research project aimed...

  19. Recording Process Documentation for Provenance

    NARCIS (Netherlands)

    Groth, P.T.; Moreau, L

    2009-01-01

    Scientific and business communities are adopting large-scale distributed systems as a means to solve a wide range of resource-intensive tasks. These communities also have requirements in terms of provenance. We define the provenance of a result produced by a distributed system as the process that

  20. User documentation for the MSK and OMS intelligent tutoring systems

    Science.gov (United States)

    Fink, Pamela K.; Herren, L. Tandy; Lincoln, David T.

    1991-01-01

    This user's guide describes how to use the Intelligent Tutoring Systems for the Manual Select Keyboard (MSK) and the Orbital Maneuvering System (OMS) and how to use the C code that runs the mockup version of the MSK.

  1. Definition and documentation of engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, G.W. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial is an extract of a two-day workshop developed under the auspices of the Quality Engineering Department at Sandia National Laboratories. The presentation starts with basic definitions and addresses why processes should be defined and documented. It covers three primary topics: (1) process considerations and rationale, (2) approach to defining and documenting engineering processes, and (3) an IDEFO model of the process for defining engineering processes.

  2. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  3. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  4. Using color management in color document processing

    Science.gov (United States)

    Nehab, Smadar

    1995-04-01

    Color Management Systems have been used for several years in Desktop Publishing (DTP) environments. While this development hasn't matured yet, we are already experiencing the next generation of the color imaging revolution-Device Independent Color for the small office/home office (SOHO) environment. Though there are still open technical issues with device independent color matching, they are not the focal point of this paper. This paper discusses two new and crucial aspects in using color management in color document processing: the management of color objects and their associated color rendering methods; a proposal for a precedence order and handshaking protocol among the various software components involved in color document processing. As color peripherals become affordable to the SOHO market, color management also becomes a prerequisite for common document authoring applications such as word processors. The first color management solutions were oriented towards DTP environments whose requirements were largely different. For example, DTP documents are image-centric, as opposed to SOHO documents that are text and charts centric. To achieve optimal reproduction on low-cost SOHO peripherals, it is critical that different color rendering methods are used for the different document object types. The first challenge in using color management of color document processing is the association of rendering methods with object types. As a result of an evolutionary process, color matching solutions are now available as application software, as driver embedded software and as operating system extensions. Consequently, document processing faces a new challenge, the correct selection of the color matching solution while avoiding duplicate color corrections.

  5. Parallel processing for artificial intelligence 1

    CERN Document Server

    Kanal, LN; Kumar, V; Suttner, CB

    1994-01-01

    Parallel processing for AI problems is of great current interest because of its potential for alleviating the computational demands of AI procedures. The articles in this book consider parallel processing for problems in several areas of artificial intelligence: image processing, knowledge representation in semantic networks, production rules, mechanization of logic, constraint satisfaction, parsing of natural language, data filtering and data mining. The publication is divided into six sections. The first addresses parallel computing for processing and understanding images. The second discus

  6. Intelligent medical image processing by simulated annealing

    International Nuclear Information System (INIS)

    Ohyama, Nagaaki

    1992-01-01

    Image processing is being widely used in the medical field and already has become very important, especially when used for image reconstruction purposes. In this paper, it is shown that image processing can be classified into 4 categories; passive, active, intelligent and visual image processing. These 4 classes are explained at first through the use of several examples. The results show that the passive image processing does not give better results than the others. Intelligent image processing, then, is addressed, and the simulated annealing method is introduced. Due to the flexibility of the simulated annealing, formulated intelligence is shown to be easily introduced in an image reconstruction problem. As a practical example, 3D blood vessel reconstruction from a small number of projections, which is insufficient for conventional method to give good reconstruction, is proposed, and computer simulation clearly shows the effectiveness of simulated annealing method. Prior to the conclusion, medical file systems such as IS and C (Image Save and Carry) is pointed out to have potential for formulating knowledge, which is indispensable for intelligent image processing. This paper concludes by summarizing the advantages of simulated annealing. (author)

  7. Intelligent process control operator aid -- An artificial intelligence approach

    International Nuclear Information System (INIS)

    Sharma, D.D.; Miller, D.D.; Hajek, B.; Chandrasekaran, B.

    1986-01-01

    This paper describes an approach for designing intelligent process and power plant control operator aids. It is argued that one of the key aspects of an intelligent operator aid is the capability for dynamic procedure synthesis with incomplete definition of initial state, unknown goal states, and the dynamic world situation. The dynamic world state is used to determine the goal, select appropriate plan steps from prespecified procedures to achieve the goal, control the execution of the synthesized plan, and provide for dynamic recovery from failure often using a goal hierarchy. The dynamic synthesis of a plan requires integration of various problems solving capabilities such as plan generation, plan synthesis, plan modification, and failure recovery from a plan. The programming language for implementing the DPS framework provides a convenient tool for developing applications. An application of the DPS approach to a Nuclear Power Plant emergency procedure synthesis is also described. Initial test results indicate that the approach is successful in dynamically synthesizing the procedures. The authors realize that the DPS framework is not a solution for all control tasks. However, many existing process and plant control problems satisfy the requirements discussed in the paper and should be able to benefit from the framework described

  8. New Perspectives on Intelligence Collection and Processing

    Science.gov (United States)

    2016-06-01

    MASINT Measurement and Signature Intelligence NPS Naval Postgraduate School OSINT Open Source Intelligence pdf Probability Density Function SIGINT...MASINT): different types of sensors • Open Source Intelligence ( OSINT ): from all open sources • Signals Intelligence (SIGINT): intercepting the

  9. Application of artificial intelligence in process control

    CERN Document Server

    Krijgsman, A

    1993-01-01

    This book is the result of a united effort of six European universities to create an overall course on the appplication of artificial intelligence (AI) in process control. The book includes an introduction to key areas including; knowledge representation, expert, logic, fuzzy logic, neural network, and object oriented-based approaches in AI. Part two covers the application to control engineering, part three: Real-Time Issues, part four: CAD Systems and Expert Systems, part five: Intelligent Control and part six: Supervisory Control, Monitoring and Optimization.

  10. Using Intelligent Agents to Manage Business Processes

    OpenAIRE

    Jennings, N. R.; Faratin, P.; Johnson, M. J.; O'Brien, P.; Wiegand, M. E.

    1996-01-01

    Management of the business process requires pertinent, consistent and up-to-date information gathering and information dissemination. These complex and time consuming tasks prompt organizations to develop an Information Technology system to assist with the management of various aspects of their business processes. Intelligent agents are the strongest solution candidates because of their many advantages, namely: autonomy, social ability, responsiveness and proactiveness. Given these characteri...

  11. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  12. Environmental information document defense waste processing facility

    International Nuclear Information System (INIS)

    1981-07-01

    This report documents the impact analysis of a proposed Defense Waste Processing Facility (DWPF) for immobilizing high-level waste currently being stored on an interim basis at the Savannah River Plant (SRP). The DWPF will process the waste into a form suitable for shipment to and disposal in a federal repository. The DWPF will convert the high-level waste into: a leach-resistant form containing above 99.9% of all the radioactivity, and a residue of slightly contaminated salt. The document describes the SRP site and environs, including population, land and water uses; surface and subsurface soils and waters; meteorology; and ecology. A conceptual integrated facility for concurrently producing glass waste and saltcrete is described, and the environmental effects of constructing and operating the facility are presented. Alternative sites and waste disposal options are addressed. Also environmental consultations and permits are discussed

  13. Intelligent processing for thick composites

    Science.gov (United States)

    Shin, Daniel Dong-Ok

    2000-10-01

    Manufacturing thick composite parts are associated with adverse curing conditions such as large in-plane temperature gradient and exotherms. The condition is further aggravated because the manufacturer's cycle and the existing cure control systems do not adequately counter such affects. In response, the forecast-based thermal control system is developed to have better cure control for thick composites. Accurate cure kinetic model is crucial for correctly identifying the amount of heat generated for composite process simulation. A new technique for identifying cure parameters for Hercules AS4/3502 prepreg is presented by normalizing the DSC data. The cure kinetics is based on an autocatalytic model for the proposed method, which uses dynamic and isothermal DSC data to determine its parameters. Existing models are also used to determine kinetic parameters but rendered inadequate because of the material's temperature dependent final degree of cure. The model predictions determined from the new technique showed good agreement to both isothermal and dynamic DSC data. The final degree of cure was also in good agreement with experimental data. A realistic cure simulation model including bleeder ply analysis and compaction is validated with Hercules AS4/3501-6 based laminates. The nonsymmetrical temperature distribution resulting from the presence of bleeder plies agreed well to the model prediction. Some of the discrepancies in the predicted compaction behavior were attributed to inaccurate viscosity and permeability models. The temperature prediction was quite good for the 3cm laminate. The validated process simulation model along with cure kinetics model for AS4/3502 prepreg were integrated into the thermal control system. The 3cm Hercules AS4/3501-6 and AS4/3502 laminate were fabricated. The resulting cure cycles satisfied all imposed requirements by minimizing exotherms and temperature gradient. Although the duration of the cure cycles increased, such phenomena was

  14. Document Examination: Applications of Image Processing Systems.

    Science.gov (United States)

    Kopainsky, B

    1989-12-01

    Dealing with images is a familiar business for an expert in questioned documents: microscopic, photographic, infrared, and other optical techniques generate images containing the information he or she is looking for. A recent method for extracting most of this information is digital image processing, ranging from the simple contrast and contour enhancement to the advanced restoration of blurred texts. When combined with a sophisticated physical imaging system, an image pricessing system has proven to be a powerful and fast tool for routine non-destructive scanning of suspect documents. This article reviews frequent applications, comprising techniques to increase legibility, two-dimensional spectroscopy (ink discrimination, alterations, erased entries, etc.), comparison techniques (stamps, typescript letters, photo substitution), and densitometry. Computerized comparison of handwriting is not included. Copyright © 1989 Central Police University.

  15. Editorial: "Business process intelligence : connecting data and processes"

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Zhao, J.L.; Wang, H.; Wang, Harry Jiannan

    2015-01-01

    This introduction to the special issue on Business Process Intelligence (BPI) discusses the relation between data and processes. The recent attention for Big Data illustrates that organizations are aware of the potential of the torrents of data generated by today's information systems. However, at

  16. Intelligent systems for KSC ground processing

    Science.gov (United States)

    Heard, Astrid E.

    1992-01-01

    The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

  17. Markov decision processes in artificial intelligence

    CERN Document Server

    Sigaud, Olivier

    2013-01-01

    Markov Decision Processes (MDPs) are a mathematical framework for modeling sequential decision problems under uncertainty as well as Reinforcement Learning problems. Written by experts in the field, this book provides a global view of current research using MDPs in Artificial Intelligence. It starts with an introductory presentation of the fundamental aspects of MDPs (planning in MDPs, Reinforcement Learning, Partially Observable MDPs, Markov games and the use of non-classical criteria). Then it presents more advanced research trends in the domain and gives some concrete examples using illustr

  18. Parallel processing for artificial intelligence 2

    CERN Document Server

    Kumar, V; Suttner, CB

    1994-01-01

    With the increasing availability of parallel machines and the raising of interest in large scale and real world applications, research on parallel processing for Artificial Intelligence (AI) is gaining greater importance in the computer science environment. Many applications have been implemented and delivered but the field is still considered to be in its infancy. This book assembles diverse aspects of research in the area, providing an overview of the current state of technology. It also aims to promote further growth across the discipline. Contributions have been grouped according to their

  19. Intelligent systems/software engineering methodology - A process to manage cost and risk

    Science.gov (United States)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  20. The process of implementing Competitive Intelligence in a company

    Directory of Open Access Journals (Sweden)

    František Bartes

    2013-01-01

    Full Text Available It is a common occurrence in business practice that the management of a company, in an effort to jump-start the function of the Competitive Intelligence unit, makes a number of mistakes and errors. Yet it is not difficult to avoid these missteps and achieve the desired level of Competitive Intelligence activities in a purposeful and effective manner. The author believes that a resolution of this problem lies in his concept of Competitive Intelligence viewed as a system application discipline (like value analysis or value engineering, which is why he approaches the problem of actual implementation of Competitive Intelligence in a company by referring to standards ČSN EN 12 973 and ČSN EN 1325-2. The author then proposes his own procedure for implementing Competitive Intelligence in a company. He first describes the various ways of securing the Competitive Intelligence services. Depending on the manner of securing these services, it is necessary to choose the actual method of bringing Competitive Intelligence into the company. The author goes on to lists the essentials that every program of Competitive Intelligence implementation should have. The process of Competitive Intelligence implementation unfolds in three stages, those being: 1. Managerial preparation for the introduction of Competitive Intelligence. 2. Personnel-oriented and professional preparation for applying Competitive Intelligence. 3. Organizational preparation for the implementation and practice of Competitive Intelligence. In Discussion, the author points out the most common mistakes he encountered in practice when implementing the Competitive Intelligence function.

  1. Process mining : business intelligence software wordt eindelijk intelligent

    NARCIS (Netherlands)

    Aalst, van der W.M.P.

    2007-01-01

    Business Intelligence is een begrip dat verwijst naar software die gebruikt kan worden om gegevens over operationele bedrijfsprocessen te verzamelen en deze vervolgens te analyseren. Het doel van BI software is het verkrijgen van meer kennis en inzicht, welke gebruikt kunnen worden om processen

  2. Forensic intelligence applied to questioned document analysis: A model and its application against organized crime.

    Science.gov (United States)

    De Alcaraz-Fossoul, Josep; Roberts, Katherine A

    2017-07-01

    The capability of forensic sciences to fight crime, especially against organized criminal groups, becomes relevant in the recent economic downturn and the war on terrorism. In view of these societal challenges, the methods of combating crime should experience critical changes in order to improve the effectiveness and efficiency of the current resources available. It is obvious that authorities have serious difficulties combating criminal groups of transnational nature. These are characterized as well structured organizations with international connections, abundant financial resources and comprised of members with significant and diverse expertise. One common practice among organized criminal groups is the use of forged documents that allow for the commission of illegal cross-border activities. Law enforcement can target these movements to identify counterfeits and establish links between these groups. Information on document falsification can become relevant to generate forensic intelligence and to design new strategies against criminal activities of this nature and magnitude. This article discusses a methodology for improving the development of forensic intelligence in the discipline of questioned document analysis. More specifically, it focuses on document forgeries and falsification types used by criminal groups. It also describes the structure of international criminal organizations that use document counterfeits as means to conduct unlawful activities. The model presented is partially based on practical applications of the system that have resulted in satisfactory outcomes in our laboratory. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  3. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  4. Tool path strategy and cutting process monitoring in intelligent machining

    Science.gov (United States)

    Chen, Ming; Wang, Chengdong; An, Qinglong; Ming, Weiwei

    2018-06-01

    Intelligent machining is a current focus in advanced manufacturing technology, and is characterized by high accuracy and efficiency. A central technology of intelligent machining—the cutting process online monitoring and optimization—is urgently needed for mass production. In this research, the cutting process online monitoring and optimization in jet engine impeller machining, cranio-maxillofacial surgery, and hydraulic servo valve deburring are introduced as examples of intelligent machining. Results show that intelligent tool path optimization and cutting process online monitoring are efficient techniques for improving the efficiency, quality, and reliability of machining.

  5. Artificial intelligence in the materials processing laboratory

    Science.gov (United States)

    Workman, Gary L.; Kaukler, William F.

    1990-01-01

    Materials science and engineering provides a vast arena for applications of artificial intelligence. Advanced materials research is an area in which challenging requirements confront the researcher, from the drawing board through production and into service. Advanced techniques results in the development of new materials for specialized applications. Hand-in-hand with these new materials are also requirements for state-of-the-art inspection methods to determine the integrity or fitness for service of structures fabricated from these materials. Two problems of current interest to the Materials Processing Laboratory at UAH are an expert system to assist in eddy current inspection of graphite epoxy components for aerospace and an expert system to assist in the design of superalloys for high temperature applications. Each project requires a different approach to reach the defined goals. Results to date are described for the eddy current analysis, but only the original concepts and approaches considered are given for the expert system to design superalloys.

  6. System Documentation AS Basis For Company Business Process Improvement

    OpenAIRE

    Pelawi, Dewan

    2012-01-01

    Business process is an activity performed together to achieve business goals. Good business process will support the achievement of the organization’s plan to make profit for the company. In order to understand the business process, the business process needs to be documented and analyzed. The purpose of research is to make the system documentation as a basis to improve or complete the ongoing business process. The research method is system documentation. System documentation is a way to desc...

  7. INTELLIGENT SUPPORT OF EDUCATIONAL PROCESSES AT LEVEL OF SPECIALITY

    Directory of Open Access Journals (Sweden)

    Irina I. Kazmina

    2013-01-01

    Full Text Available The article is devoted to intelligent support of educational processes at level of speciality with the help of information system. In this paper intelligent information system of Modern Humanitarian Academy is considered and three directions of development of intelligent support within the scope of developed information system are offered. These directions include: development of model of student, data mining of quality of teaching and prediction of quality of teaching in the future. 

  8. Process of technical document quality assessment | Djebabra ...

    African Journals Online (AJOL)

    The most used instrument in training and scientific research is obviously the book which occupies since always a place of choice. Indeed, the book and more particularly the technical book are is used as a support, as well in the basic training as in the document is of good quality in order to have confidence in the services ...

  9. Internet-based intelligent information processing systems

    CERN Document Server

    Tonfoni, G; Ichalkaranje, N S

    2003-01-01

    The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap

  10. Improving the Product Documentation Process of a Small Software Company

    Science.gov (United States)

    Valtanen, Anu; Ahonen, Jarmo J.; Savolainen, Paula

    Documentation is an important part of the software process, even though it is often neglected in software companies. The eternal question is how much documentation is enough. In this article, we present a practical implementation of lightweight product documentation process resulting from SPI efforts in a small company. Small companies’ financial and human resources are often limited. The documentation process described here, offers a template for creating adequate documentation consuming minimal amount of resources. The key element of the documentation process is an open source web-based bugtracking system that was customized to be used as a documentation tool. The use of the tool enables iterative and well structured documentation. The solution best serves the needs of a small company with off-the-shelf software products and striving for SPI.

  11. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    Science.gov (United States)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  12. Predicting speech intelligibility in conditions with nonlinearly processed noisy speech

    DEFF Research Database (Denmark)

    Jørgensen, Søren; Dau, Torsten

    2013-01-01

    The speech-based envelope power spectrum model (sEPSM; [1]) was proposed in order to overcome the limitations of the classical speech transmission index (STI) and speech intelligibility index (SII). The sEPSM applies the signal-tonoise ratio in the envelope domain (SNRenv), which was demonstrated...... to successfully predict speech intelligibility in conditions with nonlinearly processed noisy speech, such as processing with spectral subtraction. Moreover, a multiresolution version (mr-sEPSM) was demonstrated to account for speech intelligibility in various conditions with stationary and fluctuating...

  13. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  14. Adaptive Algorithms for Automated Processing of Document Images

    Science.gov (United States)

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  15. LEARNING STYLES BASED ADAPTIVE INTELLIGENT TUTORING SYSTEMS: DOCUMENT ANALYSIS OF ARTICLES PUBLISHED BETWEEN 2001. AND 2016.

    Directory of Open Access Journals (Sweden)

    Amit Kumar

    2017-12-01

    Full Text Available Actualizing instructional intercessions to suit learner contrasts has gotten extensive consideration. Among these individual contrast factors, the observational confirmation in regards to the academic benefit of learning styles has been addressed, yet the examination on the issue proceeds. Late improvements in web-based executions have driven researchers to re-examine the learning styles in adaptive tutoring frameworks. Adaptivity in intelligent tutoring systems is strongly influenced by the learning style of a learner. This study involved extensive document analysis of adaptive tutoring systems based on learning styles. Seventy-eight studies in literature from 2001 to 2016 were collected and classified under select parameters such as main focus, purpose, research types, methods, types and levels of participants, field/area of application, learner modelling, data gathering tools used and research findings. The current studies reveal that majority of the studies defined a framework or architecture of adaptive intelligent tutoring system (AITS while others focused on impact of AITS on learner satisfaction and academic outcomes. Currents trends, gaps in literature and ications were discussed.

  16. Multiple multichannel spectra acquisition and processing system with intelligent interface

    International Nuclear Information System (INIS)

    Chen Ying; Wei Yixiang; Qu Jianshi; Zheng Futang; Xu Shengkui; Xie Yuanming; Qu Xing; Ji Weitong; Qiu Xuehua

    1986-01-01

    A Multiple multichannel spectra acquisition and processing system with intelligent interface is described. Sixteen spectra measured with various lengths, channel widths, back biases and acquisition times can be identified and collected by the intelligent interface simultaneously while the connected computer is doing data processing. The execution time for the Ge(Li) gamma-ray spectrum analysis software on IBM PC-XT is about 55 seconds

  17. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  18. The big data processing platform for intelligent agriculture

    Science.gov (United States)

    Huang, Jintao; Zhang, Lichen

    2017-08-01

    Big data technology is another popular technology after the Internet of Things and cloud computing. Big data is widely used in many fields such as social platform, e-commerce, and financial analysis and so on. Intelligent agriculture in the course of the operation will produce large amounts of data of complex structure, fully mining the value of these data for the development of agriculture will be very meaningful. This paper proposes an intelligent data processing platform based on Storm and Cassandra to realize the storage and management of big data of intelligent agriculture.

  19. Classification process in a text document recommender system

    Directory of Open Access Journals (Sweden)

    Dan MUNTEANU

    2005-12-01

    Full Text Available This paper presents the classification process in a recommender system used for textual documents taken especially from web. The system uses in the classification process a combination of content filters, event filters and collaborative filters and it uses implicit and explicit feedback for evaluating documents.

  20. Sustainability Reporting Process Model using Business Intelligence

    OpenAIRE

    Alxneit, Thorsten Julius

    2015-01-01

    Sustainability including the reporting requirements is one of the most relevant topics for companies. In recent years, many software providers have launched new software tools targeting companies committed to implementing sustainability reporting. But it’s not only companies willing to use their Business Intelligence (BI) solution, there are also basic principles such as the single source of truth and tendencies to combine sustainability reporting with the financial reporting (...

  1. Political and Budgetary Oversight of the Ukrainian Intelligence Community: Processes, Problems and Prospects for Reform

    National Research Council Canada - National Science Library

    Petrov, Oleksii

    2007-01-01

    .... Official government documents, news reports and other literature on the intelligence system in Ukraine, as well as studies of intelligence oversight within democracies are the primary sources of data...

  2. An Approach to quantify the Costs of Business Process Intelligence.

    NARCIS (Netherlands)

    Mutschler, B.B.; Bumiller, J.; Reichert, M.U.; Desel, J.; Frank, U.

    2005-01-01

    Today, enterprises are forced to continuously optimize their business as well as service processes. In this context the process-centered alignment of information systems is crucial. The use of business process intelligence (BPI) tools offers promising perspectives in this respect. However, when

  3. Quality control of the documentation process in electronic economic activities

    Directory of Open Access Journals (Sweden)

    Krutova A.S.

    2017-06-01

    Full Text Available It is proved that the main tool that will provide adequate information resources e economic activities of social and economic relations are documenting quality control processes as the basis of global information space. Directions problems as formation evaluation information resources in the process of documentation, namely development tools assess the efficiency of the system components – qualitative assessment; development of mathematical modeling tools – quantitative evaluation. A qualitative assessment of electronic documentation of economic activity through exercise performance, efficiency of communication; document management efficiency; effectiveness of flow control operations; relationship management effectiveness. The concept of quality control process documents electronically economic activity to components which include: the level of workflow; forms adequacy of information; consumer quality documents; quality attributes; type of income data; condition monitoring systems; organizational level process documentation; attributes of quality, performance quality consumer; type of management system; type of income data; condition monitoring systems. Grounded components of the control system electronic document subjects of economic activity. Detected components IT-audit management system economic activity: compliance audit; audit of internal control; detailed multilevel analysis; corporate risk assessment methodology. The stages and methods of processing electronic transactions economic activity during condition monitoring of electronic economic activity.

  4. The Algorithm Theoretical Basis Document for Level 1A Processing

    Science.gov (United States)

    Jester, Peggy L.; Hancock, David W., III

    2012-01-01

    The first process of the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software converts the Level 0 data into the Level 1A Data Products. The Level 1A Data Products are the time ordered instrument data converted from counts to engineering units. This document defines the equations that convert the raw instrument data into engineering units. Required scale factors, bias values, and coefficients are defined in this document. Additionally, required quality assurance and browse products are defined in this document.

  5. Creation of structured documentation templates using Natural Language Processing techniques.

    Science.gov (United States)

    Kashyap, Vipul; Turchin, Alexander; Morin, Laura; Chang, Frank; Li, Qi; Hongsermeier, Tonya

    2006-01-01

    Structured Clinical Documentation is a fundamental component of the healthcare enterprise, linking both clinical (e.g., electronic health record, clinical decision support) and administrative functions (e.g., evaluation and management coding, billing). One of the challenges in creating good quality documentation templates has been the inability to address specialized clinical disciplines and adapt to local clinical practices. A one-size-fits-all approach leads to poor adoption and inefficiencies in the documentation process. On the other hand, the cost associated with manual generation of documentation templates is significant. Consequently there is a need for at least partial automation of the template generation process. We propose an approach and methodology for the creation of structured documentation templates for diabetes using Natural Language Processing (NLP).

  6. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Science.gov (United States)

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  7. Artificial intelligence in process design and operation

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1988-01-01

    Artificial Intelligence (AI) has recently become prominent in the discussion of computer applications in the utility business. In order to assess this technology, a research project was performed to determine whether software development techniques based on AI could be used to facilitate management of information associated with the design of a generating station. The approach taken was the development of an expert system, using a relatively simple set of rules acting on a more complex knowledge base. A successful prototype for the application was developed and its potential extension to a production environment demonstrated. During the course of prototype development, other possible applications of AI in design engineering were discovered, and areas of particular interest selected for further investigation. A plan for AI R and D was formulated. That plan and other possible future work in AI are discussed

  8. 智慧型文件與智慧型系統整合之研究 A Research on the Integration of Intelligent Document and Intelligent System

    Directory of Open Access Journals (Sweden)

    Sinn-Cheng Lin

    2003-06-01

    Full Text Available 本文首先從系統智慧化與文件智慧化兩個領域逐漸匯流的角度切入,探討電子文件的智慧化程度實是影響智慧型檢索系統性能的重要因素;接著,從XML發展趨勢觀之,我們認為以XML為核心的技術已經逐漸扮演了提升文件智能的重要角色;再者,藉由一系列的實作,我們以XML為核心,建置了XML資料交換系統、XML新聞管理與出版系統、XML/CMARC編目系統和WAPOPAC行動公用目錄系統,分別驗證了XML在資料交換方面、在內文語意描述方面、在圖書館自動化的編目系統方面和行動資訊檢索方面,皆有著不可忽視的應用潛力。This study focuses on the integration of intelligent documents and intelligent systems.First, the paper defines the intelligent document as an electronic document that has extra self-description information, semantically. We believe that the intelligence of the document would be an important factor that impacts the performance of information retrieval systems. Next, by exploring the development of XML, we find that the XMLbased technologies already became the principle of intelligent documents. Moreover, a series of system implementations have been done in this paper, they are a data exchange system, a news publication system, an XML-based CMARC cataloging system and a WAP-based OPAC system. These experiments demonstrate the application potentials of XML in many fields, such as electronic data exchange, electronic publication, library automation and mobile information service of library.

  9. PROCESS DOCUMENTATION: A MODEL FOR KNOWLEDGE MANAGEMENT IN ORGANIZATIONS.

    Science.gov (United States)

    Haddadpoor, Asefeh; Taheri, Behjat; Nasri, Mehran; Heydari, Kamal; Bahrami, Gholamreza

    2015-10-01

    Continuous and interconnected processes are a chain of activities that turn the inputs of an organization to its outputs and help achieve partial and overall goals of the organization. These activates are carried out by two types of knowledge in the organization called explicit and implicit knowledge. Among these, implicit knowledge is the knowledge that controls a major part of the activities of an organization, controls these activities internally and will not be transferred to the process owners unless they are present during the organization's work. Therefore the goal of this study is identification of implicit knowledge and its integration with explicit knowledge in order to improve human resources management, physical resource management, information resource management, training of new employees and other activities of Isfahan University of Medical Science. The project for documentation of activities in department of health of Isfahan University of Medical Science was carried out in several stages. First the main processes and related sub processes were identified and categorized with the help of planning expert. The categorization was carried out from smaller processes to larger ones. In this stage the experts of each process wrote down all their daily activities and organized them into general categories based on logical and physical relations between different activities. Then each activity was assigned a specific code. The computer software was designed after understanding the different parts of the processes, including main and sup processes, and categorization, which will be explained in the following sections. The findings of this study showed that documentation of activities can help expose implicit knowledge because all of inputs and outputs of a process along with the length, location, tools and different stages of the process, exchanged information, storage location of the information and information flow can be identified using proper

  10. A Document Imaging Technique for Implementing Electronic Loan Approval Process

    Directory of Open Access Journals (Sweden)

    J. Manikandan

    2015-04-01

    Full Text Available The image processing is one of the leading technologies of computer applications. Image processing is a type of signal processing, the input for image processor is an image or video frame and the output will be an image or subset of image [1]. Computer graphics and computer vision process uses an image processing techniques. Image processing systems are used in various environments like medical fields, computer-aided design (CAD, research fields, crime investigation fields and military fields. In this paper, we proposed a document image processing technique, for establishing electronic loan approval process (E-LAP [2]. Loan approval process has been tedious process, the E-LAP system attempts to reduce the complexity of loan approval process. Customers have to login to fill the loan application form online with all details and submit the form. The loan department then processes the submitted form and then sends an acknowledgement mail via the E-LAP to the requested customer with the details about list of documents required for the loan approval process [3]. The approaching customer can upload the scanned copies of all required documents. All this interaction between customer and bank take place using an E-LAP system.

  11. Intelligent query processing for semantic mediation of information systems

    Directory of Open Access Journals (Sweden)

    Saber Benharzallah

    2011-11-01

    Full Text Available We propose an intelligent and an efficient query processing approach for semantic mediation of information systems. We propose also a generic multi agent architecture that supports our approach. Our approach focuses on the exploitation of intelligent agents for query reformulation and the use of a new technology for the semantic representation. The algorithm is self-adapted to the changes of the environment, offers a wide aptitude and solves the various data conflicts in a dynamic way; it also reformulates the query using the schema mediation method for the discovered systems and the context mediation for the other systems.

  12. Meaning of cognitive processes for creating artificial intelligence

    OpenAIRE

    Pangrác, Vojtěch

    2011-01-01

    This diploma thesis brings an integral view at cognitive processes connected with artificial intelligence systems, and makes a comparison with the processes observed in nature, including human being. A historical background helps us to look at the whole issue from a certain point of view. The main axis of interest comes after the historical overview and includes the following: environment -- stimulations -- processing -- reflection in the cognitive system -- reaction to stimulation; I balance...

  13. INFORMATION SYSTEM OF AUTOMATION OF PREPARATION EDUCATIONAL PROCESS DOCUMENTS

    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko

    2016-01-01

    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  14. Intelligent control for scalable video processing

    NARCIS (Netherlands)

    Wüst, C.C.

    2006-01-01

    In this thesis we study a problem related to cost-effective video processing in software by consumer electronics devices, such as digital TVs. Video processing is the task of transforming an input video signal into an output video signal, for example to improve the quality of the signal. This

  15. The Janus Head Article - On Quality in the Documentation Process

    Directory of Open Access Journals (Sweden)

    Henrik Andersen

    2006-03-01

    Full Text Available The god Janus in Greek mythology was a two-faced god; each face had its own view of the world. Our idea behind the Janus Head article is to give you two different and maybe even contradicting views on a certain topic. In this issue the topic is quality in the documentation process. In the first half of this issue’s Janus Head Article translators from the international company Grundfos give us their view of quality and how quality is managed in the documentation process at Grundfos. In the second half of the Janus Head Article scholars from the University of Southern Denmark describe and discuss quality in the documentation process at Grundfos from a researcher’s point of view.

  16. The Janus Head Article - On Quality in the Documentation Process

    Directory of Open Access Journals (Sweden)

    Henrik Andersen

    2012-08-01

    Full Text Available The god Janus in Greek mythology was a two-faced god; each face had its own view of the world. Our idea behind the Janus Head article is to give you two different and maybe even contradicting views on a certain topic. In this issue the topic is quality in the documentation process. In the first half of this issue’s Janus Head Article translators from the international company Grundfos give us their view of quality and how quality is managed in the documentation process at Grundfos. In the second half of the Janus Head Article scholars from the University of Southern Denmark describe and discuss quality in the documentation process at Grundfos from a researcher’s point of view.

  17. Advances in Reasoning-Based Image Processing Intelligent Systems Conventional and Intelligent Paradigms

    CERN Document Server

    Nakamatsu, Kazumi

    2012-01-01

    The book puts special stress on the contemporary techniques for reasoning-based image processing and analysis: learning based image representation and advanced video coding; intelligent image processing and analysis in medical vision systems; similarity learning models for image reconstruction; visual perception for mobile robot motion control, simulation of human brain activity in the analysis of video sequences; shape-based invariant features extraction; essential of paraconsistent neural networks, creativity and intelligent representation in computational systems. The book comprises 14 chapters. Each chapter is a small monograph, representing resent investigations of authors in the area. The topics of the chapters cover wide scientific and application areas and complement each-other very well. The chapters’ content is based on fundamental theoretical presentations, followed by experimental results and comparison with similar techniques. The size of the chapters is well-ballanced which permits a thorough ...

  18. Intelligent Controller Design for a Chemical Process

    OpenAIRE

    Mr. Glan Devadhas G; Dr.Pushpakumar S.

    2010-01-01

    Chemical process control is a challenging problem due to the strong on*line non*linearity and extreme sensitivity to disturbances of the process. Ziegler – Nichols tuned PI and PID controllers are found to provide poor performances for higher*order and non–linear systems. This paper presents an application of one*step*ahead fuzzy as well as ANFIS (adaptive*network*based fuzzy inference system) tuning scheme for an Continuous Stirred Tank Reactor CSTR process. The controller is designed based ...

  19. Graphics Processing Unit Enhanced Parallel Document Flocking Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL; ST Charles, Jesse Lee [ORNL

    2010-01-01

    Analyzing and clustering documents is a complex problem. One explored method of solving this problem borrows from nature, imitating the flocking behavior of birds. One limitation of this method of document clustering is its complexity O(n2). As the number of documents grows, it becomes increasingly difficult to generate results in a reasonable amount of time. In the last few years, the graphics processing unit (GPU) has received attention for its ability to solve highly-parallel and semi-parallel problems much faster than the traditional sequential processor. In this paper, we have conducted research to exploit this archi- tecture and apply its strengths to the flocking based document clustering problem. Using the CUDA platform from NVIDIA, we developed a doc- ument flocking implementation to be run on the NVIDIA GEFORCE GPU. Performance gains ranged from thirty-six to nearly sixty times improvement of the GPU over the CPU implementation.

  20. A graded approach to safety documentation at processing facilities

    International Nuclear Information System (INIS)

    Cowen, M.L.

    1992-01-01

    Westinghouse Savannah River Company (WSRC) has over 40 major Safety Analysis Reports (SARs) in preparation for non-reactor facilities. These facilities include nuclear material production facilities, waste management facilities, support laboratories and environmental remediation facilities. The SARs for these various projects encompass hazard levels from High to Low, and mission times from startup, through operation, to shutdown. All of these efforts are competing for scarce resources, and therefore some mechanism is required for balancing the documentation requirements. Three of the key variables useful for the decision making process are Depth of Safety Analysis, Urgency of Safety Analysis, and Resource Availability. This report discusses safety documentation at processing facilities

  1. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model. Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used. Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase. Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  2. Flocking-based Document Clustering on the Graphics Processing Unit

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL; Patton, Robert M [ORNL; ST Charles, Jesse Lee [ORNL

    2008-01-01

    Abstract?Analyzing and grouping documents by content is a complex problem. One explored method of solving this problem borrows from nature, imitating the flocking behavior of birds. Each bird represents a single document and flies toward other documents that are similar to it. One limitation of this method of document clustering is its complexity O(n2). As the number of documents grows, it becomes increasingly difficult to receive results in a reasonable amount of time. However, flocking behavior, along with most naturally inspired algorithms such as ant colony optimization and particle swarm optimization, are highly parallel and have found increased performance on expensive cluster computers. In the last few years, the graphics processing unit (GPU) has received attention for its ability to solve highly-parallel and semi-parallel problems much faster than the traditional sequential processor. Some applications see a huge increase in performance on this new platform. The cost of these high-performance devices is also marginal when compared with the price of cluster machines. In this paper, we have conducted research to exploit this architecture and apply its strengths to the document flocking problem. Our results highlight the potential benefit the GPU brings to all naturally inspired algorithms. Using the CUDA platform from NIVIDA? we developed a document flocking implementation to be run on the NIVIDA?GEFORCE 8800. Additionally, we developed a similar but sequential implementation of the same algorithm to be run on a desktop CPU. We tested the performance of each on groups of news articles ranging in size from 200 to 3000 documents. The results of these tests were very significant. Performance gains ranged from three to nearly five times improvement of the GPU over the CPU implementation. This dramatic improvement in runtime makes the GPU a potentially revolutionary platform for document clustering algorithms.

  3. The Role of Intelligence Quotient and Emotional Intelligence in Cognitive Control Processes

    Science.gov (United States)

    Checa, Purificación; Fernández-Berrocal, Pablo

    2015-01-01

    The relationship between intelligence quotient (IQ) and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI) in individuals' cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed. PMID:26648901

  4. The role of Intelligence Quotient and Emotional Intelligence in cognitive control processes

    Directory of Open Access Journals (Sweden)

    Purificación eCheca

    2015-12-01

    Full Text Available The relationship between intelligence quotient (IQ and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI in individuals’ cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test, but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed

  5. Matching intelligent systems with business process reengineering

    NARCIS (Netherlands)

    Hart, 't M.W.

    1996-01-01

    According to Venkatraman (1991) five degrees of IT-induced business reconfiguration can be distinguished: (1) localized exploitation of IT, (2) internal integration, (3) business process redesign, (4) business network redesign, and (5) business scope redefinition. On each of these levels, different

  6. A document processing pipeline for annotating chemical entities in scientific documents.

    Science.gov (United States)

    Campos, David; Matos, Sérgio; Oliveira, José L

    2015-01-01

    The recognition of drugs and chemical entities in text is a very important task within the field of biomedical information extraction, given the rapid growth in the amount of published texts (scientific papers, patents, patient records) and the relevance of these and other related concepts. If done effectively, this could allow exploiting such textual resources to automatically extract or infer relevant information, such as drug profiles, relations and similarities between drugs, or associations between drugs and potential drug targets. The objective of this work was to develop and validate a document processing and information extraction pipeline for the identification of chemical entity mentions in text. We used the BioCreative IV CHEMDNER task data to train and evaluate a machine-learning based entity recognition system. Using a combination of two conditional random field models, a selected set of features, and a post-processing stage, we achieved F-measure results of 87.48% in the chemical entity mention recognition task and 87.75% in the chemical document indexing task. We present a machine learning-based solution for automatic recognition of chemical and drug names in scientific documents. The proposed approach applies a rich feature set, including linguistic, orthographic, morphological, dictionary matching and local context features. Post-processing modules are also integrated, performing parentheses correction, abbreviation resolution and filtering erroneous mentions using an exclusion list derived from the training data. The developed methods were implemented as a document annotation tool and web service, freely available at http://bioinformatics.ua.pt/becas-chemicals/.

  7. Intelligent Predictive Control of Nonlienar Processes Using

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Sørensen, Paul Haase; Poulsen, Niels Kjølstad

    1996-01-01

    This paper presents a novel approach to design of generalized predictive controllers (GPC) for nonlinear processes. A neural network is used for modelling the process and a gain-scheduling type of GPC is subsequently designed. The combination of neural network models and predictive control has...... frequently been discussed in the neural network community. This paper proposes an approximate scheme, the approximate predictive control (APC), which facilitates the implementation and gives a substantial reduction in the required amount of computations. The method is based on a technique for extracting...... linear models from a nonlinear neural network and using them in designing the control system. The performance of the controller is demonstrated in a simulation study of a pneumatic servo system...

  8. 48 CFR 532.905 - Payment documentation and process.

    Science.gov (United States)

    2010-10-01

    ... payments. The contracting officer or designee must review the processing of invoices or vouchers before... approval of any payment on (or attached to) the invoice or voucher submitted by the contractor and forward... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Payment documentation and...

  9. Forest Service National Visitor Use Monitoring Process: Research Method Documentation

    Science.gov (United States)

    Donald B.K. English; Susan M. Kocis; Stanley J. Zarnoch; J. Ross Arnold

    2002-01-01

    In response to the need for improved information on recreational use of National Forest System lands, the authors have developed a nationwide, systematic monitoring process. This report documents the methods they used in estimating recreational use on an annual basis. The basic unit of measure is exiting volume of visitors from a recreation site on a given day. Sites...

  10. Mental Status Documentation: Information Quality and Data Processes.

    Science.gov (United States)

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.

  11. Building the competitive intelligence knowledge: processes and activities in a corporate organisation

    OpenAIRE

    Sreenivasulu, V.

    1999-01-01

    This paper discusses the process of building and developing comprehensive tools, techniques, support systems, and better methods of harnessing the competitive intelligence knowledge processes. The author stresses the need for building sophisticated methodological competitive intelligence knowledge acquisition, systematic collection of competitive intelligence knowledge from various sources for critical analysis, process, organization, synthesis, assessment, screening, filtering and interpreta...

  12. A conceptual framework for intelligent real-time information processing

    Science.gov (United States)

    Schudy, Robert

    1987-01-01

    By combining artificial intelligence concepts with the human information processing model of Rasmussen, a conceptual framework was developed for real time artificial intelligence systems which provides a foundation for system organization, control and validation. The approach is based on the description of system processing terms of an abstraction hierarchy of states of knowledge. The states of knowledge are organized along one dimension which corresponds to the extent to which the concepts are expressed in terms of the system inouts or in terms of the system response. Thus organized, the useful states form a generally triangular shape with the sensors and effectors forming the lower two vertices and the full evaluated set of courses of action the apex. Within the triangle boundaries are numerous processing paths which shortcut the detailed processing, by connecting incomplete levels of analysis to partially defined responses. Shortcuts at different levels of abstraction include reflexes, sensory motor control, rule based behavior, and satisficing. This approach was used in the design of a real time tactical decision aiding system, and in defining an intelligent aiding system for transport pilots.

  13. An intelligent allocation algorithm for parallel processing

    Science.gov (United States)

    Carroll, Chester C.; Homaifar, Abdollah; Ananthram, Kishan G.

    1988-01-01

    The problem of allocating nodes of a program graph to processors in a parallel processing architecture is considered. The algorithm is based on critical path analysis, some allocation heuristics, and the execution granularity of nodes in a program graph. These factors, and the structure of interprocessor communication network, influence the allocation. To achieve realistic estimations of the executive durations of allocations, the algorithm considers the fact that nodes in a program graph have to communicate through varying numbers of tokens. Coarse and fine granularities have been implemented, with interprocessor token-communication duration, varying from zero up to values comparable to the execution durations of individual nodes. The effect on allocation of communication network structures is demonstrated by performing allocations for crossbar (non-blocking) and star (blocking) networks. The algorithm assumes the availability of as many processors as it needs for the optimal allocation of any program graph. Hence, the focus of allocation has been on varying token-communication durations rather than varying the number of processors. The algorithm always utilizes as many processors as necessary for the optimal allocation of any program graph, depending upon granularity and characteristics of the interprocessor communication network.

  14. Intelligent Adaptation Process for Case Based Systems

    International Nuclear Information System (INIS)

    Nassar, A.M.; Mohamed, A.H.; Mohamed, A.H.

    2014-01-01

    Case Based Reasoning (CBR) Systems is one of the important decision making systems applied in many fields all over the world. The effectiveness of any CBR system based on the quality of the storage cases in the case library. Similar cases can be retrieved and adapted to produce the solution for the new problem. One of the main issues faced the CBR systems is the difficulties of achieving the useful cases. The proposed system introduces a new approach that uses the genetic algorithm (GA) technique to automate constructing the cases into the case library. Also, it can optimize the best one to be stored in the library for the future uses. However, the proposed system can avoid the problems of the uncertain and noisy cases. Besides, it can simply the retrieving and adaptation processes. So, it can improve the performance of the CBR system. The suggested system can be applied for many real-time problems. It has been applied for diagnosis the faults of the wireless network, diagnosis of the cancer diseases, diagnosis of the debugging of a software as cases of study. The proposed system has proved its performance in this field

  15. Employing the intelligence cycle process model within the Homeland Security Enterprise

    OpenAIRE

    Stokes, Roger L.

    2013-01-01

    CHDS State/Local The purpose of this thesis was to examine the employment and adherence of the intelligence cycle process model within the National Network of Fusion Centers and the greater Homeland Security Enterprise by exploring the customary intelligence cycle process model established by the United States Intelligence Community (USIC). This thesis revealed there are various intelligence cycle process models used by the USIC and taught to the National Network. Given the numerous differ...

  16. Artificial intelligence, expert systems, computer vision, and natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1984-01-01

    An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

  17. Working memory and intelligibility of hearing-aid processed speech

    Science.gov (United States)

    Souza, Pamela E.; Arehart, Kathryn H.; Shen, Jing; Anderson, Melinda; Kates, James M.

    2015-01-01

    Previous work suggested that individuals with low working memory capacity may be at a disadvantage in adverse listening environments, including situations with background noise or substantial modification of the acoustic signal. This study explored the relationship between patient factors (including working memory capacity) and intelligibility and quality of modified speech for older individuals with sensorineural hearing loss. The modification was created using a combination of hearing aid processing [wide-dynamic range compression (WDRC) and frequency compression (FC)] applied to sentences in multitalker babble. The extent of signal modification was quantified via an envelope fidelity index. We also explored the contribution of components of working memory by including measures of processing speed and executive function. We hypothesized that listeners with low working memory capacity would perform more poorly than those with high working memory capacity across all situations, and would also be differentially affected by high amounts of signal modification. Results showed a significant effect of working memory capacity for speech intelligibility, and an interaction between working memory, amount of hearing loss and signal modification. Signal modification was the major predictor of quality ratings. These data add to the literature on hearing-aid processing and working memory by suggesting that the working memory-intelligibility effects may be related to aggregate signal fidelity, rather than to the specific signal manipulation. They also suggest that for individuals with low working memory capacity, sensorineural loss may be most appropriately addressed with WDRC and/or FC parameters that maintain the fidelity of the signal envelope. PMID:25999874

  18. Advanced multiresponse process optimisation an intelligent and integrated approach

    CERN Document Server

    Šibalija, Tatjana V

    2016-01-01

    This book presents an intelligent, integrated, problem-independent method for multiresponse process optimization. In contrast to traditional approaches, the idea of this method is to provide a unique model for the optimization of various processes, without imposition of assumptions relating to the type of process, the type and number of process parameters and responses, or interdependences among them. The presented method for experimental design of processes with multiple correlated responses is composed of three modules: an expert system that selects the experimental plan based on the orthogonal arrays; the factor effects approach, which performs processing of experimental data based on Taguchi’s quality loss function and multivariate statistical methods; and process modeling and optimization based on artificial neural networks and metaheuristic optimization algorithms. The implementation is demonstrated using four case studies relating to high-tech industries and advanced, non-conventional processes.

  19. [INVITED] Computational intelligence for smart laser materials processing

    Science.gov (United States)

    Casalino, Giuseppe

    2018-03-01

    Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

  20. Streamlining of the Decontamination and Demolition Document Preparation Process

    International Nuclear Information System (INIS)

    Durand, Nick; Meincke, Carol; Peek, Georgianne

    1999-01-01

    During the past five years, the Sandia National Labo- ratories Decontamination, Decommissioning, Demolition, and Reuse (D3R) Program has evolved and become more focused and efficient. Historical approaches to project documentation, requirements, and drivers are discussed detailing key assumptions, oversight authority, and proj- ect approvals. Discussion of efforts to streamline the D3R project planning and preparation process include the in- corporation of the principles of graded approach, Total Quality Management, and the Observational Method (CH2MHILL April 1989).1 Process improvements were realized by clearly defining regulatory requirements for each phase of a project, establishing general guidance for the program and combining project-specific documents to eliminate redundant and unneeded information. Proc- ess improvements to cost, schedule, and quality are dis- cussed in detail for several projects

  1. Artificial intelligence implementation in the APS process diagnostic

    Energy Technology Data Exchange (ETDEWEB)

    Guessasma, Sofiane; Salhi, Zahir; Montavon, Ghislain; Gougeon, Patrick; Coddet, Christian

    2004-07-25

    Thermal spray process is a technique of coating manufacturing implementing a wide variety of materials and processes. This technique is characterized by up to 150 processing parameters influencing the coating properties. The control of the coating quality is needed through the consideration of a robust methodology that takes into account the parameter interdependencies, the process variability and offers the ability to quantify the processing parameter-process response relationships. The aim of this work is to introduce a new approach based on artificial intelligence responding to these requirements. A detailed procedure is presented considering an artificial neural network (ANN) structure which encodes implicitly the physical phenomena governing the process. The implementation of such a structure was coupled to experimental results of an optic sensor controlling the powder particle fusion state before the coating formation. The optimization steps were discussed and the predicted results were compared to the experimental ones allowing the identification of the control factors.

  2. Artificial intelligence implementation in the APS process diagnostic

    International Nuclear Information System (INIS)

    Guessasma, Sofiane; Salhi, Zahir; Montavon, Ghislain; Gougeon, Patrick; Coddet, Christian

    2004-01-01

    Thermal spray process is a technique of coating manufacturing implementing a wide variety of materials and processes. This technique is characterized by up to 150 processing parameters influencing the coating properties. The control of the coating quality is needed through the consideration of a robust methodology that takes into account the parameter interdependencies, the process variability and offers the ability to quantify the processing parameter-process response relationships. The aim of this work is to introduce a new approach based on artificial intelligence responding to these requirements. A detailed procedure is presented considering an artificial neural network (ANN) structure which encodes implicitly the physical phenomena governing the process. The implementation of such a structure was coupled to experimental results of an optic sensor controlling the powder particle fusion state before the coating formation. The optimization steps were discussed and the predicted results were compared to the experimental ones allowing the identification of the control factors

  3. Natural language processing in psychiatry. Artificial intelligence technology and psychopathology.

    Science.gov (United States)

    Garfield, D A; Rapp, C; Evens, M

    1992-04-01

    The potential benefit of artificial intelligence (AI) technology as a tool of psychiatry has not been well defined. In this essay, the technology of natural language processing and its position with regard to the two main schools of AI is clearly outlined. Past experiments utilizing AI techniques in understanding psychopathology are reviewed. Natural language processing can automate the analysis of transcripts and can be used in modeling theories of language comprehension. In these ways, it can serve as a tool in testing psychological theories of psychopathology and can be used as an effective tool in empirical research on verbal behavior in psychopathology.

  4. Ramp Technology and Intelligent Processing in Small Manufacturing

    Science.gov (United States)

    Rentz, Richard E.

    1992-01-01

    To address the issues of excessive inventories and increasing procurement lead times, the Navy is actively pursuing flexible computer integrated manufacturing (FCIM) technologies, integrated by communication networks to respond rapidly to its requirements for parts. The Rapid Acquisition of Manufactured Parts (RAMP) program, initiated in 1986, is an integral part of this effort. The RAMP program's goal is to reduce the current average production lead times experienced by the Navy's inventory control points by a factor of 90 percent. The manufacturing engineering component of the RAMP architecture utilizes an intelligent processing technology built around a knowledge-based shell provided by ICAD, Inc. Rules and data bases in the software simulate an expert manufacturing planner's knowledge of shop processes and equipment. This expert system can use Product Data Exchange using STEP (PDES) data to determine what features the required part has, what material is required to manufacture it, what machines and tools are needed, and how the part should be held (fixtured) for machining, among other factors. The program's rule base then indicates, for example, how to make each feature, in what order to make it, and to which machines on the shop floor the part should be routed for processing. This information becomes part of the shop work order. The process planning function under RAMP greatly reduces the time and effort required to complete a process plan. Since the PDES file that drives the intelligent processing is 100 percent complete and accurate to start with, the potential for costly errors is greatly diminished.

  5. Document Management and Exchange System – Supporting Education Process

    Directory of Open Access Journals (Sweden)

    Emil Egredzija

    2010-03-01

    Full Text Available Development and implementation of new technologies are very important in education. One of the most challenging tasks in the education process is to build efficient and cost-friendly system for content management and exchange. The system has to be reliable, easy manageable and open. Centralized storage, secured access, and ubiquitous client technologies have emerged as best-practice solutions in engineering that kind of services. Users can easily publish or exchange documents and not need to worry about their distribution, storage or technical skills required for efficient document management. The system that will be presented is built on open source technologies and is deployable on all today's popular web software platforms. The web server, the programming language and operating system that are used to build and deploy such a system are all non-proprietary and completely open because our mission was to build system that can be easily extended and not limited by its corporate license. The system uses security mechanisms such as user group access policy, operating system level security (file system and secured data storage in database. Because of the growing need for document management in education process we believe that this project will find its place in practice.

  6. The Relationship between Multiple Intelligences with Preferred Science Teaching and Science Process Skills

    Directory of Open Access Journals (Sweden)

    Mohd Ali Samsudin

    2015-02-01

    Full Text Available This study was undertaken to identify the relationship between multiple intelligences with preferred science teaching and science process skills. The design of the study is a survey using three questionnaires reported in the literature: Multiple Intelligences Questionnaire, Preferred Science Teaching Questionnaire and Science Process Skills Questionnaire. The study selected 300 primary school students from five (5 primary schools in Penang, Malaysia. The findings showed a relationship between kinesthetic, logical-mathematical, visual-spatial and naturalistic intelligences with the preferred science teaching. In addition there was a correlation between kinesthetic and visual-spatial intelligences with science process skills, implying that multiple intelligences are related to science learning.

  7. The ethical intelligence: a tool guidance in the process of the negotiation

    Directory of Open Access Journals (Sweden)

    Cristina Seijo

    2014-08-01

    Full Text Available The present article is the result of a research, which has as object present a theoretical contrast that invites to the reflection on the ethical intelligence as a tool guidance in the negotiation. In the same one there are approached the different types of ethical intelligence; spatial intelligence, rational intelligence, emotional intelligence among others, equally one refers associative intelligence to the processes of negotiation and to the tactics of negotiation. In this respect, it is possible to deal to the ethical intelligence as the aptitude to examine the moral standards of the individual and of the society to decide between what this one correct or incorrect and to be able like that to solve the different problematic ones for which an individual or a society cross. For this reason, one invites to start mechanisms of transparency and participation by virtue of which the ethical intelligence is born in mind as the threshold that orientates this process of negotiation. 

  8. Working memory and intelligibility of hearing-aid processed speech

    Directory of Open Access Journals (Sweden)

    Pamela eSouza

    2015-05-01

    Full Text Available Previous work suggested that individuals with low working memory capacity may be at a disadvantage in adverse listening environments, including situations with background noise or substantial modification of the acoustic signal. This study explored the relationship between patient factors (including working memory capacity and intelligibility and quality of modified speech for older individuals with sensorineural hearing loss. The modification was created using a combination of hearing aid processing (wide-dynamic range compression and frequency compression applied to sentences in multitalker babble. The extent of signal modification was quantified via an envelope fidelity index. We also explored the contribution of components of working memory by including measures of processing speed and executive function. We hypothesized that listeners with low working memory capacity would perform more poorly than those with high working memory capacity across all situations, and would also be differentially affected by high amounts of signal modification. Results showed a significant effect of working memory capacity for speech intelligibility, and an interaction between working memory, amount of hearing loss and signal modification. Signal modification was the major predictor of quality ratings. These data add to the literature on hearing-aid processing and working memory by suggesting that the working memory-intelligibility effects may be related to aggregate signal fidelity, rather than on the specific signal manipulation. They also suggest that for individuals with low working memory capacity, sensorineural loss may be most appropriately addressed with wide-dynamic range compression and/or frequency compression parameters that maintain the fidelity of the signal envelope.

  9. Embedding Metadata and Other Semantics in Word Processing Documents

    Directory of Open Access Journals (Sweden)

    Peter Sefton

    2009-10-01

    Full Text Available This paper describes a technique for embedding document metadata, and potentially other semantic references inline in word processing documents, which the authors have implemented with the help of a software development team. Several assumptions underly the approach; It must be available across computing platforms and work with both Microsoft Word (because of its user base and OpenOffice.org (because of its free availability. Further the application needs to be acceptable to and usable by users, so the initial implementation covers only small number of features, which will only be extended after user-testing. Within these constraints the system provides a mechanism for encoding not only simple metadata, but for inferring hierarchical relationships between metadata elements from a ‘flat’ word processing file.The paper includes links to open source code implementing the techniques as part of a broader suite of tools for academic writing. This addresses tools and software, semantic web and data curation, integrating curation into research workflows and will provide a platform for integrating work on ontologies, vocabularies and folksonomies into word processing tools.

  10. Adaptive Moving Object Tracking Integrating Neural Networks And Intelligent Processing

    Science.gov (United States)

    Lee, James S. J.; Nguyen, Dziem D.; Lin, C.

    1989-03-01

    A real-time adaptive scheme is introduced to detect and track moving objects under noisy, dynamic conditions including moving sensors. This approach integrates the adaptiveness and incremental learning characteristics of neural networks with intelligent reasoning and process control. Spatiotemporal filtering is used to detect and analyze motion, exploiting the speed and accuracy of multiresolution processing. A neural network algorithm constitutes the basic computational structure for classification. A recognition and learning controller guides the on-line training of the network, and invokes pattern recognition to determine processing parameters dynamically and to verify detection results. A tracking controller acts as the central control unit, so that tracking goals direct the over-all system. Performance is benchmarked against the Widrow-Hoff algorithm, for target detection scenarios presented in diverse FLIR image sequences. Efficient algorithm design ensures that this recognition and control scheme, implemented in software and commercially available image processing hardware, meets the real-time requirements of tracking applications.

  11. Intelligence

    Science.gov (United States)

    Sternberg, Robert J.

    2012-01-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain—especially with regard to the functioning in the prefrontal cortex—and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

  12. Intelligence.

    Science.gov (United States)

    Sternberg, Robert J

    2012-03-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex-and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret.

  13. Acoustic richness modulates the neural networks supporting intelligible speech processing.

    Science.gov (United States)

    Lee, Yune-Sang; Min, Nam Eun; Wingfield, Arthur; Grossman, Murray; Peelle, Jonathan E

    2016-03-01

    The information contained in a sensory signal plays a critical role in determining what neural processes are engaged. Here we used interleaved silent steady-state (ISSS) functional magnetic resonance imaging (fMRI) to explore how human listeners cope with different degrees of acoustic richness during auditory sentence comprehension. Twenty-six healthy young adults underwent scanning while hearing sentences that varied in acoustic richness (high vs. low spectral detail) and syntactic complexity (subject-relative vs. object-relative center-embedded clause structures). We manipulated acoustic richness by presenting the stimuli as unprocessed full-spectrum speech, or noise-vocoded with 24 channels. Importantly, although the vocoded sentences were spectrally impoverished, all sentences were highly intelligible. These manipulations allowed us to test how intelligible speech processing was affected by orthogonal linguistic and acoustic demands. Acoustically rich speech showed stronger activation than acoustically less-detailed speech in a bilateral temporoparietal network with more pronounced activity in the right hemisphere. By contrast, listening to sentences with greater syntactic complexity resulted in increased activation of a left-lateralized network including left posterior lateral temporal cortex, left inferior frontal gyrus, and left dorsolateral prefrontal cortex. Significant interactions between acoustic richness and syntactic complexity occurred in left supramarginal gyrus, right superior temporal gyrus, and right inferior frontal gyrus, indicating that the regions recruited for syntactic challenge differed as a function of acoustic properties of the speech. Our findings suggest that the neural systems involved in speech perception are finely tuned to the type of information available, and that reducing the richness of the acoustic signal dramatically alters the brain's response to spoken language, even when intelligibility is high. Copyright © 2015 Elsevier

  14. Defense Waste Processing Facility staged operations: environmental information document

    International Nuclear Information System (INIS)

    1981-11-01

    Environmental information is presented relating to a staged version of the proposed Defense Waste Processing Facility (DWPF) at the Savannah River Plant. The information is intended to provide the basis for an Environmental Impact Statement. In either the integral or the staged design, the DWPF will convert the high-level waste currently stored in tanks into: a leach-resistant form containing about 99.9% of all the radioactivity, and a residual, slightly contaminated salt, which is disposed of as saltcrete. In the first stage of the staged version, the insoluble sludge portion of the waste and the long lived radionuclides contained therein will be vitrified. The waste glass will be sealed in canisters and stored onsite until shipped to a Federal repository. In the second stage, the supernate portion of the waste will be decontaminated by ion exchange. The recovered radionuclides will be transferred to the Stage 1 facility, and mixed with the sludge feed before vitrification. The residual, slightly contaminated salt solution will be mixed with Portland cement to form a concrete product (saltcrete) which will be buried onsite in an engineered landfill. This document describes the conceptual facilities and processes for producing glass waste and decontaminated salt. The environmental effects of facility construction, normal operations, and accidents are then presented. Descriptions of site and environs, alternative sites and waste disposal options, and environmental consultations and permits are given in the base Environmental Information Document

  15. Process Technical Basis Documentation Diagram for a solid-waste processing facility

    International Nuclear Information System (INIS)

    Benar, C.J.; Petersen, C.A.

    1994-02-01

    The Process Technical Basis Documentation Diagram is for a solid-waste processing facility that could be designed to treat, package, and certify contact-handled mixed low-level waste for permanent disposal. The treatment processes include stabilization using cementitious materials and immobilization using a polymer material. The Diagram identifies several engineering/demonstration activities that would confirm the process selection and process design. An independent peer review was conducted at the request of Westinghouse Hanford Company to determine the technical adequacy of the technical approach for waste form development. The peer review panel provided comments and identified documents that it felt were needed in the Diagram as precedence for Title I design. The Diagram is a visual tool to identify traceable documentation of key activities, including those documents suggested by the peer review, and to show how they relate to each other. The Diagram is divided into three sections: (1) the Facility section, which contains documents pertaining to the facility design, (2) the Process Demonstration section, which contains documents pertaining to the process engineering/demonstration work, and 3) the Regulatory section, which contains documents describing the compliance strategy for each acceptance requirement for each feed type, and how this strategy will be implemented

  16. USE OF ARTIFICIAL INTELLIGENCE TECHNIQUES IN QUALITY IMPROVING PROCESS

    OpenAIRE

    KALİTE İYİLEŞTİRME SÜRECİNDE YAPAY ZEKÃ KAYA; Orhan ENGİN

    2005-01-01

    Today, changing of competition conditions and customer preferences caused to happen many differences in the viewpoint of firms' quality studies. At the same time, improvements in computer technologies accelerated use of artificial intelligence. Artificial intelligence technologies are being used to solve many industry problems. In this paper, we investigated the use of artificial intelligence techniques to solve quality problems. The artificial intelligence techniques, which are used in quali...

  17. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent...... males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two.......2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were...

  18. Integrating artificial and human intelligence into tablet production process.

    Science.gov (United States)

    Gams, Matjaž; Horvat, Matej; Ožek, Matej; Luštrek, Mitja; Gradišek, Anton

    2014-12-01

    We developed a new machine learning-based method in order to facilitate the manufacturing processes of pharmaceutical products, such as tablets, in accordance with the Process Analytical Technology (PAT) and Quality by Design (QbD) initiatives. Our approach combines the data, available from prior production runs, with machine learning algorithms that are assisted by a human operator with expert knowledge of the production process. The process parameters encompass those that relate to the attributes of the precursor raw materials and those that relate to the manufacturing process itself. During manufacturing, our method allows production operator to inspect the impacts of various settings of process parameters within their proven acceptable range with the purpose of choosing the most promising values in advance of the actual batch manufacture. The interaction between the human operator and the artificial intelligence system provides improved performance and quality. We successfully implemented the method on data provided by a pharmaceutical company for a particular product, a tablet, under development. We tested the accuracy of the method in comparison with some other machine learning approaches. The method is especially suitable for analyzing manufacturing processes characterized by a limited amount of data.

  19. Advanced Manufacturing Processes Laboratory Building 878 hazards assessment document

    International Nuclear Information System (INIS)

    Wood, C.; Thornton, W.; Swihart, A.; Gilman, T.

    1994-07-01

    The introduction of the hazards assessment process is to document the impact of the release of hazards at the Advanced Manufacturing Processes Laboratory (AMPL) that are significant enough to warrant consideration in Sandia National Laboratories' operational emergency management program. This hazards assessment is prepared in accordance with the Department of Energy Order 5500.3A requirement that facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment provides an analysis of the potential airborne release of chemicals associated with the operations and processes at the AMPL. This research and development laboratory develops advanced manufacturing technologies, practices, and unique equipment and provides the fabrication of prototype hardware to meet the needs of Sandia National Laboratories, Albuquerque, New Mexico (SNL/NM). The focus of the hazards assessment is the airborne release of materials because this requires the most rapid, coordinated emergency response on the part of the AMPL, SNL/NM, collocated facilities, and surrounding jurisdiction to protect workers, the public, and the environment

  20. Advanced Manufacturing Processes Laboratory Building 878 hazards assessment document

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.; Thornton, W.; Swihart, A.; Gilman, T.

    1994-07-01

    The introduction of the hazards assessment process is to document the impact of the release of hazards at the Advanced Manufacturing Processes Laboratory (AMPL) that are significant enough to warrant consideration in Sandia National Laboratories` operational emergency management program. This hazards assessment is prepared in accordance with the Department of Energy Order 5500.3A requirement that facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment provides an analysis of the potential airborne release of chemicals associated with the operations and processes at the AMPL. This research and development laboratory develops advanced manufacturing technologies, practices, and unique equipment and provides the fabrication of prototype hardware to meet the needs of Sandia National Laboratories, Albuquerque, New Mexico (SNL/NM). The focus of the hazards assessment is the airborne release of materials because this requires the most rapid, coordinated emergency response on the part of the AMPL, SNL/NM, collocated facilities, and surrounding jurisdiction to protect workers, the public, and the environment.

  1. A New Technique For Information Processing of CLIC Technical Documentation

    CERN Document Server

    Tzermpinos, Konstantinos

    2013-01-01

    The scientific work presented in this paper could be described as a novel, systemic approach to the process of organization of CLIC documentation. The latter refers to the processing of various sets of archived data found on various CERN archiving services in a more friendly and organized way. From physics aspect, this is equal to having an initial system characterized by high entropy, which after some transformation of energy and matter will produce a final system of reduced entropy. However, this reduction in entropy can be considered valid for open systems only, which are sub-systems of grander isolated systems, to which the total entropy will always increase. Thus, using as basis elements from information theory, systems theory and thermodynamics, the unorganized form of data pending to be organized to a higher form, is modeled as an initial open sub-system with increased entropy, which, after the processing of information, will produce a final system with decreased entropy. This systemic approach to the ...

  2. Synthetic-Creative Intelligence and Psychometric Intelligence: Analysis of the Threshold Theory and Creative Process

    Science.gov (United States)

    Ferrando, Mercedes; Soto, Gloria; Prieto, Lola; Sáinz, Marta; Ferrándiz, Carmen

    2016-01-01

    There has been an increasing body of research to uncover the relationship between creativity and intelligence. This relationship usually has been examined using traditional measures of intelligence and seldom using new approaches (i.e. Ferrando et al. 2005). In this work, creativity is measured by tools developed based on Sternberg's successful…

  3. The brain as a distributed intelligent processing system: an EEG study.

    Science.gov (United States)

    da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo

    2011-03-15

    Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. The present results support these claims and the neural efficiency hypothesis.

  4. The brain as a distributed intelligent processing system: an EEG study.

    Directory of Open Access Journals (Sweden)

    Armando Freitas da Rocha

    Full Text Available BACKGROUND: Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS, first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. METHODOLOGY AND PRINCIPAL FINDINGS: In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale and WISC (Wechsler Intelligence Scale for Children, and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. CONCLUSION: The present results support these claims and the neural efficiency hypothesis.

  5. The Brain as a Distributed Intelligent Processing System: An EEG Study

    Science.gov (United States)

    da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo

    2011-01-01

    Background Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. Methodology and Principal Findings In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Whechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. Conclusion The present results support these claims and the neural efficiency hypothesis. PMID:21423657

  6. Intelligent sensor networks the integration of sensor networks, signal processing and machine learning

    CERN Document Server

    Hu, Fei

    2012-01-01

    Although governments worldwide have invested significantly in intelligent sensor network research and applications, few books cover intelligent sensor networks from a machine learning and signal processing perspective. Filling this void, Intelligent Sensor Networks: The Integration of Sensor Networks, Signal Processing and Machine Learning focuses on the close integration of sensing, networking, and smart signal processing via machine learning. Based on the world-class research of award-winning authors, the book provides a firm grounding in the fundamentals of intelligent sensor networks, incl

  7. UMTRA Surface Project management action process document: Final. Revision 2

    International Nuclear Information System (INIS)

    1996-06-01

    Title 1 of the UMTRCA authorized the DOE to undertake remedial actions at these designed sites and associated vicinity properties (VP), which contain uranium mill tailings and other residual radioactive materials (RRM) derived from the processing sites. Title 2 of the UMTRCA addresses uranium mill sites that were licensed at the time the UMTRCA was enacted. Cleanup of these Title 2 sites is the responsibility of the licensees. The cleanup of the Title 1 sites has been split into two separate projects: the Surface Project, which deals with the mill buildings, tailings, and contaminated soils at the sites and VPs; and the Ground Water Project, which is limited to the contaminated ground water at the sites. This management action process (MAP) document discusses the Uranium Mill Tailings Remedial Action (UMTRA) Surface Project. Since its inception through March 1996, the Surface Project (hereinafter called the Project) has cleaned up 16 of the 24 designated processing sites and approximately 5,000 VPs, reducing the risk to human health and the environment posed by the uranium mill tailings. Two of the 24 sites, Belfield and Bowman, North Dakota, will not be remediated at the request of the state, reducing the total number of sites to 22. By the start of FY1998, the remaining 6 processing sites and associated VPs will be cleaned up. The remedial action activities to be funded in FY1998 by the FY1998 budget request are remediation of the remaining Grand Junction, Colorado, VPs; closure of the Cheney disposal cell in Grand Junction, Colorado; and preparation of the completion reports for 4 completed sites

  8. UMTRA Surface Project management action process document: Final. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    Title 1 of the UMTRCA authorized the DOE to undertake remedial actions at these designed sites and associated vicinity properties (VP), which contain uranium mill tailings and other residual radioactive materials (RRM) derived from the processing sites. Title 2 of the UMTRCA addresses uranium mill sites that were licensed at the time the UMTRCA was enacted. Cleanup of these Title 2 sites is the responsibility of the licensees. The cleanup of the Title 1 sites has been split into two separate projects: the Surface Project, which deals with the mill buildings, tailings, and contaminated soils at the sites and VPs; and the Ground Water Project, which is limited to the contaminated ground water at the sites. This management action process (MAP) document discusses the Uranium Mill Tailings Remedial Action (UMTRA) Surface Project. Since its inception through March 1996, the Surface Project (hereinafter called the Project) has cleaned up 16 of the 24 designated processing sites and approximately 5,000 VPs, reducing the risk to human health and the environment posed by the uranium mill tailings. Two of the 24 sites, Belfield and Bowman, North Dakota, will not be remediated at the request of the state, reducing the total number of sites to 22. By the start of FY1998, the remaining 6 processing sites and associated VPs will be cleaned up. The remedial action activities to be funded in FY1998 by the FY1998 budget request are remediation of the remaining Grand Junction, Colorado, VPs; closure of the Cheney disposal cell in Grand Junction, Colorado; and preparation of the completion reports for 4 completed sites.

  9. Intelligent control system for continuous technological process of alkylation

    Science.gov (United States)

    Gebel, E. S.; Hakimov, R. A.

    2018-01-01

    Relevance of intelligent control for complex dynamic objects and processes are shown in this paper. The model of a virtual analyzer based on a neural network is proposed. Comparative analysis of mathematical models implemented in MathLab software showed that the most effective from the point of view of the reproducibility of the result is the model with seven neurons in the hidden layer, the training of which was performed using the method of scaled coupled gradients. Comparison of the data from the laboratory analysis and the theoretical model are showed that the root-mean-square error does not exceed 3.5, and the calculated value of the correlation coefficient corresponds to a "strong" connection between the values.

  10. The Viewpoint Paradigm: a semiotic based approach for the intelligibility of a cooperative designing process

    Directory of Open Access Journals (Sweden)

    Pierre-Jean Charrel

    2002-11-01

    Full Text Available The concept of viewpoint is studied in the field of the modelling and the knowledge management concerned in the upstream phases of a designing process. The concept is approached by semiotics, i.e. in dealing with the requirements so that an actor gives sense to an object. This gives means to transform the intuitive concepts of viewpoint and relation between viewpoints into the Viewpoint Paradigm: the sense of an object is the integration of the viewpoints which exert on it. The elements of this paradigm are integrated in a general model, which defines two concepts formally: Viewpoint and Correlation of viewpoints. The Viewpoint Paradigm is then implemented in operational concerns which are related with the intelligibility of the designing process. Two models of viewpoint and correlation are proposed. They raise of viewpoints management such as one can identify them in the written documents of a project.

  11. The process of implementing Competitive Intelligence in a company

    OpenAIRE

    František Bartes

    2013-01-01

    It is a common occurrence in business practice that the management of a company, in an effort to jump-start the function of the Competitive Intelligence unit, makes a number of mistakes and errors. Yet it is not difficult to avoid these missteps and achieve the desired level of Competitive Intelligence activities in a purposeful and effective manner. The author believes that a resolution of this problem lies in his concept of Competitive Intelligence viewed as a system application discipline ...

  12. Intelligent Integration between Human Simulated Intelligence and Expert Control Technology for the Combustion Process of Gas Heating Furnace

    Directory of Open Access Journals (Sweden)

    Yucheng Liu

    2014-01-01

    Full Text Available Due to being poor in control quality of the combustion process of gas heating furnace, this paper explored a sort of strong robust control algorithm in order to improve the control quality of the combustion process of gas heating furnace. The paper analyzed the control puzzle in the complex combustion process of gas heating furnace, summarized the cybernetics characteristic of the complex combustion process, researched into control strategy of the uncertainty complex control process, discussed the control model of the complex process, presented a sort of intelligent integration between human-simulated intelligence and expert control technology, and constructed the control algorithm for the combustion process controlling of gas heating furnace. The simulation results showed that the control algorithm proposed in the paper is not only better in dynamic and steady quality of the combustion process, but also obvious in energy saving effect, feasible, and effective in control strategy.

  13. Natural language processing in an intelligent writing strategy tutoring system.

    Science.gov (United States)

    McNamara, Danielle S; Crossley, Scott A; Roscoe, Rod

    2013-06-01

    The Writing Pal is an intelligent tutoring system that provides writing strategy training. A large part of its artificial intelligence resides in the natural language processing algorithms to assess essay quality and guide feedback to students. Because writing is often highly nuanced and subjective, the development of these algorithms must consider a broad array of linguistic, rhetorical, and contextual features. This study assesses the potential for computational indices to predict human ratings of essay quality. Past studies have demonstrated that linguistic indices related to lexical diversity, word frequency, and syntactic complexity are significant predictors of human judgments of essay quality but that indices of cohesion are not. The present study extends prior work by including a larger data sample and an expanded set of indices to assess new lexical, syntactic, cohesion, rhetorical, and reading ease indices. Three models were assessed. The model reported by McNamara, Crossley, and McCarthy (Written Communication 27:57-86, 2010) including three indices of lexical diversity, word frequency, and syntactic complexity accounted for only 6% of the variance in the larger data set. A regression model including the full set of indices examined in prior studies of writing predicted 38% of the variance in human scores of essay quality with 91% adjacent accuracy (i.e., within 1 point). A regression model that also included new indices related to rhetoric and cohesion predicted 44% of the variance with 94% adjacent accuracy. The new indices increased accuracy but, more importantly, afford the means to provide more meaningful feedback in the context of a writing tutoring system.

  14. Advances in intelligent process-aware information systems concepts, methods, and technologies

    CERN Document Server

    Oberhauser, Roy; Reichert, Manfred

    2017-01-01

    This book provides a state-of-the-art perspective on intelligent process-aware information systems and presents chapters on specific facets and approaches applicable to such systems. Further, it highlights novel advances and developments in various aspects of intelligent process-aware information systems and business process management systems. Intelligence capabilities are increasingly being integrated into or created in many of today’s software products and services. Process-aware information systems provide critical computing infrastructure to support the various processes involved in the creation and delivery of business products and services. Yet the integration of intelligence capabilities into process-aware information systems is a non-trivial yet necessary evolution of these complex systems. The book’s individual chapters address adaptive process management, case management processes, autonomically-capable processes, process-oriented information logistics, process recommendations, reasoning over ...

  15. Hanford Tanks Initiative requirements and document management process guide

    International Nuclear Information System (INIS)

    Schaus, P.S.

    1998-01-01

    This revision of the guide provides updated references to project management level Program Management and Assessment Configuration Management activities, and provides working level directions for submitting requirements and project documentation related to the Hanford Tanks Initiative (HTI) project. This includes documents and information created by HTI, as well as non-HTI generated materials submitted to the project

  16. Processing United Nations Documents in the University of Michigan Library.

    Science.gov (United States)

    Stolper, Gertrude

    This guide provides detailed instructions for recording documents in the United Nations (UN) card catalog which provides access to the UN depository collection in the Harlan Hatcher Graduate Library at the University of Michigan. Procedures for handling documents when they are received include stamping, counting, and sorting into five categories:…

  17. Data Processing Languages for Business Intelligence. SQL vs. R

    Directory of Open Access Journals (Sweden)

    Marin FOTACHE

    2016-01-01

    Full Text Available As data centric approach, Business Intelligence (BI deals with the storage, integration, processing, exploration and analysis of information gathered from multiple sources in various formats and volumes. BI systems are generally synonymous to costly, complex platforms that require vast organizational resources. But there is also an-other face of BI, that of a pool of data sources, applications, services developed at different times using different technologies. This is “democratic” BI or, in some cases, “fragmented”, “patched” (or “chaotic” BI. Fragmentation creates not only integration problems, but also supports BI agility as new modules can be quickly developed. Among various languages and tools that cover large extents of BI activities, SQL and R are instrumental for both BI platform developers and BI users. SQL and R address both monolithic and democratic BI. This paper compares essential data processing features of two languages, identifying similarities and differences among them and also their strengths and limits.

  18. Feature-based tolerancing for intelligent inspection process definition

    International Nuclear Information System (INIS)

    Brown, C.W.

    1993-07-01

    This paper describes a feature-based tolerancing capability that complements a geometric solid model with an explicit representation of conventional and geometric tolerances. This capability is focused on supporting an intelligent inspection process definition system. The feature-based tolerance model's benefits include advancing complete product definition initiatives (e.g., STEP -- Standard for Exchange of Product model dam), suppling computer-integrated manufacturing applications (e.g., generative process planning and automated part programming) with product definition information, and assisting in the solution of measurement performance issues. A feature-based tolerance information model was developed based upon the notion of a feature's toleranceable aspects and describes an object-oriented scheme for representing and relating tolerance features, tolerances, and datum reference frames. For easy incorporation, the tolerance feature entities are interconnected with STEP solid model entities. This schema will explicitly represent the tolerance specification for mechanical products, support advanced dimensional measurement applications, and assist in tolerance-related methods divergence issues

  19. Richland Environmental Restoration Project management action process document

    International Nuclear Information System (INIS)

    1996-04-01

    This document is the prescribed means for providing direct input to the US Department of Energy Headquarters regarding the status, accomplishments, strategy, and issues of the Richland Environmental Restoration Project. The project mission, organizational interfaces, and operational history of the Hanford Site are provided. Remediation strategies are analyzed in detail. The document includes a status of Richland Environmental Restoration project activities and accomplishments, and it presents current cost summaries, schedules, and technical baselines

  20. Richland Environmental Restoration Project management action process document

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    This document is the prescribed means for providing direct input to the US Department of Energy Headquarters regarding the status, accomplishments, strategy, and issues of the Richland Environmental Restoration Project. The project mission, organizational interfaces, and operational history of the Hanford Site are provided. Remediation strategies are analyzed in detail. The document includes a status of Richland Environmental Restoration project activities and accomplishments, and it presents current cost summaries, schedules, and technical baselines.

  1. 19 CFR 210.7 - Service of process and other documents; publication of notices.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Service of process and other documents... § 210.7 Service of process and other documents; publication of notices. (a) Manner of service. (1) The service of process and all documents issued by or on behalf of the Commission or the administrative law...

  2. 19 CFR 201.16 - Service of process and other documents.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Service of process and other documents. 201.16... APPLICATION Initiation and Conduct of Investigations § 201.16 Service of process and other documents. (a) By..., the service of a process or other document of the Commission shall be served by anyone duly authorized...

  3. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  4. Information Processing and Coaching Treatments in an Intelligent Tutoring System

    National Research Council Canada - National Science Library

    Dillon, Ronna

    1997-01-01

    The purpose of this effort was to develop an intelligent tutoring system (ITS) to train test administrators how to operate computerized adaptive testing Armed Services Vocational Aptitude Battery (CAT-ASVAB...

  5. Artificial intelligence in NMR imaging and image processing

    International Nuclear Information System (INIS)

    Kuhn, M.H.

    1988-01-01

    NMR tomography offers a wealth of information and data acquisition variants. Artificial intelligence is able to efficiently support the selection of measuring parameters and the evaluation of results. (orig.) [de

  6. Intelligent technologies in process of highly-precise products manufacturing

    Science.gov (United States)

    Vakhidova, K. L.; Khakimov, Z. L.; Isaeva, M. R.; Shukhin, V. V.; Labazanov, M. A.; Ignatiev, S. A.

    2017-10-01

    One of the main control methods of the surface layer of bearing parts is the eddy current testing method. Surface layer defects of bearing parts, like burns, cracks and some others, are reflected in the results of the rolling surfaces scan. The previously developed method for detecting defects from the image of the raceway was quite effective, but the processing algorithm is complicated and lasts for about 12 ... 16 s. The real non-stationary signals from an eddy current transducer (ECT) consist of short-time high-frequency and long-time low-frequency components, therefore a transformation is used for their analysis, which provides different windows for different frequencies. The wavelet transform meets these conditions. Based on aforesaid, a methodology for automatically detecting and recognizing local defects in bearing parts surface layer has been developed on the basis of wavelet analysis using integral estimates. Some of the defects are recognized by the amplitude component, otherwise an automatic transition to recognition by the phase component of information signals (IS) is carried out. The use of intelligent technologies in the manufacture of bearing parts will, firstly, significantly improve the quality of bearings, and secondly, significantly improve production efficiency by reducing (eliminating) rejections in the manufacture of products, increasing the period of normal operation of the technological equipment (inter-adjustment period), the implementation of the system of Flexible facilities maintenance, as well as reducing production costs.

  7. Detection, information fusion, and temporal processing for intelligence in recognition

    Energy Technology Data Exchange (ETDEWEB)

    Casasent, D. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1996-12-31

    The use of intelligence in vision recognition uses many different techniques or tools. This presentation discusses several of these techniques for recognition. The recognition process is generally separated into several steps or stages when implemented in hardware, e.g. detection, segmentation and enhancement, and recognition. Several new distortion-invariant filters, biologically-inspired Gabor wavelet filter techniques, and morphological operations that have been found very useful for detection and clutter rejection are discussed. These are all shift-invariant operations that allow multiple object regions of interest in a scene to be located in parallel. We also discuss new algorithm fusion concepts by which the results from different detection algorithms are combined to reduce detection false alarms; these fusion methods utilize hierarchical processing and fuzzy logic concepts. We have found this to be most necessary, since no single detection algorithm is best for all cases. For the final recognition stage, we describe a new method of representing all distorted versions of different classes of objects and determining the object class and pose that most closely matches that of a given input. Besides being efficient in terms of storage and on-line computations required, it overcomes many of the problems that other classifiers have in terms of the required training set size, poor generalization with many hidden layer neurons, etc. It is also attractive in its ability to reject input regions as clutter (non-objects) and to learn new object descriptions. We also discuss its use in processing a temporal sequence of input images of the contents of each local region of interest. We note how this leads to robust results in which estimation efforts in individual frames can be overcome. This seems very practical, since in many scenarios a decision need not be made after only one frame of data, since subsequent frames of data enter immediately in sequence.

  8. A MURI Center for Intelligent Biomimetic Image Processing and Classification

    Science.gov (United States)

    2007-11-01

    times, labeled "plage" or "open space" or "natural", the system learns to associate multiple classes with a given input. Testbed image examples have shown...brain color perception and category learning. Commentary on "Coordinating perceptually grounded categories through language " by Luc Steels and Tony...Symposium on Computational Intelligence (ISCI), Kosice, Slovakia, June 2002. 9. Carpenter, G.A., Award from the Slovak Artificial Intelligence Society, 2002

  9. 48 CFR 32.905 - Payment documentation and process.

    Science.gov (United States)

    2010-10-01

    ..., quantity, unit of measure, unit price, and extended price of supplies delivered or services performed. (v... documentation to the designated payment office by the 5th working day after Government acceptance or approval... this section. Acceptance should be completed as expeditiously as possible. The receiving report or...

  10. Racial Equality in Intelligence: Predictions from a Theory of Intelligence as Processing

    Science.gov (United States)

    Fagan, Joseph F.; Holland, Cynthia R.

    2007-01-01

    African-Americans and Whites were asked to solve problems typical of those administered on standard tests of intelligence. Half of the problems were solvable on the basis of information generally available to either race and/or on the basis of information newly learned. Such knowledge did not vary with race. Other problems were only solvable on…

  11. Effect of promoting self-esteem by participatory learning process on emotional intelligence among early adolescents.

    Science.gov (United States)

    Munsawaengsub, Chokchai; Yimklib, Somkid; Nanthamongkolchai, Sutham; Apinanthavech, Suporn

    2009-12-01

    To study the effect of promoting self-esteem by participatory learning program on emotional intelligence among early adolescents. The quasi-experimental study was conducted in grade 9 students from two schools in Bangbuathong district, Nonthaburi province. Each experimental and comparative group consisted of 34 students with the lowest score of emotional intelligence. The instruments were questionnaires, Program to Develop Emotional Intelligence and Handbook of Emotional Intelligence Development. The experimental group attended 8 participatory learning activities in 4 weeks to Develop Emotional Intelligence while the comparative group received the handbook for self study. Assessment the effectiveness of program was done by pre-test and post-test immediately and 4 weeks apart concerning the emotional intelligence. Implementation and evaluation was done during May 24-August 12, 2005. Data were analyzed by frequency, percentage, mean, standard deviation, Chi-square, independent sample t-test and paired sample t-test. Before program implementation, both groups had no statistical difference in mean score of emotional intelligence. After intervention, the experimental group had higher mean score of emotional intelligence both immediately and 4 weeks later with statistical significant (p = 0.001 and self-esteem by participatory learning process could enhance the emotional intelligence in early-adolescent. This program could be modified and implemented for early adolescent in the community.

  12. Documentation of a Conduit Flow Process (CFP) for MODFLOW-2005

    Science.gov (United States)

    Shoemaker, W. Barclay; Kuniansky, Eve L.; Birk, Steffen; Bauer, Sebastian; Swain, Eric D.

    2007-01-01

    This report documents the Conduit Flow Process (CFP) for the modular finite-difference ground-water flow model, MODFLOW-2005. The CFP has the ability to simulate turbulent ground-water flow conditions by: (1) coupling the traditional ground-water flow equation with formulations for a discrete network of cylindrical pipes (Mode 1), (2) inserting a high-conductivity flow layer that can switch between laminar and turbulent flow (Mode 2), or (3) simultaneously coupling a discrete pipe network while inserting a high-conductivity flow layer that can switch between laminar and turbulent flow (Mode 3). Conduit flow pipes (Mode 1) may represent dissolution or biological burrowing features in carbonate aquifers, voids in fractured rock, and (or) lava tubes in basaltic aquifers and can be fully or partially saturated under laminar or turbulent flow conditions. Preferential flow layers (Mode 2) may represent: (1) a porous media where turbulent flow is suspected to occur under the observed hydraulic gradients; (2) a single secondary porosity subsurface feature, such as a well-defined laterally extensive underground cave; or (3) a horizontal preferential flow layer consisting of many interconnected voids. In this second case, the input data are effective parameters, such as a very high hydraulic conductivity, representing multiple features. Data preparation is more complex for CFP Mode 1 (CFPM1) than for CFP Mode 2 (CFPM2). Specifically for CFPM1, conduit pipe locations, lengths, diameters, tortuosity, internal roughness, critical Reynolds numbers (NRe), and exchange conductances are required. CFPM1, however, solves the pipe network equations in a matrix that is independent of the porous media equation matrix, which may mitigate numerical instability associated with solution of dual flow components within the same matrix. CFPM2 requires less hydraulic information and knowledge about the specific location and hydraulic properties of conduits, and turbulent flow is approximated by

  13. RECOVERY OF DOCUMENT TEXT FROM TORN FRAGMENTS USING IMAGE PROCESSING

    OpenAIRE

    C.Prasad; Dr.Mahesh; Dr.S.A.K. Jilani

    2016-01-01

    Recovery of document from its torn or damaged fragments play an important role in the field of forensics and archival study. Reconstruction of the torn papers manually with the help of glue and tapes etc., is tedious, time consuming and not satisfactory. For torn images reconstruction we go for image mosaicing, where we reconstruct the image using features (corners) and RANSAC with homography.But for the torn fragments there is no such similarity portion between fragments. Hence we propose a ...

  14. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  15. Theoretical and Practical Aspects of Logistic Quality Management System Documentation Development Process

    Directory of Open Access Journals (Sweden)

    Linas Šaulinskas

    2013-12-01

    Full Text Available This paper addresses aspects of logistics quality management system documentation development and suggests models for quality management system documentation development, documentation hierarchical systems and authorization approval. It also identifies logistic processes and a responsibilities model and a detailed document development and approval process that can be practically applied. Our results are based upon an analysis of advanced Lithuanian and foreign corporate business practices, a review of current literature and recommendations for quality management system standards.

  16. Decision Support for Software Process Management Teams: An Intelligent Software Agent Approach

    National Research Council Canada - National Science Library

    Church, Lori

    2000-01-01

    ... to market, eliminate redundancy, and ease job stress. This thesis proposes a conceptual model for software process management decision support in the form of an intelligent software agent network...

  17. Interated Intelligent Industrial Process Sensing and Control: Applied to and Demonstrated on Cupola Furnaces

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed Abdelrahman; roger Haggard; Wagdy Mahmoud; Kevin Moore; Denis Clark; Eric Larsen; Paul King

    2003-02-12

    The final goal of this project was the development of a system that is capable of controlling an industrial process effectively through the integration of information obtained through intelligent sensor fusion and intelligent control technologies. The industry of interest in this project was the metal casting industry as represented by cupola iron-melting furnaces. However, the developed technology is of generic type and hence applicable to several other industries. The system was divided into the following four major interacting components: 1. An object oriented generic architecture to integrate the developed software and hardware components @. Generic algorithms for intelligent signal analysis and sensor and model fusion 3. Development of supervisory structure for integration of intelligent sensor fusion data into the controller 4. Hardware implementation of intelligent signal analysis and fusion algorithms

  18. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    Energy Technology Data Exchange (ETDEWEB)

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  19. The Relationship between Multiple Intelligences with Preferred Science Teaching and Science Process Skills

    OpenAIRE

    Mohd Ali Samsudin; Noor Hasyimah Haniza; Corrienna Abdul-Talib; Hayani Marlia Mhd Ibrahim

    2015-01-01

    This study was undertaken to identify the relationship between multiple intelligences with preferred science teaching and science process skills. The design of the study is a survey using three questionnaires reported in the literature: Multiple Intelligences Questionnaire, Preferred Science Teaching Questionnaire and Science Process Skills Questionnaire. The study selected 300 primary school students from five (5) primary schools in Penang, Malaysia. The findings showed a relationship betwee...

  20. The application of neural networks with artificial intelligence technique in the modeling of industrial processes

    International Nuclear Information System (INIS)

    Saini, K. K.; Saini, Sanju

    2008-01-01

    Neural networks are a relatively new artificial intelligence technique that emulates the behavior of biological neural systems in digital software or hardware. These networks can 'learn', automatically, complex relationships among data. This feature makes the technique very useful in modeling processes for which mathematical modeling is difficult or impossible. The work described here outlines some examples of the application of neural networks with artificial intelligence technique in the modeling of industrial processes.

  1. Financial intelligence of business process outsourcing professional in Davao City Philippines

    Directory of Open Access Journals (Sweden)

    Samantha Ferraren

    2016-12-01

    Full Text Available This research determined the financial intelligence using the Kiyosaki Cashflow Quadrant. The study employed the Kiyosaki Cashflow Quadrant to classify employees financial intelligence as likely to be an investor, big business owner, self-employed and employed. The Ordinal Regression was employed in determining the parameters of the chosen variables through Maximum Likelihood Estimation (MLE. The results showed that income is a significant factor in the financial intelligence of the Business Process Outsourcing (BPO employees; the higher the income the better is the financial intelligence. The type of BPO employer is a significantly determined by the degree of financial intelligence in the BPO employees in financial services had higher financial intelligence than that of non-financial services. There is a significant relationship between financial literacy and financial intelligence.Although, there were some rank and file employees who earned less, and they may be classified as investor because of their behavior towards money. Financial wellness program for BPO employees in financial and non-financial services alike was recommended to improve financial intelligence to be able to achieve financial freedom .

  2. Artificial Intelligence in ADA: Pattern-Directed Processing. Final Report.

    Science.gov (United States)

    Reeker, Larry H.; And Others

    To demonstrate to computer programmers that the programming language Ada provides superior facilities for use in artificial intelligence applications, the three papers included in this report investigate the capabilities that exist within Ada for "pattern-directed" programming. The first paper (Larry H. Reeker, Tulane University) is…

  3. Is general intelligence little more than the speed of higher-order processing?

    Science.gov (United States)

    Schubert, Anna-Lena; Hagemann, Dirk; Frischkorn, Gidon T

    2017-10-01

    Individual differences in the speed of information processing have been hypothesized to give rise to individual differences in general intelligence. Consistent with this hypothesis, reaction times (RTs) and latencies of event-related potential have been shown to be moderately associated with intelligence. These associations have been explained either in terms of individual differences in some brain-wide property such as myelination, the speed of neural oscillations, or white-matter tract integrity, or in terms of individual differences in specific processes such as the signal-to-noise ratio in evidence accumulation, executive control, or the cholinergic system. Here we show in a sample of 122 participants, who completed a battery of RT tasks at 2 laboratory sessions while an EEG was recorded, that more intelligent individuals have a higher speed of higher-order information processing that explains about 80% of the variance in general intelligence. Our results do not support the notion that individuals with higher levels of general intelligence show advantages in some brain-wide property. Instead, they suggest that more intelligent individuals benefit from a more efficient transmission of information from frontal attention and working memory processes to temporal-parietal processes of memory storage. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Original article Functioning of memory and attention processes in children with intelligence below average

    Directory of Open Access Journals (Sweden)

    Aneta Rita Borkowska

    2014-05-01

    Full Text Available BACKGROUND The aim of the research was to assess memorization and recall of logically connected and unconnected material, coded graphically and linguistically, and the ability to focus attention, in a group of children with intelligence below average, compared to children with average intelligence. PARTICIPANTS AND PROCEDURE The study group included 27 children with intelligence below average. The control group consisted of 29 individuals. All of them were examined using the authors’ experimental trials and the TUS test (Attention and Perceptiveness Test. RESULTS Children with intelligence below average memorized significantly less information contained in the logical material, demonstrated lower ability to memorize the visual material, memorized significantly fewer words in the verbal material learning task, achieved lower results in such indicators of the visual attention process pace as the number of omissions and mistakes, and had a lower pace of perceptual work, compared to children with average intelligence. CONCLUSIONS The results confirm that children with intelligence below average have difficulties with memorizing new material, both logically connected and unconnected. The significantly lower capacity of direct memory is independent of modality. The results of the study on the memory process confirm the hypothesis about lower abilities of children with intelligence below average, in terms of concentration, work pace, efficiency and perception.

  5. DEVELOPMENT OF TECHNOLOGY AND REGULATORY DOCUMENTATION ON PROCESSED BROCCOLI PRODUCT

    Directory of Open Access Journals (Sweden)

    T. I. Kryachko

    2017-01-01

    Full Text Available The aim of the present investigation was development of an efficient technology for obtaining powders from fresh broccoli; determination of the possibility of using domestic production of broccoli as an import-substituting product; development of regulatory documentation for broccoli powders for the food industry. The research was carried out jointly with the representatives of the Federal Scientific cen-ter of vegetable production on an experimental basis in 2016. The domestic Tonus variety of broccoli (Federal Scientific center of vegetable production and the Maraton F1 hybrid (France, differing in appearance, vegetative period, biochemical and physical characteristics were chosen. Technology of broccoli powder production from domestic and imported products was developed using two methods of drying convection and lyophilization. The gentle drying conditions of broccoli freeze drying compared to convective drying technology provided higher content of both vitamin C and polyphenols in the final powder. Comparative studies of organoleptic and physico-chemical properties of powders obtained from domestic and imported broccoli demonstrated close quality parameters, indicating the possibility of effective domestic broccoli utilization and import substitution. For the first time in the Russian Federation, the "Organization Standard" was developed for regulation of the quality parameters of broccoli powders intended for use in the food industry.

  6. Forensic intelligence framework. Part II: Study of the main generic building blocks and challenges through the examples of illicit drugs and false identity documents monitoring.

    Science.gov (United States)

    Baechler, Simon; Morelato, Marie; Ribaux, Olivier; Beavis, Alison; Tahtouh, Mark; Kirkbride, K Paul; Esseiva, Pierre; Margot, Pierre; Roux, Claude

    2015-05-01

    The development of forensic intelligence relies on the expression of suitable models that better represent the contribution of forensic intelligence in relation to the criminal justice system, policing and security. Such models assist in comparing and evaluating methods and new technologies, provide transparency and foster the development of new applications. Interestingly, strong similarities between two separate projects focusing on specific forensic science areas were recently observed. These observations have led to the induction of a general model (Part I) that could guide the use of any forensic science case data in an intelligence perspective. The present article builds upon this general approach by focusing on decisional and organisational issues. The article investigates the comparison process and evaluation system that lay at the heart of the forensic intelligence framework, advocating scientific decision criteria and a structured but flexible and dynamic architecture. These building blocks are crucial and clearly lay within the expertise of forensic scientists. However, it is only part of the problem. Forensic intelligence includes other blocks with their respective interactions, decision points and tensions (e.g. regarding how to guide detection and how to integrate forensic information with other information). Formalising these blocks identifies many questions and potential answers. Addressing these questions is essential for the progress of the discipline. Such a process requires clarifying the role and place of the forensic scientist within the whole process and their relationship to other stakeholders. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. The role of across-frequency envelope processing for speech intelligibility

    DEFF Research Database (Denmark)

    Chabot-Leclerc, Alexandre; Jørgensen, Søren; Dau, Torsten

    2013-01-01

    Speech intelligibility models consist of a preprocessing part that transforms the stimuli into some internal (auditory) representation, and a decision metric that quantifies effects of transmission channel, speech interferers, and auditory processing on the speech intelligibility. Here, two recent...... speech intelligibility models, the spectro-temporal modulation index [STMI; Elhilali et al. (2003)] and the speech-based envelope power spectrum model [sEPSM; Jørgensen and Dau (2011)] were evaluated in conditions of noisy speech subjected to reverberation, and to nonlinear distortions through either...

  8. The role of across-frequency envelope processing for speech intelligibility

    DEFF Research Database (Denmark)

    Chabot-Leclerc, Alexandre; Jørgensen, Søren; Dau, Torsten

    2013-01-01

    Speech intelligibility models consist of a preprocessing part that transforms the stimuli into some internal (auditory) representation, and a decision metric that quantifies effects of transmission channel, speech interferers, and auditory processing on the speech intelligibility. Here, two recent...... speech intelligibility models, the spectro-temporal modulation index (STMI; Elhilali et al., 2003) and the speech-based envelope power spectrum model (sEPSM; Jørgensen and Dau, 2011) were evaluated in conditions of noisy speech subjected to reverberation, and to nonlinear distortions through either...

  9. Agriculture and Food Processes Branch program summary document

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-06-01

    The work of the Agriculture and Food Processes Branch within the US DOE's Office of Industrial Programs is discussed and reviewed. The Branch is responsible for assisting the food and agricultural sectors of the economy in increasing their energy efficiency by cost sharing with industry the development and demonstration of technologies industry by itself would not develop because of a greater than normal risk factor, but have significant energy conservation benefits. This task is made more difficult by the diversity of agriculture and the food industry. The focus of the program is now on the development and demonstration of energy conservation technology in high energy use industry sectors and agricultural functions (e.g., sugar processing, meat processing, irrigation, and crop drying, high energy use functions common to many sectors of the food industry (e.g., refrigeration, drying, and evaporation), and innovative concepts (e.g., energy integrated farm systems. Specific projects within the program are summarized. (LCL)

  10. Mothers' daily person and process praise: implications for children's theory of intelligence and motivation.

    Science.gov (United States)

    Pomerantz, Eva M; Kempner, Sara G

    2013-11-01

    This research examined if mothers' day-to-day praise of children's success in school plays a role in children's theory of intelligence and motivation. Participants were 120 children (mean age = 10.23 years) and their mothers who took part in a 2-wave study spanning 6 months. During the first wave, mothers completed a 10-day daily interview in which they reported on their use of person (e.g., "You are smart") and process (e.g., "You tried hard") praise. Children's entity theory of intelligence and preference for challenge in school were assessed with surveys at both waves. Mothers' person, but not process, praise was predictive of children's theory of intelligence and motivation: The more person praise mothers used, the more children subsequently held an entity theory of intelligence and avoided challenge over and above their earlier functioning on these dimensions.

  11. UMTRA Ground Water Project management action process document

    International Nuclear Information System (INIS)

    1996-03-01

    A critical U.S. Department of Energy (DOE) mission is to plan, implement, and complete DOE Environmental Restoration (ER) programs at facilities that were operated by or in support of the former Atomic Energy Commission (AEC). These facilities include the 24 inactive processing sites the Uranium Mill Tailings Radiation Control Act (UMTRCA) (42 USC Section 7901 et seq.) identified as Title I sites, which had operated from the late 1940s through the 1970s. In UMTRCA, Congress acknowledged the potentially harmful health effects associated with uranium mill tailings and directed the DOE to stabilize, dispose of, and control the tailings in a safe and environmentally sound manner. The UMTRA Surface Project deals with buildings, tailings, and contaminated soils at the processing sites and any associated vicinity properties (VP). Surface remediation at the processing sites will be completed in 1997 when the Naturita, Colorado, site is scheduled to be finished. The UMTRA Ground Water Project was authorized in an amendment to the UMTRCA (42 USC Section 7922(a)), when Congress directed DOE to comply with U.S. Environmental Protection Agency (EPA) ground water standards. The UMTRA Ground Water Project addresses any contamination derived from the milling operation that is determined to be present at levels above the EPA standards

  12. Intelligent methods for the process parameter determination of plastic injection molding

    Science.gov (United States)

    Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn

    2018-03-01

    Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.

  13. Cognitive and emotional demands of black humour processing: the role of intelligence, aggressiveness and mood.

    Science.gov (United States)

    Willinger, Ulrike; Hergovich, Andreas; Schmoeger, Michaela; Deckert, Matthias; Stoettner, Susanne; Bunda, Iris; Witting, Andrea; Seidler, Melanie; Moser, Reinhilde; Kacena, Stefanie; Jaeckle, David; Loader, Benjamin; Mueller, Christian; Auff, Eduard

    2017-05-01

    Humour processing is a complex information-processing task that is dependent on cognitive and emotional aspects which presumably influence frame-shifting and conceptual blending, mental operations that underlie humour processing. The aim of the current study was to find distinctive groups of subjects with respect to black humour processing, intellectual capacities, mood disturbance and aggressiveness. A total of 156 adults rated black humour cartoons and conducted measurements of verbal and nonverbal intelligence, mood disturbance and aggressiveness. Cluster analysis yields three groups comprising following properties: (1) moderate black humour preference and moderate comprehension; average nonverbal and verbal intelligence; low mood disturbance and moderate aggressiveness; (2) low black humour preference and moderate comprehension; average nonverbal and verbal intelligence, high mood disturbance and high aggressiveness; and (3) high black humour preference and high comprehension; high nonverbal and verbal intelligence; no mood disturbance and low aggressiveness. Age and gender do not differ significantly, differences in education level can be found. Black humour preference and comprehension are positively associated with higher verbal and nonverbal intelligence as well as higher levels of education. Emotional instability and higher aggressiveness apparently lead to decreased levels of pleasure when dealing with black humour. These results support the hypothesis that humour processing involves cognitive as well as affective components and suggest that these variables influence the execution of frame-shifting and conceptual blending in the course of humour processing.

  14. Realization Of Algebraic Processor For XML Documents Processing

    International Nuclear Information System (INIS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2010-01-01

    In this paper, are presented some possibilities concerning the implementation of an algebraic method for XML hierarchical data processing which makes faster the XML search mechanism. Here is offered a different point of view for creation of advanced algebraic processor (with all necessary software tools and programming modules respectively). Therefore, this nontraditional approach for fast XML navigation with the presented algebraic processor may help to build an easier user-friendly interface provided XML transformations, which can avoid the difficulties in the complicated language constructions of XSL, XSLT and XPath. This approach allows comparatively simple search of XML hierarchical data by means of the following types of functions: specification functions and so named build-in functions. The choice of programming language Java may appear strange at first, but it isn't when you consider that the applications can run on different kinds of computers. The specific search mechanism based on the linear algebra theory is faster in comparison with MSXML parsers (on the basis of the developed examples with about 30%). Actually, there exists the possibility for creating new software tools based on the linear algebra theory, which cover the whole navigation and search techniques characterizing XSLT/XPath. The proposed method is able to replace more complicated operations in other SOA components.

  15. Relationships among processing speed, working memory, and fluid intelligence in children.

    Science.gov (United States)

    Fry, A F; Hale, S

    2000-10-01

    The present review focuses on three issues, (a) the time course of developmental increases in cognitive abilities; (b) the impact of age on individual differences in these abilities, and (c) the mechanisms by which developmental increases in different aspects of cognition affect each other. We conclude from our review of the literature that the development of processing speed, working memory, and fluid intelligence, all follow a similar time course, suggesting that all three abilities develop in concert. Furthermore, the strength of the correlation between speed and intelligence does not appear to change with age, and most of the effect of the age-related increase in speed on intelligence appears to be mediated through the effect of speed on working memory. Finally, most of the effect of the age-related improvement in working memory on intelligence is itself attributable to the effect of the increase in speed on working memory, providing evidence of a cognitive developmental cascade.

  16. Documentation of pain care processes does not accurately reflect pain management delivered in primary care.

    Science.gov (United States)

    Krebs, Erin E; Bair, Matthew J; Carey, Timothy S; Weinberger, Morris

    2010-03-01

    Researchers and quality improvement advocates sometimes use review of chart-documented pain care processes to assess the quality of pain management. Studies have found that primary care providers frequently fail to document pain assessment and management. To assess documentation of pain care processes in an academic primary care clinic and evaluate the validity of this documentation as a measure of pain care delivered. Prospective observational study. 237 adult patients at a university-affiliated internal medicine clinic who reported any pain in the last week. Immediately after a visit, we asked patients to report the pain treatment they received. Patients completed the Brief Pain Inventory (BPI) to assess pain severity at baseline and 1 month later. We extracted documentation of pain care processes from the medical record and used kappa statistics to assess agreement between documentation and patient report of pain treatment. Using multivariable linear regression, we modeled whether documented or patient-reported pain care predicted change in pain at 1 month. Participants' mean age was 53.7 years, 66% were female, and 74% had chronic pain. Physicians documented pain assessment for 83% of visits. Patients reported receiving pain treatment more often (67%) than was documented by physicians (54%). Agreement between documentation and patient report was moderate for receiving a new pain medication (k = 0.50) and slight for receiving pain management advice (k = 0.13). In multivariable models, documentation of new pain treatment was not associated with change in pain (p = 0.134). In contrast, patient-reported receipt of new pain treatment predicted pain improvement (p = 0.005). Chart documentation underestimated pain care delivered, compared with patient report. Documented pain care processes had no relationship with pain outcomes at 1 month, but patient report of receiving care predicted clinically significant improvement. Chart review measures may not accurately

  17. Intelligent techniques in signal processing for multimedia security

    CERN Document Server

    Santhi, V

    2017-01-01

    This book proposes new algorithms to ensure secured communications and prevent unauthorized data exchange in secured multimedia systems. Focusing on numerous applications’ algorithms and scenarios, it offers an in-depth analysis of data hiding technologies including watermarking, cryptography, encryption, copy control, and authentication. The authors present a framework for visual data hiding technologies that resolves emerging problems of modern multimedia applications in several contexts including the medical, healthcare, education, and wireless communication networking domains. Further, it introduces several intelligent security techniques with real-time implementation. As part of its comprehensive coverage, the book discusses contemporary multimedia authentication and fingerprinting techniques, while also proposing personal authentication/recognition systems based on hand images, surveillance system security using gait recognition, face recognition under restricted constraints such as dry/wet face condi...

  18. The remarkable cell: Intelligently designed or by evolutionary process?

    Directory of Open Access Journals (Sweden)

    Mark Pretorius

    2013-02-01

    Full Text Available The objective of this article was to deal with the challenging theme of the Origin of Life. Science has been arguing the when and how of the beginning of life for centuries. It is a subject which remains perplexing despite all the technological advances made in science. The first part of the article dealt with the idea of a universe and earth divinely created to sustain life. The second part dealt with the premise that the first life forms were the miraculous work of an intelligent designer, which is revealed by the sophisticated and intricate design of these first life forms. The article concluded with an explanation that these life forms are in stark contrast to the idea of a random Darwinian type evolution for life�s origin, frequently referred to as abiogenesis or spontaneous generation.

  19. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  20. Using Software Zelio Soft in Educational Process to Simulation Control Programs for Intelligent Relays

    Science.gov (United States)

    Michalik, Peter; Mital, Dusan; Zajac, Jozef; Brezikova, Katarina; Duplak, Jan; Hatala, Michal; Radchenko, Svetlana

    2016-10-01

    Article deals with point to using intelligent relay and PLC systems in practice, to their architecture and principles of programming and simulations for education process on all types of school from secondary to universities. Aim of the article is proposal of simple examples of applications, where is demonstrated methodology of programming on real simple practice examples and shown using of chosen instructions. In practical part is described process of creating schemas and describing of function blocks, where are described methodologies of creating program and simulations of output reactions on changeable inputs for intelligent relays.

  1. Application of process improvement principles to increase the frequency of complete airway management documentation.

    Science.gov (United States)

    McCarty, L Kelsey; Saddawi-Konefka, Daniel; Gargan, Lauren M; Driscoll, William D; Walsh, John L; Peterfreund, Robert A

    2014-12-01

    Process improvement in healthcare delivery settings can be difficult, even when there is consensus among clinicians about a clinical practice or desired outcome. Airway management is a medical intervention fundamental to the delivery of anesthesia care. Like other medical interventions, a detailed description of the management methods should be documented. Despite this expectation, airway documentation is often insufficient. The authors hypothesized that formal adoption of process improvement methods could be used to increase the rate of "complete" airway management documentation. The authors defined a set of criteria as a local practice standard of "complete" airway management documentation. The authors then employed selected process improvement methodologies over 13 months in three iterative and escalating phases to increase the percentage of records with complete documentation. The criteria were applied retrospectively to determine the baseline frequency of complete records, and prospectively to measure the impact of process improvements efforts over the three phases of implementation. Immediately before the initial intervention, a retrospective review of 23,011 general anesthesia cases over 6 months showed that 13.2% of patient records included complete documentation. At the conclusion of the 13-month improvement effort, documentation improved to a completion rate of 91.6% (Pprocess improvement methodologies can improve airway documentation and may be similarly effective in improving other areas of anesthesia clinical practice.

  2. "Intelligent" tools for workflow process redesign : a research agenda

    NARCIS (Netherlands)

    Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.

    2006-01-01

    Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research

  3. Competitive Intelligence.

    Science.gov (United States)

    Bergeron, Pierrette; Hiller, Christine A.

    2002-01-01

    Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligence process; and discusses education and training opportunities for competitive intelligence, including core competencies…

  4. Application of ConceptDraw Office for planning, documenting, monitoring of operating processes

    International Nuclear Information System (INIS)

    Bocharnikov, O.P.; Savenko, S.V.; Nikiforov, N.S.

    2011-01-01

    ConceptDraw Office allows effectively deciding the following tasks: to carry out planning, designing, control of implementation of production processes; to prepare working documents (engineering diagrams, process flor diagrams, evacuation and emergency plans; to create dashboards with information for organization management; to distribute the organization resources with the purpose of increase of efficiency and safety of operating processes.

  5. Processing Speed and Intelligence as Predictors of School Achievement: Mediation or Unique Contribution?

    Science.gov (United States)

    Dodonova, Yulia A.; Dodonov, Yury S.

    2012-01-01

    The relationships between processing speed, intelligence, and school achievement were analyzed on a sample of 184 Russian 16-year-old students. Two speeded tasks required the discrimination of simple geometrical shapes and the recognition of the presented meaningless figures. Raven's Advanced Progressive Matrices and the verbal subtests of…

  6. Optimization of chemical composition in the manufacturing process of flotation balls based on intelligent soft sensing

    Directory of Open Access Journals (Sweden)

    Dučić Nedeljko

    2016-01-01

    Full Text Available This paper presents an application of computational intelligence in modeling and optimization of parameters of two related production processes - ore flotation and production of balls for ore flotation. It is proposed that desired chemical composition of flotation balls (Mn=0.69%; Cr=2.247%; C=3.79%; Si=0.5%, which ensures minimum wear rate (0.47 g/kg during copper milling is determined by combining artificial neural network (ANN and genetic algorithm (GA. Based on the results provided by neuro-genetic combination, a second neural network was derived as an ‘intelligent soft sensor’ in the process of white cast iron production. The proposed ANN 12-16-12-4 model demonstrated favourable prediction capacity, and can be recommended as a ‘intelligent soft sensor’ in the alloying process intended for obtaining favourable chemical composition of white cast iron for production of flotation balls. In the development of intelligent soft sensor data from the two real production processes was used. [Projekat Ministarstva nauke Republike Srbije, br. TR35037 i br. TR35015

  7. A Multidirectional Model for Assessing Learning Disabled Students' Intelligence: An Information-Processing Framework.

    Science.gov (United States)

    Swanson, H. Lee

    1982-01-01

    An information processing approach to the assessment of learning disabled students' intellectual performance is presented. The model is based on the assumption that intelligent behavior is comprised of a variety of problem- solving strategies. An account of child problem solving is explained and illustrated with a "thinking aloud" protocol.…

  8. The influence of masker type on early reflection processing and speech intelligibility (L)

    DEFF Research Database (Denmark)

    Arweiler, Iris; Buchholz, Jörg M.; Dau, Torsten

    2013-01-01

    Arweiler and Buchholz [J. Acoust. Soc. Am. 130, 996-1005 (2011)] showed that, while the energy of early reflections (ERs) in a room improves speech intelligibility, the benefit is smaller than that provided by the energy of the direct sound (DS). In terms of integration of ERs and DS, binaural...... listening did not provide a benefit from ERs apart from a binaural energy summation, such that monaural auditory processing could account for the data. However, a diffuse speech shaped noise (SSN) was used in the speech intelligibility experiments, which does not provide distinct binaural cues...... to the auditory system. In the present study, the monaural and binaural benefit from ERs for speech intelligibility was investigated using three directional maskers presented from 90° azimuth: a SSN, a multi-talker babble, and a reversed two-talker masker. For normal-hearing as well as hearing-impaired listeners...

  9. Integrated Intelligent Modeling, Design and Control of Crystal Growth Processes

    National Research Council Canada - National Science Library

    Prasad, V

    2000-01-01

    .... This MURI program took an integrated approach towards modeling, design and control of crystal growth processes and in conjunction with growth and characterization experiments developed much better...

  10. Intelligence and treaty ratification

    International Nuclear Information System (INIS)

    Sojka, G.L.

    1990-01-01

    What did the intelligence community and the Intelligence Committee di poorly in regard to the treaty ratification process for arms control? We failed to solve the compartmentalization problem/ This is a second-order problem, and, in general, analysts try to be very open; but there are problems nevertheless. There are very few, if any, people within the intelligence community who are cleared for everything relevant to our monitoring capability emdash short of probably the Director of Central Intelligence and the president emdash and this is a major problem. The formal monitoring estimates are drawn up by individuals who do not have access to all the information to make the monitoring judgements. This paper reports that the intelligence community did not present a formal document on either Soviet incentives of disincentives to cheat or on the possibility of cheating scenarios, and that was a mistake. However, the intelligence community was very responsive in producing those types of estimates, and, ultimately, the evidence behind them in response to questions. Nevertheless, the author thinks the intelligence community would do well to address this issue up front before a treaty is submitted to the Senate for advice and consent

  11. Tank waste remediation system privatization infrastructure program requirements and document management process guide

    International Nuclear Information System (INIS)

    ROOT, R.W.

    1999-01-01

    This guide provides the Tank Waste Remediation System Privatization Infrastructure Program management with processes and requirements to appropriately control information and documents in accordance with the Tank Waste Remediation System Configuration Management Plan (Vann 1998b). This includes documents and information created by the program, as well as non-program generated materials submitted to the project. It provides appropriate approval/control, distribution and filing systems

  12. 22 CFR 92.94 - Replying to inquiries regarding service of process or other documents.

    Science.gov (United States)

    2010-04-01

    ... regarding the service of legal process or documents of like nature, and should render such assistance as...)). If the person upon whom the process is intended to be served is known to be willing to accept service... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Replying to inquiries regarding service of...

  13. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  14. Business Intelligence Applied to the ALMA Software Integration Process

    Science.gov (United States)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  15. Intelligent process mapping through systematic improvement of heuristics

    Science.gov (United States)

    Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.

    1992-01-01

    The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.

  16. QuikForm: Intelligent deformation processing of structural alloys

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, R.J.; Wellman, G.W.

    1994-09-01

    There currently exists a critical need for tools to enhance the industrial competitiveness and agility of US industries involved in deformation processing of structural alloys. In response to this need, Sandia National Laboratories has embarked upon the QuikForm Initiative. The goal of this program is the development of computer-based tools to facilitate the design of deformation processing operations. The authors are currently focusing their efforts on the definition/development of a comprehensive system for the design of sheet metal stamping operations. The overall structure of the proposed QuikForm system is presented, and the focus of their thrust in each technical area is discussed.

  17. Intelligent workflow driven processing for electronic mail management

    African Journals Online (AJOL)

    Email has become one of the most efficient means of electronics communication for many years and email management has become a critical issue due to congestion. Different client/individuals encounter problems while processing their emails due to large volume of email being received and lot of request to be replied.

  18. Enhancing the Scientific Process with Artificial Intelligence: Forest Science Applications

    Science.gov (United States)

    Ronald E. McRoberts; Daniel L. Schmoldt; H. Michael Rauscher

    1991-01-01

    Forestry, as a science, is a process for investigating nature. It consists of repeatedly cycling through a number of steps, including identifying knowledge gaps, creating knowledge to fill them, and organizing, evaluating, and delivering this knowledge. Much of this effort is directed toward creating abstract models of natural phenomena. The cognitive techniques of AI...

  19. Personality and Information Processing Speed: Independent Influences on Intelligent Performance

    Science.gov (United States)

    Bates, Timothy C.; Rock, Andrew

    2004-01-01

    Raven's matrices and inspection time (IT) were recorded from 56 subjects under five arousal levels. Raven's and IT correlated strongly (r = -0.7) as predicted by processing-speed theories of "g." In line with Eysenck's [Eysenck, H. J. (1967). "The biological basis of personality". Springfield, IL: Thomas] arousal theory of extraversion, there was…

  20. The Use Of Computer Intelligent Processing Technologies Among ...

    African Journals Online (AJOL)

    This paper assesses the awareness and usage of a novel approach to data and information processing among scientists, researchers and students in the field of environmental sciences. In depth and structured interview was conducted, targeting a population who are working in a variety of environmental issues. The data ...

  1. Intelligent process control of fiber chemical vapor deposition

    Science.gov (United States)

    Jones, John Gregory

    Chemical Vapor Deposition (CVD) is a widely used process for the application of thin films. In this case, CVD is being used to apply a thin film interface coating to single crystal monofilament sapphire (Alsb2Osb3) fibers for use in Ceramic Matrix Composites (CMC's). The hot-wall reactor operates at near atmospheric pressure which is maintained using a venturi pump system. Inert gas seals obviate the need for a sealed system. A liquid precursor delivery system has been implemented to provide precise stoichiometry control. Neural networks have been implemented to create real-time process description models trained using data generated based on a Navier-Stokes finite difference model of the process. Automation of the process to include full computer control and data logging capability is also presented. In situ sensors including a quadrupole mass spectrometer, thermocouples, laser scanner, and Raman spectrometer have been implemented to determine the gas phase reactants and coating quality. A fuzzy logic controller has been developed to regulate either the gas phase or the in situ temperature of the reactor using oxygen flow rate as an actuator. Scanning electron microscope (SEM) images of various samples are shown. A hierarchical control structure upon which the control structure is based is also presented.

  2. Software architecture for intelligent image processing using Prolog

    Science.gov (United States)

    Jones, Andrew C.; Batchelor, Bruce G.

    1994-10-01

    We describe a prototype system for interactive image processing using Prolog, implemented by the first author on an Apple Macintosh computer. This system is inspired by Prolog+, but differs from it in two particularly important respects. The first is that whereas Prolog+ assumes the availability of dedicated image processing hardware, with which the Prolog system communicates, our present system implements image processing functions in software using the C programming language. The second difference is that although our present system supports Prolog+ commands, these are implemented in terms of lower-level Prolog predicates which provide a more flexible approach to image manipulation. We discuss the impact of the Apple Macintosh operating system upon the implementation of the image-processing functions, and the interface between these functions and the Prolog system. We also explain how the Prolog+ commands have been implemented. The system described in this paper is a fairly early prototype, and we outline how we intend to develop the system, a task which is expedited by the extensible architecture we have implemented.

  3. 12th International Conference on Intelligent Information Hiding and Multimedia Signal Processing

    CERN Document Server

    Tsai, Pei-Wei; Huang, Hsiang-Cheh

    2017-01-01

    This volume of Smart Innovation, Systems and Technologies contains accepted papers presented in IIH-MSP-2016, the 12th International Conference on Intelligent Information Hiding and Multimedia Signal Processing. The conference this year was technically co-sponsored by Tainan Chapter of IEEE Signal Processing Society, Fujian University of Technology, Chaoyang University of Technology, Taiwan Association for Web Intelligence Consortium, Fujian Provincial Key Laboratory of Big Data Mining and Applications (Fujian University of Technology), and Harbin Institute of Technology Shenzhen Graduate School. IIH-MSP 2016 is held in 21-23, November, 2016 in Kaohsiung, Taiwan. The conference is an international forum for the researchers and professionals in all areas of information hiding and multimedia signal processing. .

  4. Predicting speech intelligibility based on the signal-to-noise envelope power ratio after modulation-frequency selective processing

    DEFF Research Database (Denmark)

    Jørgensen, Søren; Dau, Torsten

    2011-01-01

    A model for predicting the intelligibility of processed noisy speech is proposed. The speech-based envelope power spectrum model has a similar structure as the model of Ewert and Dau [(2000). J. Acoust. Soc. Am. 108, 1181-1196], developed to account for modulation detection and masking data. The ...... process provides a key measure of speech intelligibility. © 2011 Acoustical Society of America.......A model for predicting the intelligibility of processed noisy speech is proposed. The speech-based envelope power spectrum model has a similar structure as the model of Ewert and Dau [(2000). J. Acoust. Soc. Am. 108, 1181-1196], developed to account for modulation detection and masking data....... The model estimates the speech-to-noise envelope power ratio, SNR env, at the output of a modulation filterbank and relates this metric to speech intelligibility using the concept of an ideal observer. Predictions were compared to data on the intelligibility of speech presented in stationary speech...

  5. A Latent Variable Analysis of Working Memory Capacity, Short-Term Memory Capacity, Processing Speed, and General Fluid Intelligence.

    Science.gov (United States)

    Conway, Andrew R. A.; Cowan, Nelsin; Bunting, Michael F.; Therriault, David J.; Minkoff, Scott R. B.

    2002-01-01

    Studied the interrelationships among general fluid intelligence, short-term memory capacity, working memory capacity, and processing speed in 120 young adults and used structural equation modeling to determine the best predictor of general fluid intelligence. Results suggest that working memory capacity, but not short-term memory capacity or…

  6. Influence of Family Processes, Motivation, and Beliefs about Intelligence on Creative Problem Solving of Scientifically Talented Individuals

    Science.gov (United States)

    Cho, Seokhee; Lin, Chia-Yi

    2011-01-01

    Predictive relationships among perceived family processes, intrinsic and extrinsic motivation, incremental beliefs about intelligence, confidence in intelligence, and creative problem-solving practices in mathematics and science were examined. Participants were 733 scientifically talented Korean students in fourth through twelfth grades as well as…

  7. Information processing speed mediates the relationship between white matter and general intelligence in schizophrenia.

    Science.gov (United States)

    Alloza, Clara; Cox, Simon R; Duff, Barbara; Semple, Scott I; Bastin, Mark E; Whalley, Heather C; Lawrie, Stephen M

    2016-08-30

    Several authors have proposed that schizophrenia is the result of impaired connectivity between specific brain regions rather than differences in local brain activity. White matter abnormalities have been suggested as the anatomical substrate for this dysconnectivity hypothesis. Information processing speed may act as a key cognitive resource facilitating higher order cognition by allowing multiple cognitive processes to be simultaneously available. However, there is a lack of established associations between these variables in schizophrenia. We hypothesised that the relationship between white matter and general intelligence would be mediated by processing speed. White matter water diffusion parameters were studied using Tract-based Spatial Statistics and computed within 46 regions-of-interest (ROI). Principal component analysis was conducted on these white matter ROI for fractional anisotropy (FA) and mean diffusivity, and on neurocognitive subtests to extract general factors of white mater structure (gFA, gMD), general intelligence (g) and processing speed (gspeed). There was a positive correlation between g and gFA (r= 0.67, p =0.001) that was partially and significantly mediated by gspeed (56.22% CI: 0.10-0.62). These findings suggest a plausible model of structure-function relations in schizophrenia, whereby white matter structure may provide a neuroanatomical substrate for general intelligence, which is partly supported by speed of information processing. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Automated business process management – in times of digital transformation using machine learning or artificial intelligence

    Directory of Open Access Journals (Sweden)

    Paschek Daniel

    2017-01-01

    Full Text Available The continuous optimization of business processes is still a challenge for companies. In times of digital transformation, faster changing internal and external framework conditions and new customer expectations for fastest delivery and best quality of goods and many more, companies should set up their internal process at the best way. But what to do if framework conditions changed unexpectedly? The purpose of the paper is to analyse how the digital transformation will impact the Business Process Management (BPM while using methods like machine learning or artificial intelligence. Therefore, the core components will be explained, compared and set up in relation. To identify application areas interviews and analysis will be held up with digital companies. The finding of the paper will be recommendation for action in the field of BPM and process optimization through machine learning and artificial intelligence. The Approach of optimizing and management processes via machine learning and artificial intelligence will support companies to decide which tool will be the best for automated BPM.

  9. Models and standards for production systems integration: Technological process and documents

    Directory of Open Access Journals (Sweden)

    Lečić Danica

    2005-01-01

    Full Text Available Electronic business demands from production companies to collaborate with customers, suppliers and end users and start electronic manufacturing. To achieve this goal companies have to integrate their subsystems (Application to Application-A2A and they have to collaborate with their business partners (Business to Business - B2B. For this purpose models and unique standards for integration are necessary. In this paper, ebXML and OAGI specifications have been used to present metamodel process by UML class diagram and standardized model of document Working Order for technological process in the form of OAGI BOD XML document. Based on it, from an example, model of technological process is presented by activity diagram (DA in XML form and an appearance of document Working Order. Just as well, rules of transformation DA to XML are presented.

  10. Intelligent Optimization of a Mixed Culture Cultivation Process

    Directory of Open Access Journals (Sweden)

    Petia Koprinkova-Hristova

    2015-04-01

    Full Text Available In the present paper a neural network approach called "Adaptive Critic Design" (ACD was applied to optimal tuning of set point controllers of the three main substrates (sugar, nitrogen source and dissolved oxygen for PHB production process. For approximation of the critic and the controllers a special kind of recurrent neural networks called Echo state networks (ESN were used. Their structure allows fast training that will be of crucial importance in on-line applications. The critic network is trained to minimize the temporal difference error using Recursive Least Squares method. Two approaches - gradient and heuristic - were exploited for training of the controllers. The comparison is made with respect to achieved improvement of the utility function subject of optimization as well as with known expert strategy for control the PHB production process.

  11. Improving content marketing processes with the approaches by artificial intelligence

    OpenAIRE

    Kose, Utku; Sert, Selcuk

    2017-01-01

    Content marketing is todays one of the most remarkable approaches in the context of marketing processes of companies. Value of this kind of marketing has improved in time, thanks to the latest developments regarding to computer and communication technologies. Nowadays, especially social media based platforms have a great importance on enabling companies to design multimedia oriented, interactive content. But on the other hand, there is still something more to do for improved content marketing...

  12. Artificial Intelligence for Inferential Control of Crude Oil Stripping Process

    Directory of Open Access Journals (Sweden)

    Mehdi Ebnali

    2018-01-01

    Full Text Available Stripper columns are used for sweetening crude oil, and they must hold product hydrogen sulfide content as near the set points as possible in the faces of upsets. Since product    quality cannot be measured easily and economically online, the control of product quality is often achieved by maintaining a suitable tray temperature near its set point. Tray temperature control method, however, is not a proper option for a multi-component stripping column because the tray temperature does not correspond exactly to the product composition. To overcome this problem, secondary measurements can be used to infer the product quality and adjust the values of the manipulated variables. In this paper, we have used a novel inferential control approach base on adaptive network fuzzy inference system (ANFIS for stripping process. ANFIS with different learning algorithms is used for modeling the process and building a composition estimator to estimate the composition of the bottom product. The developed estimator is tested, and the results show that the predictions made by ANFIS structure are in good agreement with the results of simulation by ASPEN HYSYS process simulation package. In addition, inferential control by the implementation of ANFIS-based online composition estimator in a cascade control scheme is superior to traditional tray temperature control method based on less integral time absolute error and low duty consumption in reboiler.

  13. Suppressive mechanisms in visual motion processing: From perception to intelligence.

    Science.gov (United States)

    Tadin, Duje

    2015-10-01

    Perception operates on an immense amount of incoming information that greatly exceeds the brain's processing capacity. Because of this fundamental limitation, the ability to suppress irrelevant information is a key determinant of perceptual efficiency. Here, I will review a series of studies investigating suppressive mechanisms in visual motion processing, namely perceptual suppression of large, background-like motions. These spatial suppression mechanisms are adaptive, operating only when sensory inputs are sufficiently robust to guarantee visibility. Converging correlational and causal evidence links these behavioral results with inhibitory center-surround mechanisms, namely those in cortical area MT. Spatial suppression is abnormally weak in several special populations, including the elderly and individuals with schizophrenia-a deficit that is evidenced by better-than-normal direction discriminations of large moving stimuli. Theoretical work shows that this abnormal weakening of spatial suppression should result in motion segregation deficits, but direct behavioral support of this hypothesis is lacking. Finally, I will argue that the ability to suppress information is a fundamental neural process that applies not only to perception but also to cognition in general. Supporting this argument, I will discuss recent research that shows individual differences in spatial suppression of motion signals strongly predict individual variations in IQ scores. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. The process of deforestation in weak democracies and the role of Intelligence.

    Science.gov (United States)

    Obydenkova, Anastassia; Nazarov, Zafar; Salahodjaev, Raufhon

    2016-07-01

    This article examines the interconnection between national intelligence, political institutions, and the mismanagement of public resources (deforestations). The paper examines the reasons for deforestation and investigates the factors accountable for it. The analysis builds on authors-compiled cross-national dataset on 185 countries over the time period of twenty years, from 1990 to 2010. We find that, first, nation's intelligence reduces significantly the level of deforestation in a state. Moreover, the nations' IQ seems to play an offsetting role in the natural resource conservation (forest management) in the countries with weak democratic institutions. The analysis also discovered the presence of the U-shaped relationship between democracy and deforestation. Intelligence sheds more light on this interconnection and explains the results. Our results are robust to various sample selection strategies and model specifications. The main implication from our study is that intelligence not only shapes formal rules and informal regulations such as social trust, norms and traditions but also it has the ability to reverse the paradoxical process known as "resource curse." The study contributes to better understanding of reasons of deforestation and shed light on the debated impact of political regime on forest management. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Recent Technological Advances in Natural Language Processing and Artificial Intelligence

    OpenAIRE

    Shah, Nishal Pradeepkumar

    2012-01-01

    A recent advance in computer technology has permitted scientists to implement and test algorithms that were known from quite some time (or not) but which were computationally expensive. Two such projects are IBM's Jeopardy as a part of its DeepQA project [1] and Wolfram's Wolframalpha[2]. Both these methods implement natural language processing (another goal of AI scientists) and try to answer questions as asked by the user. Though the goal of the two projects is similar, both of them have a ...

  16. 2nd International Symposium on Signal Processing and Intelligent Recognition Systems

    CERN Document Server

    Bandyopadhyay, Sanghamitra; Krishnan, Sri; Li, Kuan-Ching; Mosin, Sergey; Ma, Maode

    2016-01-01

    This Edited Volume contains a selection of refereed and revised papers originally presented at the second International Symposium on Signal Processing and Intelligent Recognition Systems (SIRS-2015), December 16-19, 2015, Trivandrum, India. The program committee received 175 submissions. Each paper was peer reviewed by at least three or more independent referees of the program committee and the 59 papers were finally selected. The papers offer stimulating insights into biometrics, digital watermarking, recognition systems, image and video processing, signal and speech processing, pattern recognition, machine learning and knowledge-based systems. The book is directed to the researchers and scientists engaged in various field of signal processing and related areas. .

  17. An Intelligent System for Modelling, Design and Analysis of Chemical Processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    ICAS, Integrated Computer Aided System, is a software that consists of a number of intelligent tools, which are very suitable, among others, for computer aided modelling, sustainable design of chemical and biochemical processes, and design-analysis of product-process monitoring systems. Each...... the computer aided modelling tool will illustrate how to generate a desired process model, how to analyze the model equations, how to extract data and identify the model and make it ready for various types of application. In sustainable process design, the example will highlight the issue of integration...

  18. FLOCKING-BASED DOCUMENT CLUSTERING ON THE GRAPHICS PROCESSING UNIT [Book Chapter

    Energy Technology Data Exchange (ETDEWEB)

    Charles, J S; Patton, R M; Potok, T E; Cui, X

    2008-01-01

    Analyzing and grouping documents by content is a complex problem. One explored method of solving this problem borrows from nature, imitating the fl ocking behavior of birds. Each bird represents a single document and fl ies toward other documents that are similar to it. One limitation of this method of document clustering is its complexity O(n2). As the number of documents grows, it becomes increasingly diffi cult to receive results in a reasonable amount of time. However, fl ocking behavior, along with most naturally inspired algorithms such as ant colony optimization and particle swarm optimization, are highly parallel and have experienced improved performance on expensive cluster computers. In the last few years, the graphics processing unit (GPU) has received attention for its ability to solve highly-parallel and semi-parallel problems much faster than the traditional sequential processor. Some applications see a huge increase in performance on this new platform. The cost of these high-performance devices is also marginal when compared with the price of cluster machines. In this paper, we have conducted research to exploit this architecture and apply its strengths to the document flocking problem. Our results highlight the potential benefi t the GPU brings to all naturally inspired algorithms. Using the CUDA platform from NVIDIA®, we developed a document fl ocking implementation to be run on the NVIDIA® GEFORCE 8800. Additionally, we developed a similar but sequential implementation of the same algorithm to be run on a desktop CPU. We tested the performance of each on groups of news articles ranging in size from 200 to 3,000 documents. The results of these tests were very signifi cant. Performance gains ranged from three to nearly fi ve times improvement of the GPU over the CPU implementation. This dramatic improvement in runtime makes the GPU a potentially revolutionary platform for document clustering algorithms.

  19. Development of the NRC`s Human Performance Investigation Process (HPIP). Volume 3, Development documentation

    Energy Technology Data Exchange (ETDEWEB)

    Paradies, M.; Unger, L. [System Improvements, Inc., Knoxville, TN (United States); Haas, P.; Terranova, M. [Concord Associates, Inc., Knoxville, TN (United States)

    1993-10-01

    The three volumes of this report detail a standard investigation process for use by US Nuclear Regulatory Commission (NRC) personnel when investigating human performance related events at nuclear power plants. The process, called the Human Performance Investigation Process (HPIP), was developed to meet the special needs of NRC personnel, especially NRC resident and regional inspectors. HPIP is a systematic investigation process combining current procedures and field practices, expert experience, NRC human performance research, and applicable investigation techniques. The process is easy to learn and helps NRC personnel perform better field investigations of the root causes of human performance problems. The human performance data gathered through such investigations provides a better understanding of the human performance issues that cause events at nuclear power plants. This document, Volume III, is a detailed documentation of the development effort and the pilot training program.

  20. Network-Capable Application Process and Wireless Intelligent Sensors for ISHM

    Science.gov (United States)

    Figueroa, Fernando; Morris, Jon; Turowski, Mark; Wang, Ray

    2011-01-01

    Intelligent sensor technology and systems are increasingly becoming attractive means to serve as frameworks for intelligent rocket test facilities with embedded intelligent sensor elements, distributed data acquisition elements, and onboard data acquisition elements. Networked intelligent processors enable users and systems integrators to automatically configure their measurement automation systems for analog sensors. NASA and leading sensor vendors are working together to apply the IEEE 1451 standard for adding plug-and-play capabilities for wireless analog transducers through the use of a Transducer Electronic Data Sheet (TEDS) in order to simplify sensor setup, use, and maintenance, to automatically obtain calibration data, and to eliminate manual data entry and error. A TEDS contains the critical information needed by an instrument or measurement system to identify, characterize, interface, and properly use the signal from an analog sensor. A TEDS is deployed for a sensor in one of two ways. First, the TEDS can reside in embedded, nonvolatile memory (typically flash memory) within the intelligent processor. Second, a virtual TEDS can exist as a separate file, downloadable from the Internet. This concept of virtual TEDS extends the benefits of the standardized TEDS to legacy sensors and applications where the embedded memory is not available. An HTML-based user interface provides a visual tool to interface with those distributed sensors that a TEDS is associated with, to automate the sensor management process. Implementing and deploying the IEEE 1451.1-based Network-Capable Application Process (NCAP) can achieve support for intelligent process in Integrated Systems Health Management (ISHM) for the purpose of monitoring, detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, mitigation to maintain operability, and integrated awareness of system health by the operator. It can also support local data collection and storage. This

  1. The use of artificial intelligence techniques to improve the multiple payload integration process

    Science.gov (United States)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  2. Developing a new theory of knowledge sharing : Documenting and reflecting on a messy process

    NARCIS (Netherlands)

    Martinsons, M.G.; Davison, R.M.; Ou, Carol

    2015-01-01

    Much has been written about theories and how they can be tested. Unfortunately, much less has been written about how to develop them. This paper sheds light on the process of new theory development. We document and reflect on how we developed a context-sensitive indigenous theory of knowledge

  3. The processed neutron activation cross-section data files of the FENDL project. Summary documentation

    International Nuclear Information System (INIS)

    Ganesan, S.; Pashchenko, A.B.; Lemmle, H.D.; Mann, F.M.

    1994-01-01

    This document summarises a neutron activation cross-section database which has been processed in two formats for input to MCNP Monte Carlo codes and to REAC transmutation codes. The data are available from the IAEA Nuclear Data Section online via INTERNET by FTP command. (author)

  4. Preliminary application of Structure from Motion and GIS to document decomposition and taphonomic processes.

    Science.gov (United States)

    Carlton, Connor D; Mitchell, Samantha; Lewis, Patrick

    2018-01-01

    Over the past decade, Structure from Motion (SfM) has increasingly been used as a means of digital preservation and for documenting archaeological excavations, architecture, and cultural material. However, few studies have tapped the potential of using SfM to document and analyze taphonomic processes affecting burials for forensic sciences purposes. This project utilizes SfM models to elucidate specific post-depositional events that affected a series of three human cadavers deposited at the South East Texas Applied Forensic Science Facility (STAFS). The aim of this research was to test the ability for untrained researchers to employ spatial software and photogrammetry for data collection purposes. For a series of three months a single lens reflex (SLR) camera was used to capture a series of overlapping images at periodic stages in the decomposition process of each cadaver. These images are processed through photogrammetric software that creates a 3D model that can be measured, manipulated, and viewed. This project used photogrammetric and geospatial software to map changes in decomposition and movement of the body from original deposition points. Project results indicate SfM and GIS as a useful tool for documenting decomposition and taphonomic processes. Results indicate photogrammetry is an efficient, relatively simple, and affordable tool for the documentation of decomposition. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    Science.gov (United States)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  6. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  7. Developing an Intelligent Automatic Appendix Extraction Method from Ultrasonography Based on Fuzzy ART and Image Processing

    Directory of Open Access Journals (Sweden)

    Kwang Baek Kim

    2015-01-01

    Full Text Available Ultrasound examination (US does a key role in the diagnosis and management of the patients with clinically suspected appendicitis which is the most common abdominal surgical emergency. Among the various sonographic findings of appendicitis, outer diameter of the appendix is most important. Therefore, clear delineation of the appendix on US images is essential. In this paper, we propose a new intelligent method to extract appendix automatically from abdominal sonographic images as a basic building block of developing such an intelligent tool for medical practitioners. Knowing that the appendix is located at the lower organ area below the bottom fascia line, we conduct a series of image processing techniques to find the fascia line correctly. And then we apply fuzzy ART learning algorithm to the organ area in order to extract appendix accurately. The experiment verifies that the proposed method is highly accurate (successful in 38 out of 40 cases in extracting appendix.

  8. Intelligent indexing

    International Nuclear Information System (INIS)

    Farkas, J.

    1992-01-01

    In this paper we discuss the relevance of artificial intelligence to the automatic indexing of natural language text. We describe the use of domain-specific semantically-based thesauruses and address the problem of creating adequate knowledge bases for intelligent indexing systems. We also discuss the relevance of the Hilbert space ι 2 to the compact representation of documents and to the definition of the similarity of natural language texts. (author). 17 refs., 2 figs

  9. Intelligent indexing

    Energy Technology Data Exchange (ETDEWEB)

    Farkas, J

    1993-12-31

    In this paper we discuss the relevance of artificial intelligence to the automatic indexing of natural language text. We describe the use of domain-specific semantically-based thesauruses and address the problem of creating adequate knowledge bases for intelligent indexing systems. We also discuss the relevance of the Hilbert space {iota}{sup 2} to the compact representation of documents and to the definition of the similarity of natural language texts. (author). 17 refs., 2 figs.

  10. 文件物件模型及其在XML文件處理之應用 Document Object Model and Its Application on XML Document Processing

    Directory of Open Access Journals (Sweden)

    Sinn-cheng Lin

    2001-06-01

    Full Text Available 無Document Object Model (DOM is an application-programming interface that can be applied to process XML documents. It defines the logical structure, the accessing interfaces and the operation methods for the document. In the DOM, an original document is mapped to a tree structure. Therefore ,the computer program can easily traverse the tree manipulate the nodes in the tree. In this paper, the fundamental models, definitions and specifications of DOM are surveyed. Then we create an experimenta1 system of DOM called XML On-Line Parser. The front-end of the system is built by the Web-based user interface for the XML document input and the parsed result output. On the other hand, the back-end of the system is built by an ASP program, which transforms the original document to DOM tree for document manipulation. This on-line system can be linked with a general-purpose web browser to check the well-formedness and the validity of the XML documents.

  11. Intelligent Processing Equipment Developments Within the Navy's Manufacturing Technology Centers of Excellence

    Science.gov (United States)

    Nanzetta, Philip

    1992-01-01

    The U.S. Navy has had an active Manufacturing Technology (MANTECH) Program aimed at developing advanced production processes and equipment since the late-1960's. During the past decade, however, the resources of the MANTECH program were concentrated in Centers of Excellence. Today, the Navy sponsors four manufacturing technology Centers of Excellence: the Automated Manufacturing Research Facility (AMRF); the Electronics Manufacturing Productivity Facility (EMPF); the National Center for Excellence in Metalworking Technology (NCEMT); and the Center of Excellence for Composites Manufacturing Technology (CECMT). This paper briefly describes each of the centers and summarizes typical Intelligent Equipment Processing (IEP) projects that were undertaken.

  12. Development of a process model for intelligent control of gas metal arc welding

    International Nuclear Information System (INIS)

    Smartt, H.B.; Johnson, J.A.; Einerson, C.J.; Watkins, A.D.; Carlson, N.M.

    1991-01-01

    This paper discusses work in progress on the development of an intelligent control scheme for arc welding. A set of four sensors is used to detect weld bead cooling rate, droplet transfer mode, weld pool and joint location and configuration, and weld defects during welding. A neural network is being developed as the bridge between the multiple sensor set a conventional proportional-integral controller that provides independent control of process variables. This approach is being developed for the gas metal arc welding process. 20 refs., 8 figs

  13. Study on intelligent processing system of man-machine interactive garment frame model

    Science.gov (United States)

    Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian

    2018-05-01

    A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.

  14. Comparing Binaural Pre-processing Strategies II: Speech Intelligibility of Bilateral Cochlear Implant Users.

    Science.gov (United States)

    Baumgärtel, Regina M; Hu, Hongmei; Krawczyk-Becker, Martin; Marquardt, Daniel; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Bomke, Katrin; Plotz, Karsten; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias

    2015-12-30

    Several binaural audio signal enhancement algorithms were evaluated with respect to their potential to improve speech intelligibility in noise for users of bilateral cochlear implants (CIs). 50% speech reception thresholds (SRT50) were assessed using an adaptive procedure in three distinct, realistic noise scenarios. All scenarios were highly nonstationary, complex, and included a significant amount of reverberation. Other aspects, such as the perfectly frontal target position, were idealized laboratory settings, allowing the algorithms to perform better than in corresponding real-world conditions. Eight bilaterally implanted CI users, wearing devices from three manufacturers, participated in the study. In all noise conditions, a substantial improvement in SRT50 compared to the unprocessed signal was observed for most of the algorithms tested, with the largest improvements generally provided by binaural minimum variance distortionless response (MVDR) beamforming algorithms. The largest overall improvement in speech intelligibility was achieved by an adaptive binaural MVDR in a spatially separated, single competing talker noise scenario. A no-pre-processing condition and adaptive differential microphones without a binaural link served as the two baseline conditions. SRT50 improvements provided by the binaural MVDR beamformers surpassed the performance of the adaptive differential microphones in most cases. Speech intelligibility improvements predicted by instrumental measures were shown to account for some but not all aspects of the perceptually obtained SRT50 improvements measured in bilaterally implanted CI users. © The Author(s) 2015.

  15. A New Dimension of Business Intelligence: Location-based Intelligence

    OpenAIRE

    Zeljko Panian

    2012-01-01

    Through the course of this paper we define Locationbased Intelligence (LBI) which is outgrowing from process of amalgamation of geolocation and Business Intelligence. Amalgamating geolocation with traditional Business Intelligence (BI) results in a new dimension of BI named Location-based Intelligence. LBI is defined as leveraging unified location information for business intelligence. Collectively, enterprises can transform location data into business intelligence applic...

  16. Development and evaluation of an intelligent traceability system for frozen tilapia fillet processing.

    Science.gov (United States)

    Xiao, Xinqing; Fu, Zetian; Qi, Lin; Mira, Trebar; Zhang, Xiaoshuan

    2015-10-01

    The main export varieties in China are brand-name, high-quality bred aquatic products. Among them, tilapia has become the most important and fast-growing species since extensive consumer markets in North America and Europe have evolved as a result of commodity prices, year-round availability and quality of fresh and frozen products. As the largest tilapia farming country, China has over one-third of its tilapia production devoted to further processing and meeting foreign market demand. Using by tilapia fillet processing, this paper introduces the efforts for developing and evaluating ITS-TF: an intelligent traceability system integrated with statistical process control (SPC) and fault tree analysis (FTA). Observations, literature review and expert questionnaires were used for system requirement and knowledge acquisition; scenario simulation was applied to evaluate and validate ITS-TF performance. The results show that traceability requirement is evolved from a firefighting model to a proactive model for enhancing process management capacity for food safety; ITS-TF transforms itself as an intelligent system to provide functions on early warnings and process management by integrated SPC and FTA. The valuable suggestion that automatic data acquisition and communication technology should be integrated into ITS-TF was achieved for further system optimization, perfection and performance improvement. © 2014 Society of Chemical Industry.

  17. An intelligent approach for cooling radiator fault diagnosis based on infrared thermal image processing technique

    International Nuclear Information System (INIS)

    Taheri-Garavand, Amin; Ahmadi, Hojjat; Omid, Mahmoud; Mohtasebi, Seyed Saeid; Mollazade, Kaveh; Russell Smith, Alan John; Carlomagno, Giovanni Maria

    2015-01-01

    This research presents a new intelligent fault diagnosis and condition monitoring system for classification of different conditions of cooling radiator using infrared thermal images. The system was adopted to classify six types of cooling radiator faults; radiator tubes blockage, radiator fins blockage, loose connection between fins and tubes, radiator door failure, coolant leakage, and normal conditions. The proposed system consists of several distinct procedures including thermal image acquisition, image pre-processing, image processing, two-dimensional discrete wavelet transform (2D-DWT), feature extraction, feature selection using a genetic algorithm (GA), and finally classification by artificial neural networks (ANNs). The 2D-DWT is implemented to decompose the thermal images. Subsequently, statistical texture features are extracted from the original images and are decomposed into thermal images. The significant selected features are used to enhance the performance of the designed ANN classifier for the 6 types of cooling radiator conditions (output layer) in the next stage. For the tested system, the input layer consisted of 16 neurons based on the feature selection operation. The best performance of ANN was obtained with a 16-6-6 topology. The classification results demonstrated that this system can be employed satisfactorily as an intelligent condition monitoring and fault diagnosis for a class of cooling radiator. - Highlights: • Intelligent fault diagnosis of cooling radiator using thermal image processing. • Thermal image processing in a multiscale representation structure by 2D-DWT. • Selection features based on a hybrid system that uses both GA and ANN. • Application of ANN as classifier. • Classification accuracy of fault detection up to 93.83%

  18. Women's, midwives' and obstetricians' experiences of a structured process to document refusal of recommended maternity care.

    Science.gov (United States)

    Jenkinson, Bec; Kruske, Sue; Stapleton, Helen; Beckmann, Michael; Reynolds, Maree; Kildea, Sue

    2016-12-01

    Ethical and professional guidance for midwives and obstetricians emphasises informed consent and respect for patient autonomy; the right to refuse care is well established. However, the existing literature is largely silent on the appropriate clinical responses when pregnant women refuse recommended care, and accounts of disrespectful interactions and conflict are numerous. Policies and processes to support women and maternity care providers are rare and unstudied. To document the perspectives of women, midwives and obstetricians following the introduction of a structured process (Maternity Care Plan; MCP) to document refusal of recommended maternity care in a large tertiary maternity unit. A qualitative, interpretive study involved thematic analysis of in-depth semi-structured interviews with women (n=9), midwives (n=12) and obstetricians (n=9). Four major themes were identified including: 'Reassuring and supporting clinicians'; 'Keeping the door open'; 'Varied awareness, criteria and use of the MCP process' and 'No guarantees'. Clinicians felt protected and reassured by the structured documentation and communication process and valued keeping women engaged in hospital care. This, in turn, protected women's access to maternity care. However, the process could not guarantee favourable responses from other clinicians subsequently involved in the woman's care. Ongoing discussions of risk, perceived by women and some midwives to be pressure to consent to recommended care, were still evident. These limitations may have been attributable to the absence of agreed criteria for initiating the MCP process and fragmented care. Varying awareness and use of the process also diminished women's access to it. Copyright © 2016 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  19. Hybrid robust deep and shallow semantic processing for creativity support in document production

    OpenAIRE

    Uszkoreit, Hans; Callmeier, Ulrich; Eisele, Andreas; Schäfer, Ulrich; Siegel, Melanie

    2004-01-01

    The research performed in the DeepThought project (http://www.project-deepthought.net) aims at demonstrating the potential of deep linguistic processing if added to existing shallow methods that ensure robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. We use this approach to demonstrate the feasibility of three ambitious applications, one of which is a tool for creativity support in document production and collective brainstormi...

  20. An alternate way for image documentation in gamma camera processing units

    International Nuclear Information System (INIS)

    Schneider, P.

    1980-01-01

    For documentation of images and curves generated by a gamma camera processing system a film exposure tool from a CT system was linked to the video monitor by use of a resistance bridge. The machine has a stock capacity of 100 plane films. For advantage there is no need for an interface, the complete information on the monitor is transferred to the plane film and compared to software controlled data output on printer or plotter the device is tremendously time saving. (orig.) [de

  1. Role of Interdisciplinary Cooperation in Process of Documentation of Cultural Heritage

    Directory of Open Access Journals (Sweden)

    Jindřich Hodač

    2011-12-01

    Full Text Available This paper is focused on presentation of results of long-term interdisciplinary cooperation in a process of documentation of Cultural Heritage. There are two sides joined in this cooperation. The first side is a ,,submitter” - in our case it means architect-historian (Mr. Rykl. The second side is a ,,contractor” - in our case it means surveyorphotogrammetrist (Mr. Hodač and his students. We are cooperating mostly on projects of metrical documentation of Culture Heritage buildings and sites. Our cooperation is realizing mainly in bachelor„s/master’s projects. Other opportunity for our collaboration is our course [1]. We are offering this course to students of two faculties/specializations (surveyors + architects. Beside the wide range of real results (2D drawings, 3D models, photomaps etc. we also collected quite a lot of experience with process of collaboration itself. Cooperation and communication of submitter and contactor are playing key roles for successful project. It is possible to generally expect that submitter will give the ,,task” and contractor will try to find proper technology to solve it. The process of communication should be permanent because new circumstances and findings are arising all the time. It is very important for all together to find common language across specializations to understand each other. Surveyors are ,,slightly pressed” to get more knowledge about historical building constructions. Architects-historians should get basic awareness about various recent technologies for metrical documentation and its ,,pros and cons”.

  2. Working memory - not processing speed - mediates fluid intelligence deficits associated with attention deficit/hyperactivity disorder symptoms.

    Science.gov (United States)

    Brydges, Christopher R; Ozolnieks, Krista L; Roberts, Gareth

    2017-09-01

    Attention deficit/hyperactivity disorder (ADHD) is a psychological condition characterized by inattention and hyperactivity. Cognitive deficits are commonly observed in ADHD patients, including impaired working memory, processing speed, and fluid intelligence, the three of which are theorized to be closely associated with one another. In this study, we aimed to determine if decreased fluid intelligence was associated with ADHD, and was mediated by deficits in working memory and processing speed. This study tested 142 young adults from the general population on a range of working memory, processing speed, and fluid intelligence tasks, and an ADHD self-report symptoms questionnaire. Results showed that total and hyperactive ADHD symptoms correlated significantly and negatively with fluid intelligence, but this association was fully mediated by working memory. However, inattentive symptoms were not associated with fluid intelligence. Additionally, processing speed was not associated with ADHD symptoms at all, and was not uniquely predictive of fluid intelligence. The results provide implications for working memory training programs for ADHD patients, and highlight potential differences between the neuropsychological profiles of ADHD subtypes. © 2015 The British Psychological Society.

  3. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining.

    Science.gov (United States)

    Salehi, Mojtaba; Bahreininejad, Ardeshir

    2011-08-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.

  4. Artificial Intelligence.

    Science.gov (United States)

    Wash, Darrel Patrick

    1989-01-01

    Making a machine seem intelligent is not easy. As a consequence, demand has been rising for computer professionals skilled in artificial intelligence and is likely to continue to go up. These workers develop expert systems and solve the mysteries of machine vision, natural language processing, and neural networks. (Editor)

  5. Event Detection Intelligent Camera: Demonstration of flexible, real-time data taking and processing

    Energy Technology Data Exchange (ETDEWEB)

    Szabolics, Tamás, E-mail: szabolics.tamas@wigner.mta.hu; Cseh, Gábor; Kocsis, Gábor; Szepesi, Tamás; Zoletnik, Sándor

    2015-10-15

    Highlights: • We present EDICAM's operation principles description. • Firmware tests results. • Software test results. • Further developments. - Abstract: An innovative fast camera (EDICAM – Event Detection Intelligent CAMera) was developed by MTA Wigner RCP in the last few years. This new concept was designed for intelligent event driven processing to be able to detect predefined events and track objects in the plasma. The camera provides a moderate frame rate of 400 Hz at full frame resolution (1280 × 1024), and readout of smaller region of interests can be done in the 1–140 kHz range even during exposure of the full image. One of the most important advantages of this hardware is a 10 Gbit/s optical link which ensures very fast communication and data transfer between the PC and the camera, enabling two level of processing: primitive algorithms in the camera hardware and high-level processing in the PC. This camera hardware has successfully proven to be able to monitoring the plasma in several fusion devices for example at ASDEX Upgrade, KSTAR and COMPASS with the first version of firmware. A new firmware and software package is under development. It allows to detect predefined events in real time and therefore the camera is capable to change its own operation or to give warnings e.g. to the safety system of the experiment. The EDICAM system can handle a huge amount of data (up to TBs) with high data rate (950 MB/s) and will be used as the central element of the 10 camera overview video diagnostic system of Wendenstein 7-X (W7-X) stellarator. This paper presents key elements of the newly developed built-in intelligence stressing the revolutionary new features and the results of the test of the different software elements.

  6. The Virtual UNICOS Process Expert: integration of Artificial Intelligence tools in Control Systems

    CERN Multimedia

    Vilches Calvo, I; Barillere, R

    2009-01-01

    UNICOS is a CERN framework to produce control applications. It provides operators with ways to interact with all process items from the most simple (e.g. I/O channels) to the most abstract objects (e.g. a part of the plant). This possibility of fine grain operation is particularly useful to recover from abnormal situations if operators have the required knowledge. The Virtual UNICOS Process Expert project aims at providing operators with means to handle difficult operation cases for which the intervention of process experts is usually requested. The main idea of project is to use the openness of the UNICOS-based applications to integrate tools (e.g. Artificial Intelligence tools) which will act as Process Experts to analyze complex situations, to propose and to execute smooth recovery procedures.

  7. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    Science.gov (United States)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  8. Cognitive Processing Speed, Working Memory, and the Intelligibility of Hearing Aid-Processed Speech in Persons with Hearing Impairment

    Directory of Open Access Journals (Sweden)

    Wycliffe Kabaywe Yumba

    2017-08-01

    Full Text Available Previous studies have demonstrated that successful listening with advanced signal processing in digital hearing aids is associated with individual cognitive capacity, particularly working memory capacity (WMC. This study aimed to examine the relationship between cognitive abilities (cognitive processing speed and WMC and individual listeners’ responses to digital signal processing settings in adverse listening conditions. A total of 194 native Swedish speakers (83 women and 111 men, aged 33–80 years (mean = 60.75 years, SD = 8.89, with bilateral, symmetrical mild to moderate sensorineural hearing loss who had completed a lexical decision speed test (measuring cognitive processing speed and semantic word-pair span test (SWPST, capturing WMC participated in this study. The Hagerman test (capturing speech recognition in noise was conducted using an experimental hearing aid with three digital signal processing settings: (1 linear amplification without noise reduction (NoP, (2 linear amplification with noise reduction (NR, and (3 non-linear amplification without NR (“fast-acting compression”. The results showed that cognitive processing speed was a better predictor of speech intelligibility in noise, regardless of the types of signal processing algorithms used. That is, there was a stronger association between cognitive processing speed and NR outcomes and fast-acting compression outcomes (in steady state noise. We observed a weaker relationship between working memory and NR, but WMC did not relate to fast-acting compression. WMC was a relatively weaker predictor of speech intelligibility in noise. These findings might have been different if the participants had been provided with training and or allowed to acclimatize to binary masking noise reduction or fast-acting compression.

  9. Research on application of intelligent computation based LUCC model in urbanization process

    Science.gov (United States)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents

  10. Work process and task-based design of intelligent assistance systems in German textile industry

    Science.gov (United States)

    Löhrer, M.; Ziesen, N.; Altepost, A.; Saggiomo, M.; Gloy, Y. S.

    2017-10-01

    The mid-sized embossed German textile industry must face social challenges e.g. demographic change or technical changing processes. Interaction with intelligent systems (on machines) and increasing automation changes processes, working structures and employees’ tasks on all levels. Work contents are getting more complex, resulting in the necessity for diversified and enhanced competencies. Mobile devices like tablets or smartphones are increasingly finding their way into the workplace. Employees who grew up with new forms of media have certain advantages regarding the usage of modern technologies compared to older employees. Therefore, it is necessary to design new systems which help to adapt the competencies of both younger and older employees to new automated production processes in the digital work environment. The key to successful integration of technical assistance systems is user-orientated design and development that includes concepts for competency development under consideration of, e.g., ethical and legal aspects.

  11. The research of new type stratified water injection process intelligent measurement technology

    Science.gov (United States)

    Zhao, Xin

    2017-10-01

    To meet the needs of injection and development of Daqing Oilfield, the injection of oil from the early stage of general water injection to the subdivision of water is the purpose of improving the utilization degree and the qualified rate of water injection, improving the performance of water injection column and the matching process. Sets of suitable for high water content of the effective water injection technology supporting technology. New layered water injection technology intelligent measurement technology will be more information testing and flow control combined into a unified whole, long-term automatic monitoring of the work of the various sections, in the custom The process has the characteristics of "multi-layer synchronous measurement, continuous monitoring of process parameters, centralized admission data", which can meet the requirement of subdivision water injection, but also realize the automatic synchronization measurement of each interval, greatly improve the efficiency of tiered injection wells to provide a new means for the remaining oil potential.

  12. A Spatially Intelligent Public Participation System for the Environmental Impact Assessment Process

    Directory of Open Access Journals (Sweden)

    Lei Lei

    2013-05-01

    Full Text Available An environmental impact assessment (EIA is a decision-making process that evaluates the possible significant effects that a proposed project may exert on the environment. The EIA scoping and reviewing stages often involve public participation. Although its importance has long been recognized, public participation in the EIA process is often regarded as ineffective, due to time, budget, resource, technical and procedural constraints, as well as the complexity of environmental information. Geographic Information System (GIS and Volunteer Geographic Information (VGI have the potential to contribute to data collection, sharing and presentation, utilize local user-generated content to benefit decision-making and increase public outreach. This research integrated GIS, VGI, social media tools, data mining and mobile technology to design a spatially intelligent framework that presented and shared EIA information effectively to the public. A spatially intelligent public participative system (SIPPS was also developed as a proof-of-concept of the framework. The research selected the Tehachapi Renewable Transmission Project (TRTP as the pilot study area. Survey questionnaires were designed to collect feedback and conduct evaluation. Results show that SIPPS was able to improve the effectiveness of public participation, promote environmental awareness and achieve good system usability.

  13. Open-source intelligence in the Czech military knowledge syst em and process design

    OpenAIRE

    Krejci, Roman

    2002-01-01

    Owing to the recent transitions in the Czech Republic, the Czech military must satisfy a large set of new requirements. One way the military intelligence can become more effective and can conserve resources is by increasing the efficiency of open-source intelligence (OSINT), which plays an important part in intelligence gathering in the age of information. When using OSINT effectively, the military intelligence can elevate its responsiveness to different types of crises and can also properly ...

  14. Study of an intelligent system for wells elevation and petroliferous processes control; Estudo de um sistema inteligente para elevacao de pocos e controle de processos petroliferos

    Energy Technology Data Exchange (ETDEWEB)

    Patricio, Antonio Rodrigues

    1996-11-01

    The petroleum production problems were studied by means of an integrated process evaluation of a rod pumping well, a gas lift well and a process until for produced fluids. Using the artificial intelligent concepts as fuzzy logic and neural systems is presented SIEP, An Intelligent for Production Lift and Process Control, aimed to do the integrated management of the petroleum production process. (author)

  15. Emotional Intelligence Tests: Potential Impacts on the Hiring Process for Accounting Students

    Science.gov (United States)

    Nicholls, Shane; Wegener, Matt; Bay, Darlene; Cook, Gail Lynn

    2012-01-01

    Emotional intelligence is increasingly recognized as being important for professional career success. Skills related to emotional intelligence (e.g. organizational commitment, public speaking, teamwork, and leadership) are considered essential. Human resource professionals have begun including tests of emotional intelligence (EI) in job applicant…

  16. Introduction of self-control of enterprise information system through accounting documentation process of

    Directory of Open Access Journals (Sweden)

    K.О. Volskа

    2017-12-01

    Full Text Available The research is devoted to determining the possibility of implementing self-control of an enterprise information system, describing the criteria for building an information system in an enterprise that will be self-organized and capable of self-analysis. The article considers the concept of self-control, its main criteria as well as the possibility of implementing the self-control in the information system of the enterprise. The current study provides the definition of intelligent information systems and how to use expert knowledge in them. The article presents the place of the self-control (in terms of its organization at the enterprise in the economic activity of the enterprise and its correlation with internal control; as a result, it is suggested to consider the self-control of the information system as the unit included in the methods of internal control. The paper carries out the comparison of the response to the error in the information system under the usual control (from the subject of the control to the person and the self-control, that made it possible to determine the latter as the method of preventing errors, that is, real-time control during the data entry in the information system of the enterprise. It is proposed to divide the control mechanisms in the information system into informational (protection of the information system from a technical point of view and special (accounting, legal, technological, etc.. The special control mechanisms of the information system should initially be formed by the experts of the relevant profile and who should present them in the form of algorithms for preventing any possible errors that will allow IT-professionals to describe them at the software level and implement one of the criteria for self-control of the information system, namely, a self-examination. The article proposes to implement the self-control at the input of the information system, when entering the data of the primary documents

  17. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  18. Documentation Protocols to Generate Risk Indicators Regarding Degradation Processes for Cultural Heritage Risk Evaluation

    Science.gov (United States)

    Kioussi, A.; Karoglou, M.; Bakolas, A.; Labropoulos, K.; Moropoulou, A.

    2013-07-01

    Sustainable maintenance and preservation of cultural heritage assets depends highly on its resilience to external or internal alterations and to various hazards. Risk assessment of a heritage asset's can be defined as the identification of all potential hazards affecting it and the evaluation of the asset's vulnerability (building materials and building structure conservation state).Potential hazards for cultural heritage are complex and varying. The risk of decay and damage associated with monuments is not limited to certain long term natural processes, sudden events and human impact (macroscale of the heritage asset) but is also a function of the degradation processes within materials and structural elements due to physical and chemical procedures. Obviously, these factors cover different scales of the problem. The deteriorating processes in materials may be triggered by external influences or caused because of internal chemical and/or physical variations of materials properties and characteristics. Therefore risk evaluation should be dealt in the direction of revealing the specific active decay and damage mechanism both in mesoscale [type of decay and damage] and microscale [decay phenomenon mechanism] level. A prerequisite for risk indicators identification and development is the existence of an organised source of comparable and interoperable data about heritage assets under observation. This unified source of information offers a knowledge based background of the asset's vulnerability through the diagnosis of building materials' and building structure's conservation state, through the identification of all potential hazards affecting these and through mapping of its possible alterations during its entire life-time. In this framework the identification and analysis of risks regarding degradation processes for the development of qualitative and quantitative indicators can be supported by documentation protocols. The data investigated by such protocols help

  19. DOCUMENTATION PROTOCOLS TO GENERATE RISK INDICATORS REGARDING DEGRADATION PROCESSES FOR CULTURAL HERITAGE RISK EVALUATION

    Directory of Open Access Journals (Sweden)

    A. Kioussi

    2013-07-01

    Full Text Available Sustainable maintenance and preservation of cultural heritage assets depends highly on its resilience to external or internal alterations and to various hazards. Risk assessment of a heritage asset's can be defined as the identification of all potential hazards affecting it and the evaluation of the asset's vulnerability (building materials and building structure conservation state.Potential hazards for cultural heritage are complex and varying. The risk of decay and damage associated with monuments is not limited to certain long term natural processes, sudden events and human impact (macroscale of the heritage asset but is also a function of the degradation processes within materials and structural elements due to physical and chemical procedures. Obviously, these factors cover different scales of the problem. The deteriorating processes in materials may be triggered by external influences or caused because of internal chemical and/or physical variations of materials properties and characteristics. Therefore risk evaluation should be dealt in the direction of revealing the specific active decay and damage mechanism both in mesoscale [type of decay and damage] and microscale [decay phenomenon mechanism] level. A prerequisite for risk indicators identification and development is the existence of an organised source of comparable and interoperable data about heritage assets under observation. This unified source of information offers a knowledge based background of the asset's vulnerability through the diagnosis of building materials' and building structure's conservation state, through the identification of all potential hazards affecting these and through mapping of its possible alterations during its entire life-time. In this framework the identification and analysis of risks regarding degradation processes for the development of qualitative and quantitative indicators can be supported by documentation protocols. The data investigated by such

  20. UMTRA Surface Project management action process document. Final report: Revision 1

    International Nuclear Information System (INIS)

    1996-04-01

    A critical mission of the US Department of Energy (DOE) is the planning, implementation, and completion of environmental restoration (ER) programs at facilities that were operated by or in support of the former Atomic Energy Commission (AEC) from the late 1940s into the 1970s. Among these facilities are the 24 former uranium mill sites designed in the Uranium Mill Tailings Radiation Control Act (UMTRCA) of 1978 (42 USC section 7901 et seq.) Title 1 of the UMTRCA authorized the DOE to undertake remedial actions at these designated sites and associated vicinity properties (VP), which contain uranium mill tailings and other residual radioactive materials (RRM) derived from the processing sites. Title 2 of the UMTRCA addresses uranium mill sites that were licensed at the time the UMTRCA was enacted. Cleanup of these Title 2 sites is the responsibility of the licensees. The cleanup of the Title 1 sites has been split into two separate projects: the Surface Project, which deals with the mill buildings, tailings, and contaminated soils at the sites and VPs; and the Ground Water Project, which is limited to the contaminated ground water at the sites. This management action process (MAP) document discusses the Uranium Mill Tailings Remedial Action (UMTRA) Surface Project only; a separate MAP document has been prepared for the UMTRA Ground Water Project

  1. Documenting the use of expert scientific reasoning processes by high school physics students

    Directory of Open Access Journals (Sweden)

    A. Lynn Stephens

    2010-11-01

    Full Text Available We describe a methodology for identifying evidence for the use of three types of scientific reasoning. In two case studies of high school physics classes, we used this methodology to identify multiple instances of students using analogies, extreme cases, and Gedanken experiments. Previous case studies of expert scientists have indicated that these processes can be central during scientific model construction; here we code for their spontaneous use by students. We document evidence for numerous instances of these forms of reasoning in these classes. Most of these instances were associated with motion- and force-indicating depictive gestures, which we take as one kind of evidence for the use of animated mental imagery. Altogether, this methodology shows promise for use in highlighting the role of nonformal reasoning in student learning and for investigating the possible association of animated mental imagery with scientific reasoning processes.

  2. Energy-efficient hierarchical processing in the network of wireless intelligent sensors (WISE)

    Science.gov (United States)

    Raskovic, Dejan

    Sensor network nodes have benefited from technological advances in the field of wireless communication, processing, and power sources. However, the processing power of microcontrollers is often not sufficient to perform sophisticated processing, while the power requirements of digital signal processing boards or handheld computers are usually too demanding for prolonged system use. We are matching the intrinsic hierarchical nature of many digital signal-processing applications with the natural hierarchy in distributed wireless networks, and building the hierarchical system of wireless intelligent sensors. Our goal is to build a system that will exploit the hierarchical organization to optimize the power consumption and extend battery life for the given time and memory constraints, while providing real-time processing of sensor signals. In addition, we are designing our system to be able to adapt to the current state of the environment, by dynamically changing the algorithm through procedure replacement. This dissertation presents the analysis of hierarchical environment and methods for energy profiling used to evaluate different system design strategies, and to optimize time-effective and energy-efficient processing.

  3. Document segmentation via oblique cuts

    Science.gov (United States)

    Svendsen, Jeremy; Branzan-Albu, Alexandra

    2013-01-01

    This paper presents a novel solution for the layout segmentation of graphical elements in Business Intelligence documents. We propose a generalization of the recursive X-Y cut algorithm, which allows for cutting along arbitrary oblique directions. An intermediate processing step consisting of line and solid region removal is also necessary due to presence of decorative elements. The output of the proposed segmentation is a hierarchical structure which allows for the identification of primitives in pie and bar charts. The algorithm was tested on a database composed of charts from business documents. Results are very promising.

  4. An Integrated Open Approach to Capturing Systematic Knowledge for Manufacturing Process Innovation Based on Collective Intelligence

    Directory of Open Access Journals (Sweden)

    Gangfeng Wang

    2018-02-01

    Full Text Available Process innovation plays a vital role in the manufacture realization of increasingly complex new products, especially in the context of sustainable development and cleaner production. Knowledge-based innovation design can inspire designers’ creative thinking; however, the existing scattered knowledge has not yet been properly captured and organized according to Computer-Aided Process Innovation (CAPI. Therefore, this paper proposes an integrated approach to tackle this non-trivial issue. By analyzing the design process of CAPI and technical features of open innovation, a novel holistic paradigm of process innovation knowledge capture based on collective intelligence (PIKC-CI is constructed from the perspective of the knowledge life cycle. Then, a multi-source innovation knowledge fusion algorithm based on semantic elements reconfiguration is applied to form new public knowledge. To ensure the credibility and orderliness of innovation knowledge refinement, a collaborative editing strategy based on knowledge lock and knowledge–social trust degree is explored. Finally, a knowledge management system MPI-OKCS integrating the proposed techniques is implemented into the pre-built CAPI general platform, and a welding process innovation example is provided to illustrate the feasibility of the proposed approach. It is expected that our work would lay the foundation for the future knowledge-inspired CAPI and smart process planning.

  5. ParaText : scalable solutions for processing and searching very large document collections : final LDRD report.

    Energy Technology Data Exchange (ETDEWEB)

    Crossno, Patricia Joyce; Dunlavy, Daniel M.; Stanton, Eric T.; Shead, Timothy M.

    2010-09-01

    This report is a summary of the accomplishments of the 'Scalable Solutions for Processing and Searching Very Large Document Collections' LDRD, which ran from FY08 through FY10. Our goal was to investigate scalable text analysis; specifically, methods for information retrieval and visualization that could scale to extremely large document collections. Towards that end, we designed, implemented, and demonstrated a scalable framework for text analysis - ParaText - as a major project deliverable. Further, we demonstrated the benefits of using visual analysis in text analysis algorithm development, improved performance of heterogeneous ensemble models in data classification problems, and the advantages of information theoretic methods in user analysis and interpretation in cross language information retrieval. The project involved 5 members of the technical staff and 3 summer interns (including one who worked two summers). It resulted in a total of 14 publications, 3 new software libraries (2 open source and 1 internal to Sandia), several new end-user software applications, and over 20 presentations. Several follow-on projects have already begun or will start in FY11, with additional projects currently in proposal.

  6. ECG Signal Processing, Classification and Interpretation A Comprehensive Framework of Computational Intelligence

    CERN Document Server

    Pedrycz, Witold

    2012-01-01

    Electrocardiogram (ECG) signals are among the most important sources of diagnostic information in healthcare so improvements in their analysis may also have telling consequences. Both the underlying signal technology and a burgeoning variety of algorithms and systems developments have proved successful targets for recent rapid advances in research. ECG Signal Processing, Classification and Interpretation shows how the various paradigms of Computational Intelligence, employed either singly or in combination, can produce an effective structure for obtaining often vital information from ECG signals. Neural networks do well at capturing the nonlinear nature of the signals, information granules realized as fuzzy sets help to confer interpretability on the data and evolutionary optimization may be critical in supporting the structural development of ECG classifiers and models of ECG signals. The contributors address concepts, methodology, algorithms, and case studies and applications exploiting the paradigm of Comp...

  7. Argumentative SOX Compliant and Quality Decision Support Intelligent Expert System over the Suppliers Selection Process

    Directory of Open Access Journals (Sweden)

    Jesus Angel Fernandez Canelas

    2013-01-01

    Full Text Available The objective of this paper is to define a decision support system over SOX (Sarbanes-Oxley Act compatibility and quality of the Suppliers Selection Process based on Artificial Intelligence and Argumentation Theory knowledge and techniques. The present SOX Law, in effect nowadays, was created to improve financial government control over US companies. This law is a factor standard out United States due to several factors like present globalization, expansion of US companies, or key influence of US stock exchange markets worldwide. This paper constitutes a novel approach to this kind of problems due to following elements: (1 it has an optimized structure to look for the solution, (2 it has a dynamic learning method to handle court and control gonvernment bodies decisions, (3 it uses fuzzy knowledge to improve its performance, and (4 it uses its past accumulated experience to let the system evolve far beyond its initial state.

  8. Study on robot motion control for intelligent welding processes based on the laser tracking sensor

    Science.gov (United States)

    Zhang, Bin; Wang, Qian; Tang, Chen; Wang, Ju

    2017-06-01

    A robot motion control method is presented for intelligent welding processes of complex spatial free-form curve seams based on the laser tracking sensor. First, calculate the tip position of the welding torch according to the velocity of the torch and the seam trajectory detected by the sensor. Then, search the optimal pose of the torch under constraints using genetic algorithms. As a result, the intersection point of the weld seam and the laser plane of the sensor is within the detectable range of the sensor. Meanwhile, the angle between the axis of the welding torch and the tangent of the weld seam meets the requirements. The feasibility of the control method is proved by simulation.

  9. Development of an intelligent CAI system for a distributed processing environment

    International Nuclear Information System (INIS)

    Fujii, M.; Sasaki, K.; Ohi, T.; Itoh, T.

    1993-01-01

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  10. Intelligent Processing Equipment Research and Development Programs of the Department of Commerce

    Science.gov (United States)

    Simpson, J. A.

    1992-01-01

    The intelligence processing equipment (IPE) research and development (R&D) programs of the Department of Commerce are carried out within the National Institute of Standards and Technology (NIST). This institute has had work in support of industrial productivity as part of its mission since its founding in 1901. With the advent of factory automation these efforts have increasingly turned to R&D in IPE. The Manufacturing Engineering Laboratory (MEL) of NIST devotes a major fraction of its efforts to this end while other elements within the organization, notably the Material Science and Engineering Laboratory, have smaller but significant programs. An inventory of all such programs at NIST and a representative selection of projects that at least demonstrate the scope of the efforts are presented.

  11. Service with a smile: do emotional intelligence, gender, and autonomy moderate the emotional labor process?

    Science.gov (United States)

    Johnson, Hazel-Anne M; Spector, Paul E

    2007-10-01

    This survey study of 176 participants from eight customer service organizations investigated how individual factors moderate the impact of emotional labor strategies on employee well-being. Hierarchical regression analyses indicated that gender and autonomy were significant moderators of the relationships between emotional labor strategies and the personal outcomes of emotional exhaustion, affective well-being, and job satisfaction. Females were more likely to experience negative consequences when engaging in surface acting. Autonomy served to alleviate negative outcomes for individuals who used emotional labor strategies often. Contrary to our hypotheses, emotional intelligence did not moderate the relationship between the emotional labor strategies and personal outcomes. Results demonstrated how the emotional labor process can influence employee well-being. (c) 2007 APA, all rights reserved.

  12. The Relationship between Emotional Intelligence and Cool and Hot Cognitive Processes: A Systematic Review

    Science.gov (United States)

    Gutiérrez-Cobo, María José; Cabello, Rosario; Fernández-Berrocal, Pablo

    2016-01-01

    Although emotion and cognition were considered to be separate aspects of the psyche in the past, researchers today have demonstrated the existence of an interplay between the two processes. Emotional intelligence (EI), or the ability to perceive, use, understand, and regulate emotions, is a relatively young concept that attempts to connect both emotion and cognition. While EI has been demonstrated to be positively related to well-being, mental and physical health, and non-aggressive behaviors, little is known about its underlying cognitive processes. The aim of the present study was to systematically review available evidence about the relationship between EI and cognitive processes as measured through “cool” (i.e., not emotionally laden) and “hot” (i.e., emotionally laden) laboratory tasks. We searched Scopus and Medline to find relevant articles in Spanish and English, and divided the studies following two variables: cognitive processes (hot vs. cool) and EI instruments used (performance-based ability test, self-report ability test, and self-report mixed test). We identified 26 eligible studies. The results provide a fair amount of evidence that performance-based ability EI (but not self-report EI tests) is positively related with efficiency in hot cognitive tasks. EI, however, does not appear to be related with cool cognitive tasks: neither through self-reporting nor through performance-based ability instruments. These findings suggest that performance-based ability EI could improve individuals’ emotional information processing abilities. PMID:27303277

  13. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  14. Artificial Intelligence Tools for Scaling Up of High Shear Wet Granulation Process.

    Science.gov (United States)

    Landin, Mariana

    2017-01-01

    The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R 2 > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  15. Development of automatic radiographic inspection system using digital image processing and artificial intelligence

    International Nuclear Information System (INIS)

    Itoga, Kouyu; Sugimoto, Koji; Michiba, Koji; Kato, Yuhei; Sugita, Yuji; Onda, Katsuhiro.

    1991-01-01

    The application of computers to welding inspection is expanding rapidly. The classification of the application is the collection, analysis and processing of data, the graphic display of results, the distinction of the kinds of defects and the evaluation of the harmufulness of defects and the judgement of acceptance or rejection. The application of computer techniques to the automation of data collection was realized at the relatively early stage. Data processing and the graphic display of results are the techniques in progress now, and the application of artificial intelligence to the distinction of the kinds of defects and the evaluation of harmfulness is expected to expand rapidly. In order to computerize radiographic inspection, the abilities of image processing technology and knowledge engineering must be given to computers. The object of this system is the butt joints by arc welding of the steel materials of up to 30 mm thickness. The digitizing transformation of radiographs, the distinction and evaluation of transmissivity and gradation by image processing, and only as for those, of which the picture quality satisfies the standard, the extraction of defect images, their display, the distinction of the kinds and the final judgement are carried out. The techniques of image processing, the knowledge for distinguishing the kinds of defects and the concept of the practical system are reported. (K.I.)

  16. A New Tool for Intelligent Parallel Processing of Radar/SAR Remotely Sensed Imagery

    Directory of Open Access Journals (Sweden)

    A. Castillo Atoche

    2013-01-01

    Full Text Available A novel parallel tool for large-scale image enhancement/reconstruction and postprocessing of radar/SAR sensor systems is addressed. The proposed parallel tool performs the following intelligent processing steps: image formation, for the application of different system-level effects of image degradation with a particular remote sensing (RS system and simulation of random noising effects, enhancement/reconstruction by employing nonparametric robust high-resolution techniques, and image postprocessing using the fuzzy anisotropic diffusion technique which incorporates a better edge-preserving noise removal effect and faster diffusion process. This innovative tool allows the processing of high-resolution images provided with different radar/SAR sensor systems as required by RS endusers for environmental monitoring, risk prevention, and resource management. To verify the performance implementation of the proposed parallel framework, the processing steps are developed and specifically tested on graphic processing units (GPU, achieving considerable speedups compared to the serial version of the same techniques implemented in C language.

  17. Real-time operation guide system for sintering process with artificial intelligence

    Institute of Scientific and Technical Information of China (English)

    FAN Xiao-hui; CHEN Xu-ling; JIANG Tao; LI Tao

    2005-01-01

    In order to optimize the sintering process, a real-time operation guide system with artificial intelligence was developed, mainly including the data acquisition online subsystem, the sinter chemical composition controller, the sintering process state controller, and the abnormal conditions diagnosis subsystem. Knowledge base of the sintering process controlling was constructed, and inference engine of the system was established. Sinter chemical compositions were controlled by the strategies of self-adaptive prediction, internal optimization and center on basicity. And the state of sintering was stabilized centering on permeability. In order to meet the needs of process change and make the system clear, the system has learning ability and explanation function. The software of the system was developed in Visual C++ programming language. The application of the system shows that the hitting accuracy of sinter compositions and burning through point prediction are more than 85%; the first-grade rate of sinter chemical composition, stability rate of burning through point and stability rate of sintering process are increased by 3%, 9% and 4%, respectively.

  18. Artificial Intelligence Mechanisms on Interactive Modified Simplex Method with Desirability Function for Optimising Surface Lapping Process

    Directory of Open Access Journals (Sweden)

    Pongchanun Luangpaiboon

    2014-01-01

    Full Text Available A study has been made to optimise the influential parameters of surface lapping process. Lapping time, lapping speed, downward pressure, and charging pressure were chosen from the preliminary studies as parameters to determine process performances in terms of material removal, lap width, and clamp force. The desirability functions of the-nominal-the-best were used to compromise multiple responses into the overall desirability function level or D response. The conventional modified simplex or Nelder-Mead simplex method and the interactive desirability function are performed to optimise online the parameter levels in order to maximise the D response. In order to determine the lapping process parameters effectively, this research then applies two powerful artificial intelligence optimisation mechanisms from harmony search and firefly algorithms. The recommended condition of (lapping time, lapping speed, downward pressure, and charging pressure at (33, 35, 6.0, and 5.0 has been verified by performing confirmation experiments. It showed that the D response level increased to 0.96. When compared with the current operating condition, there is a decrease of the material removal and lap width with the improved process performance indices of 2.01 and 1.14, respectively. Similarly, there is an increase of the clamp force with the improved process performance index of 1.58.

  19. An architectural framework for developing intelligent applications for the carbon dioxide capture process

    Energy Technology Data Exchange (ETDEWEB)

    Luo, C.; Zhou, Q.; Chan, C.W. [Regina Univ., SK (Canada)

    2009-07-01

    This presentation reported on the development of automated application solutions for the carbon dioxide (CO{sub 2}) capture process. An architectural framework was presented for developing intelligent systems for the process system. The chemical absorption process consists of dozens of components. It therefore generates more than a hundred different types of data. Developing automated support for these tasks is desirable because the monitoring, analysis and diagnosis of the data is very complex. The proposed framework interacts with an implemented domain ontology for the CO{sub 2} capture process, which consists of information derived from senior operators of the CO{sub 2} pilot plant at the International Test Centre for Carbon Dioxide Capture at University of Regina. The well-defined library within the framework reduces development time and cost. The framework also has built-in web-based software components for data monitoring, management, and analysis. These components provide support for generating automated solutions for the CO{sub 2} capture process. An automated monitoring system that was also developed based on the architectural framework.

  20. Political and Budgetary Oversight of the Ukrainian Intelligence Community: Processes, Problems and Prospects for Reform

    National Research Council Canada - National Science Library

    Petrov, Oleksii

    2007-01-01

    This thesis addresses the problem of providing policy and budget oversight of Ukrainian intelligence organizations in accordance with norms and practices developed in contemporary Western democracies...

  1. Document Image Processing: Going beyond the Black-and-White Barrier. Progress, Issues and Options with Greyscale and Colour Image Processing.

    Science.gov (United States)

    Hendley, Tom

    1995-01-01

    Discussion of digital document image processing focuses on issues and options associated with greyscale and color image processing. Topics include speed; size of original document; scanning resolution; markets for different categories of scanners, including photographic libraries, publishing, and office applications; hybrid systems; data…

  2. Process querying : enabling business intelligence through query-based process analytics

    NARCIS (Netherlands)

    Polyvyanyy, A.; Ouyang, C.; Barros, A.; van der Aalst, W.M.P.

    2017-01-01

    The volume of process-related data is growing rapidly: more and more business operations are being supported and monitored by information systems. Industry 4.0 and the corresponding industrial Internet of Things are about to generate new waves of process-related data, next to the abundance of event

  3. Prodiag--a hybrid artificial intelligence based reactor diagnostic system for process faults

    International Nuclear Information System (INIS)

    Reifman, J.; Wei, T.Y.C.; Vitela, J.E.; Applequist, C. A.; Chasensky, T.M.

    1996-01-01

    Commonwealth Research Corporation (CRC) and Argonne National Laboratory (ANL) are collaborating on a DOE-sponsored Cooperative Research and Development Agreement (CRADA), project to perform feasibility studies on a novel approach to Artificial Intelligence (Al) based diagnostics for component faults in nuclear power plants. Investigations are being performed in the construction of a first-principles physics-based plant level process diagnostic expert system (ES) and the identification of component-level fault patterns through operating component characteristics using artificial neural networks (ANNs). The purpose of the proof-of-concept project is to develop a computer-based system using this Al approach to assist process plant operators during off-normal plant conditions. The proposed computer-based system will use thermal hydraulic (T-H) signals complemented by other non-T-H signals available in the data stream to provide the process operator with the component which most likely caused the observed process disturbance.To demonstrate the scale-up feasibility of the proposed diagnostic system it is being developed for use with the Chemical Volume Control System (CVCS) of a nuclear power plant. A full-scope operator training simulator representing the Commonwealth Edison Braidwood nuclear power plant is being used both as the source of development data and as the means to evaluate the advantages of the proposed diagnostic system. This is an ongoing multi-year project and this paper presents the results to date of the CRADA phase

  4. Information Design for “Weak Signal” detection and processing in Economic Intelligence: A case study on Health resources

    Directory of Open Access Journals (Sweden)

    Sahbi Sidhom

    2011-12-01

    Full Text Available The topics of this research cover all phases of “Information Design” applied to detect and profit from weak signals in economic intelligence (EI or business intelligence (BI. The field of the information design (ID applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing, human performances technology and human factors. Applied in the context of information system, it allows end-users to easily detect implicit topics known as “weak signals” (WS. In our approach to implement the ID, the processes cover the development of a knowledge management (KM process in the context of EI. A case study concerning information monitoring health resources is presented using ID processes to outline weak signals. Both French and American bibliographic databases were applied to make the connection to multilingual concepts in the health watch process.

  5. Black-White Differences in Cognitive Processing: A Study of the Planning, Attention, Simultaneous, and Successive Theory of Intelligence

    Science.gov (United States)

    Naglieri, Jack A.; Rojahn, Johannes; Matto, Holly C.; Aquilino, Sally A.

    2005-01-01

    Researchers have typically found a mean difference of about 15 points between Blacks and Whites on traditional measures of intelligence. Some have argued that the difference between Blacks and Whites would be smaller on measures of cognitive processing. This study examined Black (n = 298) and White (n = 1,691) children on Planning, Attention,…

  6. Intelligence in Artificial Intelligence

    OpenAIRE

    Datta, Shoumen Palit Austin

    2016-01-01

    The elusive quest for intelligence in artificial intelligence prompts us to consider that instituting human-level intelligence in systems may be (still) in the realm of utopia. In about a quarter century, we have witnessed the winter of AI (1990) being transformed and transported to the zenith of tabloid fodder about AI (2015). The discussion at hand is about the elements that constitute the canonical idea of intelligence. The delivery of intelligence as a pay-per-use-service, popping out of ...

  7. Developing emergency exercises for hazardous material transportation: process, documents and templates.

    Science.gov (United States)

    Crichton, Margaret; Kelly, Terence

    2012-01-01

    Multi-agency emergency exercises establish and reinforce relationships, and bring people from different areas together to work as a team, realise clear goals, understand roles and responsibilities, and get to know and respect each agency's strengths and weaknesses. However, despite the long-held belief in and respect for exercises in their provision of benefits to the individual and the organisation, there is little evidence of a consistent and clear process for exercise design, especially identifying the documents that may need to be completed to ensure efficient exercise preparation and performance. This paper reports the results of a project undertaken on behalf of the organisations that form the radioactive material transportation mutual-aid agreement, RADSAFE, to develop a standardised process to design emergency exercises. Three stages, from identifying the requirement for an exercise (Stage I), through to obtaining approval for operational orders (Stage II), then conducting a management review as part of the continuous improvement cycle (Stage III), were developed. Although designed for radioactive material transportation events, it is suggested that many of the factors within these three stages can be generalised for the design of exercises in other high-hazard industries.

  8. Process sensors characterization based on noise analysis technique and artificial intelligence

    International Nuclear Information System (INIS)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos

    2005-01-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  9. Process sensors characterization based on noise analysis technique and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: rnavarro@ipen.br; sperillo@ipen.br; rcsantos@ipen.br

    2005-07-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  10. Expert system in OPS5 for intelligent processing of the alarms in nuclear plants

    International Nuclear Information System (INIS)

    Cavalcante Junior, Jose Airton Chaves

    1997-11-01

    This work intends to establish a model of knowledge representation based on a expert system to supervise either security or operating to be applied generally on monitoring and detecting faults of industrial processes. The model structure proposed here let the system represent the knowledge related to faults on a process using a combination of rules either basic or associative. Besides, the model proposed has a mechanism of propagation of events in real time that acts on this structure making it possible to have an intelligent alarm processing. The rules used by the system define faults from the data acquired by instrumentation (basic rules), or from the establishment of a conjunction of faults already existent (associate rules). The computing implementation of the model defined in this work was developed in OPS5. It was applied on an example consisting of the shutdown of the Angra-I's power plant and was called FDAX (FDA Extended). For the simulated tests the FDAX was connected to the SICA (Integrated System of Angra-I Computers). It results save validity to the model, confirming thus its performance to real time applications. (author)

  11. Gender Differences in the Relationship between Emotional Intelligence and Right Hemisphere Lateralization for Facial Processing

    Science.gov (United States)

    Castro-Schilo, Laura; Kee, Daniel W.

    2010-01-01

    The present study examined relationships between emotional intelligence, measured by the Mayer-Salovey-Caruso Emotional Intelligence Test, and right hemisphere dominance for a free vision chimeric face test. A sample of 122 ethnically diverse college students participated and completed online versions of the forenamed tests. A hierarchical…

  12. The Impact of Business Intelligence (BI Competence on Customer Relationship Management (CRM Process: An Empirical Investigation of the Banking Industry

    Directory of Open Access Journals (Sweden)

    Ali Mortezaei

    2018-03-01

    Full Text Available Nowadays, establishing long-term and effective relationships with customers is a key factor in understanding customers’ needs and preferences and achieving competitive advantage. In addition, companies are facing with a growing need for information and analytical knowledge about their customers, market, competitors, organizational environment, and other factors affecting their business. Business intelligence has been considered as a response to this need. The purpose of this study is to investigate the role of business intelligence competence in improving customer relationship management process. Based on the literature review and the competence – capability relationship paradigm, a conceptual model was developed comprising of different dimensions of business intelligence competence and customer relationship management processes. The data were collected from the banking sector and partial least squares structural equation modelling was employed for data analysis. Empirical results showed that organizational business intelligence competence, comprising of managerial, technical, and cultural competence, has a significantly positive impact on enhancing capabilities of customer relationship management process including initiation, maintenance, and termination of the relationship.

  13. Intelligent Mission Controller Node

    National Research Council Canada - National Science Library

    Perme, David

    2002-01-01

    The goal of the Intelligent Mission Controller Node (IMCN) project was to improve the process of translating mission taskings between real-world Command, Control, Communications, Computers, and Intelligence (C41...

  14. Human Document Project

    NARCIS (Netherlands)

    de Vries, Jeroen; Abelmann, Leon; Manz, A; Elwenspoek, Michael Curt

    2012-01-01

    “The Human Document Project‿ is a project which tries to answer all of the questions related to preserving information about the human race for tens of generations of humans to come or maybe even for a future intelligence which can emerge in the coming thousands of years. This document mainly

  15. An intelligent system for monitoring and diagnosis of the CO{sub 2} capture process

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Q.; Chan, C.W.; Tontiwachwuthikul, P. [University of Regina, Regina, SK (Canada). Faculty of Engineering

    2011-07-15

    Amine-based carbon dioxide capture has been widely considered as a feasible ideal technology for reducing large-scale CO{sub 2} emissions and mitigating global warming. The operation of amine-based CO{sub 2} capture is a complicated task, which involves monitoring over 100 process parameters and careful manipulation of numerous valves and pumps. The current research in the field of CO{sub 2} capture has emphasized the need for improving CO{sub 2} capture efficiency and enhancing plant performance. In the present study, artificial intelligence techniques were applied for developing a knowledge-based expert system that aims at effectively monitoring and controlling the CO{sub 2} capture process and thereby enhancing CO{sub 2} capture efficiency. In developing the system, the inferential modeling technique (IMT) was applied to analyze the domain knowledge and problem-solving techniques, and a knowledge base was developed on DeltaV Simulate. The expert system helps to enhance CO{sub 2} capture system performance and efficiency by reducing the time required for diagnosis and problem solving if abnormal conditions occur. The expert system can be used as a decision-support tool that helps inexperienced operators control the plant: it can be used also for training novice operators.

  16. Intelligent Technique for Signal Processing to Identify the Brain Disorder for Epilepsy Captures Using Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Gurumurthy Sasikumar

    2016-01-01

    Full Text Available The new direction of understand the signal that is created from the brain organization is one of the main chores in the brain signal processing. Amid all the neurological disorders the human brain epilepsy is measured as one of the extreme prevalent and then programmed artificial intelligence detection technique is an essential due to the crooked and unpredictable nature of happening of epileptic seizures. We proposed an Improved Fuzzy firefly algorithm, which would enhance the classification of the brain signal efficiently with minimum iteration. An important bunching technique created on fuzzy logic is the Fuzzy C means. Together in the feature domain with the spatial domain the features gained after multichannel EEG signals remained combined by means of fuzzy algorithms. And for better precision segmentation process the firefly algorithm is applied to optimize the Fuzzy C-means membership function. Simultaneously for the efficient clustering method the convergence criteria are set. On the whole the proposed technique yields more accurate results and that gives an edge over other techniques. This proposed algorithm result compared with other algorithms like fuzzy c means algorithm and PSO algorithm.

  17. An intelligent signal processing and pattern recognition technique for defect identification using an active sensor network

    Science.gov (United States)

    Su, Zhongqing; Ye, Lin

    2004-08-01

    The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.

  18. Middle manager role and contribution towards the competitive intelligence process: A case of Irish subsidiaries

    Directory of Open Access Journals (Sweden)

    Willie Chinyamurindi

    2016-07-01

    Full Text Available Background: Calls have been made especially during a period of global competition and economic austerity for research that focuses on how competitive intelligence (CI is actually generated within organisations. Objectives: The aim of this study was to understand the views and experiences of middle managers with regard to their role and contribution towards the CI process within Irish subsidiaries of the Multinational Corporation (MNC. Method: The study adopts a qualitative approach using the semi-structured interview technique to generate narratives and themes around how CI is generated using a sample of 15 middle managers drawn from five participating Irish subsidiaries. Results: Based on the analysis of the narratives of the middle managers, three main themes emerged as findings. Firstly, the process of gathering CI was facilitated by the reliance on internal and external tools. Secondly, information gathered from the use of such tools was then communicated by middle managers to top managers to inform the making of strategic decisions. Thus, (and thirdly, middle managers were found to occupy an important role not only through the execution of their management duties but by extending this influence towards the generation of information deemed to affect the competitive position of not just the subsidiary but also the parent company. Conclusion: The study concludes by focusing on the implications and recommendations based on the three themes drawn from the empirical data.

  19. Alexandria: towards an efficient centralised document management. More efficient business processes

    International Nuclear Information System (INIS)

    Couvreur, D.

    2011-01-01

    The capital of SCK-CEN is the knowledge of its staff. There is an enormous amount of information circulating within the research centre. A centralised management for all documents is also critical to efficiently manage, share and unlock the expertise. Since 2009, SCK-CEN has been working on a document management system: Alexandria. A first test draft was completed in 2010.

  20. Artificial Intelligence Based Selection of Optimal Cutting Tool and Process Parameters for Effective Turning and Milling Operations

    Science.gov (United States)

    Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta

    2016-06-01

    With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.

  1. An Intelligent System For Arabic Text Categorization

    NARCIS (Netherlands)

    Syiam, M.M.; Tolba, Mohamed F.; Fayed, Z.T.; Abdel-Wahab, Mohamed S.; Ghoniemy, Said A.; Habib, Mena Badieh

    Text Categorization (classification) is the process of classifying documents into a predefined set of categories based on their content. In this paper, an intelligent Arabic text categorization system is presented. Machine learning algorithms are used in this system. Many algorithms for stemming and

  2. TECHNICAL BASIS DOCUMENT FOR AT-POWER SIGNIFICANCE DETERMINATION PROCESS (SDP) NOTEBOOKS

    International Nuclear Information System (INIS)

    AZARM, M.A.; SMANTA, P.K.; MARTINEZ-GURIDI, G.; HIGGINS, J.

    2004-01-01

    To support the assessment of inspection findings as part of the risk-informed inspection in the United States Nuclear Regulatory Commission's (USNRC's) Reactor Oversight Process (ROP), risk inspection notebooks, also called significance determination process (SDP) notebooks, have been developed for each of the operating plants in the United States. These notebooks serve as a tool for assessing risk significance of inspection findings along with providing an engineering understanding of the significance. Plant-specific notebooks are developed to capture plant-specific features, characteristics, and analyses that influence the risk profile of the plant. At the same time, the notebooks follow a consistent set of assumptions and guidelines to assure consistent treatment of inspection findings across the plants. To achieve these objectives, notebooks are designed to provide specific information that are unique both in the manner in which the information is provided and in the way the screening risk assessment is carried out using the information provided. The unique features of the SDP notebooks, the approaches used to present the information for assessment of inspection findings, the assumptions used in consistent modeling across different plants with due credit to plant-specific features and analyses form the technical basis of the SDP notebooks. In this document, the unique features and the technical basis for the notebooks are presented. The types of information that are included and the reasoning/basis for including that information are discussed. The rules and basis for developing the worksheets that are used by the inspectors in the assessment of inspection findings are presented. The approach to modeling plants' responses to different initiating events and specific assumptions/considerations used for each of the reactor types are also discussed

  3. Visual function and cognitive speed of processing mediate age-related decline in memory span and fluid intelligence.

    Science.gov (United States)

    Clay, Olivio J; Edwards, Jerri D; Ross, Lesley A; Okonkwo, Ozioma; Wadley, Virginia G; Roth, David L; Ball, Karlene K

    2009-06-01

    To evaluate the relationship between sensory and cognitive decline, particularly with respect to speed of processing, memory span, and fluid intelligence. In addition, the common cause, sensory degradation and speed of processing hypotheses were compared. Structural equation modeling was used to investigate the complex relationships among age-related decrements in these areas. Cross-sectional data analyses included 842 older adult participants (M = 73 years). After accounting for age-related declines in vision and processing speed, the direct associations between age and memory span and between age and fluid intelligence were nonsignificant. Older age was associated with visual decline, which was associated with slower speed of processing, which in turn was associated with greater cognitive deficits. The findings support both the sensory degradation and speed of processing accounts of age-related, cognitive decline. Furthermore, the findings highlight positive aspects of normal cognitive aging in that older age may not be associated with a loss of fluid intelligence if visual sensory functioning and processing speed can be maintained.

  4. Planning pesticides usage for herbal and animal pests based on intelligent classification system with image processing and neural networks

    Directory of Open Access Journals (Sweden)

    Dimililer Kamil

    2018-01-01

    Full Text Available Pests are divided into two as herbal and animal pests in agriculture, and detection and use of minimum pesticides are quite challenging task. Last three decades, researchers have been improving their studies on these manners. Therefore, effective, efficient, and as well as intelligent systems are designed and modelled. In this paper, an intelligent classification system is designed for detecting pests as herbal or animal to use of proper pesticides accordingly. The designed system suggests two main stages. Firstly, images are processed using different image processing techniques that images have specific distinguishing geometric patterns. The second stage is neural network phase for classification. A backpropagation neural network is used for training and testing with processed images. System is tested, and experiment results show efficiency and effective classification rate. Autonomy and time efficiency within the pesticide usage are also discussed.

  5. 2015 Chinese Intelligent Systems Conference

    CERN Document Server

    Du, Junping; Li, Hongbo; Zhang, Weicun; CISC’15

    2016-01-01

    This book presents selected research papers from the 2015 Chinese Intelligent Systems Conference (CISC’15), held in Yangzhou, China. The topics covered include multi-agent systems, evolutionary computation, artificial intelligence, complex systems, computation intelligence and soft computing, intelligent control, advanced control technology, robotics and applications, intelligent information processing, iterative learning control, and machine learning. Engineers and researchers from academia, industry and the government can gain valuable insights into solutions combining ideas from multiple disciplines in the field of intelligent systems.

  6. Artificial intelligence framework for simulating clinical decision-making: a Markov decision process approach.

    Science.gov (United States)

    Bennett, Casey C; Hauser, Kris

    2013-01-01

    In the modern healthcare system, rapidly expanding costs/complexity, the growing myriad of treatment options, and exploding information streams that often do not effectively reach the front lines hinder the ability to choose optimal treatment decisions over time. The goal in this paper is to develop a general purpose (non-disease-specific) computational/artificial intelligence (AI) framework to address these challenges. This framework serves two potential functions: (1) a simulation environment for exploring various healthcare policies, payment methodologies, etc., and (2) the basis for clinical artificial intelligence - an AI that can "think like a doctor". This approach combines Markov decision processes and dynamic decision networks to learn from clinical data and develop complex plans via simulation of alternative sequential decision paths while capturing the sometimes conflicting, sometimes synergistic interactions of various components in the healthcare system. It can operate in partially observable environments (in the case of missing observations or data) by maintaining belief states about patient health status and functions as an online agent that plans and re-plans as actions are performed and new observations are obtained. This framework was evaluated using real patient data from an electronic health record. The results demonstrate the feasibility of this approach; such an AI framework easily outperforms the current treatment-as-usual (TAU) case-rate/fee-for-service models of healthcare. The cost per unit of outcome change (CPUC) was $189 vs. $497 for AI vs. TAU (where lower is considered optimal) - while at the same time the AI approach could obtain a 30-35% increase in patient outcomes. Tweaking certain AI model parameters could further enhance this advantage, obtaining approximately 50% more improvement (outcome change) for roughly half the costs. Given careful design and problem formulation, an AI simulation framework can approximate optimal

  7. Modeling of steam distillation mechanism during steam injection process using artificial intelligence.

    Science.gov (United States)

    Daryasafar, Amin; Ahadi, Arash; Kharrat, Riyaz

    2014-01-01

    Steam distillation as one of the important mechanisms has a great role in oil recovery in thermal methods and so it is important to simulate this process experimentally and theoretically. In this work, the simulation of steam distillation is performed on sixteen sets of crude oil data found in the literature. Artificial intelligence (AI) tools such as artificial neural network (ANN) and also adaptive neurofuzzy interference system (ANFIS) are used in this study as effective methods to simulate the distillate recoveries of these sets of data. Thirteen sets of data were used to train the models and three sets were used to test the models. The developed models are highly compatible with respect to input oil properties and can predict the distillate yield with minimum entry. For showing the performance of the proposed models, simulation of steam distillation is also done using modified Peng-Robinson equation of state. Comparison between the calculated distillates by ANFIS and neural network models and also equation of state-based method indicates that the errors of the ANFIS model for training data and test data sets are lower than those of other methods.

  8. Digital image analysis in breast pathology-from image processing techniques to artificial intelligence.

    Science.gov (United States)

    Robertson, Stephanie; Azizpour, Hossein; Smith, Kevin; Hartman, Johan

    2018-04-01

    Breast cancer is the most common malignant disease in women worldwide. In recent decades, earlier diagnosis and better adjuvant therapy have substantially improved patient outcome. Diagnosis by histopathology has proven to be instrumental to guide breast cancer treatment, but new challenges have emerged as our increasing understanding of cancer over the years has revealed its complex nature. As patient demand for personalized breast cancer therapy grows, we face an urgent need for more precise biomarker assessment and more accurate histopathologic breast cancer diagnosis to make better therapy decisions. The digitization of pathology data has opened the door to faster, more reproducible, and more precise diagnoses through computerized image analysis. Software to assist diagnostic breast pathology through image processing techniques have been around for years. But recent breakthroughs in artificial intelligence (AI) promise to fundamentally change the way we detect and treat breast cancer in the near future. Machine learning, a subfield of AI that applies statistical methods to learn from data, has seen an explosion of interest in recent years because of its ability to recognize patterns in data with less need for human instruction. One technique in particular, known as deep learning, has produced groundbreaking results in many important problems including image classification and speech recognition. In this review, we will cover the use of AI and deep learning in diagnostic breast pathology, and other recent developments in digital image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Modeling of Steam Distillation Mechanism during Steam Injection Process Using Artificial Intelligence

    Science.gov (United States)

    Ahadi, Arash; Kharrat, Riyaz

    2014-01-01

    Steam distillation as one of the important mechanisms has a great role in oil recovery in thermal methods and so it is important to simulate this process experimentally and theoretically. In this work, the simulation of steam distillation is performed on sixteen sets of crude oil data found in the literature. Artificial intelligence (AI) tools such as artificial neural network (ANN) and also adaptive neurofuzzy interference system (ANFIS) are used in this study as effective methods to simulate the distillate recoveries of these sets of data. Thirteen sets of data were used to train the models and three sets were used to test the models. The developed models are highly compatible with respect to input oil properties and can predict the distillate yield with minimum entry. For showing the performance of the proposed models, simulation of steam distillation is also done using modified Peng-Robinson equation of state. Comparison between the calculated distillates by ANFIS and neural network models and also equation of state-based method indicates that the errors of the ANFIS model for training data and test data sets are lower than those of other methods. PMID:24883365

  10. Prediction of Surface Roughness in End Milling Process Using Intelligent Systems: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Abdel Badie Sharkawy

    2011-01-01

    Full Text Available A study is presented to model surface roughness in end milling process. Three types of intelligent networks have been considered. They are (i radial basis function neural networks (RBFNs, (ii adaptive neurofuzzy inference systems (ANFISs, and (iii genetically evolved fuzzy inference systems (G-FISs. The machining parameters, namely, the spindle speed, feed rate, and depth of cut have been used as inputs to model the workpiece surface roughness. The goal is to get the best prediction accuracy. The procedure is illustrated using experimental data of end milling 6061 aluminum alloy. The three networks have been trained using experimental training data. After training, they have been examined using another set of data, that is, validation data. Results are compared with previously published results. It is concluded that ANFIS networks may suffer the local minima problem, and genetic tuning of fuzzy networks cannot insure perfect optimality unless suitable parameter setting (population size, number of generations etc. and tuning range for the FIS, parameters are used which can be hardly satisfied. It is shown that the RBFN model has the best performance (prediction accuracy in this particular case.

  11. [Support of the nursing process through electronic nursing documentation systems (UEPD) – Initial validation of an instrument].

    Science.gov (United States)

    Hediger, Hannele; Müller-Staub, Maria; Petry, Heidi

    2016-01-01

    Electronic nursing documentation systems, with standardized nursing terminology, are IT-based systems for recording the nursing processes. These systems have the potential to improve the documentation of the nursing process and to support nurses in care delivery. This article describes the development and initial validation of an instrument (known by its German acronym UEPD) to measure the subjectively-perceived benefits of an electronic nursing documentation system in care delivery. The validity of the UEPD was examined by means of an evaluation study carried out in an acute care hospital (n = 94 nurses) in German-speaking Switzerland. Construct validity was analyzed by principal components analysis. Initial references of validity of the UEPD could be verified. The analysis showed a stable four factor model (FS = 0.89) scoring in 25 items. All factors loaded ≥ 0.50 and the scales demonstrated high internal consistency (Cronbach's α = 0.73 – 0.90). Principal component analysis revealed four dimensions of support: establishing nursing diagnosis and goals; recording a case history/an assessment and documenting the nursing process; implementation and evaluation as well as information exchange. Further testing with larger control samples and with different electronic documentation systems are needed. Another potential direction would be to employ the UEPD in a comparison of various electronic documentation systems.

  12. Driver's various information process and multi-ruled decision-making mechanism: a fundamental of intelligent driving shaping model

    Directory of Open Access Journals (Sweden)

    Wuhong Wang

    2011-05-01

    Full Text Available The most difficult but important problem in advance driver assistance system development is how to measure and model the behavioral response of drivers with focusing on the cognition process. This paper describes driver's deceleration and acceleration behavior based on driving situation awareness in the car-following process, and then presents several driving models for analysis of driver's safety approaching behavior in traffic operation. The emphasis of our work is placed on the research of driver's various information process and multi-ruled decisionmaking mechanism by considering the complicated control process of driving; the results will be able to provide a theoretical basis for intelligent driving shaping model.

  13. Artificial intelligence versus statistical modeling and optimization of continuous bead milling process for bacterial cell lysis

    Directory of Open Access Journals (Sweden)

    Shafiul Haque

    2016-11-01

    Full Text Available AbstractFor a commercially viable recombinant intracellular protein production process, efficient cell lysis and protein release is a major bottleneck. The recovery of recombinant protein, cholesterol oxidase (COD was studied in a continuous bead milling process. A full factorial Response Surface Model (RSM design was employed and compared to Artificial Neural Networks coupled with Genetic Algorithm (ANN-GA. Significant process variables, cell slurry feed rate (A, bead load (B, cell load (C and run time (D, were investigated and optimized for maximizing COD recovery. RSM predicted an optimum of feed rate of 310.73 mL/h, bead loading of 79.9% (v/v, cell loading OD600 nm of 74, and run time of 29.9 min with a recovery of ~3.2 g/L. ANN coupled with GA predicted a maximum COD recovery of ~3.5 g/L at an optimum feed rate (mL/h: 258.08, bead loading (%, v/v: 80%, cell loading (OD600 nm: 73.99, and run time of 32 min. An overall 3.7-fold increase in productivity is obtained when compared to a batch process. Optimization and comparison of statistical vs. artificial intelligence techniques in continuous bead milling process has been attempted for the very first time in our study. We were able to successfully represent the complex non-linear multivariable dependence of enzyme recovery on bead milling parameters. The quadratic second order response functions are not flexible enough to represent such complex non-linear dependence. ANN being a summation function of multiple layers are capable to represent complex non-linear dependence of variables in this case; enzyme recovery as a function of bead milling parameters. Since GA can even optimize discontinuous functions present study cites a perfect example of using machine learning (ANN in combination with evolutionary optimization (GA for representing undefined biological functions which is the case for common industrial processes involving biological moieties.

  14. Artificial intelligence for the modeling and control of combustion processes: a review

    Energy Technology Data Exchange (ETDEWEB)

    Kalogirou, S.A. [Higher Technical Inst., Nicosia, Cyprus (Greece). Dept. of Mechanical Engineering

    2003-07-01

    Artificial intelligence (AI) systems are widely accepted as a technology offering an alternative way to tackle complex and ill-defined problems. They can learn from examples, are fault tolerant in the sense that they are able to handle noisy and incomplete data, are able to deal with non-linear problems, and once trained can perform prediction and generalization at high speed. They have been used in diverse applications in control, robotics, pattern recognition, forecasting, medicine, power systems, manufacturing, optimization, signal processing, and social/psychological sciences. They are particularly useful in system modeling such as in implementing complex mappings and system identification. Al systems comprise areas like, expert systems, artificial neural networks, genetic algorithms, fuzzy logic and various hybrid systems, which combine two or more techniques. The major objective of this paper is to illustrate how Al techniques might play an important role in modeling and prediction of the performance and control of combustion process. The paper outlines an understanding of how AI systems operate by way of presenting a number of problems in the different disciplines of combustion engineering. The various applications of AI are presented in a thematic rather than a chronological or any other order. Problems presented include two main areas: combustion systems and internal combustion (IC) engines. Combustion systems include boilers, furnaces and incinerators modeling and emissions prediction, whereas, IC engines include diesel and spark ignition engines and gas engines modeling and control. Results presented in this paper, are testimony to the potential of Al as a design tool in many areas of combustion engineering. (author)

  15. Artificial intelligence for the modeling and control of combustion processes: a review

    Energy Technology Data Exchange (ETDEWEB)

    Soteris A. Kalogirou, [Higher Technical Institute, Nicosia (Cyprus). Department of Mechanical Engineering

    2003-07-01

    Artificial intelligence (AI) systems are widely accepted as a technology offering an alternative way to tackle complex and ill-defined problems. They can learn from examples, are fault tolerant in the sense that they are able to handle noisy and incomplete data, are able to deal with non-linear problems, and once trained can perform prediction and generalization at high speed. They have been used in diverse applications in control, robotics, pattern recognition, forecasting, medicine, power systems, manufacturing, optimization, signal processing, and social/psychological sciences. They are particularly useful in system modeling such as in implementing complex mappings and system identification. AI systems comprise areas like, expert systems, artificial neural networks, genetic algorithms, fuzzy logic and various hybrid systems, which combine two or more techniques. The major objective of this paper is to illustrate how AI techniques might play an important role in modeling and prediction of the performance and control of combustion process. The paper outlines an understanding of how AI systems operate by way of presenting a number of problems in the different disciplines of combustion engineering. The various applications of AI are presented in a thematic rather than a chronological or any other order. Problems presented include two main areas: combustion systems and internal combustion (IC) engines. Combustion systems include boilers, furnaces and incinerators modeling and emissions prediction, whereas, IC engines include diesel and spark ignition engines and gas engines modeling and control. Results presented in this paper, are testimony to the potential of AI as a design tool in many areas of combustion engineering. 109 refs., 31 figs., 11 tabs.

  16. Joint Intelligence Analysis Complex: DOD Partially Used Best Practices for Analyzing Alternatives and Should Do So Fully for Future Military Construction Decisions

    Science.gov (United States)

    2016-09-01

    EUCOM’s Intelligence Analytic Center, a data processing center, a warehouse , and various supporting facilities. This documentation also explains that...use, and that the shortage degrades the reliability of theater and national communications and intelligence assets. In budget justification documents...1623 of the bill would, if enacted, limit DOD’s fiscal year 2017 obligation or expenditure of funding for intelligence manpower positions for JIAC

  17. MongoDB and Python Patterns and processes for the popular document-oriented database

    CERN Document Server

    O'Higgins, Niall

    2011-01-01

    Learn how to leverage MongoDB with your Python applications, using the hands-on recipes in this book. You get complete code samples for tasks such as making fast geo queries for location-based apps, efficiently indexing your user documents for social-graph lookups, and many other scenarios. This guide explains the basics of the document-oriented database and shows you how to set up a Python environment with it. Learn how to read and write to MongoDB, apply idiomatic MongoDB and Python patterns, and use the database with several popular Python web frameworks. You'll discover how to model your

  18. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    Science.gov (United States)

    Stoitsis, John; Valavanis, Ioannis; Mougiakakou, Stavroula G.; Golemati, Spyretta; Nikita, Alexandra; Nikita, Konstantina S.

    2006-12-01

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

  19. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    International Nuclear Information System (INIS)

    Stoitsis, John; Valavanis, Ioannis; Mougiakakou, Stavroula G.; Golemati, Spyretta; Nikita, Alexandra; Nikita, Konstantina S.

    2006-01-01

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis

  20. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Stoitsis, John [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece)]. E-mail: stoitsis@biosim.ntua.gr; Valavanis, Ioannis [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Mougiakakou, Stavroula G. [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Golemati, Spyretta [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Nikita, Alexandra [University of Athens, Medical School 152 28 Athens (Greece); Nikita, Konstantina S. [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece)

    2006-12-20

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

  1. Documentation of 50% water conservation in a single process at a beef abattoir

    Science.gov (United States)

    Beef slaughter is water intensive due to stringent food safety requirements. We conducted a study at a commercial beef processor to demonstrate water conservation by modifying the mechanical head wash. We documented the initial nozzle configuration (112 nozzles), water pressure (275 kPa), and flowra...

  2. Documentation of 50% water conservation in a single process at a beef abattoir. Meat Science

    Science.gov (United States)

    Beef slaughter is water intensive due to stringent food safety requirements. We conducted a study at a commercial beef processor to demonstrate water conservation by modifying the mechanical head wash. We documented the initial nozzle configuration (112 nozzles), water pressure (275 kPa), and flowra...

  3. Documentation control process of Brazilian multipurpose reactor: conceptual design and basic design

    International Nuclear Information System (INIS)

    Kibrit, Eduardo; Prates, Jose Eduardo; Longo, Guilherme Carneiro; Salvetti, Tereza Cristina

    2015-01-01

    Established in the scope of Plan of Action of the Ministry of Science, Technology and Innovation (PACTI/MCTI) in 2007, the construction of the Brazilian Multipurpose Reactor (RMB) is on the way. This type of reactor has a broad spectrum of applications in the nuclear field and related technologies such as the radioisotopes used as supplies in the production of radiopharmaceuticals, with very much benefit to the Brazilian society being, therefore, the main goal of the Project. RMB Project consists of the following stages: site selection and site evaluation; design (conceptual design, basic design, detailed design and experimental design); construction (procurement, manufacturing; civil construction; electromechanical construction and assembling); commissioning; operation and decommissioning. Each stage requires adaptation of human resources for the stage schedule execution. The implementation of a project of this magnitude requires a complex project management, which covers not only technical, but also administrative areas. Licensing, financial resources, quality and document control systems, engineering are some of the areas involved in project success. The development of the conceptual and basic designs involved the participation of three main engineering companies. INTERTECHNE Consultores S.A. was in charge of conceptual and basic designs for conventional systems of buildings and infrastructure. INVAP S.E. was responsible for preparing the basic design of the reactor core and annexes. MRS Estudos Ambientais Ltda. has prepared documents for environmental licensing. This paper describes the procedures used during conceptual and basic design stages to control design documentation and flow of this documentation, involving the analysis and incorporation of comments from experts, control and storage of a volume of approximately 15,000 documents. (author)

  4. Documentation control process of Brazilian multipurpose reactor: conceptual design and basic design

    Energy Technology Data Exchange (ETDEWEB)

    Kibrit, Eduardo; Prates, Jose Eduardo; Longo, Guilherme Carneiro; Salvetti, Tereza Cristina, E-mail: ekibrit@ipen.br, E-mail: jeprates@ipen.br, E-mail: glongo@ipen.br, E-mail: salvetti@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Established in the scope of Plan of Action of the Ministry of Science, Technology and Innovation (PACTI/MCTI) in 2007, the construction of the Brazilian Multipurpose Reactor (RMB) is on the way. This type of reactor has a broad spectrum of applications in the nuclear field and related technologies such as the radioisotopes used as supplies in the production of radiopharmaceuticals, with very much benefit to the Brazilian society being, therefore, the main goal of the Project. RMB Project consists of the following stages: site selection and site evaluation; design (conceptual design, basic design, detailed design and experimental design); construction (procurement, manufacturing; civil construction; electromechanical construction and assembling); commissioning; operation and decommissioning. Each stage requires adaptation of human resources for the stage schedule execution. The implementation of a project of this magnitude requires a complex project management, which covers not only technical, but also administrative areas. Licensing, financial resources, quality and document control systems, engineering are some of the areas involved in project success. The development of the conceptual and basic designs involved the participation of three main engineering companies. INTERTECHNE Consultores S.A. was in charge of conceptual and basic designs for conventional systems of buildings and infrastructure. INVAP S.E. was responsible for preparing the basic design of the reactor core and annexes. MRS Estudos Ambientais Ltda. has prepared documents for environmental licensing. This paper describes the procedures used during conceptual and basic design stages to control design documentation and flow of this documentation, involving the analysis and incorporation of comments from experts, control and storage of a volume of approximately 15,000 documents. (author)

  5. Intelligent systems

    CERN Document Server

    Irwin, J David

    2011-01-01

    Technology has now progressed to the point that intelligent systems are replacing humans in the decision making processes as well as aiding in the solution of very complex problems. In many cases intelligent systems are already outperforming human activities. Artificial neural networks are not only capable of learning how to classify patterns, such images or sequence of events, but they can also effectively model complex nonlinear systems. Their ability to classify sequences of events is probably more popular in industrial applications where there is an inherent need to model nonlinear system

  6. Advanced intelligent systems

    CERN Document Server

    Ryoo, Young; Jang, Moon-soo; Bae, Young-Chul

    2014-01-01

    Intelligent systems have been initiated with the attempt to imitate the human brain. People wish to let machines perform intelligent works. Many techniques of intelligent systems are based on artificial intelligence. According to changing and novel requirements, the advanced intelligent systems cover a wide spectrum: big data processing, intelligent control, advanced robotics, artificial intelligence and machine learning. This book focuses on coordinating intelligent systems with highly integrated and foundationally functional components. The book consists of 19 contributions that features social network-based recommender systems, application of fuzzy enforcement, energy visualization, ultrasonic muscular thickness measurement, regional analysis and predictive modeling, analysis of 3D polygon data, blood pressure estimation system, fuzzy human model, fuzzy ultrasonic imaging method, ultrasonic mobile smart technology, pseudo-normal image synthesis, subspace classifier, mobile object tracking, standing-up moti...

  7. Extensible Markup Language: How Might It Alter the Software Documentation Process and the Role of the Technical Communicator?

    Science.gov (United States)

    Battalio, John T.

    2002-01-01

    Describes the influence that Extensible Markup Language (XML) will have on the software documentation process and subsequently on the curricula of advanced undergraduate and master's programs in technical communication. Recommends how curricula of advanced undergraduate and master's programs in technical communication ought to change in order to…

  8. Advective transport observations with MODPATH-OBS--documentation of the MODPATH observation process

    Science.gov (United States)

    Hanson, R.T.; Kauffman, L.K.; Hill, M.C.; Dickinson, J.E.; Mehl, S.W.

    2013-01-01

    The MODPATH-OBS computer program described in this report is designed to calculate simulated equivalents for observations related to advective groundwater transport that can be represented in a quantitative way by using simulated particle-tracking data. The simulated equivalents supported by MODPATH-OBS are (1) distance from a source location at a defined time, or proximity to an observed location; (2) time of travel from an initial location to defined locations, areas, or volumes of the simulated system; (3) concentrations used to simulate groundwater age; and (4) percentages of water derived from contributing source areas. Although particle tracking only simulates the advective component of conservative transport, effects of non-conservative processes such as retardation can be approximated through manipulation of the effective-porosity value used to calculate velocity based on the properties of selected conservative tracers. This program can also account for simple decay or production, but it cannot account for diffusion. Dispersion can be represented through direct simulation of subsurface heterogeneity and the use of many particles. MODPATH-OBS acts as a postprocessor to MODPATH, so that the sequence of model runs generally required is MODFLOW, MODPATH, and MODPATH-OBS. The version of MODFLOW and MODPATH that support the version of MODPATH-OBS presented in this report are MODFLOW-2005 or MODFLOW-LGR, and MODPATH-LGR. MODFLOW-LGR is derived from MODFLOW-2005, MODPATH 5, and MODPATH 6 and supports local grid refinement. MODPATH-LGR is derived from MODPATH 5. It supports the forward and backward tracking of particles through locally refined grids and provides the output needed for MODPATH_OBS. For a single grid and no observations, MODPATH-LGR results are equivalent to MODPATH 5. MODPATH-LGR and MODPATH-OBS simulations can use nearly all of the capabilities of MODFLOW-2005 and MODFLOW-LGR; for example, simulations may be steady-state, transient, or a combination

  9. The Influence of Cochlear Mechanical Dysfunction, Temporal Processing Deficits, and Age on the Intelligibility of Audible Speech in Noise for Hearing-Impaired Listeners

    Directory of Open Access Journals (Sweden)

    Peter T. Johannesen

    2016-05-01

    Full Text Available The aim of this study was to assess the relative importance of cochlear mechanical dysfunction, temporal processing deficits, and age on the ability of hearing-impaired listeners to understand speech in noisy backgrounds. Sixty-eight listeners took part in the study. They were provided with linear, frequency-specific amplification to compensate for their audiometric losses, and intelligibility was assessed for speech-shaped noise (SSN and a time-reversed two-talker masker (R2TM. Behavioral estimates of cochlear gain loss and residual compression were available from a previous study and were used as indicators of cochlear mechanical dysfunction. Temporal processing abilities were assessed using frequency modulation detection thresholds. Age, audiometric thresholds, and the difference between audiometric threshold and cochlear gain loss were also included in the analyses. Stepwise multiple linear regression models were used to assess the relative importance of the various factors for intelligibility. Results showed that (a cochlear gain loss was unrelated to intelligibility, (b residual cochlear compression was related to intelligibility in SSN but not in a R2TM, (c temporal processing was strongly related to intelligibility in a R2TM and much less so in SSN, and (d age per se impaired intelligibility. In summary, all factors affected intelligibility, but their relative importance varied across maskers.

  10. Standard CMMI Appraisal Method for Process Improvement (SCAMPI) A, Version 1.3: Method Definition Document

    Science.gov (United States)

    2011-03-01

    Assurance Plans • Training Plans • Measurement Plans • Estimating records • Release planning • Workflow planning • Kanban boards • Service...Work flow planning • Kanban boards • Acquisition Strategy Documents • Supplier Evaluation Criteria • Requests for Proposal • Specific...Preliminary Design Reviews, deliveries) • QA Audit records/reports • Measurement reports/repository • Kanban board • Continuous/Cumulative Flow

  11. Artificial intelligence in process control: Knowledge base for the shuttle ECS model

    Science.gov (United States)

    Stiffler, A. Kent

    1989-01-01

    The general operation of KATE, an artificial intelligence controller, is outlined. A shuttle environmental control system (ECS) demonstration system for KATE is explained. The knowledge base model for this system is derived. An experimental test procedure is given to verify parameters in the model.

  12. Joint Intelligence Operations Center (JIOC) Baseline Business Process Model & Capabilities Evaluation Methodology

    Science.gov (United States)

    2012-03-01

    Targeting Review Board OPLAN Operations Plan OPORD Operations Order OPSIT Operational Situation OSINT Open Source Intelligence OV...Analysis Evaluate FLTREPs MISREPs Unit Assign Assets Feedback Asset Shortfalls Multi-Int Collection Political & Embasy Law Enforcement HUMINT OSINT ...Embassy Information OSINT Manage Theater HUMINT Law Enforcement Collection Sort Requests Platform Information Agency Information M-I Collect

  13. Overview of the research and development on knowledge information processing and intelligent robots

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, K

    1982-04-01

    To implement intelligent computers, the problem of formalization of human intellectual activity must be considered. Insight into formalized intellectual activity can be gained by examination of its four abilities: (1) problem-solving; (2) learning, recognition and understanding; (3) language analysis and understanding; and (4) intellectual interaction. These are the topics discussed in the paper. 68 references.

  14. The Efficiency of Requesting Process for Formal Business-Documents in Indonesia: An Implementation of Web Application Base on Secure and Encrypted Sharing Process

    Directory of Open Access Journals (Sweden)

    Aris Budi Setyawan

    2014-12-01

    Full Text Available In recent business practices, the need of the formal document for business, such as the business license documents, business domicile letters, halal certificates, and other formal documents, is desperately needed and becomes its own problems for businesses, especially for small and medium enterprises. One stop service unit that was conceived and implemented by the government today, has not been fully integrated yet. Not all permits (related with formal document for business can be completed and finished in one place, businesses are still have to move from one government department to another government department to get a formal document for their business. With these practices, not only a lot of the time and cost will be sacrificed, but also businesses must always fill out a form with the same field. This study aims to assess and identify the problem, especially on applying the formal document for business, and use it as inputs for the development of a web application based on secure and encrypted sharing process. The study starts with a survey of 200 businesses that have applied the formal document for their business, to map the initial conditions of applying the formal document for business in Indonesia . With these applications that are built based on these needs, it is expected that not only the time, cost, and physical effort from both parties are becoming more efficient, but also the negative practices of bureaucratic and economic obstacles in business activities can be minimized, so the competitiveness of business and their contribution for national economy will increase.Keywords : Formal documents, Efficiencies, Web application, Secure and encrypted sharing process, SMEs

  15. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  16. Machine listening intelligence

    Science.gov (United States)

    Cella, C. E.

    2017-05-01

    This manifesto paper will introduce machine listening intelligence, an integrated research framework for acoustic and musical signals modelling, based on signal processing, deep learning and computational musicology.

  17. Intelligent Optics Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Intelligent Optics Laboratory supports sophisticated investigations on adaptive and nonlinear optics; advancedimaging and image processing; ground-to-ground and...

  18. Vision of the Arc for Quality Documentation and for Closed Loop Control of the Welding Process

    DEFF Research Database (Denmark)

    Kristiansen, Morten; Kristiansen, Ewa; Jensen, Casper Houmann

    2014-01-01

    For gas metal arc welding a vision system was developed, which was robust to monitor the position of the arc. The monitoring documents the welding quality indirectly and a closed loop fuzzy control was implemented to control an even excess penetration. For welding experiments on a butt......-joint with a V-groove with varying root gapthe system demonstrated increased welding quality compared to the system with no control. The system was implemented with a low cost vision system, which makes the system interesting to apply in industrial welding automation systems....

  19. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Science.gov (United States)

    2010-07-01

    ... ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA processing... determination within 20 workdays, we have instituted multitrack processing of requests. Based on the information... source; responsive records were part of the Air Force's decision-making process, and the prerelease...

  20. Reading the Music and Understanding the Therapeutic Process: Documentation, Analysis and Interpretation of Improvisational Music Therapy

    Directory of Open Access Journals (Sweden)

    Deborah Parker

    2011-01-01

    Full Text Available This article is concerned primarily with the challenges of presenting clinical material from improvisational music therapy. My aim is to propose a model for the transcription of music therapy material, or “musicotherapeutic objects” (comparable to Bion’s “psychoanalytic objects”, which preserves the integrated “gestalt” of the musical experience as far as possible, whilst also supporting detailed analysis and interpretation. Unwilling to resort to use of visual documentation, but aware that many important indicators in music therapy are non-sounding, I propose a richly annotated score, where traditional music notation is integrated with graphic and verbal additions, in order to document non-sounding events. This model is illustrated within the context of a clinical case with a high functioning autistic woman. The four transcriptions, together with the original audio tracks, present significant moments during the course of music therapy, attesting to the development of the dyadic relationship, with reference to John Bowlby’s concept of a “secure base” as the most appropriate dynamic environment for therapy.

  1. Nursing documentation: experience of the use of the nursing process model in selected hospitals in Ibadan, Oyo State, Nigeria.

    Science.gov (United States)

    Ofi, Bola; Sowunmi, Olanrewaju

    2012-08-01

    The descriptive study was conducted to determine the extent of utilization of the nursing process for documentation of nursing care in three selected hospitals, Ibadan, Nigeria. One hundred fifty nurses and 115 discharged clients' records were selected from the hospitals. Questionnaires and checklists were used to collect data. Utilization of nursing process for care was 100%, 73.6% and 34.8% in the three hospitals. Nurses encountered difficulties in history taking, formulation of nursing diagnoses, objectives, nursing orders and evaluation. Most nurses disagreed or were undecided with the use of authorized abbreviations and symbols (34.3%, 40.3% and 69.5%), recording errors that occurred during care (37.1%, 56.1% and 52.2%) and inclusion of change in clients' condition (54.3%, 56.1% and 73.8%). Most nurses appreciated the significance of documentation. Lack of time, knowledge and need for extensive writing are the major barriers against documentation. Seventy-seven point four per cent of the 115 clients' records from one hospital showed evidence of documentation, no evidence from the other two. Study findings have implications for continuing professional education, practice and supervision. © 2012 Blackwell Publishing Asia Pty Ltd.

  2. Artificial Intelligence Framework for Simulating Clinical Decision-Making: A Markov Decision Process Approach

    OpenAIRE

    Bennett, Casey C.; Hauser, Kris

    2013-01-01

    In the modern healthcare system, rapidly expanding costs/complexity, the growing myriad of treatment options, and exploding information streams that often do not effectively reach the front lines hinder the ability to choose optimal treatment decisions over time. The goal in this paper is to develop a general purpose (non-disease-specific) computational/artificial intelligence (AI) framework to address these challenges. This serves two potential functions: 1) a simulation environment for expl...

  3. Writing cases as a knowledge capture process in a competitive intelligence program

    OpenAIRE

    Mallowan , Monica; Marcon , Christian

    2009-01-01

    International audience; Students in Competitive Intelligence (CI) programs submit a report following their internship in an organisation. It is proposed that the result of their experiences be shared with their peers, in the form of cases written for in-class analysis. A knowledge base is thus created, which gradually becomes the program's memory and, by its constant renewal and connection with the reality, the most useful teaching tool for the professor.

  4. Electronic repository and standardization of processes and electronic documents in transport

    Directory of Open Access Journals (Sweden)

    Tomasz DĘBICKI

    2007-01-01

    Full Text Available The article refers to the idea of the use of electronic repository to store standardised scheme of processes between a Logistics Service Provider and its business partners. Application of repository for automatic or semi-automatic configuration of interoperability in electronic data interchange between information systems of differentcompanies based on transport (road, rail, sea and combined related processes. Standardisation includes processes, scheme of cooperation and related to them, electronic messages.

  5. Browsing and Querying in Online Documentation:A Study of User Interfaces and the Interaction Process

    DEFF Research Database (Denmark)

    Hertzum, Morten; Frøkjær, Erik

    1996-01-01

    A user interface study concerning the usage effectiveness of selected retrieval modes was conducted using an experimental text retrieval system, TeSS, giving access to online documentation of certain programming tools. Four modes of TeSS were compared: (1) browsing, (2) conventional boolean....... In the experiment the use of printed manuals is faster and provides answers of higher quality than any of the electronic modes. Therefore, claims about the effectiveness of computer-based text retrieval have to be wary in situations where printed manuals are manageable to the users. Among the modes of Te......SS, browsing is the fastest and the one causing fewest operational errors. On the same two variables, time and operational errors, the Venn diagram mode performs better than conventional boolean retrieval. The combined mode scores worst on the objective performance measures; nonetheless nearly all subjects...

  6. The documental management of the Educational Secretary as guarantee of legality in the control of the university processes

    Directory of Open Access Journals (Sweden)

    Neyda Armenteros Arencibia

    2015-06-01

    Full Text Available The pedagogic professional formation aspires to obtain superior results in the documental management in order to guarantee the required confidence in the norms and p rocedures. The present article deal the theoretical bases that should be took into account for such aspiration, making emphasis in the characteristics of this process in the faculty of Infantile Education of the U CP the Pinar del Río.

  7. Designing a framework of intelligent information processing for dentistry administration data.

    Science.gov (United States)

    Amiri, N; Matthews, D C; Gao, Q

    2005-07-01

    This study was designed to test a cumulative view of current data in the clinical database at the Faculty of Dentistry, Dalhousie University. We planned to examine associations among demographic factors and treatments. Three tables were selected from the database of the faculty: patient, treatment and procedures. All fields and record numbers in each table were documented. Data was explored using SQL server and Visual Basic and then cleaned by removing incongruent fields. After transformation, a data warehouse was created. This was imported to SQL analysis services manager to create an OLAP (Online Analytic Process) cube. The multidimensional model used for access to data was created using a star schema. Treatment count was the measurement variable. Five dimensions--date, postal code, gender, age group and treatment categories--were used to detect associations. Another data warehouse of 8 tables (international tooth code # 1-8) was created and imported to SAS enterprise miner to complete data mining. Association nodes were used for each table to find sequential associations and minimum criteria were set to 2% of cases. Findings of this study confirmed most assumptions of treatment planning procedures. There were some small unexpected patterns of clinical interest. Further developments are recommended to create predictive models. Recent improvements in information technology offer numerous advantages for conversion of raw data from faculty databases to information and subsequently to knowledge. This knowledge can be used by decision makers, managers, and researchers to answer clinical questions, affect policy change and determine future research needs.

  8. Standard CMMIsm Appraisal Method for Process Improvement (SCAMPIsm), Version 1.1: Method Definition Document

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard CMMI Appraisal Method for Process Improvement (SCAMPI(Service Mark)) is designed to provide benchmark quality ratings relative to Capability Maturity Model(registered) Integration (CMMI(Service Mark)) models...

  9. System design document for the INFLO prototype.

    Science.gov (United States)

    2014-03-01

    This report documents the high level System Design Document (SDD) for the prototype development and : demonstration of the Intelligent Network Flow Optimization (INFLO) application bundle, with a focus on the Speed : Harmonization (SPD-HARM) and Queu...

  10. Program Management at the National Nuclear Security Administration Office of Defense Nuclear Security: A Review of Program Management Documents and Underlying Processes

    International Nuclear Information System (INIS)

    Madden, Michael S.

    2010-01-01

    The scope of this paper is to review the National Nuclear Security Administration Office of Defense Nuclear Security (DNS) program management documents and to examine the underlying processes. The purpose is to identify recommendations for improvement and to influence the rewrite of the DNS Program Management Plan (PMP) and the documentation supporting it. As a part of this process, over 40 documents required by DNS or its stakeholders were reviewed. In addition, approximately 12 other documents produced outside of DNS and its stakeholders were reviewed in an effort to identify best practices. The complete list of documents reviewed is provided as an attachment to this paper.

  11. Documenting the invisible – on the ‘how’ of process research

    DEFF Research Database (Denmark)

    Pallesen, Eva Holdflod

    2017-01-01

    scholars, this has been an occasion for deeming the discipline of methodology ‘dead’ or ‘emptied’. In contrast to such claims, this article argues that the scholar doing empirical research from approaches drawing on process philosophy to no less extent than other scholars must deal with problems...... to think from. The article suggests that process philosophy may open up a methodological thinking that has room for a more connotative, playful way of relating to research material – which does not demand from a method to overcome the gap between what is there and what is captured but makes use of this gap......Currently, there is a growing field in organization studies, reflecting a stream in social science more broadly, which seeks to encompass a process philosophical view of the world as multiple and in constant becoming. However, this raises new questions and challenges to the field of methodology...

  12. Crowd-Sourced Intelligence Agency: Prototyping counterveillance

    Directory of Open Access Journals (Sweden)

    Jennifer Gradecki

    2017-02-01

    Full Text Available This paper discusses how an interactive artwork, the Crowd-Sourced Intelligence Agency (CSIA, can contribute to discussions of Big Data intelligence analytics. The CSIA is a publicly accessible Open Source Intelligence (OSINT system that was constructed using information gathered from technical manuals, research reports, academic papers, leaked documents, and Freedom of Information Act files. Using a visceral heuristic, the CSIA demonstrates how the statistical correlations made by automated classification systems are different from human judgment and can produce false-positives, as well as how the display of information through an interface can affect the judgment of an intelligence agent. The public has the right to ask questions about how a computer program determines if they are a threat to national security and to question the practicality of using statistical pattern recognition algorithms in place of human judgment. Currently, the public’s lack of access to both Big Data and the actual datasets intelligence agencies use to train their classification algorithms keeps the possibility of performing effective sous-dataveillance out of reach. Without this data, the results returned by the CSIA will not be identical to those of intelligence agencies. Because we have replicated how OSINT is processed, however, our results will resemble the type of results and mistakes made by OSINT systems. The CSIA takes some initial steps toward contributing to an informed public debate about large-scale monitoring of open source, social media data and provides a prototype for counterveillance and sousveillance tools for citizens.

  13. Examples of transcultural processes in two colonial linguistic documents on Jebero (Peru)

    NARCIS (Netherlands)

    Alexander-Bakkerus, A.

    2015-01-01

    In this paper we bring to light the "transcultural processes" and "the impacts of colonial thinking" as contained in The British Library manuscripts Add. 25,323 and 25,324. The manuscripts deal with Jebero, an indigenous language of North-Peru, as it was spoken in the 18th century. (The language,

  14. Intelligence and treaty ratification

    International Nuclear Information System (INIS)

    Cahn, A.H.

    1990-01-01

    This paper reports that there are two sets of questions applicable to the ratification phase: what is the role of intelligence in the ratification process? What effect did intelligence have on that process. The author attempts to answer these and other questions

  15. Energy analysis handbook. CAC document 214. [Combining process analysis with input-output analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bullard, C. W.; Penner, P. S.; Pilati, D. A.

    1976-10-01

    Methods are presented for calculating the energy required, directly and indirectly, to produce all types of goods and services. Procedures for combining process analysis with input-output analysis are described. This enables the analyst to focus data acquisition cost-effectively, and to achieve a specified degree of accuracy in the results. The report presents sample calculations and provides the tables and charts needed to perform most energy cost calculations, including the cost of systems for producing or conserving energy.

  16. The Management of Law Firms Using Business Process Management, Document Management and Web Services Integration

    OpenAIRE

    Roxana Maria Petculet

    2012-01-01

    The aim of this paper is to present the technical solution implemented in the present context for the management of law firms. The informational system consists of the automation of business processes using a BPM engine and electronic archiving using a DMS. The communication between the two modules is made by invoking web services. The whole system integrates modules like: project management, contract management, invoice management, collection, CRM, reporting.

  17. Intelligible Artificial Intelligence

    OpenAIRE

    Weld, Daniel S.; Bansal, Gagan

    2018-01-01

    Since Artificial Intelligence (AI) software uses techniques like deep lookahead search and stochastic optimization of huge neural networks to fit mammoth datasets, it often results in complex behavior that is difficult for people to understand. Yet organizations are deploying AI algorithms in many mission-critical settings. In order to trust their behavior, we must make it intelligible --- either by using inherently interpretable models or by developing methods for explaining otherwise overwh...

  18. Document reconstruction by layout analysis of snippets

    Science.gov (United States)

    Kleber, Florian; Diem, Markus; Sablatnig, Robert

    2010-02-01

    Document analysis is done to analyze entire forms (e.g. intelligent form analysis, table detection) or to describe the layout/structure of a document. Also skew detection of scanned documents is performed to support OCR algorithms that are sensitive to skew. In this paper document analysis is applied to snippets of torn documents to calculate features for the reconstruction. Documents can either be destroyed by the intention to make the printed content unavailable (e.g. tax fraud investigation, business crime) or due to time induced degeneration of ancient documents (e.g. bad storage conditions). Current reconstruction methods for manually torn documents deal with the shape, inpainting and texture synthesis techniques. In this paper the possibility of document analysis techniques of snippets to support the matching algorithm by considering additional features are shown. This implies a rotational analysis, a color analysis and a line detection. As a future work it is planned to extend the feature set with the paper type (blank, checked, lined), the type of the writing (handwritten vs. machine printed) and the text layout of a snippet (text size, line spacing). Preliminary results show that these pre-processing steps can be performed reliably on a real dataset consisting of 690 snippets.

  19. Analysis of geodetic and legal documentation in the process of expropriation for roads. Krakow case study

    Science.gov (United States)

    Trembecka, Anna

    2013-06-01

    Amendment to the Act on special rules of preparation and implementation of investment in public roads resulted in an accelerated mode of acquisition of land for the development of roads. The decision to authorize the execution of road investment issued on its basis has several effects, i.e. determines the location of a road, approves surveying division, approves construction design and also results in acquisition of a real property by virtue of law by the State Treasury or local government unit, among others. The conducted study revealed that over 3 years, in this mode, the city of Krakow has acquired 31 hectares of land intended for the implementation of road investments. Compensation is determined in separate proceedings based on an appraisal study estimating property value, often at a distant time after the loss of land by the owner. One reason for the lengthy compensation proceedings is challenging the proposed amount of compensation, unregulated legal status of the property as well as imprecise legislation. It is important to properly develop geodetic and legal documentation which accompanies the application for issuance of the decision and is also used in compensation proceedings. Zmiana ustawy o szczególnych zasadach przygotowywania i realizacji inwestycji w zakresie dróg publicznych spowodowała przyspieszony tryb pozyskiwania gruntów przeznaczonych pod budowę dróg. Wydawana na jej podstawie decyzja o zezwoleniu na realizację inwestycji drogowej wywołuje szereg skutków, tj. m.in. ustala lokalizację drogi, zatwierdza podziały geodezyjne, zatwierdza projekt budowlany a także powoduje nabycie nieruchomości z mocy prawa, przez Skarb Państwa lub jednostki samorządu terytorialnego. Przeprowadzone badania wykazały iż w powyższym trybie miasto Kraków nabyło w okresie 3 lat ponad 31 ha gruntów przeznaczonych na realizację inwestycji drogowych. Odszkodowanie ustalane jest w drodze odrębnego postępowania w oparciu o operat szacunkowy okre

  20. Cobit-Framework and process control engineering. Application of Cobit-Process documentation ME3; Cobit-Framework und die Netzleittechnik. Anwendung der Cobit-Prozessbeschreibung ME3

    Energy Technology Data Exchange (ETDEWEB)

    Bosin, Erwin [Tiwag-Netz AG, Thaur (Austria). Prozessrechner und USV

    2009-11-16

    Cobit (Control Objectives for Information and related Technology) is an internationally recognised framework which offers two approaches to facilitating the operativeness of the power system management. One is aimed at the selection of the relevant processes needed for meeting business requirements and the other at identifying possible improvements to control and steering mechanisms in the selected processes. This is documented by the selected Cobit ME3 process (''Ensure Compliance With External Requirements''). The high security level of the power system management must be maintained in this context.

  1. Principles of artificial intelligence

    CERN Document Server

    Nilsson, Nils J

    1980-01-01

    A classic introduction to artificial intelligence intended to bridge the gap between theory and practice, Principles of Artificial Intelligence describes fundamental AI ideas that underlie applications such as natural language processing, automatic programming, robotics, machine vision, automatic theorem proving, and intelligent data retrieval. Rather than focusing on the subject matter of the applications, the book is organized around general computational concepts involving the kinds of data structures used, the types of operations performed on the data structures, and the properties of th

  2. Apparatus-Program Complexes Processing and Creation of Essentially non-Format Documents on the Basis of Technology Auto-Adaptive Fonts

    Directory of Open Access Journals (Sweden)

    E. G. Andrianova

    2014-01-01

    Full Text Available The need to translate paper documents into electronic form demanded a development of methods and algorithms for automatic processing systems and web publishing unformatted graphic documents of on-line libraries. Translation of scanned images into modern formats of electronic documents using OCR programmes faces serious difficulties. These difficulties are connected with the standardization set of fonts and design of printed documents. There is also a need to maintain the original form of electronic format of such documents. The article discusses the possibility for building an extensible adaptive dictionary of graphic objects, which constitute unformatted graphics documents. Dictionary automatically adjusted as graphics processing and accumulation of statistical information for each new document. This adaptive extensible dictionary of graphic letters, fonts, and other objects of automated particular document processing is called "auto-adaptive font", and a set of its application methods is named "auto-adaptive font technology."Based on the theory of estimation algorithms, a mathematical model is designed. It allows us to represent all objects of unformatted graphic document in a unified manner to build a feature vector for each object, and evaluate a similarity of these objects in the selected metric. The algorithm of the adaptive models of graphic images is developed and a criterion for combining similar properties in one element to build an auto-adaptive font is offered thus allowing us to build a software core of hardware-software complex for processing the unformatted graphic documents. A standard block diagram of hardware-software complex is developed to process the unformatted graphic documents. The article presents a description of all the blocks of this complex, including document processing station and its interaction with the web server of publishing electronic documents.

  3. Preliminary Feasibility Assessment of Integrating CCHP with NW Food Processing Plant #1: Modeling Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Michael G.; Srivastava, Viraj; Wagner, Anne W.; Makhmalbaf, Atefe; Thornton, John

    2014-01-01

    The Pacific Northwest National Laboratory (PNNL) has launched a project funded by the Bonneville Power Association (BPA) to identify strategies for increasing industrial energy efficiency and reducing energy costs of Northwest Food Processors Association (NWFPA) plants through deployment of novel combinations and designs of variable-output combined heat and power (CHP) distributed generation (DG), combined cooling, heating and electric power (CCHP) DG and energy storage systems. Detailed evaluations and recommendations of CHP and CCHP DG systems will be performed for several Northwest (NW) food processing sites. The objective is to reduce the overall energy use intensity of NW food processors by 25% by 2020 and by 50% by 2030, as well as reducing emissions and understanding potential congestion reduction impacts on the transmission system in the Pacific Northwest.

  4. Reflexive photography: an alternative method for documenting the learning process of cultural competence.

    Science.gov (United States)

    Amerson, Roxanne; Livingston, Wade G

    2014-04-01

    This qualitative descriptive study used reflexive photography to evaluate the learning process of cultural competence during an international service-learning project in Guatemala. Reflexive photography is an innovative qualitative research technique that examines participants' interactions with their environment through their personal reflections on images that they captured during their experience. A purposive sample of 10 baccalaureate nursing students traveled to Guatemala, where they conducted family and community assessments, engaged in home visits, and provided health education. Data collection involved over 100 photographs and a personal interview with each student. The themes developed from the photographs and interviews provided insight into the activities of an international experience that influence the cognitive, practical, and affective learning of cultural competence. Making home visits and teaching others from a different culture increased students' transcultural self-efficacy. Reflexive photography is a more robust method of self-reflection, especially for visual learners.

  5. Comment and response document for the UMTRA Project vitro processing site completion report Salt Lake City, Utah. Revision 1

    International Nuclear Information System (INIS)

    1995-03-01

    This Comment and Response Document is a series of UMTRA document review forms regarding the UMTRA Project Vitro Processing Site Completion Report for Salt Lake City, Utah in March, 1995. The completion report provides evidence that the final Salt Lake City, Utah, processing site property conditions are in accordance with the approved design and that all U.S. Environmental Protection Agency (EPA) standards have been satisfied. Included as appendices to support the stated conclusions are the record drawings; a summary of grid test results; contract specifications and construction drawings, the EPA standards (40 CFR Part 192); the audit, inspection, and surveillance summary; the permit information; and project photographs. The principal objective of the remedial action at Salt Lake City is to remove the tailings from the processing site, render the site free of contamination to EPA standards, and restore the site to the final design grade elevations. Each section is evaluated in detail to check all aspects of above report, especially the inclusion of adequate verification data. Each review form contains a section entitled State of Utah Response and Action, which is an explanation or correction of DOE criticisms of the report

  6. Understanding the Globalization of Intelligence

    DEFF Research Database (Denmark)

    Svendsen, Adam David Morgan

    "This book provides an introduction to the complexities of contemporary Western Intelligence and its dynamics during an era of globalization. Towards an understanding of the globalization of intelligence process, Svendsen focuses on the secretive phenomenon of international or foreign intelligence...... cooperation ('liaison'), as it occurs in both theory and practice. Reflecting a complex coexistence plurality of several different and overlapping concepts in action, the challenging process of the globalization of intelligence emerges as essential for complex issue management purposes during a globalized era...

  7. Understanding Genetic Breast Cancer Risk: Processing Loci of the BRCA Gist Intelligent Tutoring System.

    Science.gov (United States)

    Wolfe, Christopher R; Reyna, Valerie F; Widmer, Colin L; Cedillos-Whynott, Elizabeth M; Brust-Renck, Priscila G; Weil, Audrey M; Hu, Xiangen

    2016-07-01

    The BRCA Gist Intelligent Tutoring System helps women understand and make decisions about genetic testing for breast cancer risk. BRCA Gist is guided by Fuzzy-Trace Theory, (FTT) and built using AutoTutor Lite. It responds differently to participants depending on what they say. Seven tutorial dialogues requiring explanation and argumentation are guided by three FTT concepts: forming gist explanations in one's own words, emphasizing decision-relevant information, and deliberating the consequences of decision alternatives. Participants were randomly assigned to BRCA Gist , a control, or impoverished BRCA Gist conditions removing gist explanation dialogues, argumentation dialogues, or FTT images. All BRCA Gist conditions performed significantly better than controls on knowledge, comprehension, and risk assessment. Significant differences in knowledge, comprehension, and fine-grained dialogue analyses demonstrate the efficacy of gist explanation dialogues. FTT images significantly increased knowledge. Providing more elements in arguments against testing correlated with increased knowledge and comprehension.

  8. Pipeline defect prediction using long range ultrasonic testing and intelligent processing

    International Nuclear Information System (INIS)

    Dino Isa; Rajprasad Rajkumar

    2009-01-01

    This paper deals with efforts to improve nondestructive testing (NDT) techniques by using artificial intelligence in detecting and predicting pipeline defects such as cracks and wall thinning. The main emphasis here will be on the prediction of corrosion type defects rather than just detection after the fact. Long range ultrasonic testing will be employed, where a ring of piezoelectric transducers are used to generate torsional guided waves. Various defects such as cracks as well as corrosion under insulation (CUI) will be simulated on a test pipe. The machine learning algorithm known as the Support Vector Machine (SVM) will be used to predict and classify transducer signals using regression and large margin classification. Regression results show that the SVM is able to accurately predict future defects based on trends of previous defect. The classification performance was also exceptional showing a facility to detect defects at different depths as well as for distinguishing closely spaced defects. (author)

  9. Prototype interface facility for intelligent handling and processing of medical image and data

    Science.gov (United States)

    Lymberopoulos, Dimitris C.; Garantziotis, Giannis; Spiropoulos, Kostas V.; Kotsopoulos, Stavros A.; Goutis, Costas E.

    1993-06-01

    This paper introduces an interface facility (IF) developed within the overall framework of RACE research project. Due to the nature of the project which it has been focused in the Remote Medical Expert Consultation, the involvement of distances, the versatile user advocation and familiarity with newly introduced methods of medical diagnosis, considerable deficiencies can arise. The aim was to intelligently assist the user/physician by providing an ergonomic environment which would contain operational and functional deficiencies to the lowest possible levels. IF, energizes and activates system and application level commands and procedures along with the necessary exemplified and instructional help facilities, in order to concisely allow the user to interact with the system safely and easily at all levels.

  10. Supporting the personnel reliability decision-making process with artificial intelligence

    International Nuclear Information System (INIS)

    Harte, D.C.

    1991-01-01

    Recent legislation concerning personnel security has vastly increased the responsibility and accountability of the security manager. Access authorization, fitness for duty, and personnel security access programs require decisions regarding an individual's trustworthiness and reliability based on the findings of a background investigation. While these guidelines provide significant data and are useful as a tool, limited resources are available to the adjudicator of derogatory information on what is and is not acceptable in terms of granting access to sensitive areas of nuclear plants. The reason why one individual is deemed unacceptable and the next acceptable may be questioned and cause discriminatory accusations. This paper is continuation of discussion on workforce reliability, focusing on the use of artificial intelligence to support the decisions of a security manager. With this support, the benefit of previous decisions helps ensure consistent adjudication of background investigations

  11. Intelligence analysis – the royal discipline of Competitive Intelligence

    Directory of Open Access Journals (Sweden)

    František Bartes

    2011-01-01

    Full Text Available The aim of this article is to propose work methodology for Competitive Intelligence teams in one of the intelligence cycle’s specific area, in the so-called “Intelligence Analysis”. Intelligence Analysis is one of the stages of the Intelligence Cycle in which data from both the primary and secondary research are analyzed. The main result of the effort is the creation of added value for the information collected. Company Competiitve Intelligence, correctly understood and implemented in business practice, is the “forecasting of the future”. That is forecasting about the future, which forms the basis for strategic decisions made by the company’s top management. To implement that requirement in corporate practice, the author perceives Competitive Intelligence as a systemic application discipline. This approach allows him to propose a “Work Plan” for Competitive Intelligence as a fundamental standardized document to steer Competitive Intelligence team activities. The author divides the Competitive Intelligence team work plan into five basic parts. Those parts are derived from the five-stage model of the intelligence cycle, which, in the author’s opinion, is more appropriate for complicated cases of Competitive Intelligence.

  12. Intelligent Monitoring System with High Temperature Distributed Fiberoptic Sensor for Power Plant Combustion Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kwang Y. Lee; Stuart S. Yin; Andre Boehman

    2006-09-26

    The objective of the proposed work is to develop an intelligent distributed fiber optical sensor system for real-time monitoring of high temperature in a boiler furnace in power plants. Of particular interest is the estimation of spatial and temporal distributions of high temperatures within a boiler furnace, which will be essential in assessing and controlling the mechanisms that form and remove pollutants at the source, such as NOx. The basic approach in developing the proposed sensor system is three fold: (1) development of high temperature distributed fiber optical sensor capable of measuring temperatures greater than 2000 C degree with spatial resolution of less than 1 cm; (2) development of distributed parameter system (DPS) models to map the three-dimensional (3D) temperature distribution for the furnace; and (3) development of an intelligent monitoring system for real-time monitoring of the 3D boiler temperature distribution. Under Task 1, we have set up a dedicated high power, ultrafast laser system for fabricating in-fiber gratings in harsh environment optical fibers, successfully fabricated gratings in single crystal sapphire fibers by the high power laser system, and developed highly sensitive long period gratings (lpg) by electric arc. Under Task 2, relevant mathematical modeling studies of NOx formation in practical combustors have been completed. Studies show that in boiler systems with no swirl, the distributed temperature sensor may provide information sufficient to predict trends of NOx at the boiler exit. Under Task 3, we have investigated a mathematical approach to extrapolation of the temperature distribution within a power plant boiler facility, using a combination of a modified neural network architecture and semigroup theory. Given a set of empirical data with no analytic expression, we first developed an analytic description and then extended that model along a single axis.

  13. Automating the generation of lexical patterns for processing free text in clinical documents.

    Science.gov (United States)

    Meng, Frank; Morioka, Craig

    2015-09-01

    Many tasks in natural language processing utilize lexical pattern-matching techniques, including information extraction (IE), negation identification, and syntactic parsing. However, it is generally difficult to derive patterns that achieve acceptable levels of recall while also remaining highly precise. We present a multiple sequence alignment (MSA)-based technique that automatically generates patterns, thereby leveraging language usage to determine the context of words that influence a given target. MSAs capture the commonalities among word sequences and are able to reveal areas of linguistic stability and variation. In this way, MSAs provide a systemic approach to generating lexical patterns that are generalizable, which will both increase recall levels and maintain high levels of precision. The MSA-generated patterns exhibited consistent F1-, F.5-, and F2- scores compared to two baseline techniques for IE across four different tasks. Both baseline techniques performed well for some tasks and less well for others, but MSA was found to consistently perform at a high level for all four tasks. The performance of MSA on the four extraction tasks indicates the method's versatility. The results show that the MSA-based patterns are able to handle the extraction of individual data elements as well as relations between two concepts without the need for large amounts of manual intervention. We presented an MSA-based framework for generating lexical patterns that showed consistently high levels of both performance and recall over four different extraction tasks when compared to baseline methods. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Biblio-MetReS for user-friendly mining of genes and biological processes in scientific documents.

    Science.gov (United States)

    Usie, Anabel; Karathia, Hiren; Teixidó, Ivan; Alves, Rui; Solsona, Francesc

    2014-01-01

    One way to initiate the reconstruction of molecular circuits is by using automated text-mining techniques. Developing more efficient methods for such reconstruction is a topic of active research, and those methods are typically included by bioinformaticians in pipelines used to mine and curate large literature datasets. Nevertheless, experimental biologists have a limited number of available user-friendly tools that use text-mining for network reconstruction and require no programming skills to use. One of these tools is Biblio-MetReS. Originally, this tool permitted an on-the-fly analysis of documents contained in a number of web-based literature databases to identify co-occurrence of proteins/genes. This approach ensured results that were always up-to-date with the latest live version of the databases. However, this 'up-to-dateness' came at the cost of large execution times. Here we report an evolution of the application Biblio-MetReS that permits constructing co-occurrence networks for genes, GO processes, Pathways, or any combination of the three types of entities and graphically represent those entities. We show that the performance of Biblio-MetReS in identifying gene co-occurrence is as least as good as that of other comparable applications (STRING and iHOP). In addition, we also show that the identification of GO processes is on par to that reported in the latest BioCreAtIvE challenge. Finally, we also report the implementation of a new strategy that combines on-the-fly analysis of new documents with preprocessed information from documents that were encountered in previous analyses. This combination simultaneously decreases program run time and maintains 'up-to-dateness' of the results. http://metres.udl.cat/index.php/downloads, metres.cmb@gmail.com.

  15. Biblio-MetReS for user-friendly mining of genes and biological processes in scientific documents

    Directory of Open Access Journals (Sweden)

    Anabel Usie

    2014-02-01

    Full Text Available One way to initiate the reconstruction of molecular circuits is by using automated text-mining techniques. Developing more efficient methods for such reconstruction is a topic of active research, and those methods are typically included by bioinformaticians in pipelines used to mine and curate large literature datasets. Nevertheless, experimental biologists have a limited number of available user-friendly tools that use text-mining for network reconstruction and require no programming skills to use. One of these tools is Biblio-MetReS. Originally, this tool permitted an on-the-fly analysis of documents contained in a number of web-based literature databases to identify co-occurrence of proteins/genes. This approach ensured results that were always up-to-date with the latest live version of the databases. However, this ‘up-to-dateness’ came at the cost of large execution times. Here we report an evolution of the application Biblio-MetReS that permits constructing co-occurrence networks for genes, GO processes, Pathways, or any combination of the three types of entities and graphically represent those entities. We show that the performance of Biblio-MetReS in identifying gene co-occurrence is as least as good as that of other comparable applications (STRING and iHOP. In addition, we also show that the identification of GO processes is on par to that reported in the latest BioCreAtIvE challenge. Finally, we also report the implementation of a new strategy that combines on-the-fly analysis of new documents with preprocessed information from documents that were encountered in previous analyses. This combination simultaneously decreases program run time and maintains ‘up-to-dateness’ of the results. Availability: http://metres.udl.cat/index.php/downloads, Contact: metres.cmb@gmail.com.

  16. Intelligent Tutor

    Science.gov (United States)

    1990-01-01

    NASA also seeks to advance American education by employing the technology utilization process to develop a computerized, artificial intelligence-based Intelligent Tutoring System (ITS) to help high school and college physics students. The tutoring system is designed for use with the lecture and laboratory portions of a typical physics instructional program. Its importance lies in its ability to observe continually as a student develops problem solutions and to intervene when appropriate with assistance specifically directed at the student's difficulty and tailored to his skill level and learning style. ITS originated as a project of the Johnson Space Center (JSC). It is being developed by JSC's Software Technology Branch in cooperation with Dr. R. Bowen Loftin at the University of Houston-Downtown. Program is jointly sponsored by NASA and ACOT (Apple Classrooms of Tomorrow). Other organizations providing support include Texas Higher Education Coordinating Board, the National Research Council, Pennzoil Products Company and the George R. Brown Foundation. The Physics I class of Clear Creek High School, League City, Texas are providing the classroom environment for test and evaluation of the system. The ITS is a spinoff product developed earlier to integrate artificial intelligence into training/tutoring systems for NASA astronauts flight controllers and engineers.

  17. Artificial intelligence

    CERN Document Server

    Hunt, Earl B

    1975-01-01

    Artificial Intelligence provides information pertinent to the fundamental aspects of artificial intelligence. This book presents the basic mathematical and computational approaches to problems in the artificial intelligence field.Organized into four parts encompassing 16 chapters, this book begins with an overview of the various fields of artificial intelligence. This text then attempts to connect artificial intelligence problems to some of the notions of computability and abstract computing devices. Other chapters consider the general notion of computability, with focus on the interaction bet

  18. Intelligent mechatronics; Intelligent mechatronics

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, H. [The University of Tokyo, Tokyo (Japan). Institute of Industrial Science

    1995-10-01

    Intelligent mechatronics (IM) was explained as follows: a study of IM essentially targets realization of a robot namely, but in the present stage the target is a creation of new values by intellectualization of machine, that is, a combination of the information infrastructure and the intelligent machine system. IM is also thought to be constituted of computers positively used and micromechatronics. The paper next introduces examples of IM study, mainly those the author is concerned with as shown below: sensor gloves, robot hands, robot eyes, tele operation, three-dimensional object recognition, mobile robot, magnetic bearing, construction of remote controlled unmanned dam, robot network, sensitivity communication using neuro baby, etc. 27 figs.

  19. Modern Processing Capabilities of Analog Data from Documentation of the Great Omayyad Mosque in Aleppo, Syria, Damaged in Civil War

    Science.gov (United States)

    Pavelka, K.; Šedina, J.; Raeva, P.; Hůlková, M.

    2017-08-01

    In 1999, a big project for the documentation of the Great Omayyad mosque in Aleppo / Syria under UNESCO was conducted. By end of the last century, still analogue cameras were still being used, like the UMK Zeiss, RolleiMetric System. Digital cameras and digital automatic data processing were just starting to be on the rise and laser scanning was not relevant. In this situation, photogrammetrical measurement used stereo technology for complicated situations, and object and single-image technology for creating photoplans. Hundreds of photogrammetric images were taken. However, data processing was carried out on digital stereo plotters or workstations; it was necessary that all analogue photos were converted to digital form using a photogrammetric scanner. The outputs were adequate to the end of the last century. Nowadays, after 19 years, the photogrammetric materials still exist, but the technology and processing is completely different. Our original measurement is historical and nowadays quite obsolete. So we was it decided to explore the possibilities of the new processing of historical materials. Why? The reason is that in the last few years there has been civil war in Syria and the above mentioned monument was severely damaged. The existing historical materials therefore provide a unique opportunity for possible future reconstruction. This paper refers to the completion of existing materials, their evaluation and possibilities of new processing with today's technologies.

  20. MODERN PROCESSING CAPABILITIES OF ANALOG DATA FROM DOCUMENTATION OF THE GREAT OMAYYAD MOSQUE IN ALEPPO, SYRIA, DAMAGED IN CIVIL WAR

    Directory of Open Access Journals (Sweden)

    K. Pavelka

    2017-08-01

    Full Text Available In 1999, a big project for the documentation of the Great Omayyad mosque in Aleppo / Syria under UNESCO was conducted. By end of the last century, still analogue cameras were still being used, like the UMK Zeiss, RolleiMetric System. Digital cameras and digital automatic data processing were just starting to be on the rise and laser scanning was not relevant. In this situation, photogrammetrical measurement used stereo technology for complicated situations, and object and single-image technology for creating photoplans. Hundreds of photogrammetric images were taken. However, data processing was carried out on digital stereo plotters or workstations; it was necessary that all analogue photos were converted to digital form using a photogrammetric scanner. The outputs were adequate to the end of the last century. Nowadays, after 19 years, the photogrammetric materials still exist, but the technology and processing is completely different. Our original measurement is historical and nowadays quite obsolete. So we was it decided to explore the possibilities of the new processing of historical materials. Why? The reason is that in the last few years there has been civil war in Syria and the above mentioned monument was severely damaged. The existing historical materials therefore provide a unique opportunity for possible future reconstruction. This paper refers to the completion of existing materials, their evaluation and possibilities of new processing with today’s technologies.

  1. The National Air Intelligence Center Software Process Improvement Effort (NAIC SPI)

    National Research Council Canada - National Science Library

    Blankenship, Donald

    2001-01-01

    ...) Software Process Improvements effort. The objective of this effort was for the contractor to provide engineering and software process improvement for NAIC/SCD to reach SEI's CMM Level 2 in process maturity...

  2. Service-oriented architecture of adaptive, intelligent data acquisition and processing systems for long-pulse fusion experiments

    International Nuclear Information System (INIS)

    Gonzalez, J.; Ruiz, M.; Barrera, E.; Lopez, J.M.; Arcas, G. de; Vega, J.

    2010-01-01

    The data acquisition systems used in long-pulse fusion experiments need to implement data reduction and pattern recognition algorithms in real time. In order to accomplish these operations, it is essential to employ software tools that allow for hot swap capabilities throughout the temporal evolution of the experiments. This is very important because processing needs are not equal during different phases of the experiment. The intelligent test and measurement system (ITMS) developed by UPM and CIEMAT is an example of a technology for implementing scalable data acquisition and processing systems based on PXI and CompactPCI hardware. In the ITMS platform, a set of software tools allows the user to define the processing algorithms associated with the different experimental phases using state machines driven by software events. These state machines are specified using the State Chart XML (SCXML) language. The software tools are developed using JAVA, JINI, an SCXML engine and several LabVIEW applications. Within this schema, it is possible to execute data acquisition and processing applications in an adaptive way. The power of SCXML semantics and the ability to work with XML user-defined data types allow for very easy programming of the ITMS platform. With this approach, the ITMS platform is a suitable solution for implementing scalable data acquisition and processing systems based on a service-oriented model with the ability to easily implement remote participation applications.

  3. Services oriented architecture for adaptive and intelligent data acquisition and processing systems in long pulse fusion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, J.; Ruiz, M.; Barrera, E.; Lopez, J.M.; De Arcas, G. [Universidad Politecnica de Madrid (Spain); Vega, J. [Association EuratomCIEMAT para Fusion, Madrid (Spain)

    2009-07-01

    Data acquisition systems used in long pulse fusion experiments require to implement data reduction and pattern recognition algorithms in real time. In order to accomplish these operations is essential to dispose software tools that allow hot swap capabilities throughout the temporal evolution of the experiments. This is very important because the processing needs are not equal in the different experiment's phases. The intelligent test and measurement system (ITMS) developed by UPM and CIEMAT is an example of technology for implementing scalable data acquisition and processing systems based in PXI and compact PCI hardware. In the ITMS platform a set of software tools allows the user to define the processing associated with the different experiment's phases using state machines driven by software events. These state machines are specified using State Chart XML (SCXML) language. The software tools are developed using: JAVA, JINI, a SCXML engine and several LabVIEW applications. With this schema it is possible to execute data acquisition and processing applications in an adaptive way. The powerful of SCXML semantics and the possibility of to work with XML user defined data types allow a very easy programming of ITMS platform. With this approach ITMS platform is a suitable solution for implementing scalable data acquisition and processing systems, based in a services oriented model, with ease possibility for implement remote participation applications. (authors)

  4. Service-oriented architecture of adaptive, intelligent data acquisition and processing systems for long-pulse fusion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, J. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada. Universidad Politecnica de Madrid, Crta. Valencia Km-7 Madrid 28031 (Spain); Ruiz, M., E-mail: mariano.ruiz@upm.e [Grupo de Investigacion en Instrumentacion y Acustica Aplicada. Universidad Politecnica de Madrid, Crta. Valencia Km-7 Madrid 28031 (Spain); Barrera, E.; Lopez, J.M.; Arcas, G. de [Grupo de Investigacion en Instrumentacion y Acustica Aplicada. Universidad Politecnica de Madrid, Crta. Valencia Km-7 Madrid 28031 (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2010-07-15

    The data acquisition systems used in long-pulse fusion experiments need to implement data reduction and pattern recognition algorithms in real time. In order to accomplish these operations, it is essential to employ software tools that allow for hot swap capabilities throughout the temporal evolution of the experiments. This is very important because processing needs are not equal during different phases of the experiment. The intelligent test and measurement system (ITMS) developed by UPM and CIEMAT is an example of a technology for implementing scalable data acquisition and processing systems based on PXI and CompactPCI hardware. In the ITMS platform, a set of software tools allows the user to define the processing algorithms associated with the different experimental phases using state machines driven by software events. These state machines are specified using the State Chart XML (SCXML) language. The software tools are developed using JAVA, JINI, an SCXML engine and several LabVIEW applications. Within this schema, it is possible to execute data acquisition and processing applications in an adaptive way. The power of SCXML semantics and the ability to work with XML user-defined data types allow for very easy programming of the ITMS platform. With this approach, the ITMS platform is a suitable solution for implementing scalable data acquisition and processing systems based on a service-oriented model with the ability to easily implement remote participation applications.

  5. Towards Intelligible Query Processing in Relevance Feedback-Based Image Retrieval Systems

    OpenAIRE

    Mohammed, Belkhatir

    2008-01-01

    We have specified within the scope of this paper a framework combining semantics and relational (spatial) characterizations within a coupled architecture in order to address the semantic gap. This framework is instantiated by an operational model based on a sound logic-based formalism, allowing to define a representation for image documents and a matching function to compare index and query structures. We have specified a query framework coupling keyword-based querying with a relevance feedba...

  6. Dual cell conductivity during ionic exchange processes: the intelligent transmitter EXA DC 400

    International Nuclear Information System (INIS)

    Mier, A.

    1997-01-01

    Why is differential conductivity important versus standard conductivity measurement? That entirely depends on the application. If we have a process where the conductivity changes ge.. Cation exchanger, then standard conductivity measurement is not appropriate. With dual cell conductivity we can rate the process and eliminate conductivity changes outside the process. Therefore we achieve more precise control or monitoring of that process. (Author)

  7. Dental ethics and emotional intelligence.

    Science.gov (United States)

    Rosenblum, Alvin B; Wolf, Steve

    2014-01-01

    Dental ethics is often taught, viewed, and conducted as an intell enterprise, uninformed by other noncognitive factors. Emotional intelligence (EQ) is defined distinguished from the cognitive intelligence measured by Intelligence Quotient (IQ). This essay recommends more inclusion of emotional, noncognitive input to the ethical decision process in dental education and dental practice.

  8. Artificial Intelligence and Moral intelligence

    OpenAIRE

    Laura Pana

    2008-01-01

    We discuss the thesis that the implementation of a moral code in the behaviour of artificial intelligent systems needs a specific form of human and artificial intelligence, not just an abstract intelligence. We present intelligence as a system with an internal structure and the structural levels of the moral system, as well as certain characteristics of artificial intelligent agents which can/must be treated as 1- individual entities (with a complex, specialized, autonomous or selfdetermined,...

  9. The Acquisition of Context Data of Study Process and their Application in Classroom and Intelligent Tutoring Systems

    Directory of Open Access Journals (Sweden)

    Bicans Janis

    2015-12-01

    Full Text Available Over the last decade, researchers are investigating the potential of the educational paradigm shift from the traditional “one-size-fits all” teaching approach to an adaptive and more personalized study process. Availability of fast mobile connections along with the portative handheld device evolution, like phones and tablets, enable teachers and learners to communicate and interact with each other in a completely different way and speed. The mentioned devices not only deliver tutoring material to the learner, but might also serve as sensors to provide data about the learning process itself, e.g., learning conditions, location, detailed information on learning of tutoring material and other information. This sensor data put into the context of the study process can be widely used to improve student experience in the classroom and e-learning by providing more precise and detailed information to the teacher and/or an intelligent tutoring system for the selection of an appropriate tutoring strategy. This paper analyses and discusses acquisition, processing, and application scenarios of contextual information.

  10. Trends in ambient intelligent systems the role of computational intelligence

    CERN Document Server

    Khan, Mohammad; Abraham, Ajith

    2016-01-01

    This book demonstrates the success of Ambient Intelligence in providing possible solutions for the daily needs of humans. The book addresses implications of ambient intelligence in areas of domestic living, elderly care, robotics, communication, philosophy and others. The objective of this edited volume is to show that Ambient Intelligence is a boon to humanity with conceptual, philosophical, methodical and applicative understanding. The book also aims to schematically demonstrate developments in the direction of augmented sensors, embedded systems and behavioral intelligence towards Ambient Intelligent Networks or Smart Living Technology. It contains chapters in the field of Ambient Intelligent Networks, which received highly positive feedback during the review process. The book contains research work, with in-depth state of the art from augmented sensors, embedded technology and artificial intelligence along with cutting-edge research and development of technologies and applications of Ambient Intelligent N...

  11. Artificial Intelligence for Controlling Robotic Aircraft

    Science.gov (United States)

    Krishnakumar, Kalmanje

    2005-01-01

    A document consisting mostly of lecture slides presents overviews of artificial-intelligence-based control methods now under development for application to robotic aircraft [called Unmanned Aerial Vehicles (UAVs) in the paper] and spacecraft and to the next generation of flight controllers for piloted aircraft. Following brief introductory remarks, the paper presents background information on intelligent control, including basic characteristics defining intelligent systems and intelligent control and the concept of levels of intelligent control. Next, the paper addresses several concepts in intelligent flight control. The document ends with some concluding remarks, including statements to the effect that (1) intelligent control architectures can guarantee stability of inner control loops and (2) for UAVs, intelligent control provides a robust way to accommodate an outer-loop control architecture for planning and/or related purposes.

  12. Database in Artificial Intelligence.

    Science.gov (United States)

    Wilkinson, Julia

    1986-01-01

    Describes a specialist bibliographic database of literature in the field of artificial intelligence created by the Turing Institute (Glasgow, Scotland) using the BRS/Search information retrieval software. The subscription method for end-users--i.e., annual fee entitles user to unlimited access to database, document provision, and printed awareness…

  13. Individual differences in working memory, secondary memory, and fluid intelligence: evidence from the levels-of-processing span task.

    Science.gov (United States)

    Rose, Nathan S

    2013-12-01

    Individual differences in working memory (WM) are related to performance on secondary memory (SM), and fluid intelligence (gF) tests. However, the source of the relation remains unclear, in part because few studies have controlled for the nature of encoding; therefore, it is unclear whether individual variation is due to encoding, maintenance, or retrieval processes. In the current study, participants performed a WM task (the levels-of-processing span task; Rose, Myerson, Roediger III, & Hale, 2010) and a SM test that tested for both targets and the distracting processing words from the initial WM task. Deeper levels of processing at encoding did not benefit WM, but did benefit subsequent SM, although the amount of benefit was smaller for those with lower WM spans. This result suggests that, despite encoding cues that facilitate retrieval from SM, low spans may have engaged in shallower, maintenance-focused processing to maintain the words in WM. Low spans also recalled fewer targets, more distractors, and more extralist intrusions than high spans, although this was partially due to low spans' poorer recall of targets, which resulted in a greater number of opportunities to commit recall errors. Delayed recall of intrusions and commission of source errors (labeling targets as processing words and vice versa) were significant negative predictors of gF. These results suggest that the ability to use source information to recall relevant information and withhold recall of irrelevant information is a critical source of both individual variation in WM and the relation between WM, SM, and gF. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  14. Transitioning Existing Content: inferring organisation-specific documents

    Directory of Open Access Journals (Sweden)

    Arijit Sengupta

    2000-11-01

    Full Text Available A definition for a document type within an organization represents an organizational norm about the way the organizational actors represent products and supporting evidence of organizational processes. Generating a good organization-specific document structure is, therefore, important since it can capture a shared understanding among the organizational actors about how certain business processes should be performed. Current tools that generate document type definitions focus on the underlying technology, emphasizing tags created in a single instance document. The tools, thus, fall short of capturing the shared understanding between organizational actors about how a given document type should be represented. We propose a method for inferring organization-specific document structures using multiple instance documents as inputs. The method consists of heuristics that combine individual document definitions, which may have been compiled using standard algorithms. We propose a number of heuristics utilizing artificial intelligence and natural language processing techniques. As the research progresses, the heuristics will be tested on a suite of test cases representing multiple instance documents for different document types. The complete methodology will be implemented as a research prototype

  15. An evolutionary approach for business process redesign : towards an intelligent system

    NARCIS (Netherlands)

    Netjes, M.; Limam Mansar, S.; Reijers, H.A.; Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Although extensive literature on BPR is available, there is still a lack of concrete guidance on actually changing processes for the better. It is our goal to provide a redesign approach which describes and supports the steps to derive from an existing process a better performing redesign. In this

  16. Supporting the full BPM life-cycle using process mining and intelligent redesign

    NARCIS (Netherlands)

    Netjes, M.; Reijers, H.A.; Aalst, van der W.M.P.; Siau, K.

    2007-01-01

    Abstract. Business Process Management (BPM) systems provide a broad range of facilities to enact and manage operational business processes. Ideally, these systems should provide support for the complete BPM life-cycle: (re)design, configuration, execution, control, and diagnosis by the FileNet P8

  17. Bio-inspired Artificial Intelligence: А Generalized Net Model of the Regularization Process in MLP

    Directory of Open Access Journals (Sweden)

    Stanimir Surchev

    2013-10-01

    Full Text Available Many objects and processes inspired by the nature have been recreated by the scientists. The inspiration to create a Multilayer Neural Network came from human brain as member of the group. It possesses complicated structure and it is difficult to recreate, because of the existence of too many processes that require different solving methods. The aim of the following paper is to describe one of the methods that improve learning process of Artificial Neural Network. The proposed generalized net method presents Regularization process in Multilayer Neural Network. The purpose of verification is to protect the neural network from overfitting. The regularization is commonly used in neural network training process. Many methods of verification are present, the subject of interest is the one known as Regularization. It contains function in order to set weights and biases with smaller values to protect from overfitting.

  18. Shifts in information processing level: the speed theory of intelligence revisited.

    Science.gov (United States)

    Sircar, S S

    2000-06-01

    A hypothesis is proposed here to reconcile the inconsistencies observed in the IQ-P3 latency relation. The hypothesis stems from the observation that task-induced increase in P3 latency correlates positively with IQ scores. It is hypothesised that: (a) there are several parallel information processing pathways of varying complexity which are associated with the generation of P3 waves of varying latencies; (b) with increasing workload, there is a shift in the 'information processing level' through progressive recruitment of more complex polysynaptic pathways with greater processing power and inhibition of the oligosynaptic pathways; (c) high-IQ subjects have a greater reserve of higher level processing pathways; (d) a given 'task-load' imposes a greater 'mental workload' in subjects with lower IQ than in those with higher IQ. According to this hypothesis, a meaningful comparison of the P3 correlates of IQ is possible only when the information processing level is pushed to its limits.

  19. Evaluation of the functional performance and technical quality of an Electronic Documentation System of the Nursing Process.

    Science.gov (United States)

    de Oliveira, Neurilene Batista; Peres, Heloisa Helena Ciqueto

    2015-01-01

    To evaluate the functional performance and the technical quality of the Electronic Documentation System of the Nursing Process of the Teaching Hospital of the University of São Paulo. exploratory-descriptive study. The Quality Model of regulatory standard 25010 and the Evaluation Process defined under regulatory standard 25040, both of the International Organization for Standardization/International Electrotechnical Commission. The quality characteristics evaluated were: functional suitability, reliability, usability, performance efficiency, compatibility, security, maintainability and portability. The sample was made up of 37 evaluators. in the evaluation of the specialists in information technology, only the characteristic of usability obtained a rate of positive responses of less than 70%. For the nurse lecturers, all the quality characteristics obtained a rate of positive responses of over 70%. The staff nurses of the medical and surgical clinics with experience in using the system) and staff nurses from other units of the hospital and from other health institutions (without experience in using the system) obtained rates of positive responses of more than 70% referent to the functional suitability, usability, and security. However, performance efficiency, reliability and compatibility all obtained rates below the parameter established. the software achieved rates of positive responses of over 70% for the majority of the quality characteristics evaluated.

  20. Towards Intelligent Supply Chains

    DEFF Research Database (Denmark)

    Siurdyban, Artur; Møller, Charles

    2012-01-01

    applied to the context of organizational processes can increase the success rate of business operations. The framework is created using a set of theoretical based constructs grounded in a discussion across several streams of research including psychology, pedagogy, artificial intelligence, learning...... of deploying inapt operations leading to deterioration of profits. To address this problem, we propose a unified business process design framework based on the paradigm of intelligence. Intelligence allows humans and human-designed systems cope with environmental volatility, and we argue that its principles......, business process management and supply chain management. It outlines a number of system tasks combined in four integrated management perspectives: build, execute, grow and innovate, put forward as business process design propositions for Intelligent Supply Chains....

  1. Artificial Intelligence.

    Science.gov (United States)

    Information Technology Quarterly, 1985

    1985-01-01

    This issue of "Information Technology Quarterly" is devoted to the theme of "Artificial Intelligence." It contains two major articles: (1) Artificial Intelligence and Law" (D. Peter O'Neill and George D. Wood); (2) "Artificial Intelligence: A Long and Winding Road" (John J. Simon, Jr.). In addition, it contains two sidebars: (1) "Calculating and…

  2. Automated business process management – in times of digital transformation using machine learning or artificial intelligence

    OpenAIRE

    Paschek Daniel; Luminosu Caius Tudor; Draghici Anca

    2017-01-01

    The continuous optimization of business processes is still a challenge for companies. In times of digital transformation, faster changing internal and external framework conditions and new customer expectations for fastest delivery and best quality of goods and many more, companies should set up their internal process at the best way. But what to do if framework conditions changed unexpectedly? The purpose of the paper is to analyse how the digital transformation will impact the Business Proc...

  3. The prediction of breast cancer biopsy outcomes using two CAD approaches that both emphasize an intelligible decision process

    International Nuclear Information System (INIS)

    Elter, M.; Schulz-Wendtland, R.; Wittenberg, T.

    2007-01-01

    Mammography is the most effective method for breast cancer screening available today. However, the low positive predictive value of breast biopsy resulting from mammogram interpretation leads to approximately 70% unnecessary biopsies with benign outcomes. To reduce the high number of unnecessary breast biopsies, several computer-aided diagnosis (CAD) systems have been proposed in the last several years. These systems help physicians in their decision to perform a breast biopsy on a suspicious lesion seen in a mammogram or to perform a short term follow-up examination instead. We present two novel CAD approaches that both emphasize an intelligible decision process to predict breast biopsy outcomes from BI-RADS findings. An intelligible reasoning process is an important requirement for the acceptance of CAD systems by physicians. The first approach induces a global model based on decison-tree learning. The second approach is based on case-based reasoning and applies an entropic similarity measure. We have evaluated the performance of both CAD approaches on two large publicly available mammography reference databases using receiver operating characteristic (ROC) analysis, bootstrap sampling, and the ANOVA statistical significance test. Both approaches outperform the diagnosis decisions of the physicians. Hence, both systems have the potential to reduce the number of unnecessary breast biopsies in clinical practice. A comparison of the performance of the proposed decision tree and CBR approaches with a state of the art approach based on artificial neural networks (ANN) shows that the CBR approach performs slightly better than the ANN approach, which in turn results in slightly better performance than the decision-tree approach. The differences are statistically significant (p value <0.001). On 2100 masses extracted from the DDSM database, the CRB approach for example resulted in an area under the ROC curve of A(z)=0.89±0.01, the decision-tree approach in A(z)=0.87±0

  4. How holistic processing of faces relates to cognitive control and intelligence.

    Science.gov (United States)

    Gauthier, Isabel; Chua, Kao-Wei; Richler, Jennifer J

    2018-04-16

    The Vanderbilt Holistic Processing Test for faces (VHPT-F) is the first standard test designed to measure individual differences in holistic processing. The test measures failures of selective attention to face parts through congruency effects, an operational definition of holistic processing. However, this conception of holistic processing has been challenged by the suggestion that it may tap into the same selective attention or cognitive control mechanisms that yield congruency effects in Stroop and Flanker paradigms. Here, we report data from 130 subjects on the VHPT-F, several versions of Stroop and Flanker tasks, as well as fluid IQ. Results suggested a small degree of shared variance in Stroop and Flanker congruency effects, which did not relate to congruency effects on the VHPT-F. Variability on the VHPT-F was also not correlated with Fluid IQ. In sum, we find no evidence that holistic face processing as measured by congruency in the VHPT-F is accounted for by domain-general control mechanisms.

  5. An Introduction to Intelligent Processing Programs Developed by the Air Force Manufacturing Technology Directorate

    Science.gov (United States)

    Sampson, Paul G.; Sny, Linda C.

    1992-01-01

    The Air Force has numerous on-going manufacturing and integration development programs (machine tools, composites, metals, assembly, and electronics) which are instrumental in improving productivity in the aerospace industry, but more importantly, have identified strategies and technologies required for the integration of advanced processing equipment. An introduction to four current Air Force Manufacturing Technology Directorate (ManTech) manufacturing areas is provided. Research is being carried out in the following areas: (1) machining initiatives for aerospace subcontractors which provide for advanced technology and innovative manufacturing strategies to increase the capabilities of small shops; (2) innovative approaches to advance machine tool products and manufacturing processes; (3) innovative approaches to advance sensors for process control in machine tools; and (4) efforts currently underway to develop, with the support of industry, the Next Generation Workstation/Machine Controller (Low-End Controller Task).

  6. Introduction of artificial intelligence techniques for computerized management of defects in an industrial process

    International Nuclear Information System (INIS)

    Utzel, N.

    1991-06-01

    An optimized management of Tore Supra Tokamak requires a computerized defect management. The aim is the analysis of an inhibited situation not corrected by automatisms of the process and that can be handled only by human intervention. The operator should understand, make a diagnosis and act to restore the system. In this report are studied an expert system helping the operator to analyze defects of the two main cooling loops (decarbonated water and pressurized water), management of the history of malfunction and recording of diagnosises, elaboration of an adapted expert model and installation of a methodology for defect management in other processes of Tore Supra [fr

  7. Pathogen intelligence

    Directory of Open Access Journals (Sweden)

    Michael eSteinert

    2014-01-01

    Full Text Available Different species inhabit different sensory worlds and thus have evolved diverse means of processing information, learning and memory. In the escalated arms race with host defense, each pathogenic bacterium not only has evolved its individual cellular sensing and behaviour, but also collective sensing, interbacterial communication, distributed information processing, joint decision making, dissociative behaviour, and the phenotypic and genotypic heterogeneity necessary for epidemiologic success. Moreover, pathogenic populations take advantage of dormancy strategies and rapid evolutionary speed, which allow them to save co-generated intelligent traits in a collective genomic memory. This review discusses how these mechanisms add further levels of complexity to bacterial pathogenicity and transmission, and how mining for these mechanisms could help to develop new anti-infective strategies.

  8. Intelligent Modeling Combining Adaptive Neuro Fuzzy Inference System and Genetic Algorithm for Optimizing Welding Process Parameters

    Science.gov (United States)

    Gowtham, K. N.; Vasudevan, M.; Maduraimuthu, V.; Jayakumar, T.

    2011-04-01

    Modified 9Cr-1Mo ferritic steel is used as a structural material for steam generator components of power plants. Generally, tungsten inert gas (TIG) welding is preferred for welding of these steels in which the depth of penetration achievable during autogenous welding is limited. Therefore, activated flux TIG (A-TIG) welding, a novel welding technique, has been developed in-house to increase the depth of penetration. In modified 9Cr-1Mo steel joints produced by the A-TIG welding process, weld bead width, depth of penetration, and heat-affected zone (HAZ) width play an important role in determining the mechanical properties as well as the performance of the weld joints during service. To obtain the desired weld bead geometry and HAZ width, it becomes important to set the welding process parameters. In this work, adaptative neuro fuzzy inference system is used to develop independent models correlating the welding process parameters like current, voltage, and torch speed with weld bead shape parameters like depth of penetration, bead width, and HAZ width. Then a genetic algorithm is employed to determine the optimum A-TIG welding process parameters to obtain the desired weld bead shape parameters and HAZ width.

  9. The crustal dynamics intelligent user interface anthology

    Science.gov (United States)

    Short, Nicholas M., Jr.; Campbell, William J.; Roelofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has, as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI is to develop a friendly and intelligent user interface service based on expert systems and natural language processing technologies. The purpose of such a service is to support the large number of potential scientific and engineering users that have need of space and land-related research and technical data, but have little or no experience in query languages or understanding of the information content or architecture of the databases of interest. This document presents the design concepts, development approach and evaluation of the performance of a prototype IUI system for the Crustal Dynamics Project Database, which was developed using a microcomputer-based expert system tool (M. 1), the natural language query processor THEMIS, and the graphics software system GSS. The IUI design is based on a multiple view representation of a database from both the user and database perspective, with intelligent processes to translate between the views.

  10. Intelligence Ethics:

    DEFF Research Database (Denmark)

    Rønn, Kira Vrist

    2016-01-01

    Questions concerning what constitutes a morally justified conduct of intelligence activities have received increased attention in recent decades. However, intelligence ethics is not yet homogeneous or embedded as a solid research field. The aim of this article is to sketch the state of the art...... of intelligence ethics and point out subjects for further scrutiny in future research. The review clusters the literature on intelligence ethics into two groups: respectively, contributions on external topics (i.e., the accountability of and the public trust in intelligence agencies) and internal topics (i.......e., the search for an ideal ethical framework for intelligence actions). The article concludes that there are many holes to fill for future studies on intelligence ethics both in external and internal discussions. Thus, the article is an invitation – especially, to moral philosophers and political theorists...

  11. 10th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Seghrouchni, Amal; Beynier, Aurélie; Camacho, David; Herpson, Cédric; Hindriks, Koen; Novais, Paulo

    2017-01-01

    This book presents the combined peer-reviewed proceedings of the tenth International Symposium on Intelligent Distributed Computing (IDC’2016), which was held in Paris, France from October 10th to 12th, 2016. The 23 contributions address a range of topics related to theory and application of intelligent distributed computing, including: Intelligent Distributed Agent-Based Systems, Ambient Intelligence and Social Networks, Computational Sustainability, Intelligent Distributed Knowledge Representation and Processing, Smart Networks, Networked Intelligence and Intelligent Distributed Applications, amongst others.

  12. Intelligent decision technology support in practice

    CERN Document Server

    Neves-Silva, Rui; Jain, Lakhmi; Phillips-Wren, Gloria; Watada, Junzo; Howlett, Robert

    2016-01-01

    This book contains a collection of innovative chapters emanating from topics raised during the 5th KES International Conference on Intelligent Decision Technologies (IDT), held during 2013 at Sesimbra, Portugal. The authors were invited to expand their original papers into a plethora of innovative chapters espousing IDT methodologies and applications. This book documents leading-edge contributions, representing advances in Knowledge-Based and Intelligent Information and Engineering System. It acknowledges that researchers recognize that society is familiar with modern Advanced Information Processing and increasingly expect richer IDT systems. Each chapter concentrates on the theory, design, development, implementation, testing or evaluation of IDT techniques or applications.  Anyone that wants to work with IDT or simply process knowledge should consider reading one or more chapters and focus on their technique of choice. Most readers will benefit from reading additional chapters to access alternative techniq...

  13. The management of electronic documents generated from compilation and revision processes of nuclear and radiation safety regulations and standards

    International Nuclear Information System (INIS)

    Wang Wenhai; Fan Yun; Shang Zhaorong

    2010-01-01

    As the Secretary Group of Regulations and Standards Review Committee on nuclear and radiation safe needs to deal with a large number of electronic documents in course of the regulation and standard review meetings, the article gives a systematical method including electronic document file naming and management as well as procedures of file transfer, storage and usage. (authors)

  14. Documentation of and satisfaction with the service delivery process of electric powered scooters among adult users in different national contexts

    DEFF Research Database (Denmark)

    Sund, Terje; Iwarsson, Susanne; Andersen, Mette C

    2013-01-01

    -up design based on a consecutive inclusion of 50 Danish and 86 Norwegian adults as they were about to be provided a scooter. A study-specific structured questionnaire for documentation of the SDP was administered. The Satisfaction with Assistive Technology Services was used for documenting user satisfaction...

  15. Logic Programs as a Specification and Description Tool in the Design Process of an Intelligent Tutoring System

    OpenAIRE

    Möbus, Claus

    1987-01-01

    We propose the use of logic programs when designing intelligent tutoring systems. With their help we specified the small-step semantics of the learning curriculum, designed the graphical user interface, derived instructions and modelled students' knowledge.

  16. Ultrasonic velocity measurements- a potential sensor for intelligent processing of austenitic stainless steels

    International Nuclear Information System (INIS)

    Venkadesan, S.; Palanichamy, P.; Vasudevan, M.; Baldev Raj

    1996-01-01

    Development of sensors based on Non-Destructive Evaluation (NDE) techniques for on-line sensing of microstructure and properties requires a thorough knowledge on the relation between the sensing mechanism/measurement of an NDE technique and the microstructure. As a first step towards developing an on-line sensor for studying the dynamic microstructural changes during processing of austenitic stainless steels, ultrasonic velocity measurements have been carried out to study the microstructural changes after processing. Velocity measurements could follow the progress of annealing starting from recovery, onset and completion of recrystallization, sense the differences in the microstructure obtained after hot deformation and estimate the grain size. This paper brings out the relation between the sensing method based on ultrasonic velocity measurements and the microstructure in austenitic stainless steel. (author)

  17. Semantic Business Intelligence - a New Generation of Business Intelligence

    Directory of Open Access Journals (Sweden)

    Dinu AIRINEI

    2012-01-01

    Full Text Available Business Intelligence Solutions represents applications used by companies to manage process and analyze data to provide substantiated decision. In the context of Semantic Web develop-ment trend is to integrate semantic unstructured data, making business intelligence solutions to be redesigned in such a manner that can analyze, process and synthesize, in addition to traditional data and data integrated with semantic another form and structure. This invariably leads appearance of new BI solution, called Semantic Business Intelligence.

  18. On-line Cutting Tool Condition Monitoring in Machining Processes Using Artificial Intelligence

    OpenAIRE

    Vallejo, Antonio J.; Morales-Menéndez, Rub&#;n; Alique, J.R.

    2008-01-01

    This chapter presented new ideas for monitoring and diagnosis of the cutting tool condition with two different algorithms for pattern recognition: HMM, and ANN. The monitoring and diagnosis system was implemented for peripheral milling process in HSM, where several Aluminium alloys and cutting tools were used. The flank wear (VB) was selected as the criterion to evaluate the tool's life and four cutting tool conditions were defined to be recognized: New, half new, half worn, and worn conditio...

  19. Investment Cost Model in Business Process Intelligence in Banking And Electricity Company

    Directory of Open Access Journals (Sweden)

    Arta Moro Sundjaja

    2016-06-01

    Full Text Available Higher demand from the top management in measuring business process performance causes the incremental implementation of BPM and BI in the enterprise. The problem faced by top managements is how to integrate their data from all system used to support the business and process the data become information that able to support the decision-making processes. Our literature review elaborates several implementations of BPI on companies in Australia and Germany, challenges faced by organizations in developing BPI solution in their organizations and some cost model to calculate the investment of BPI solutions. This paper shows the success in BPI application of banks and assurance companies in German and electricity work in Australia aims to give a vision about the importance of BPI application. Many challenges in BPI application of companies in German and Australia, BPI solution, and data warehouse design development have been discussed to add insight in future BPI development. And the last is an explanation about how to analyze cost associated with BPI solution investment.

  20. Documents preparation and review

    International Nuclear Information System (INIS)

    1999-01-01

    Ignalina Safety Analysis Group takes active role in assisting regulatory body VATESI to prepare various regulatory documents and reviewing safety reports and other documentation presented by Ignalina NPP in the process of licensing of unit 1. The list of main documents prepared and reviewed is presented

  1. Intelligence Naturelle et Intelligence Artificielle

    OpenAIRE

    Dubois, Daniel

    2011-01-01

    Cet article présente une approche systémique du concept d’intelligence naturelle en ayant pour objectif de créer une intelligence artificielle. Ainsi, l’intelligence naturelle, humaine et animale non-humaine, est une fonction composée de facultés permettant de connaître et de comprendre. De plus, l'intelligence naturelle reste indissociable de la structure, à savoir les organes du cerveau et du corps. La tentation est grande de doter les systèmes informatiques d’une intelligence artificielle ...

  2. Artificial Intelligence In Processing A Sequence Of Time-Varying Images

    Science.gov (United States)

    Siler, W.; Tucker, D.; Buckley, J.; Hess, R. G.; Powell, V. G.

    1985-04-01

    A computer system is described for unsupervised analysis of five sets of ultrasound images of the heart. Each set consists of 24 frames taken at 33 millisecond intervals. The images are acquired in real time with computer control of the ultrasound apparatus. After acquisition the images are segmented by a sequence of image-processing programs; features are extracted and stored in a version of the Carnegie- Mellon Blackboard. Region classification is accomplished by a fuzzy logic expert system FLOPS based on OPS5. Preliminary results are given.

  3. Location Intelligence Solutions

    International Nuclear Information System (INIS)

    Schmidt, D.

    2015-01-01

    Location Intelligence (LI) means using the spatial dimension of information as a key to support business processes. This spatial dimension has to be defined by geographic coordinates. Storing these spatial objects in a database allows for attaching a 'meaning' to them, like 'current position', 'border', 'building' or 'room'. Now the coordinates represent real-world objects, which can be relevant for the measurement, documentation, control or optimization of (parameters of) business processes aiming at different business objectives. But LI can only be applied, if the locations can be determined with an accuracy (in space and time) appropriate for the business process in consideration. Therefore the first step in any development of a LI solution is the analysis of the business process itself regarding its requirements for spatial and time resolution and accuracy. The next step is the detailed analysis of the surrounding conditions of the process: Does the process happen indoor and/or outdoor? Are there moving objects? If yes, how fast are they? How does the relevant environment look like? Is technical infrastructure available? Is the process restricted by regulations? As a result, a proper Location Detection Technology (LDT) has to be chosen in order to get reliable and accurate positions of the relevant objects. At the highly challenging conditions of the business processes IAEA inspectors are working with, the chosen LDTs have to deliver reliable positioning on ''room-level'' accuracy, even if there is no location enabling infrastructure in place, the objects (people) mostly are indoors and have to work under strong regulations. The presentation will give insights into innovative LI solutions based on technologies of different LDT providers. Pros and cons of combinations of different LDT (like multi- GNSS, IMU, camera, and human interaction based positioning) will be discussed from the

  4. Intelligent environmental sensing

    CERN Document Server

    Mukhopadhyay, Subhas

    2015-01-01

    Developing environmental sensing and monitoring technologies become essential especially for industries that may cause severe contamination. Intelligent environmental sensing uses novel sensor techniques, intelligent signal and data processing algorithms, and wireless sensor networks to enhance environmental sensing and monitoring. It finds applications in many environmental problems such as oil and gas, water quality, and agriculture. This book addresses issues related to three main approaches to intelligent environmental sensing and discusses their latest technological developments. Key contents of the book include:   Agricultural monitoring Classification, detection, and estimation Data fusion Geological monitoring Motor monitoring Multi-sensor systems Oil reservoirs monitoring Sensor motes Water quality monitoring Wireless sensor network protocol  

  5. Longitudinal Mediation of Processing Speed on Age-Related Change in Memory and Fluid Intelligence

    Science.gov (United States)

    Robitaille, Annie; Piccinin, Andrea M.; Muniz, Graciela; Hoffman, Lesa; Johansson, Boo; Deeg, Dorly J.H.; Aartsen, Marja J.; Comijs, Hannie C.; Hofer, Scott M.

    2014-01-01

    Age-related decline in processing speed has long been considered a key driver of cognitive aging. While the majority of empirical evidence for the processing speed hypothesis has been obtained from analyses of between-person age differences, longitudinal studies provide a direct test of within-person change. Using recent developments in longitudinal mediation analysis, we examine the speed–mediation hypothesis at both the within- and between-person levels in two longitudinal studies, LASA and OCTO-Twin. We found significant within-person indirect effects of change in age, such that increasing age was related to lower speed which, in turn, relates to lower performance across repeated measures on other cognitive outcomes. Although between-person indirect effects were also significant in LASA, they were not in OCTO-Twin. These differing magnitudes of direct and indirect effects across levels demonstrate the importance of separating between- and within-person effects in evaluating theoretical models of age-related change. PMID:23957224

  6. The Professionalization of Intelligence Cooperation

    DEFF Research Database (Denmark)

    Svendsen, Adam David Morgan

    "Providing an in-depth insight into the subject of intelligence cooperation (officially known as liason), this book explores the complexities of this process. Towards facilitating a general understanding of the professionalization of intelligence cooperation, Svendsen's analysis includes risk...... management and encourages the realisation of greater resilience. Svendsen discusses the controversial, mixed and uneven characterisations of the process of the professionalization of intelligence cooperation and argues for a degree of 'fashioning method out of mayhem' through greater operational...

  7. WIPP documentation plan

    International Nuclear Information System (INIS)

    Plung, D.L.; Montgomery, T.T.; Glasstetter, S.R.

    1986-01-01

    In support of the programs at the Waste Isolation Pilot Plant (WIPP), the Publications and Procedures Section developed a documentation plan that provides an integrated document hierarchy; further, this plan affords several unique features: 1) the format for procedures minimizes the writing responsibilities of the technical staff and maximizes use of the writing and editing staff; 2) review cycles have been structured to expedite the processing of documents; and 3) the numbers of documents needed to support the program have been appreciably reduced

  8. Intelligent tuning of vibration mitigation process for single link manipulator using fuzzy logic

    Directory of Open Access Journals (Sweden)

    Ahmed A. Ali

    2017-08-01

    Full Text Available In this work, active vibration mitigation for smart single link manipulator is presented. Two piezoelectric transducers were utilized to act as actuator and sensor respectively. Classical Proportional (P controller was tested numerically and experimentally. The comparison between measured results showed good agreement. The proposed work includes the introducing of fuzzy logic for tuning controller's gain within finite element method. Classical Proportional-Integral (PI, Fuzzy-P and Fuzzy-PI controllers were totally integrated as a series of [IF-Then] states and solved numerically by using Finite Element (FE solver (ANSYS. Proposed method will pave the way on solving the tuning process totally within single FE solver with high efficiency. Proposed method satisfied mitigation in the overall free response with about 52% and 74% of the manipulator settling time when Fuzzy-P and Fuzzy-PI controllers were activated respectively. This contribution can be utilized for many other applications related to fuzzy topics.

  9. An intelligent approach to optimize the EDM process parameters using utility concept and QPSO algorithm

    Directory of Open Access Journals (Sweden)

    Chinmaya P. Mohanty

    2017-04-01

    Full Text Available Although significant research has gone into the field of electrical discharge machining (EDM, analysis related to the machining efficiency of the process with different electrodes has not been adequately made. Copper and brass are frequently used as electrode materials but graphite can be used as a potential electrode material due to its high melting point temperature and good electrical conductivity. In view of this, the present work attempts to compare the machinability of copper, graphite and brass electrodes while machining Inconel 718 super alloy. Taguchi’s L27 orthogonal array has been employed to collect data for the study and analyze effect of machining parameters on performance measures. The important performance measures selected for this study are material removal rate, tool wear rate, surface roughness and radial overcut. Machining parameters considered for analysis are open circuit voltage, discharge current, pulse-on-time, duty factor, flushing pressure and electrode material. From the experimental analysis, it is observed that electrode material, discharge current and pulse-on-time are the important parameters for all the performance measures. Utility concept has been implemented to transform a multiple performance characteristics into an equivalent performance characteristic. Non-linear regression analysis is carried out to develop a model relating process parameters and overall utility index. Finally, the quantum behaved particle swarm optimization (QPSO and particle swarm optimization (PSO algorithms have been used to compare the optimal level of cutting parameters. Results demonstrate the elegance of QPSO in terms of convergence and computational effort. The optimal parametric setting obtained through both the approaches is validated by conducting confirmation experiments.

  10. Discharge documentation of patients discharged to subacute facilities: a three-year quality improvement process across an integrated health care system.

    Science.gov (United States)

    Gandara, Esteban; Ungar, Jonathan; Lee, Jason; Chan-Macrae, Myrna; O'Malley, Terrence; Schnipper, Jeffrey L

    2010-06-01

    Effective communication among physicians during hospital discharge is critical to patient care. Partners Healthcare (Boston) has been engaged in a multi-year process to measure and improve the quality of documentation of all patients discharged from its five acute care hospitals to subacute facilities. Partners first engaged stakeholders to develop a consensus set of 12 required data elements for all discharges to subacute facilities. A measurement process was established and later refined. Quality improvement interventions were then initiated to address measured deficiencies and included education of physicians and nurses, improvements in information technology, creation of or improvements in discharge documentation templates, training of hospitalists to serve as role models, feedback to physicians and their service chiefs regarding reviewed cases, and case manager review of documentation before discharge. To measure improvement in quality as a result of these efforts, rates of simultaneous inclusion of all 12 applicable data elements ("defect-free rate") were analyzed over time. Some 3,101 discharge documentation packets of patients discharged to subacute facilities from January 1, 2006, through September 2008 were retrospectively studied. During the 11 monitored quarters, the defect-free rate increased from 65% to 96% (p improvements were seen in documentation of preadmission medication lists, allergies, follow-up, and warfarin information. Institution of rigorous measurement, feedback, and multidisciplinary, multimodal quality improvement processes improved the inclusion of data elements in discharge documentation required for safe hospital discharge across a large integrated health care system.

  11. How semantics can inform the geological mapping process and support intelligent queries

    Science.gov (United States)

    Lombardo, Vincenzo; Piana, Fabrizio; Mimmo, Dario

    2017-04-01

    The geologic mapping process requires the organization of data according to the general knowledge about the objects, namely the geologic units, and to the objectives of a graphic representation of such objects in a map, following an established model of geotectonic evolution. Semantics can greatly help such a process in two concerns: the provision of a terminological base to name and classify the objects of the map; on the other, the implementation of a machine-readable encoding of the geologic knowledge base supports the application of reasoning mechanisms and the derivation of novel properties and relations about the objects of the map. The OntoGeonous initiative has built a terminological base of geological knowledge in a machine-readable format, following the Semantic Web tenets and the Linked Data paradigm. The major knowledge sources of the OntoGeonous initiative are GeoScience Markup Language schemata and vocabularies (through its last version, GeoSciML 4, 2015, published by the IUGS CGI Commission) and the INSPIRE "Data Specification on Geology" directives (an operative simplification of GeoSciML, published by INSPIRE Thematic Working Group Geology of the European Commission). The Linked Data paradigm has been exploited by linking (without replicating, to avoid inconsistencies) the already existing machine-readable encoding for some specific domains, such as the lithology domain (vocabulary Simple Lithology) and the geochronologic time scale (ontology "gts"). Finally, for the upper level knowledge, shared across several geologic domains, we have resorted to NASA SWEET ontology. The OntoGeonous initiative has also produced a wiki that explains how the geologic knowledge has been encoded from shared geoscience vocabularies (https://www.di.unito.it/wikigeo/). In particular, the sections dedicated to axiomatization will support the construction of an appropriate data base schema that can be then filled with the objects of the map. This contribution will discuss

  12. Intelligent Machine Vision Based Modeling and Positioning System in Sand Casting Process

    Directory of Open Access Journals (Sweden)

    Shahid Ikramullah Butt

    2017-01-01

    Full Text Available Advanced vision solutions enable manufacturers in the technology sector to reconcile both competitive and regulatory concerns and address the need for immaculate fault detection and quality assurance. The modern manufacturing has completely shifted from the manual inspections to the machine assisted vision inspection methodology. Furthermore, the research outcomes in industrial automation have revolutionized the whole product development strategy. The purpose of this research paper is to introduce a new scheme of automation in the sand casting process by means of machine vision based technology for mold positioning. Automation has been achieved by developing a novel system in which casting molds of different sizes, having different pouring cup location and radius, position themselves in front of the induction furnace such that the center of pouring cup comes directly beneath the pouring point of furnace. The coordinates of the center of pouring cup are found by using computer vision algorithms. The output is then transferred to a microcontroller which controls the alignment mechanism on which the mold is placed at the optimum location.

  13. Developing a Business Intelligence Process for a Training Module in SharePoint 2010

    Science.gov (United States)

    Schmidtchen, Bryce; Solano, Wanda M.; Albasini, Colby

    2015-01-01

    Prior to this project, training information for the employees of the National Center for Critical Processing and Storage (NCCIPS) was stored in an array of unrelated spreadsheets and SharePoint lists that had to be manually updated. By developing a content management system through a web application platform named SharePoint, this training system is now highly automated and provides a much less intensive method of storing training data and scheduling training courses. This system was developed by using SharePoint Designer and laying out the data structure for the interaction between different lists of data about the employees. The automation of data population inside of the lists was accomplished by implementing SharePoint workflows which essentially lay out the logic for how data is connected and calculated between certain lists. The resulting training system is constructed from a combination of five lists of data with a single list acting as the user-friendly interface. This interface is populated with the courses required for each employee and includes past and future information about course requirements. The employees of NCCIPS now have the ability to view, log, and schedule their training information and courses with much more ease. This system will relieve a significant amount of manual input and serve as a powerful informational resource for the employees of NCCIPS in the future.

  14. Fetal QRS extraction from abdominal recordings via model-based signal processing and intelligent signal merging

    International Nuclear Information System (INIS)

    Haghpanahi, Masoumeh; Borkholder, David A

    2014-01-01

    Noninvasive fetal ECG (fECG) monitoring has potential applications in diagnosing congenital heart diseases in a timely manner and assisting clinicians to make more appropriate decisions during labor. However, despite advances in signal processing and machine learning techniques, the analysis of fECG signals has still remained in its preliminary stages. In this work, we describe an algorithm to automatically locate QRS complexes in noninvasive fECG signals obtained from a set of four electrodes placed on the mother’s abdomen. The algorithm is based on an iterative decomposition of the maternal and fetal subspaces and filtering of the maternal ECG (mECG) components from the fECG recordings. Once the maternal components are removed, a novel merging technique is applied to merge the signals and detect the fetal QRS (fQRS) complexes. The algorithm was trained and tested on the fECG datasets provided by the PhysioNet/CinC challenge 2013. The final results indicate that the algorithm is able to detect fetal peaks for a variety of signals with different morphologies and strength levels encountered in clinical practice. (paper)

  15. Use of Information Intelligent Components for the Analysis of Complex Processes of Marine Energy Systems

    Directory of Open Access Journals (Sweden)

    Chernyi Sergei

    2016-09-01

    Full Text Available Synchronous motors and their modifications (ac converter-fed motor, etc. enable to develop low-noise, reliable and economically efficient electric drive systems. The construction of up-to-date systems based on the synchronous machines is impossible without development of the computing software incorporating mathematical and computing simulation. In its turn, modelling of the synchronous machines as a rule is based on the equations by Park-Gorev, application of which requires adoption of a series of allowances. These allowances in a number of cases do not permit to obtain adequate results of the simulation coinciding with the results of field experiments of the systems under review. Moreover, while applying the system of equations by Park-Gorev to research the systems including interaction of synchronous machines with semiconducting converters of electric energy it is necessary simulate the process of formation their controlling signals in the sphere of frequency. If the states of converter’s keys are defined not only by controlling impulses but also by the state of currents in the synchronous machines flowing through them, such approach is not reasonable.

  16. Advertising and algorithms – the obvious gains and hidden losses of using software with intelligent agent capabilities in the creative process of art directors and copywriters

    OpenAIRE

    Barker, Richie

    2017-01-01

    Situated at the intersection of information technology, advertising and creativity theory, this thesis presents a detailed picture of the influence of autonomous software applications on the creative process of advertising art directors and copywriters. These applications, which are known in the field of information technology as ‘intelligent agents,’ commonly possess the ability to learn from the user and autonomously pursue their own goals. The search engine Google, which employs intelligen...

  17. Artificial intelligence in cardiology

    OpenAIRE

    Bonderman, Diana

    2017-01-01

    Summary Decision-making is complex in modern medicine and should ideally be based on available data, structured knowledge and proper interpretation in the context of an individual patient. Automated algorithms, also termed artificial intelligence that are able to extract meaningful patterns from data collections and build decisions upon identified patterns may be useful assistants in clinical decision-making processes. In this article, artificial intelligence-based studies in clinical cardiol...

  18. Flow chemistry: intelligent processing of gas-liquid transformations using a tube-in-tube reactor.

    Science.gov (United States)

    Brzozowski, Martin; O'Brien, Matthew; Ley, Steven V; Polyzos, Anastasios

    2015-02-17

    reactive gas in a given reaction mixture. We have developed a tube-in-tube reactor device consisting of a pair of concentric capillaries in which pressurized gas permeates through an inner Teflon AF-2400 tube and reacts with dissolved substrate within a liquid phase that flows within a second gas impermeable tube. This Account examines our efforts toward the development of a simple, unified methodology for the processing of gaseous reagents in flow by way of development of a tube-in-tube reactor device and applications to key C-C, C-N, and C-O bond forming and hydrogenation reactions. We further describe the application to multistep reactions using solid-supported reagents and extend the technology to processes utilizing multiple gas reagents. A key feature of our work is the development of computer-aided imaging techniques to allow automated in-line monitoring of gas concentration and stoichiometry in real time. We anticipate that this Account will illustrate the convenience and benefits of membrane tube-in-tube reactor technology to improve and concomitantly broaden the scope of gas/liquid/solid reactions in organic synthesis.

  19. Constituents’ formal participation in the IASB’s due process: New insights into the impact of country and due process document characteristics

    Directory of Open Access Journals (Sweden)

    Michael Dobler

    2016-09-01

    Full Text Available This paper adopts a multi-issue/multi-period approach to provide new insights into key determinants of constituents’ formal participation in the due process of the International Accounting Standards Board (IASB. Based on an analysis of 8,825 comment letters submitted during the period 2006–2012, we find imbalances in the representation of constituents. Multiple regressions reveal that among various economic and cultural variables equity market capitalization and the society’s level of individualism are the key drivers of the country-level of constituents’ participation, and each variable has explanatory power over the other. The level of constituents’ participation is positively associated with the number of input opportunities offered by a due process document but unrelated to the complexity of a standard-setting project. The results are robust across various sub-samples and to additional sensitivity tests. Our findings indicate threats to the input legitimacy of the IASB and suggest avenues to stimulate constituents’ participation.

  20. Redeye: A Digital Library for Forensic Document Triage

    Energy Technology Data Exchange (ETDEWEB)

    Bogen, Paul Logasa [ORNL; McKenzie, Amber T [ORNL; Gillen, Rob [ORNL

    2013-01-01

    Forensic document analysis has become an important aspect of investigation of many different kinds of crimes from money laundering to fraud and from cybercrime to smuggling. The current workflow for analysts includes powerful tools, such as Palantir and Analyst s Notebook, for moving from evidence to actionable intelligence and tools for finding documents among the millions of files on a hard disk, such as FTK. However, the analysts often leave the process of sorting through collections of seized documents to filter out the noise from the actual evidence to a highly labor-intensive manual effort. This paper presents the Redeye Analysis Workbench, a tool to help analysts move from manual sorting of a collection of documents to performing intelligent document triage over a digital library. We will discuss the tools and techniques we build upon in addition to an in-depth discussion of our tool and how it addresses two major use cases we observed analysts performing. Finally, we also include a new layout algorithm for radial graphs that is used to visualize clusters of documents in our system.

  1. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    Science.gov (United States)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as

  2. Assessment of emotion processing skills in acquired brain injury using an ability-based test of emotional intelligence.

    Science.gov (United States)

    Hall, Sarah E; Wrench, Joanne M; Wilson, Sarah J

    2018-04-01

    Social and emotional problems are commonly reported after moderate to severe acquired brain injury (ABI) and pose a significant barrier to rehabilitation. However, progress in assessment of emotional skills has been limited by a lack of validated measurement approaches. This study represents the first formal psychometric evaluation of the use of the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) V2.0 as a tool for assessing skills in perceiving, using, understanding and managing emotions following ABI. The sample consisted of 82 participants aged 18-80 years in the postacute phase of recovery (2 months-7 years) after moderate to severe ABI. Participants completed the MSCEIT V2.0 and measures of cognition and mood. Sociodemographic and clinical variables were collated from participant interview and medical files. Results revealed deficits across all MSCEIT subscales (approximately 1 SD below the normative mean). Internal consistency was adequate at overall, area, and branch levels, and MSCEIT scores correlated in expected ways with key demographic, clinical, cognitive, and mood variables. MSCEIT performance was related to injury severity and clinician-rated functioning after ABI. Confirmatory factor analysis favored a 3-factor model of EI due to statistical redundancy of the Using Emotions branch. Overall, these findings suggest that the MSCEIT V2.0 is sensitive to emotion processing deficits after moderate to severe ABI, and can yield valid and reliable scores in an ABI sample. In terms of theoretical contributions, our findings support a domain-based, 3-factor approach for characterizing emotion-related abilities in brain-injured individuals. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Artificial Intelligence and Information Management

    Science.gov (United States)

    Fukumura, Teruo

    After reviewing the recent popularization of the information transmission and processing technologies, which are supported by the progress of electronics, the authors describe that by the introduction of the opto-electronics into the information technology, the possibility of applying the artificial intelligence (AI) technique to the mechanization of the information management has emerged. It is pointed out that althuogh AI deals with problems in the mental world, its basic methodology relies upon the verification by evidence, so the experiment on computers become indispensable for the study of AI. The authors also describe that as computers operate by the program, the basic intelligence which is concerned in AI is that expressed by languages. This results in the fact that the main tool of AI is the logical proof and it involves an intrinsic limitation. To answer a question “Why do you employ AI in your problem solving”, one must have ill-structured problems and intend to conduct deep studies on the thinking and the inference, and the memory and the knowledge-representation. Finally the authors discuss the application of AI technique to the information management. The possibility of the expert-system, processing of the query, and the necessity of document knowledge-base are stated.

  4. Modelling intelligent behavior

    Science.gov (United States)

    Green, H. S.; Triffet, T.

    1993-01-01

    An introductory discussion of the related concepts of intelligence and consciousness suggests criteria to be met in the modeling of intelligence and the development of intelligent materials. Methods for the modeling of actual structure and activity of the animal cortex have been found, based on present knowledge of the ionic and cellular constitution of the nervous system. These have led to the development of a realistic neural network model, which has been used to study the formation of memory and the process of learning. An account is given of experiments with simple materials which exhibit almost all properties of biological synapses and suggest the possibility of a new type of computer architecture to implement an advanced type of artificial intelligence.

  5. Standardization Documents

    Science.gov (United States)

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  6. Alzheimer's disease and intelligence.

    Science.gov (United States)

    Yeo, R A; Arden, R; Jung, R E

    2011-06-01

    A significant body of evidence has accumulated suggesting that individual variation in intellectual ability, whether assessed directly by intelligence tests or indirectly through proxy measures, is related to risk of developing Alzheimer's disease (AD) in later life. Important questions remain unanswered, however, such as the specificity of risk for AD vs. other forms of dementia, and the specific links between premorbid intelligence and development of the neuropathology characteristic of AD. Lower premorbid intelligence has also emerged as a risk factor for greater mortality across myriad health and mental health diagnoses. Genetic covariance contributes importantly to these associations, and pleiotropic genetic effects may impact diverse organ systems through similar processes, including inefficient design and oxidative stress. Through such processes, the genetic underpinnings of intelligence, specifically, mutation load, may also increase the risk of developing AD. We discuss how specific neurobiologic features of relatively lower premorbid intelligence, including reduced metabolic efficiency, may facilitate the development of AD neuropathology. The cognitive reserve hypothesis, the most widely accepted account of the intelligence-AD association, is reviewed in the context of this larger literature.

  7. Computer automation and artificial intelligence

    International Nuclear Information System (INIS)

    Hasnain, S.B.

    1992-01-01

    Rapid advances in computing, resulting from micro chip revolution has increased its application manifold particularly for computer automation. Yet the level of automation available, has limited its application to more complex and dynamic systems which require an intelligent computer control. In this paper a review of Artificial intelligence techniques used to augment automation is presented. The current sequential processing approach usually adopted in artificial intelligence has succeeded in emulating the symbolic processing part of intelligence, but the processing power required to get more elusive aspects of intelligence leads towards parallel processing. An overview of parallel processing with emphasis on transputer is also provided. A Fuzzy knowledge based controller for amination drug delivery in muscle relaxant anesthesia on transputer is described. 4 figs. (author)

  8. Intelligent Design

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    2005-01-01

    Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig.......Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig....

  9. Intelligent editor/printer enhancements

    Science.gov (United States)

    Woodfill, M. C.; Pheanis, D. C.

    1983-01-01

    Microprocessor support hardware, software, and cross assemblers relating to the Motorola 6800 and 6809 process systems were developed. Pinter controller and intelligent CRT development are discussed. The user's manual, design specifications for the MC6809 version of the intelligent printer controller card, and a 132-character by 64-line intelligent CRT display system using a Motorola 6809 MPU, and a one-line assembler and disassembler are provided.

  10. Computer-aided recording of automatic endoscope washing and disinfection processes as an integral part of medical documentation for quality assurance purposes

    Directory of Open Access Journals (Sweden)

    Klein Stefanie

    2010-07-01

    Full Text Available Abstract Background The reprocessing of medical endoscopes is carried out using automatic cleaning and disinfection machines. The documentation and archiving of records of properly conducted reprocessing procedures is the last and increasingly important part of the reprocessing cycle for flexible endoscopes. Methods This report describes a new computer program designed to monitor and document the automatic reprocessing of flexible endoscopes and accessories in fully automatic washer-disinfectors; it does not contain nor compensate the manual cleaning step. The program implements national standards for the monitoring of hygiene in flexible endoscopes and the guidelines for the reprocessing of medical products. No FDA approval has been obtained up to now. The advantages of this newly developed computer program are firstly that it simplifies the documentation procedures of medical endoscopes and that it could be used universally with any washer-disinfector and that it is independent of the various interfaces and software products provided by the individual suppliers of washer-disinfectors. Results The computer program presented here has been tested on a total of four washer-disinfectors in more than 6000 medical examinations within 9 months. Conclusions We present for the first time an electronic documentation system for automated washer-disinfectors for medical devices e.g. flexible endoscopes which can be used on any washer-disinfectors that documents the procedures involved in the automatic cleaning process and can be easily connected to most hospital documentation systems.

  11. Intelligent Agent Appropriation in the Tracking Phase of an Environmental Scanning Process: A Case Study of a French Trade Union

    Science.gov (United States)

    Lafaye, Christophe

    2009-01-01

    Introduction: The rapid growth of the Internet has modified the boundaries of information acquisition (tracking) in environmental scanning. Despite the numerous advantages of this new medium, information overload is an enormous problem for Internet scanners. In order to help them, intelligent agents (i.e., autonomous, automated software agents…

  12. Exploring Possible Neural Mechanisms of Intelligence Differences Using Processing Speed and Working Memory Tasks: An fMRI Study

    Science.gov (United States)

    Waiter, Gordon D.; Deary, Ian J.; Staff, Roger T.; Murray, Alison D.; Fox, Helen C.; Starr, John M.; Whalley, Lawrence J.

    2009-01-01

    To explore the possible neural foundations of individual differences in intelligence test scores, we examined the associations between Raven's Matrices scores and two tasks that were administered in a functional magnetic resonance imaging (fMRI) setting. The two tasks were an n-back working memory (N = 37) task and inspection time (N = 47). The…

  13. Intelligent Extruder

    Energy Technology Data Exchange (ETDEWEB)

    AlperEker; Mark Giammattia; Paul Houpt; Aditya Kumar; Oscar Montero; Minesh Shah; Norberto Silvi; Timothy Cribbs

    2003-04-24

    ''Intelligent Extruder'' described in this report is a software system and associated support services for monitoring and control of compounding extruders to improve material quality, reduce waste and energy use, with minimal addition of new sensors or changes to the factory floor system components. Emphasis is on process improvements to the mixing, melting and de-volatilization of base resins, fillers, pigments, fire retardants and other additives in the :finishing'' stage of high value added engineering polymer materials. While GE Plastics materials were used for experimental studies throughout the program, the concepts and principles are broadly applicable to other manufacturers materials. The project involved a joint collaboration among GE Global Research, GE Industrial Systems and Coperion Werner & Pleiderer, USA, a major manufacturer of compounding equipment. Scope of the program included development of a algorithms for monitoring process material viscosity without rheological sensors or generating waste streams, a novel detection scheme for rapid detection of process upsets and an adaptive feedback control system to compensate for process upsets where at line adjustments are feasible. Software algorithms were implemented and tested on a laboratory scale extruder (50 lb/hr) at GE Global Research and data from a production scale system (2000 lb/hr) at GE Plastics was used to validate the monitoring and detection software. Although not evaluated experimentally, a new concept for extruder process monitoring through estimation of high frequency drive torque without strain gauges is developed and demonstrated in simulation. A plan to commercialize the software system is outlined, but commercialization has not been completed.

  14. Maury Documentation

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Supporting documentation for the Maury Collection of marine observations. Includes explanations from Maury himself, as well as guides and descriptions by the U.S....

  15. Documentation Service

    International Nuclear Information System (INIS)

    Charnay, J.; Chosson, L.; Croize, M.; Ducloux, A.; Flores, S.; Jarroux, D.; Melka, J.; Morgue, D.; Mottin, C.

    1998-01-01

    This service assures the treatment and diffusion of the scientific information and the management of the scientific production of the institute as well as the secretariat operation for the groups and services of the institute. The report on documentation-library section mentions: the management of the documentation funds, search in international databases (INIS, Current Contents, Inspects), Pret-Inter service which allows accessing documents through DEMOCRITE network of IN2P3. As realizations also mentioned are: the setup of a video, photo database, the Web home page of the institute's library, follow-up of digitizing the document funds by integrating the CD-ROMs and diskettes, electronic archiving of the scientific production, etc

  16. Standard CMMI (Registered Trademark) Appraisal Method for Process Improvement (SCAMPI (Service Mark)) A, Version 1.3: Method Definition Document

    Science.gov (United States)

    2011-03-01

    Assurance Plans • Training Plans • Measurement Plans • Estimating records • Release planning • Workflow planning • Kanban boards • Service...Work flow planning • Kanban boards • Acquisition Strategy Documents • Supplier Evaluation Criteria • Requests for Proposal • Specific...Preliminary Design Reviews, deliveries) • QA Audit records/reports • Measurement reports/repository • Kanban board • Continuous/Cumulative Flow

  17. Computerising documentation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    The nuclear power generation industry is faced with public concern and government pressures over safety, efficiency and risk. Operators throughout the industry are addressing these issues with the aid of a new technology - technical document management systems (TDMS). Used for strategic and tactical advantage, the systems enable users to scan, archive, retrieve, store, edit, distribute worldwide and manage the huge volume of documentation (paper drawings, CAD data and film-based information) generated in building, maintaining and ensuring safety in the UK's power plants. The power generation industry has recognized that the management and modification of operation critical information is vital to the safety and efficiency of its power plants. Regulatory pressure from the Nuclear Installations Inspectorate (NII) to operate within strict safety margins or lose Site Licences has prompted the need for accurate, up-to-data documentation. A document capture and management retrieval system provides a powerful cost-effective solution, giving rapid access to documentation in a tightly controlled environment. The computerisation of documents and plans is discussed in this article. (Author)

  18. Crowd-Sourced Intelligence Agency: Prototyping counterveillance

    OpenAIRE

    Jennifer Gradecki; Derek Curry

    2017-01-01

    This paper discusses how an interactive artwork, the Crowd-Sourced Intelligence Agency (CSIA), can contribute to discussions of Big Data intelligence analytics. The CSIA is a publicly accessible Open Source Intelligence (OSINT) system that was constructed using information gathered from technical manuals, research reports, academic papers, leaked documents, and Freedom of Information Act files. Using a visceral heuristic, the CSIA demonstrates how the statistical correlations made by automate...

  19. Intelligent Multi-Media Integrated Interface Project

    Science.gov (United States)

    1990-06-01

    been devoted to the application of aritificial intelligence technology to the development of human -computer interface technology that integrates speech...RADC-TR-90-128 Final Technical Report June 1090 AD-A225 973 INTELLIGENT MULTI-MEDIA INTEGRATED INTERFACE PROJECT Calspan-University of Buffalo...contractual obligations or notices on a specific document require that it be returned. INTELLIGENT MULTI-MEDIA INTEGRATED INTERFACE PROJECT J. G. Neal J. M

  20. Intelligent playgrounds

    DEFF Research Database (Denmark)

    Larsen, Lasse Juel

    2009-01-01

    This paper examines play, gaming and learning in regard to intelligent playware developed for outdoor use. The key questions are how does these novel artefacts influence the concept of play, gaming and learning. Up until now play and game have been understood as different activities. This paper...... examines if the sharp differentiation between the two can be uphold in regard to intelligent playware for outdoor use. Play and game activities will be analysed and viewed in conjunction with learning contexts. This paper will stipulate that intelligent playware facilitates rapid shifts in contexts...

  1. Artificial intelligence

    CERN Document Server

    Ennals, J R

    1987-01-01

    Artificial Intelligence: State of the Art Report is a two-part report consisting of the invited papers and the analysis. The editor first gives an introduction to the invited papers before presenting each paper and the analysis, and then concludes with the list of references related to the study. The invited papers explore the various aspects of artificial intelligence. The analysis part assesses the major advances in artificial intelligence and provides a balanced analysis of the state of the art in this field. The Bibliography compiles the most important published material on the subject of

  2. Artificial Intelligence

    CERN Document Server

    Warwick, Kevin

    2011-01-01

    if AI is outside your field, or you know something of the subject and would like to know more then Artificial Intelligence: The Basics is a brilliant primer.' - Nick Smith, Engineering and Technology Magazine November 2011 Artificial Intelligence: The Basics is a concise and cutting-edge introduction to the fast moving world of AI. The author Kevin Warwick, a pioneer in the field, examines issues of what it means to be man or machine and looks at advances in robotics which have blurred the boundaries. Topics covered include: how intelligence can be defined whether machines can 'think' sensory

  3. Automated Test Requirement Document Generation

    Science.gov (United States)

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  4. Activities of Intelligence Services as a Synonymous of Fear and Intimidation

    Directory of Open Access Journals (Sweden)

    MA. Fisnik Sadiku

    2015-12-01

    Full Text Available Intelligence services are an important factor of national security. Their main role is to collect, process, analyze, and disseminate information on threats to the state and its population. Because of their “dark” activity, intelligence services for many ordinary citizens are synonymous of violence, fear and intimidation. This mostly comes out in theRepublicofKosovo, due to the murderous activities of the Serbian secret service in the past. Therefore, we will treat the work of intelligence services in democratic conditions, so that the reader can understand what is legitimate and legal of these services. In different countries of the world, security challenges continue to evolve and progress every day, and to fulfil these challenges, the state needs new ways of coordinating and developing the capability to shape the national security environment. However, the increase of intelligence in many countries has raised debates about legal and ethical issues regarding intelligence activities. Therefore, this paper will include a clear explanation of the term, meaning, process, transparency and secrecy, and the role that intelligence services have in analyzing potential threats to national security. The study is based on a wide range of print and electronic literature, including academic and scientific literature, and other documents of various intelligence agencies of developed countries.

  5. Finding competitive intelligence on Internet start-up companies: a study of secondary resource use and information-seeking processes

    Directory of Open Access Journals (Sweden)

    2001-01-01

    Full Text Available The paper reports findings from a study of CI activities involving Internet start-up companies in the telecommunications industry. The CI gathering was conducted by graduate students in library and information science in the context of a class project for a real business client, itself a small Internet start-up company. The primary objective of the study was to provide empirical insights into the applicability of specific types of secondary information resources to finding competitive intelligence information on small Internet start-up companies. An additional objective was to identify the characteristics of research strategies applied in the collection of CI on Internet start-ups from the perspective of current conceptual frameworks of information-seeking behaviour presented in the library and information science literature. This study revealed some interesting findings regarding the types of secondary information resources that can be used to find competitive intelligence on small, Internet start-up companies. The study also provided insight into the characteristics of the overall information-seeking strategies that are applied in this type of competitive intelligence research.

  6. Intelligent Advertising

    OpenAIRE

    Díaz Pinedo, Edilfredo Eliot

    2012-01-01

    Intelligent Advertisement diseña e implementa un sistema de publicidad para dispositivos móviles en un centro comercial, donde los clientes reciben publicidad de forma pasiva en sus dispositivos mientras están dentro.

  7. FROM DOCUMENTATION IMAGES TO RESTAURATION SUPPORT TOOLS: A PATH FOLLOWING THE NEPTUNE FOUNTAIN IN BOLOGNA DESIGN PROCESS

    Directory of Open Access Journals (Sweden)

    F. I. Apollonio

    2017-05-01

    Full Text Available The sixteenth-century Fountain of Neptune is one of Bologna’s most renowned landmarks. During the recent restoration activities of the monumental sculpture group, consisting in precious marbles and highly refined bronzes with water jets, a photographic campaign has been carried out exclusively for documentation purposes of the current state of preservation of the complex. Nevertheless, the highquality imagery was used for a different use, namely to create a 3D digital model accurate in shape and color by means of automated photogrammetric techniques and a robust customized pipeline. This 3D model was used as basic tool to support many and different activities of the restoration site. The paper describes the 3D model construction technique used and the most important applications in which it was used as support tool for restoration: (i reliable documentation of the actual state; (ii surface cleaning analysis; (iii new water system and jets; (iv new lighting design simulation; (v support for preliminary analysis and projectual studies related to hardly accessible areas; (vi structural analysis; (vii base for filling gaps or missing elements through 3D printing; (viii high-quality visualization and rendering and (ix support for data modelling and semantic-based diagrams.

  8. BUSINESS INTELLIGENCE

    OpenAIRE

    Bogdan Mohor Dumitrita

    2011-01-01

    The purpose of this work is to present business intelligence systems. These systems can be extremely complex and important in modern market competition. Its effectiveness also reflects in price, so we have to exlore their financial potential before investment. The systems have 20 years long history and during that time many of such tools have been developed, but they are rarely still in use. Business intelligence system consists of three main areas: Data Warehouse, ETL tools and tools f...

  9. Applications of artificial intelligence to space station: General purpose intelligent sensor interface

    Science.gov (United States)

    Mckee, James W.

    1988-01-01

    This final report describes the accomplishments of the General Purpose Intelligent Sensor Interface task of the Applications of Artificial Intelligence to Space Station grant for the period from October 1, 1987 through September 30, 1988. Portions of the First Biannual Report not revised will not be included but only referenced. The goal is to develop an intelligent sensor system that will simplify the design and development of expert systems using sensors of the physical phenomena as a source of data. This research will concentrate on the integration of image processing sensors and voice processing sensors with a computer designed for expert system development. The result of this research will be the design and documentation of a system in which the user will not need to be an expert in such areas as image processing algorithms, local area networks, image processor hardware selection or interfacing, television camera selection, voice recognition hardware selection, or analog signal processing. The user will be able to access data from video or voice sensors through standard LISP statements without any need to know about the sensor hardware or software.

  10. Network control stations in the smart grid. Process and information knots for business intelligence applications; Netzleitstellen im Smart Grid. Prozess- und Informationsknoten fuer Business Intelligence Applikationen

    Energy Technology Data Exchange (ETDEWEB)

    Kautsch, Stephan; Kroll, Meinhard [ABB AG, Mannheim (Germany); Schoellhorn, Daniel [EnBW Regional AG, Stuttgart (Germany)

    2012-07-01

    The degree of automation in the distribution will increase, whereas a more extensive monitoring is possible. Smart metering in the local network station replaces the drag pointers. This allows the pre-determined load flows to be precise and it can be determined and valuable data can be collected about how resources, for example the transformers in the secondary substations, are actually utilized. The amount of information available is increasing steadily, not least because of the increasing expansion of smart meters, that also provide valuable information for the operation of the distribution networks. This ''flood'' of data that is processed by the system, filtered, and analyzed must be prepared for the user in order to make sense, but can also be used to support and optimize many business processes. Although these tasks mentioned are usually not yet allocated within the grid operator organization, they offer themselves to be placed close to the network control centers as they propose new challenges but also opportunities. (orig.)

  11. Artificial Intelligence--Applications in Education.

    Science.gov (United States)

    Poirot, James L.; Norris, Cathleen A.

    1987-01-01

    This first in a projected series of five articles discusses artificial intelligence and its impact on education. Highlights include the history of artificial intelligence and the impact of microcomputers; learning processes; human factors and interfaces; computer assisted instruction and intelligent tutoring systems; logic programing; and expert…

  12. Business Intelligence Solutions for Gaining Competitive Advantage

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Business Intelligence is the process for increasing the competitive advantage of a company by intelligent use of available data in decision-making. Only a revolutionary Business Intelligence solution, like the proposed portal-based, can solve the complex issues faced when evaluating decision support applications and ensures the availability of any business-critical information.

  13. Model of intelligent information searching system

    International Nuclear Information System (INIS)

    Yastrebkov, D.I.

    2004-01-01

    A brief description of the technique to search for electronic documents in large archives as well as drawbacks is presented. A solution close to intelligent information searching systems is proposed. (author)

  14. Post-Revolution Constitutionalism: The Impact of Drafting Processes on the Constitutional Documents in Tunisia and Egypt

    DEFF Research Database (Denmark)

    Elsayed, Ahmed Mohamed Abdelfattah

    2014-01-01

    This paper seeks to address the constitutional paths that followed the Arab awakening in both Tunisia and Egypt. The Tunisian constitutional process, despite some tensions, was largely peaceful and consensual. On the other hand, the process in Egypt of establishing a new constitutional arrangement...... at identifying the factors have impacted both the constitutional drafting process and the popular perception of the produced constitutions in each of Tunisia and Egypt....

  15. Assessing Cognitive Abilities: Intelligence and More

    Directory of Open Access Journals (Sweden)

    Keith E. Stanovich

    2014-02-01

    Full Text Available In modern cognitive science, rationality and intelligence are measured using different tasks and operations. Furthermore, in several contemporary dual process theories of cognition, rationality is a more encompassing construct than intelligence. Researchers need to continue to develop measures of rational thought without regard to empirical correlations with intelligence. The measurement of individual differences in rationality should not be subsumed by the intelligence concept.

  16. Data transfer based on intelligent ethernet card

    International Nuclear Information System (INIS)

    Zhu Haitao; Chinese Academy of Sciences, Beijing; Chu Yuanping; Zhao Jingwei

    2007-01-01

    Intelligent Ethernet Cards are widely used in systems where the network throughout is very large, such as the DAQ systems for modern high energy physics experiments, web service. With the example of a commercial intelligent Ethernet card, this paper introduces the architecture, the principle and the process of intelligent Ethernet cards. In addition, the results of several experiments showing the differences between intelligent Ethernet cards and general ones are also presented. (authors)

  17. Documenting the conversion from traditional to Studio Physics formats at the Colorado School of Mines: Process and early results

    Science.gov (United States)

    Kohl, Patrick B.; Kuo, H. Vincent; Ruskell, Todd G.

    2008-10-01

    The Colorado School of Mines (CSM) has taught its first-semester introductory physics course using a hybrid lecture/Studio Physics format for several years. Over the past year we have converted the second semester of our calculus-based introductory physics course (Physics II) to a Studio Physics format, starting from a traditional lecture-based format. In this paper, we document the early stages of this conversion in order to better understand which features succeed and which do not, and in order to develop a model for switching to Studio that keeps the time and resource investment manageable. We describe the recent history of the Physics II course and of Studio at Mines, discuss the PER-based improvements that we are implementing, and characterize our progress via several metrics, including pre/post Conceptual Survey of Electricity and Magnetism (CSEM) scores, Colorado Learning About Science Survey scores (CLASS), solicited student comments, failure rates, and exam scores.

  18. Dynamic models of staged gasification processes. Documentation of gasification simulator; Dynamiske modeller a f trinopdelte forgasningsprocesser. Dokumentation til forgasser simulator

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-02-15

    In connection with the ERP project 'Dynamic modelling of staged gasification processes' a gasification simulator has been constructed. The simulator consists of: a mathematical model of the gasification process developed at Technical University of Denmark, a user interface programme, IGSS, and a communication interface between the two programmes. (BA)

  19. 8 CFR 286.9 - Fee for processing applications and issuing documentation at land border Ports-of-Entry.

    Science.gov (United States)

    2010-01-01

    ... SECURITY IMMIGRATION REGULATIONS IMMIGRATION USER FEE § 286.9 Fee for processing applications and issuing... Card, issued by the DOS, or a passport and combined B-1/B-2 visa and non-biometric BCC (or similar...

  20. Post-Revolution Constitutionalism: The Impact of Drafting Processes on the Constitutional Documents in Tunisia and Egypt

    OpenAIRE

    El-Sayed, Ahmed

    2014-01-01

    This paper seeks to address the constitutional paths that followed the Arab awakening in both Tunisia and Egypt. The Tunisian constitutional process, despite some tensions, was largely peaceful and consensual. On the other hand, the process in Egypt of establishing a new constitutional arrangement had been tumultuous with repercussions that are likely to linger on for a protracted period of time. Therefore, despite apparent resemblance in socio-political actors in both countries, (political I...

  1. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ General - CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. LHC Symposiums Management - CB - MB - FB - FMC Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2006 Annual reviews are posted.   CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat a...

  2. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ General - CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. LHC Symposiums Management - CB - MB - FB - FMC Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2006 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the natu...

  3. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ General - CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. LHC Symposiums Management - CB - MB - FB - FMC Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2006 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the natur...

  4. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ Management- CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. Management - CB - MB - FB Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2007 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the nature of employment and ...

  5. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ Management- CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. Management - CB - MB - FB Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2007 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the nature of em¬pl...

  6. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ General - CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. LHC Symposiums Management - CB - MB - FB - FMC Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2006 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the na...

  7. [Breastfeeding and its influence into the cognitive process of Spanish school-children (6 years old), measured by the Wechsler Intelligence Scale].

    Science.gov (United States)

    Pérez Ruiz, Juan Manuel; Miranda León, María Teresa; Peinado Herreros, José María; Iribar Ibabe, María Concepción

    2013-09-01

    Some scientific evidence support that a better cognitive development during the school age is related with breastfeeding. In this study, the potential benefit of breastfeeding duration is evaluated, related to Verbal Comprehension, Perceptual Reasoning, Working Memory and Processing Speed. A total of 103 children, first year of Primary School, six-year-old, (47 boys and 56 girls), were included from different schools in the province of Granada (Spain) at urban, semi-urban and rural areas. The global cognitive capability, as well as some specific intelligence domains which permit a more precise and deeper analysis of the cognitive processes, was evaluated through the Wechsler Intelligence Scale for Children--version IV. The results prove an association, statistically signnificative, between the best values of IQ and the other four WISC-IV indexes and a longer breastfeeding. There is a highly significant (p = 0.000) association between the best scores and those children who were breastfed during 6 months, which validates our hypothesis. The advice of breastfeeding during at least the first six months of life should be reinforced to reduce learning difficulties.

  8. Artificial intelligence in cardiology.

    Science.gov (United States)

    Bonderman, Diana

    2017-12-01

    Decision-making is complex in modern medicine and should ideally be based on available data, structured knowledge and proper interpretation in the context of an individual patient. Automated algorithms, also termed artificial intelligence that are able to extract meaningful patterns from data collections and build decisions upon identified patterns may be useful assistants in clinical decision-making processes. In this article, artificial intelligence-based studies in clinical cardiology are reviewed. The text also touches on the ethical issues and speculates on the future roles of automated algorithms versus clinicians in cardiology and medicine in general.

  9. Intelligent environmental data warehouse

    International Nuclear Information System (INIS)

    Ekechukwu, B.

    1998-01-01

    Making quick and effective decisions in environment management are based on multiple and complex parameters, a data warehouse is a powerful tool for the over all management of massive environmental information. Selecting the right data from a warehouse is an important factor consideration for end-users. This paper proposed an intelligent environmental data warehouse system. It consists of data warehouse to feed an environmental researchers and managers with desire environmental information needs to their research studies and decision in form of geometric and attribute data for study area, and a metadata for the other sources of environmental information. In addition, the proposed intelligent search engine works according to a set of rule, which enables the system to be aware of the environmental data wanted by the end-user. The system development process passes through four stages. These are data preparation, warehouse development, intelligent engine development and internet platform system development. (author)

  10. ARTIFICIAL INTELLIGENCE CAPABILITIES FOR INCREASING ORGANIZATIONAL-TECHNOLOGICAL RELIABILITY OF CONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Ginzburg Alexander Vital`evich

    2018-02-01

    Full Text Available The technology of artificial intelligence is actively being mastered in the world but there is not much talk about the capabilities of artificial intelligence in construction industry and this issue requires additional elaboration. As a rule, the decision to invest in a particular construction project is made on the basis of an assessment of the organizational and technological reliability of the construction process. Artificial intelligence can be a convenient quality tool for identifying, analyzing and subsequent control of the “pure” risks of the construction project, which not only will significantly reduce the financial and time expenditures for the investor’s decision-making process but also improve the organizational-technological reliability of the construction process as a whole. Subject: the algorithm of creation of artificial intelligence in the field of identification and analysis of potential risk events is presented, which will facilitate the creation of an independent analytical system for different stages of construction production: from the sketch to the working documentation and conduction of works directly on the construction site. Research objectives: the study of the possibility, methods and planning of the algorithm of works for creation of artificial intelligence technology in order to improve the organizational-technological reliability of the construction process. Materials and methods: the developments in the field of improving the organizational and technological reliability of construction were studied through the analysis and control of potential “pure” risks of the construction project, and the work was also carried out to integrate the technology of artificial intelligence into the area being studied. Results: An algorithm for creating artificial intelligence in the field of identification of potential “pure” risks of construction projects was presented. Conclusions: the obtained results are useful

  11. Semantic Business Intelligence - a New Generation of Business Intelligence

    OpenAIRE

    Dinu AIRINEI; Dora-Anca BERTA

    2012-01-01

    Business Intelligence Solutions represents applications used by companies to manage process and analyze data to provide substantiated decision. In the context of Semantic Web develop-ment trend is to integrate semantic unstructured data, making business intelligence solutions to be redesigned in such a manner that can analyze, process and synthesize, in addition to traditional data and data integrated with semantic another form and structure. This invariably leads appearance of new BI solutio...

  12. A Case Study Documenting the Process by Which Biology Instructors Transition from Teacher-Centered to Learner-Centered Teaching

    Science.gov (United States)

    Marbach-Ad, Gili; Hunt Rietschel, Carly

    2016-01-01

    In this study, we used a case study approach to obtain an in-depth understanding of the change process of two university instructors who were involved with redesigning a biology course. Given the hesitancy of many biology instructors to adopt evidence-based, learner-centered teaching methods, there is a critical need to understand how biology instructors transition from teacher-centered (i.e., lecture-based) instruction to teaching that focuses on the students. Using the innovation-decision model for change, we explored the motivation, decision-making, and reflective processes of the two instructors through two consecutive, large-enrollment biology course offerings. Our data reveal that the change process is somewhat unpredictable, requiring patience and persistence during inevitable challenges that arise for instructors and students. For example, the change process requires instructors to adopt a teacher-facilitator role as opposed to an expert role, to cover fewer course topics in greater depth, and to give students a degree of control over their own learning. Students must adjust to taking responsibility for their own learning, working collaboratively, and relinquishing the anonymity afforded by lecture-based teaching. We suggest implications for instructors wishing to change their teaching and administrators wishing to encourage adoption of learner-centered teaching at their institutions. PMID:27856550

  13. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    Science.gov (United States)

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  14. Document Models

    Directory of Open Access Journals (Sweden)

    A.A. Malykh

    2017-08-01

    Full Text Available In this paper, the concept of locally simple models is considered. Locally simple models are arbitrarily complex models built from relatively simple components. A lot of practically important domains of discourse can be described as locally simple models, for example, business models of enterprises and companies. Up to now, research in human reasoning automation has been mainly concentrated around the most intellectually intensive activities, such as automated theorem proving. On the other hand, the retailer business model is formed from ”jobs”, and each ”job” can be modelled and automated more or less easily. At the same time, the whole retailer model as an integrated system is extremely complex. In this paper, we offer a variant of the mathematical definition of a locally simple model. This definition is intended for modelling a wide range of domains. Therefore, we also must take into account the perceptual and psychological issues. Logic is elitist, and if we want to attract to our models as many people as possible, we need to hide this elitism behind some metaphor, to which ’ordinary’ people are accustomed. As such a metaphor, we use the concept of a document, so our locally simple models are called document models. Document models are built in the paradigm of semantic programming. This allows us to achieve another important goal - to make the documentary models executable. Executable models are models that can act as practical information systems in the described domain of discourse. Thus, if our model is executable, then programming becomes redundant. The direct use of a model, instead of its programming coding, brings important advantages, for example, a drastic cost reduction for development and maintenance. Moreover, since the model is well and sound, and not dissolved within programming modules, we can directly apply AI tools, in particular, machine learning. This significantly expands the possibilities for automation and

  15. Intelligent Universe

    Energy Technology Data Exchange (ETDEWEB)

    Hoyle, F

    1983-01-01

    The subject is covered in chapters, entitled: chance and the universe (synthesis of proteins; the primordial soup); the gospel according to Darwin (discussion of Darwin theory of evolution); life did not originate on earth (fossils from space; life in space); the interstellar connection (living dust between the stars; bacteria in space falling to the earth; interplanetary dust); evolution by cosmic control (microorganisms; genetics); why aren't the others here (a cosmic origin of life); after the big bang (big bang and steady state); the information rich universe; what is intelligence up to; the intelligent universe.

  16. Artificial intelligence

    International Nuclear Information System (INIS)

    Perret-Galix, D.

    1992-01-01

    A vivid example of the growing need for frontier physics experiments to make use of frontier technology is in the field of artificial intelligence and related themes. This was reflected in the second international workshop on 'Software Engineering, Artificial Intelligence and Expert Systems in High Energy and Nuclear Physics' which took place from 13-18 January at France Telecom's Agelonde site at La Londe des Maures, Provence. It was the second in a series, the first having been held at Lyon in 1990

  17. Artificial Intelligence and Moral intelligence

    Directory of Open Access Journals (Sweden)

    Laura Pana

    2008-07-01

    Full Text Available We discuss the thesis that the implementation of a moral code in the behaviour of artificial intelligent systems needs a specific form of human and artificial intelligence, not just an abstract intelligence. We present intelligence as a system with an internal structure and the structural levels of the moral system, as well as certain characteristics of artificial intelligent agents which can/must be treated as 1- individual entities (with a complex, specialized, autonomous or selfdetermined, even unpredictable conduct, 2- entities endowed with diverse or even multiple intelligence forms, like moral intelligence, 3- open and, even, free-conduct performing systems (with specific, flexible and heuristic mechanisms and procedures of decision, 4 – systems which are open to education, not just to instruction, 5- entities with “lifegraphy”, not just “stategraphy”, 6- equipped not just with automatisms but with beliefs (cognitive and affective complexes, 7- capable even of reflection (“moral life” is a form of spiritual, not just of conscious activity, 8 – elements/members of some real (corporal or virtual community, 9 – cultural beings: free conduct gives cultural value to the action of a ”natural” or artificial being. Implementation of such characteristics does not necessarily suppose efforts to design, construct and educate machines like human beings. The human moral code is irremediably imperfect: it is a morality of preference, of accountability (not of responsibility and a morality of non-liberty, which cannot be remedied by the invention of ethical systems, by the circulation of ideal values and by ethical (even computing education. But such an imperfect morality needs perfect instruments for its implementation: applications of special logic fields; efficient psychological (theoretical and technical attainments to endow the machine not just with intelligence, but with conscience and even spirit; comprehensive technical

  18. A Case Study Documenting the Process by Which Biology Instructors Transition from Teacher-Centered to Learner-Centered Teaching.

    Science.gov (United States)

    Marbach-Ad, Gili; Hunt Rietschel, Carly

    2016-01-01

    In this study, we used a case study approach to obtain an in-depth understanding of the change process of two university instructors who were involved with redesigning a biology course. Given the hesitancy of many biology instructors to adopt evidence-based, learner-centered teaching methods, there is a critical need to understand how biology instructors transition from teacher-centered (i.e., lecture-based) instruction to teaching that focuses on the students. Using the innovation-decision model for change, we explored the motivation, decision-making, and reflective processes of the two instructors through two consecutive, large-enrollment biology course offerings. Our data reveal that the change process is somewhat unpredictable, requiring patience and persistence during inevitable challenges that arise for instructors and students. For example, the change process requires instructors to adopt a teacher-facilitator role as opposed to an expert role, to cover fewer course topics in greater depth, and to give students a degree of control over their own learning. Students must adjust to taking responsibility for their own learning, working collaboratively, and relinquishing the anonymity afforded by lecture-based teaching. We suggest implications for instructors wishing to change their teaching and administrators wishing to encourage adoption of learner-centered teaching at their institutions. © 2016 G. Marbach-Ad and C. H. Rietschel. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  19. Hydratools manual version 1.0, documentation for a MATLAB®-based post-processing package for the Sontek Hydra

    Science.gov (United States)

    Martini, Marinna A.; Sherwood, Chris; Horwitz, Rachel; Ramsey, Andree; Lightsom, Fran; Lacy, Jessie; Xu, Jingping

    2006-01-01

    The Sediment Transport Instrumentation Group (STG) at the U.S. Geological Survey (USGS) Woods Hole Science Center has a long-standing comitment to providing scientists with high quality oceanographic data. To meet this commitment, STG personnel are vigilant in checking data as well as hardware for signs of instrument malfunction. STG data sets are accompanied by processing histories to detail data processing procedures that may have modified the natural data signal while removing noise from the data. The history also allows the data to be reprocessed in the ligth of new insight into instrument function and moored conditions. This toolbox was compiled to meet these data quality commitments for data generated by Sontek Hydra systems using both ADV and PCADP probes. In the mid 1900's, the USGS Coastal and Marine Program began frequent deployments of Sontek Hydra systems in support of projects in estuaries, coastal, and continental shelf regions nationwide. Hydra data sets are large and complex in structure, and existing processing and editing tools consisted of fragments of MATLAB code written by USGS scientists to satisfy personal research needs. This code did not meet STG quality control criteria. This toolbox permits engineers and scientists to monitor data quality by: 1.\tprocessing data with interactive critical review;

  20. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    Science.gov (United States)

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)