WorldWideScience

Sample records for intelligent qa interfaces

  1. Intelligent Multi-Media Integrated Interface Project

    Science.gov (United States)

    1990-06-01

    been devoted to the application of aritificial intelligence technology to the development of human -computer interface technology that integrates speech...RADC-TR-90-128 Final Technical Report June 1090 AD-A225 973 INTELLIGENT MULTI-MEDIA INTEGRATED INTERFACE PROJECT Calspan-University of Buffalo...contractual obligations or notices on a specific document require that it be returned. INTELLIGENT MULTI-MEDIA INTEGRATED INTERFACE PROJECT J. G. Neal J. M

  2. A Chatbot as a Natural Web Interface to Arabic Web QA

    Directory of Open Access Journals (Sweden)

    Bayan Abu Shawar

    2011-03-01

    Full Text Available In this paper, we describe a way to access Arabic Web Question Answering (QA corpus using a chatbot, without the need for sophisticated natural language processing or logical inference. Any Natural Language (NL interface to Question Answer (QA system is constrained to reply with the given answers, so there is no need for NL generation to recreate well-formed answers, or for deep analysis or logical inference to map user input questions onto this logical ontology; simple (but large set of pattern-template matching rules will suffice. In previous research, this approach works properly with English and other European languages. In this paper, we try to see how the same chatbot will react in terms of Arabic Web QA corpus. Initial results shows that 93% of answers were correct, but because of a lot of characteristics related to Arabic language, changing Arabic questions into other forms may lead to no answers.

  3. Human-Centric Interfaces for Ambient Intelligence

    CERN Document Server

    Aghajan, Hamid; Delgado, Ramon Lopez-Cozar

    2009-01-01

    To create truly effective human-centric ambient intelligence systems both engineering and computing methods are needed. This is the first book to bridge data processing and intelligent reasoning methods for the creation of human-centered ambient intelligence systems. Interdisciplinary in nature, the book covers topics such as multi-modal interfaces, human-computer interaction, smart environments and pervasive computing, addressing principles, paradigms, methods and applications. This book will be an ideal reference for university researchers, R&D engineers, computer engineers, and graduate s

  4. The crustal dynamics intelligent user interface anthology

    Science.gov (United States)

    Short, Nicholas M., Jr.; Campbell, William J.; Roelofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has, as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI is to develop a friendly and intelligent user interface service based on expert systems and natural language processing technologies. The purpose of such a service is to support the large number of potential scientific and engineering users that have need of space and land-related research and technical data, but have little or no experience in query languages or understanding of the information content or architecture of the databases of interest. This document presents the design concepts, development approach and evaluation of the performance of a prototype IUI system for the Crustal Dynamics Project Database, which was developed using a microcomputer-based expert system tool (M. 1), the natural language query processor THEMIS, and the graphics software system GSS. The IUI design is based on a multiple view representation of a database from both the user and database perspective, with intelligent processes to translate between the views.

  5. The desktop interface in intelligent tutoring systems

    Science.gov (United States)

    Baudendistel, Stephen; Hua, Grace

    1987-01-01

    The interface between an Intelligent Tutoring System (ITS) and the person being tutored is critical to the success of the learning process. If the interface to the ITS is confusing or non-supportive of the tutored domain, the effectiveness of the instruction will be diminished or lost entirely. Consequently, the interface to an ITS should be highly integrated with the domain to provide a robust and semantically rich learning environment. In building an ITS for ZetaLISP on a LISP Machine, a Desktop Interface was designed to support a programming learning environment. Using the bitmapped display, windows, and mouse, three desktops were designed to support self-study and tutoring of ZetaLISP. Through organization, well-defined boundaries, and domain support facilities, the desktops provide substantial flexibility and power for the student and facilitate learning ZetaLISP programming while screening the student from the complex LISP Machine environment. The student can concentrate on learning ZetaLISP programming and not on how to operate the interface or a LISP Machine.

  6. The intelligent user interface for NASA's advanced information management systems

    Science.gov (United States)

    Campbell, William J.; Short, Nicholas, Jr.; Rolofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    NASA has initiated the Intelligent Data Management Project to design and develop advanced information management systems. The project's primary goal is to formulate, design and develop advanced information systems that are capable of supporting the agency's future space research and operational information management needs. The first effort of the project was the development of a prototype Intelligent User Interface to an operational scientific database, using expert systems and natural language processing technologies. An overview of Intelligent User Interface formulation and development is given.

  7. Intelligent Interfaces to Empower People with Disabilities

    Science.gov (United States)

    Betke, Margrit

    Severe motion impairments can result from non-progressive disorders, such as cerebral palsy, or degenerative neurological diseases, such as Amyotrophic Lateral Sclerosis (ALS), Multiple Sclerosis (MS), or muscular dystrophy (MD). They can be due to traumatic brain injuries, for example, due to a traffic accident, or to brainstem strokes [9, 84]. Worldwide, these disorders affect millions of individuals of all races and ethnic backgrounds [4, 75, 52]. Because disease onset of MS and ALS typically occurs in adulthood, afflicted people are usually computer literate. Intelligent interfaces can immensely improve their daily lives by allowing them to communicate and participate in the information society, for example, by browsing the web, posting messages, or emailing friends. However, people with advanced ALS, MS, or MD may reach a point when they cannot control the keyboard and mouse anymore and also cannot rely on automated voice recognition because their speech has become slurred.

  8. The Properties of Intelligent Human-Machine Interface

    Directory of Open Access Journals (Sweden)

    Alexander Alfimtsev

    2012-04-01

    Full Text Available Intelligent human-machine interfaces based on multimodal interaction are developed separately in different application areas. No unified opinion exists about the issue of what properties should these interfaces have to provide an intuitive and natural interaction. Having carried out an analytical survey of the papers that deal with intelligent interfaces a set of properties are presented, which are necessary for intelligent interface between an information system and a human: absolute response, justification, training, personification, adaptiveness, collectivity, security, hidden persistence, portability, filtering.

  9. Multiple multichannel spectra acquisition and processing system with intelligent interface

    International Nuclear Information System (INIS)

    Chen Ying; Wei Yixiang; Qu Jianshi; Zheng Futang; Xu Shengkui; Xie Yuanming; Qu Xing; Ji Weitong; Qiu Xuehua

    1986-01-01

    A Multiple multichannel spectra acquisition and processing system with intelligent interface is described. Sixteen spectra measured with various lengths, channel widths, back biases and acquisition times can be identified and collected by the intelligent interface simultaneously while the connected computer is doing data processing. The execution time for the Ge(Li) gamma-ray spectrum analysis software on IBM PC-XT is about 55 seconds

  10. Exploring distributed user interfaces in ambient intelligent environments

    NARCIS (Netherlands)

    Dadlani Mahtani, P.M.; Peregrin Emparanza, J.; Markopoulos, P.; Gallud, J.A.; Tesoriero, R.; Penichet, V.M.R.

    2011-01-01

    In this paper we explore the use of Distributed User Interfaces (DUIs) in the field of Ambient Intelligence (AmI). We first introduce the emerging area of AmI, followed by describing three case studies where user interfaces or ambient displays are distributed and blending in the user’s environments.

  11. Gestures in an Intelligent User Interface

    Science.gov (United States)

    Fikkert, Wim; van der Vet, Paul; Nijholt, Anton

    In this chapter we investigated which hand gestures are intuitive to control a large display multimedia interface from a user's perspective. Over the course of two sequential user evaluations, we defined a simple gesture set that allows users to fully control a large display multimedia interface, intuitively. First, we evaluated numerous gesture possibilities for a set of commands that can be issued to the interface. These gestures were selected from literature, science fiction movies, and a previous exploratory study. Second, we implemented a working prototype with which the users could interact with both hands and the preferred hand gestures with 2D and 3D visualizations of biochemical structures. We found that the gestures are influenced to significant extent by the fast paced developments in multimedia interfaces such as the Apple iPhone and the Nintendo Wii and to no lesser degree by decades of experience with the more traditional WIMP-based interfaces.

  12. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  13. A Visualized Message Interface (VMI) for intelligent messaging services

    International Nuclear Information System (INIS)

    Endo, T.; Kasahara, H.; Nakagawa, T.

    1984-01-01

    In CCITT, Message Handling Systems (MHS) have been studied from the viewpoint of communications protocol standardization. In addition to MHS services, Message Processing (MP) services, such as image processing, filing and retrieving services, will come into increasing demand in office automation field. These messaging services, including MHS services, can be thought of as Intelligent Messaging (IM) services. IM services include many basic services, optional user facilities and service parameters. Accordingly, it is necessary to deal with these parameters and MP procedures in as systematic and user-friendly a manner as possible. As one step towards realizing a user-friendly IM services interface, the characteristics of IM service parameters are studied and a Visualized Message Interface (VMI) which resembles a conventional letter exchange format is presented. The concept of VMI formation is discussed using the generic document structure concept as well as a Screen Interface and Protocol Interface conversion package

  14. Application of MCU to intelligent interface of high precision magnet power supply

    International Nuclear Information System (INIS)

    Xu Ruinian; Li Deming

    2004-01-01

    Application of the high-capability MCU in the intelligent interface is introduced in this paper. A prototype of intelligent interface for high precision huge magnet power supply was developed successfully. This intelligent interface was composed of two parts: operation panel and main board, both of which adopt a MCU of PIC16F877 respectively. The interface has many advantages, such as small size, low cost and good interference immunity. (authors)

  15. Applications of artificial intelligence to space station: General purpose intelligent sensor interface

    Science.gov (United States)

    Mckee, James W.

    1988-01-01

    This final report describes the accomplishments of the General Purpose Intelligent Sensor Interface task of the Applications of Artificial Intelligence to Space Station grant for the period from October 1, 1987 through September 30, 1988. Portions of the First Biannual Report not revised will not be included but only referenced. The goal is to develop an intelligent sensor system that will simplify the design and development of expert systems using sensors of the physical phenomena as a source of data. This research will concentrate on the integration of image processing sensors and voice processing sensors with a computer designed for expert system development. The result of this research will be the design and documentation of a system in which the user will not need to be an expert in such areas as image processing algorithms, local area networks, image processor hardware selection or interfacing, television camera selection, voice recognition hardware selection, or analog signal processing. The user will be able to access data from video or voice sensors through standard LISP statements without any need to know about the sensor hardware or software.

  16. Automatic figure ranking and user interfacing for intelligent figure search.

    Directory of Open Access Journals (Sweden)

    Hong Yu

    2010-10-01

    Full Text Available Figures are important experimental results that are typically reported in full-text bioscience articles. Bioscience researchers need to access figures to validate research facts and to formulate or to test novel research hypotheses. On the other hand, the sheer volume of bioscience literature has made it difficult to access figures. Therefore, we are developing an intelligent figure search engine (http://figuresearch.askhermes.org. Existing research in figure search treats each figure equally, but we introduce a novel concept of "figure ranking": figures appearing in a full-text biomedical article can be ranked by their contribution to the knowledge discovery.We empirically validated the hypothesis of figure ranking with over 100 bioscience researchers, and then developed unsupervised natural language processing (NLP approaches to automatically rank figures. Evaluating on a collection of 202 full-text articles in which authors have ranked the figures based on importance, our best system achieved a weighted error rate of 0.2, which is significantly better than several other baseline systems we explored. We further explored a user interfacing application in which we built novel user interfaces (UIs incorporating figure ranking, allowing bioscience researchers to efficiently access important figures. Our evaluation results show that 92% of the bioscience researchers prefer as the top two choices the user interfaces in which the most important figures are enlarged. With our automatic figure ranking NLP system, bioscience researchers preferred the UIs in which the most important figures were predicted by our NLP system than the UIs in which the most important figures were randomly assigned. In addition, our results show that there was no statistical difference in bioscience researchers' preference in the UIs generated by automatic figure ranking and UIs by human ranking annotation.The evaluation results conclude that automatic figure ranking and user

  17. Ecological Design of Cooperative Human-Machine Interfaces for Safety of Intelligent Transport Systems

    Directory of Open Access Journals (Sweden)

    Orekhov Aleksandr

    2016-01-01

    Full Text Available The paper describes research results in the domain of cooperative intelligent transport systems. The requirements for human-machine interface considering safety issue of for intelligent transport systems (ITSare analyzed. Profiling of the requirements to cooperative human-machine interface (CHMI for such systems including requirements to usability and safety is based on a set of standards for ITSs. An approach and design technique of cooperative human-machine interface for ITSs are suggested. The architecture of cloud-based CHMI for intelligent transport systems has been developed. The prototype of software system CHMI4ITSis described.

  18. i-Car: An Intelligent and Interactive Interface for Driver Assistance ...

    African Journals Online (AJOL)

    i-Car: An Intelligent and Interactive Interface for Driver Assistance System. ... techniques with pattern recognition, feature extraction, machine learning, object recognition, ... The system uses eye closure based decision algorithm to detect driver ...

  19. Intelligent Human Machine Interface Design for Advanced Product Life Cycle Management Systems

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    Designing and implementing an intelligent and user friendly human machine interface for any kind of software or hardware oriented application is always be a challenging task for the designers and developers because it is very difficult to understand the psychology of the user, nature of the work and best suit of the environment. This research paper is basically about to propose an intelligent, flexible and user friendly machine interface for Product Life Cycle Management products or PDM Syste...

  20. Humans, Intelligent Technology, and Their Interface: A Study of Brown’s Point

    Science.gov (United States)

    2017-12-01

    INTELLIGENT TECHNOLOGY , AND THEIR INTERFACE: A STUDY OF BROWN’S POINT by Jackie L. J. White December 2017 Thesis Advisor: Carolyn Halladay...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE HUMANS, INTELLIGENT TECHNOLOGY , AND THEIR INTERFACE: A STUDY OF BROWN’S POINT...with the technology before and during the accident. I combined the findings from the accident investigation with various heuristics regarding the human

  1. QA programme documentation

    International Nuclear Information System (INIS)

    Scheibelt, L.

    1980-01-01

    The present paper deals with the following topics: The need for a documented Q.A. program; Establishing a Q.A. program; Q.A. activities; Fundamental policies; Q.A. policies; Quality objectives Q.A. manual. (orig./RW)

  2. Nuclear spectrum data acquisition intelligent interface based on MCS-51 single chip microcomputer

    International Nuclear Information System (INIS)

    Xia Songjiang; Su Qin

    1991-01-01

    The intelligent interface consists of multichannel buffer and communication interface. It can acquire 4096 channels nuclear spectrum data. By connecting to a main computer, the data acquisition system has high resolution and foreground, background operating functions. The system features simple structure, reliable communication, convenient operation and high cost performance

  3. A Framework for Function Allocation in Intelligent Driver Interface Design for Comfort and Safety

    Directory of Open Access Journals (Sweden)

    Wuhong Wang

    2010-11-01

    Full Text Available This paper presents a conceptual framework for ecological function allocation and optimization matching solution for a human-machine interface with intelligent characteristics by lwho does what and when and howr consideration. As a highlighted example in nature-social system, intelligent transportation system has been playing increasingly role in keeping traffic safety, our research is concerned with identifying human factors problem of In-vehicle Support Systems (ISSs and revealing the consequence of the effects of ISSs on driver cognitive interface. The primary objective is to explore some new ergonomics principals that will be able to use to design an intelligent driver interface for comfort and safety, which will address the impact of driver interfaces layouts, traffic information types, and driving behavioral factors on the advanced vehicles safety design.

  4. Intelligent Electric Vehicle Integration - Domain Interfaces and Supporting Informatics

    DEFF Research Database (Denmark)

    Andersen, Peter Bach

    This thesis seeks to apply the field of informatics to the intelligent integration of electric vehicles into the power system. The main goal is to release the potential of electric vehicles in relation to a reliable, economically efficient power system based on renewables. To make intelligent EV...... and services in which the electric vehicle may be best suited to participate. The next stakeholder investigated is the distribution system operator representing the low voltage grid. The challenge is assessed by considering a number of grid impacts studies. Next, a set of grid congestion mitigation strategies...

  5. Brain-Computer Interfacing Embedded in Intelligent and Affective Systems

    NARCIS (Netherlands)

    Nijholt, Antinus

    In this talk we survey recent research views on non-traditional brain-computer interfaces (BCI). That is, interfaces that can process brain activity input, but that are designed for the ‘general population’, rather than for clinical purposes. Control of applications can be made more robust by fusing

  6. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  7. Designing distributed user interfaces for ambient intelligent environments using models and simulations

    OpenAIRE

    LUYTEN, Kris; VAN DEN BERGH, Jan; VANDERVELPEN, Chris; CONINX, Karin

    2006-01-01

    There is a growing demand for design support to create interactive systems that are deployed in ambient intelligent environments. Unlike traditional interactive systems, the wide diversity of situations these type of user interfaces need to work in require tool support that is close to the environment of the end-user on the one hand and provide a smooth integration with the application logic on the other hand. This paper shows how the model-based user interface development methodology can be ...

  8. Intelligent Performance Analysis with a Natural Language Interface

    Science.gov (United States)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  9. Intelligent Context-Aware and Adaptive Interface for Mobile LBS

    Directory of Open Access Journals (Sweden)

    Jiangfan Feng

    2015-01-01

    Full Text Available Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users’ demands in a complicated environment and suggested the feasibility by the experimental results.

  10. Intelligent Context-Aware and Adaptive Interface for Mobile LBS.

    Science.gov (United States)

    Feng, Jiangfan; Liu, Yanhong

    2015-01-01

    Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users' demands in a complicated environment and suggested the feasibility by the experimental results.

  11. Interface Design Concepts in the Development of ELSA, an Intelligent Electronic Library Search Assistant.

    Science.gov (United States)

    Denning, Rebecca; Smith, Philip J.

    1994-01-01

    Describes issues and advances in the design of appropriate inference engines and knowledge structures needed by commercially feasible intelligent intermediary systems for information retrieval. Issues associated with the design of interfaces to such functions are discussed in detail. Design principles for guiding implementation of these interfaces…

  12. Brain computer interfaces as intelligent sensors for enhancing human-computer interaction

    NARCIS (Netherlands)

    Poel, M.; Nijboer, F.; Broek, E.L. van den; Fairclough, S.; Nijholt, A.

    2012-01-01

    BCIs are traditionally conceived as a way to control apparatus, an interface that allows you to act on" external devices as a form of input control. We propose an alternative use of BCIs, that of monitoring users as an additional intelligent sensor to enrich traditional means of interaction. This

  13. Brain computer interfaces as intelligent sensors for enhancing human-computer interaction

    NARCIS (Netherlands)

    Poel, Mannes; Nijboer, Femke; van den Broek, Egon; Fairclough, Stephen; Morency, Louis-Philippe; Bohus, Dan; Aghajan, Hamid; Nijholt, Antinus; Cassell, Justine; Epps, Julien

    2012-01-01

    BCIs are traditionally conceived as a way to control apparatus, an interface that allows you to "act on" external devices as a form of input control. We propose an alternative use of BCIs, that of monitoring users as an additional intelligent sensor to enrich traditional means of interaction. This

  14. A Proposed Intelligent Policy-Based Interface for a Mobile eHealth Environment

    Science.gov (United States)

    Tavasoli, Amir; Archer, Norm

    Users of mobile eHealth systems are often novices, and the learning process for them may be very time consuming. In order for systems to be attractive to potential adopters, it is important that the interface should be very convenient and easy to learn. However, the community of potential users of a mobile eHealth system may be quite varied in their requirements, so the system must be able to adapt easily to suit user preferences. One way to accomplish this is to have the interface driven by intelligent policies. These policies can be refined gradually, using inputs from potential users, through intelligent agents. This paper develops a framework for policy refinement for eHealth mobile interfaces, based on dynamic learning from user interactions.

  15. How artificial intelligence can help [man-machine interface

    International Nuclear Information System (INIS)

    Elm, W.C.

    1988-01-01

    The operator is ultimately responsible for the safe and economical operation of the plant, and must evaluate the accuracy of any system-recommended action or other output. Decision support systems offer a means to improve the man-machine interface by explicitly supporting operator problem solving, rather than complicating decision-making by the need to request an explanation of the rationale behind an expert system's advice during a high stress situation. (author)

  16. System Interface for an Integrated Intelligent Safety System (ISS for Vehicle Applications

    Directory of Open Access Journals (Sweden)

    Mahammad A. Hannan

    2010-01-01

    Full Text Available This paper deals with the interface-relevant activity of a vehicle integrated intelligent safety system (ISS that includes an airbag deployment decision system (ADDS and a tire pressure monitoring system (TPMS. A program is developed in LabWindows/CVI, using C for prototype implementation. The prototype is primarily concerned with the interconnection between hardware objects such as a load cell, web camera, accelerometer, TPM tire module and receiver module, DAQ card, CPU card and a touch screen. Several safety subsystems, including image processing, weight sensing and crash detection systems, are integrated, and their outputs are combined to yield intelligent decisions regarding airbag deployment. The integrated safety system also monitors tire pressure and temperature. Testing and experimentation with this ISS suggests that the system is unique, robust, intelligent, and appropriate for in-vehicle applications.

  17. An Object-Oriented Graphical User Interface for a Reusable Rocket Engine Intelligent Control System

    Science.gov (United States)

    Litt, Jonathan S.; Musgrave, Jeffrey L.; Guo, Ten-Huei; Paxson, Daniel E.; Wong, Edmond; Saus, Joseph R.; Merrill, Walter C.

    1994-01-01

    An intelligent control system for reusable rocket engines under development at NASA Lewis Research Center requires a graphical user interface to allow observation of the closed-loop system in operation. The simulation testbed consists of a real-time engine simulation computer, a controls computer, and several auxiliary computers for diagnostics and coordination. The system is set up so that the simulation computer could be replaced by the real engine and the change would be transparent to the control system. Because of the hard real-time requirement of the control computer, putting a graphical user interface on it was not an option. Thus, a separate computer used strictly for the graphical user interface was warranted. An object-oriented LISP-based graphical user interface has been developed on a Texas Instruments Explorer 2+ to indicate the condition of the engine to the observer through plots, animation, interactive graphics, and text.

  18. Accelerator-control-system interface for intelligent power supplies

    International Nuclear Information System (INIS)

    Cohen, S.

    1992-01-01

    A number of high-current high-precision magnet power supplies have been installed at the proton storage ring at the Los Alamos National Laboratory Accelerator Complex. The units replace existing supplies, powering large dipole magnets in the ring. These bending magnets require a high-current supply that is precise and stable. The control and interface design for these power supplies represents a departure from all others on-site. The supplies have sophisticated microprocessor control on-board and communicate with the accelerator control system via RS-422 (serial communications). The units, built by Alpha Scientific Electronics, Hayward, CA use a high-level ASCII control protocol. The low-level ''front-end'' software used by the accelerator control system has been written to accommodate these new devices. They communicate with the control system through a terminal server port connected to the site-wide ethernet backbone. Details of the software implementation for the analog and digital control of the supplies through the accelerator control system will be presented

  19. Intelligent Adaptive Systems: Literature Research of Design Guidance for Intelligent Adaptive Automation and Interfaces

    Science.gov (United States)

    2007-09-01

    Willis, 2005)..................... 231 Figure 12: Screenshot of Cognition Monitor visualisation of its internal workings. 41 Inputs are presented on the...is the audio -visual communication layer between CAMA and the crew. The interface selects and co-ordinates information to be shown on a 2D map...weighted terrain elevation data and local threat values integrated over the complete flight path. The data are visualised in a 2D map display

  20. Des interfaces intelligentes pour les modèles de gisements Intelligent Interfaces for Reservoir Models

    Directory of Open Access Journals (Sweden)

    Zucchini P.

    2006-11-01

    Full Text Available Les codes de simulation numérique nécessitent souvent l'entrée de données nombreuses et variées. Nous présentons un programme interactif d'aide à la constitution d'un jeu de données pour un modèle de simulation de l'évolution des fluides dans un gisement d'hydrocarbures pendant son exploitation. Nous avons utilisé un moteur d'inférences et un générateur d'écrans de saisie pour écrire cette interface. Cette approche comporte de nombreux avantages concernant la qualité du logiciel produit : fiabilité, extensibilité, facilité d'utilisation, etc. L'utilisation combinée de règles d'expertise et d'un langage orienté objet offre de nouvelles perspectives qui sont étudiées. En conclusion, nous proposons l'extension de cette démarche pour développer une interface commune aux logiciels en Exploration - Production. Numerical simulation software often needs many input data having different natures. This article describes an interactive software that is of help in building the input data fill needed by oil reservoir simulators. A professional inference engine has been used to build this interface. This approach offers many advantages concerning the quality of the software produced, i. e. reliability, extensibility, user friendliness, etc. New prospects opened up by the mixed use of expertise rules and object-oriented languages are pointed out. The conclusion emphasizes the extension of this approach to the development of a common interface for exploration-production software.

  1. An intelligent human-machine system based on an ecological interface design concept

    International Nuclear Information System (INIS)

    Naito, N.

    1995-01-01

    It seems both necessary and promising to develop an intelligent human-machine system, considering the objective of the human-machine system and the recent advance in cognitive engineering and artificial intelligence together with the ever-increasing importance of human factor issues in nuclear power plant operation and maintenance. It should support human operators in their knowledge-based behaviour and allow them to cope with unanticipated abnormal events, including recovery from erroneous human actions. A top-down design approach has been adopted based on cognitive work analysis, and (1) an ecological interface, (2) a cognitive model-based advisor and (3) a robust automatic sequence controller have been established. These functions have been integrated into an experimental control room. A validation test was carried out by the participation of experienced operators and engineers. The results showed the usefulness of this system in supporting the operator's supervisory plant control tasks. ((orig.))

  2. Intelligent Systems and Advanced User Interfaces for Design, Operation, and Maintenance of Command Management Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    Historically Command Management Systems (CMS) have been large, expensive, spacecraft-specific software systems that were costly to build, operate, and maintain. Current and emerging hardware, software, and user interface technologies may offer an opportunity to facilitate the initial formulation and design of a spacecraft-specific CMS as well as a to develop a more generic or a set of core components for CMS systems. Current MOC (mission operations center) hardware and software include Unix workstations, the C/C++ and Java programming languages, and X and Java window interfaces representations. This configuration provides the power and flexibility to support sophisticated systems and intelligent user interfaces that exploit state-of-the-art technologies in human-machine systems engineering, decision making, artificial intelligence, and software engineering. One of the goals of this research is to explore the extent to which technologies developed in the research laboratory can be productively applied in a complex system such as spacecraft command management. Initial examination of some of the issues in CMS design and operation suggests that application of technologies such as intelligent planning, case-based reasoning, design and analysis tools from a human-machine systems engineering point of view (e.g., operator and designer models) and human-computer interaction tools, (e.g., graphics, visualization, and animation), may provide significant savings in the design, operation, and maintenance of a spacecraft-specific CMS as well as continuity for CMS design and development across spacecraft with varying needs. The savings in this case is in software reuse at all stages of the software engineering process.

  3. Advances in software development for intelligent interfaces for alarm and emergency management consoles

    International Nuclear Information System (INIS)

    Moseley, M.R.; Olson, C.E.

    1986-01-01

    Recent advances in technology allow features like voice synthesis, voice and speech recognition, image understanding, and intelligent data base management to be incorporated in computer driven alarm and emergency management information systems. New software development environments make it possible to do rapid prototyping of custom applications. Three examples using these technologies are discussed. (1) Maximum use is made of high-speed graphics and voice synthesis to implement a state-of-the-art alarm processing and display system with features that make the operator-machine interface efficient and accurate. Although very functional, this system is not portable or flexible; the software would have to be substantially rewritten for other applications. (2) An application generator which has the capability of ''building'' a specific alarm processing and display application in a matter of a few hours, using the site definition developed in the security planning phase to produce the custom application. This package is based on a standardized choice of hardware, within which it is capable of building a system to order, automatically constructing graphics, data tables, alarm prioritization rules, and interfaces to peripherals. (3) A software tool, the User Interface Management System (UIMS), is described which permits rapid prototyping of human-machine interfaces for a variety of applications including emergency management, alarm display and process information display. The object-oriented software of the UIMS achieves rapid prototyping of a new interface by standardizing to a class library of software objects instead of hardware objects

  4. Advances in software development for intelligent interfaces for alarm and emergency management consoles

    International Nuclear Information System (INIS)

    Moseley, M.R.; Olson, C.E.

    1986-01-01

    Recent advances in technology allow features like voice synthesis, voice and speech recognition, image understanding, and intelligent data base management to be incorporated in computer driven alarm and emergency management information systems. New software development environments make it possible to do rapid prototyping of custom applications. Three examples using these technologies are discussed. 1) Maximum use is made of high-speed graphics and voice synthesis to implement a state-of-the-art alarm processing and display system with features that make the operator-machine interface efficient and accurate. 2) An application generator which has the capability of ''building'' a specific alarm processing and display application in a matter of a few hours, using the site definition developed in the security planning phase to produce the custom application. 3) A software tool, is described which permits rapid prototyping of human-machine interfaces for a variety of applications including emergency management, alarm display and process information display

  5. Integration of an intelligent systems behavior simulator and a scalable soldier-machine interface

    Science.gov (United States)

    Johnson, Tony; Manteuffel, Chris; Brewster, Benjamin; Tierney, Terry

    2007-04-01

    As the Army's Future Combat Systems (FCS) introduce emerging technologies and new force structures to the battlefield, soldiers will increasingly face new challenges in workload management. The next generation warfighter will be responsible for effectively managing robotic assets in addition to performing other missions. Studies of future battlefield operational scenarios involving the use of automation, including the specification of existing and proposed technologies, will provide significant insight into potential problem areas regarding soldier workload. The US Army Tank Automotive Research, Development, and Engineering Center (TARDEC) is currently executing an Army technology objective program to analyze and evaluate the effect of automated technologies and their associated control devices with respect to soldier workload. The Human-Robotic Interface (HRI) Intelligent Systems Behavior Simulator (ISBS) is a human performance measurement simulation system that allows modelers to develop constructive simulations of military scenarios with various deployments of interface technologies in order to evaluate operator effectiveness. One such interface is TARDEC's Scalable Soldier-Machine Interface (SMI). The scalable SMI provides a configurable machine interface application that is capable of adapting to several hardware platforms by recognizing the physical space limitations of the display device. This paper describes the integration of the ISBS and Scalable SMI applications, which will ultimately benefit both systems. The ISBS will be able to use the Scalable SMI to visualize the behaviors of virtual soldiers performing HRI tasks, such as route planning, and the scalable SMI will benefit from stimuli provided by the ISBS simulation environment. The paper describes the background of each system and details of the system integration approach.

  6. Development of intelligent interface for simulation execution by module-based simulation system

    International Nuclear Information System (INIS)

    Yoshikawa, Hidekazu; Mizutani, Naoki; Shimoda, Hiroshi; Wakabayashi, Jiro

    1988-01-01

    An intelligent user support for the two phases of simulation execution was newly developed for Module-based Simulation System (MSS). The MSS has been in development as a flexible simulation environment to improve software productivity in complex, large-scale dynamic simulation of nuclear power plant. The AI programing by Smalltalk-80 was applied to materialize the two user-interface programs for (i) semantic diagnosis of the simulation program generated automatically by MSS, and (ii) consultation system by which user can set up consistent numerical input data files necessary for executing a MSS-generated program. Frame theory was utilized in those interface programs to represent the four knowledge bases, which are (i) usage information on module library in MSS and MSS-generated program, and (ii) expertise knowledge on nuclear power plant analysis such as material properties and reactor system configuration. Capabilities of those interface programs were confirmed by some example practice on LMFBR reactor dynamic calculation, and it was demonstrated that the knowledge-based systemization was effective to improve software work environment. (author)

  7. Interfacing An Intelligent Decision-Maker To A Real-Time Control System

    Science.gov (United States)

    Evers, D. C.; Smith, D. M.; Staros, C. J.

    1984-06-01

    This paper discusses some of the practical aspects of implementing expert systems in a real-time environment. There is a conflict between the needs of a process control system and the computational load imposed by intelligent decision-making software. The computation required to manage a real-time control problem is primarily concerned with routine calculations which must be executed in real time. On most current hardware, non-trivial AI software should not be forced to operate under real-time constraints. In order for the system to work efficiently, the two processes must be separated by a well-defined interface. Although the precise nature of the task separation will vary with the application, the definition of the interface will need to follow certain fundamental principles in order to provide functional separation. This interface was successfully implemented in the expert scheduling software currently running the automated chemical processing facility at Lockheed-Georgia. Potential applications of this concept in the areas of airborne avionics and robotics will be discussed.

  8. Event building in an intelligent network interface card for the LHCb readout network

    CERN Document Server

    Dufey, J P; Neufeld, N; Zuin, M

    2001-01-01

    LHCb is an experiment being constructed at CERN's LHC accelerator for the purpose of studying precisely the CP violation parameters in the B-B system. Triggering poses special problems since the interesting events containing B-mesons are immersed in a large background of inelastic p-p reactions. Therefore, a 4 Level Triggering scheme (Level 0 to Level 3) has been implemented. Powerful embedded processors, used in modern intelligent Network Interface Cards (smart NICs), make it attractive to use them to handle the event building protocol in the high-speed data acquisition system of the LHCb experiment. The implementation of an event building algorithm developed for a specific Gigabit Ethernet NIC is presented and performance data are discussed. 5 Refs.

  9. Prototype interface facility for intelligent handling and processing of medical image and data

    Science.gov (United States)

    Lymberopoulos, Dimitris C.; Garantziotis, Giannis; Spiropoulos, Kostas V.; Kotsopoulos, Stavros A.; Goutis, Costas E.

    1993-06-01

    This paper introduces an interface facility (IF) developed within the overall framework of RACE research project. Due to the nature of the project which it has been focused in the Remote Medical Expert Consultation, the involvement of distances, the versatile user advocation and familiarity with newly introduced methods of medical diagnosis, considerable deficiencies can arise. The aim was to intelligently assist the user/physician by providing an ergonomic environment which would contain operational and functional deficiencies to the lowest possible levels. IF, energizes and activates system and application level commands and procedures along with the necessary exemplified and instructional help facilities, in order to concisely allow the user to interact with the system safely and easily at all levels.

  10. Effects of Prior Knowledge in Mathematics on Learner-Interface Interactions in a Learning-by-Teaching Intelligent Tutoring System

    Science.gov (United States)

    Bringula, Rex P.; Basa, Roselle S.; Dela Cruz, Cecilio; Rodrigo, Ma. Mercedes T.

    2016-01-01

    This study attempted to determine the influence of prior knowledge in mathematics of students on learner-interface interactions in a learning-by-teaching intelligent tutoring system. One hundred thirty-nine high school students answered a pretest (i.e., the prior knowledge in mathematics) and a posttest. In between the pretest and posttest, they…

  11. Vision based interface system for hands free control of an intelligent wheelchair

    Directory of Open Access Journals (Sweden)

    Kim Eun

    2009-08-01

    Full Text Available Abstract Background Due to the shift of the age structure in today's populations, the necessities for developing the devices or technologies to support them have been increasing. Traditionally, the wheelchair, including powered and manual ones, is the most popular and important rehabilitation/assistive device for the disabled and the elderly. However, it is still highly restricted especially for severely disabled. As a solution to this, the Intelligent Wheelchairs (IWs have received considerable attention as mobility aids. The purpose of this work is to develop the IW interface for providing more convenient and efficient interface to the people the disability in their limbs. Methods This paper proposes an intelligent wheelchair (IW control system for the people with various disabilities. To facilitate a wide variety of user abilities, the proposed system involves the use of face-inclination and mouth-shape information, where the direction of an IW is determined by the inclination of the user's face, while proceeding and stopping are determined by the shapes of the user's mouth. Our system is composed of electric powered wheelchair, data acquisition board, ultrasonic/infra-red sensors, a PC camera, and vision system. Then the vision system to analyze user's gestures is performed by three stages: detector, recognizer, and converter. In the detector, the facial region of the intended user is first obtained using Adaboost, thereafter the mouth region is detected based on edge information. The extracted features are sent to the recognizer, which recognizes the face inclination and mouth shape using statistical analysis and K-means clustering, respectively. These recognition results are then delivered to the converter to control the wheelchair. Result & conclusion The advantages of the proposed system include 1 accurate recognition of user's intention with minimal user motion and 2 robustness to a cluttered background and the time-varying illumination

  12. Information, intelligence, and interface: the pillars of a successful medical information system.

    Science.gov (United States)

    Hadzikadic, M; Harrington, A L; Bohren, B F

    1995-01-01

    This paper addresses three key issues facing developers of clinical and/or research medical information systems. 1. INFORMATION. The basic function of every database is to store information about the phenomenon under investigation. There are many ways to organize information in a computer; however only a few will prove optimal for any real life situation. Computer Science theory has developed several approaches to database structure, with relational theory leading in popularity among end users [8]. Strict conformance to the rules of relational database design rewards the user with consistent data and flexible access to that data. A properly defined database structure minimizes redundancy i.e.,multiple storage of the same information. Redundancy introduces problems when updating a database, since the repeated value has to be updated in all locations--missing even a single value corrupts the whole database, and incorrect reports are produced [8]. To avoid such problems, relational theory offers a formal mechanism for determining the number and content of data files. These files not only preserve the conceptual schema of the application domain, but allow a virtually unlimited number of reports to be efficiently generated. 2. INTELLIGENCE. Flexible access enables the user to harvest additional value from collected data. This value is usually gained via reports defined at the time of database design. Although these reports are indispensable, with proper tools more information can be extracted from the database. For example, machine learning, a sub-discipline of artificial intelligence, has been successfully used to extract knowledge from databases of varying size by uncovering a correlation among fields and records[1-6, 9]. This knowledge, represented in the form of decision trees, production rules, and probabilistic networks, clearly adds a flavor of intelligence to the data collection and manipulation system. 3. INTERFACE. Despite the obvious importance of collecting

  13. Decision-Making and the Interface between Human Intelligence and Artificial Intelligence. AIR 1987 Annual Forum Paper.

    Science.gov (United States)

    Henard, Ralph E.

    Possible future developments in artificial intelligence (AI) as well as its limitations are considered that have implications for institutional research in higher education, and especially decision making and decision support systems. It is noted that computer software programs have been developed that store knowledge and mimic the decision-making…

  14. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    Energy Technology Data Exchange (ETDEWEB)

    Wiersma, R; Grelewicz, Z; Belcher, A; Liu, X [The University of Chicago, Chicago, IL (United States)

    2015-06-15

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods: A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system.

  15. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    International Nuclear Information System (INIS)

    Wiersma, R; Grelewicz, Z; Belcher, A; Liu, X

    2015-01-01

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods: A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system

  16. QA at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1988-01-01

    This paper opens with a brief overview of the purpose of Fermilab and historical synopsis of the development and current status of quality assurance (QA) at the Laboratory. The paper subsequently addresses some of the more important aspects of interpreting the national standard ANSI/ASME NQA-1 in pure research environments like Fermilab. Highlights of this discussion include, (1) what is hermeneutics and why are hermeneutical considerations relevant for QA, (2) a critical analysis of NQA-1 focussing on teleological aspects of the standard, (3) a description of the hermeneutical approach to NQA-1 used at Fermilab which attempts to capture the true intents of the document without violating the deeply ingrained traditions of quality standards and peer review that have been foundational to the overall success of the paradigms of high-energy physics.

  17. Intelligence

    Science.gov (United States)

    Sternberg, Robert J.

    2012-01-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain—especially with regard to the functioning in the prefrontal cortex—and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

  18. Intelligence.

    Science.gov (United States)

    Sternberg, Robert J

    2012-03-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex-and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret.

  19. NRC overview: Repository QA

    International Nuclear Information System (INIS)

    Kennedy, J.E.

    1988-01-01

    The US Department of Energy (DOE) is on the threshold of an extensive program for characterizing Yucca Mountain in Nevada to determine if it is a suitable site for the permanent disposal of high-level nuclear waste. Earlier this year, the DOE published the Consultation Draft Site Characterization Plan for the Nevada site, which describes in some detail the studies that need to be performed to determine if the site is acceptable. In the near future, the final site characterization plan (SCP) is expected to be issued and large-scale site characterization activities to begin. The data and analyses that will result from the execution of that plan are expected to be the primary basis for the license application to the US Nuclear Regulatory Commission (NRC). Because of the importance of these data and analyses in the assessment of the suitability of the site and in the demonstration of that suitability in the NRC licensing process, the NRC requires in 10CFR60 that site characterization be performed under a quality assurance (QA) program. The QA program is designed to provide confidence that data are valid, retrievable, and reproducible. The documentation produced by the program will form an important part of the record on which the suitability of the site is judged in licensing. In addition, because the NRC staff can review only a selected portion of the data collected, the staff will need to rely on the system of controls in the DOE QA program

  20. Artificial Intelligence, Expert Systems, Natural Language Interfaces, Knowledge Engineering and the Librarian.

    Science.gov (United States)

    Davies, Jim

    This paper begins by examining concepts of artificial intelligence (AI) and discusses various definitions of the concept that have been suggested in the literature. The nesting relationship of expert systems within the broader framework of AI is described, and expert systems are characterized as knowledge-based systems (KBS) which attempt to solve…

  1. Intelligent Tutoring Systems: Formalization as Automata and Interface Design Using Neural Networks

    Science.gov (United States)

    Curilem, S. Gloria; Barbosa, Andrea R.; de Azevedo, Fernando M.

    2007-01-01

    This article proposes a mathematical model of Intelligent Tutoring Systems (ITS), based on observations of the behaviour of these systems. One of the most important problems of pedagogical software is to establish a common language between the knowledge areas involved in their development, basically pedagogical, computing and domain areas. A…

  2. Application of QA geoscience investigations

    International Nuclear Information System (INIS)

    Henderson, J.T.

    1980-01-01

    This paper discusses the evolution of a classical hardware QA program (as currently embodied in DOE/ALO Manual Chapter 08XA; NRC 10CFR Part 50, Appendix B; and other similar documents) into the present geoscience quality assurance programs that address eventual NRC licensing, if required. In the context of this paper, QA will be restricted to the tasks associated with nuclear repositories, i.e. site identification, selection, characterization, verification, and utilization

  3. Emerging Tensions at the Interface of Artificial Intelligence, IPRs & Competition Law in the Health & Life Sciences

    DEFF Research Database (Denmark)

    Minssen, Timo

    This presentation: • describes the interface between Big Data, IPRs & competition law in the life sciences. • highlights selected life-science areas, where tensions and potential clashes are crystallizing. • discusses how these tensions could be addressed...

  4. Advanced interfacing techniques for sensors measurement circuits and systems for intelligent sensors

    CERN Document Server

    Roy, Joyanta; Kumar, V; Mukhopadhyay, Subhas

    2017-01-01

    This book presents ways of interfacing sensors to the digital world, and discusses the marriage between sensor systems and the IoT: the opportunities and challenges. As sensor output is often affected by noise and interference, the book presents effective schemes for recovering the data from a signal that is buried in noise. It also explores interesting applications in the area of health care, un-obstructive monitoring and the electronic nose and tongue. It is a valuable resource for engineers and scientists in the area of sensors and interfacing wanting to update their knowledge of the latest developments in the field and learn more about sensing applications and challenges.

  5. Intelligent CAMAC crate controller with CC-A2 functionality and VICbus interface

    International Nuclear Information System (INIS)

    Erven, W.; Holzer, J.; Kopp, H.; Loevenich, H.W.; Meiling, W.; Zwoll, K.; Bovier, J.; Re, G.; Worm, F.

    1992-01-01

    This paper reports that for nuclear physics experiments at the Julich Cooler Synchrotron COSY a data acquisition system is under development. With this background, and in order to enhance existing CAMAC systems, an intelligent CAMAC crate controller with CC-A2 functionality was developed. The main enhancement is the replacement of the Branch Highway with a new standard of inter-crate connection: the VICbus. The other highlights are: optional use of a Motorola 68030 microprocessor as CAMAC list-processor and optimization of CAMAC blocktransfers, optional Ethernet or Cheapernet connection. This controller is commercially available from CES, Geneva and called VCC 2117

  6. Testing and interfacing intelligent power supplies for the Los Alamos National Laboratory Accelerator Complex

    International Nuclear Information System (INIS)

    Sturrock, J.C.; Cohen, S.; Weintraub, B.L.; Hayden, D.J.; Archuleta, S.F.

    1992-01-01

    New high-current, high-precision microprocessor-controlled power supplies, built by Alpha Scientific Electronics of Hayward, CA, have been installed at the Los Alamos National Laboratory Accelerator Complex. Each unit has sophisticated microprocessor control on-board and communicates via RS-422 (serial communications). The units use a high level ASCII-based control protocol. Performance tests were conducted to verify adherence to specification and to ascertain ultimate long-term stability. The ''front-end'' software used by the accelerator control system has been written to accommodate these new devices. The supplies are interfaced to the control system through a terminal server port connected to the site-wide ediernet backbone. Test design and results as well as details of the software implementation for the analog and digital control of the supplies through the accelerator control system are presented

  7. Waste-management QA training and motivation

    International Nuclear Information System (INIS)

    Henderson, J.T.

    1982-01-01

    Early in the development of a QA Program for the Waste Management and Geotechnical Projects Directorate, thought was given to establishing a QA Training Program commensurate with the needs and appropriate to the motivation of a staff of more than 130 scientists and project leaders. These individuals, i.e., researchers rather than hardware designers, had no prior experience with QA programs, and from their perception generally did not believe that such controls had any merit. Therefore, historically proven approaches to QA training had to be quickly modified or totally discarded. For instance, due to the mobility and diversity of backgrounds of personnel at SNL, the QA training program had to accommodate many different levels of QA maturity at any given time. Furthermore, since the application of QA to R and D was continuing to profit from project-specific lessons learned, these improvements in the QA program had to be easily and quickly imparted in the general staff's evolving awareness of QA. A somewhat novel approach to QA training has been developed that draws heavily upon SNL's existing In-Hours Technical Education Courses (INTEC) studio capabilities. This training attempts to accommodate individual staff needs and to ensure the required QA skills and awareness for the diverse types of programs addressed

  8. A novel proton exchange membrane fuel cell based power conversion system for telecom supply with genetic algorithm assisted intelligent interfacing converter

    International Nuclear Information System (INIS)

    Kaur, Rajvir; Krishnasamy, Vijayakumar; Muthusamy, Kaleeswari; Chinnamuthan, Periasamy

    2017-01-01

    Highlights: • Proton exchange membrane fuel cell based telecom tower supply is proposed. • The use of diesel generator is eliminated and battery size is reduced. • Boost converter based intelligent interfacing unit is implemented. • The genetic algorithm assisted controller is proposed for effective interfacing. • The controller is robust against input and output disturbance rejection. - Abstract: This paper presents the fuel cell based simple electric energy conversion system for supplying the telecommunication towers to reduce the operation and maintenance cost of telecom companies. The telecom industry is at the boom and is penetrating deep into remote rural areas having unreliable or no grid supply. The telecom industry is getting heavily dependent on a diesel generator set and battery bank as a backup for continuously supplying a base transceiver station of telecom towers. This excessive usage of backup supply resulted in increased operational expenditure, the unreliability of power supply and had become a threat to the environment. A significant development and concern of clean energy sources, proton exchange membrane fuel cell based supply for base transceiver station is proposed with intelligent interfacing unit. The necessity of the battery bank capacity is significantly reduced as compared with the earlier solutions. Further, a simple closed loop and genetic algorithm assisted controller is proposed for intelligent interfacing unit which consists of power electronic boost converter for power conditioning. The proposed genetic algorithm assisted controller would ensure the tight voltage regulation at the DC distribution bus of the base transceiver station. Also, it will provide the robust performance of the base transceiver station under telecom load variation and proton exchange membrane fuel cell output voltage fluctuations. The complete electric energy conversion system along with telecom loads is simulated in MATLAB/Simulink platform and

  9. QA manpower requirement for nuclear power plants

    International Nuclear Information System (INIS)

    Link, M.

    1980-01-01

    To ensure the quality of the plant, QA activities are to be performed by the owner, the main contractor, the subcontractors and the Licensing Authority. The responsibilities of the QA-personnel of these organizations comprise as a minimum the control of the quality assurance systems and the proof of the quality requirements. Numbers of the required QA-personnel, designated for different tasks and recommended educational levels and professional qualifications will be given. (orig./RW)

  10. Follow-up utterances in QA dialogue

    NARCIS (Netherlands)

    van Schooten, B.W.; op den Akker, Hendrikus J.A.

    2006-01-01

    The processing of user follow-up utterances by a QA system is a topic which is still in its infant stages, but enjoys growing interest in the QA community. In this paper, we discuss the broader issues related to handling follow-up utterances in a real-life "information kiosk" setting. With help of a

  11. Patient QA systems for rotational radiation therapy

    DEFF Research Database (Denmark)

    Fredh, Anna; Scherman, J.B.; Munck af Rosenschöld, Per Martin

    2013-01-01

    The purpose of the present study was to investigate the ability of commercial patient quality assurance (QA) systems to detect linear accelerator-related errors.......The purpose of the present study was to investigate the ability of commercial patient quality assurance (QA) systems to detect linear accelerator-related errors....

  12. QA

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    Koeberg's system for quality assurance was discussed with the Quality Assurance Programme Manager for Koeberg Construction. An American style of quality assurance, practised on French technology is used for Koeberg. The quality assurance that is practised at Koeberg, also affected other industries in South Africa

  13. Introducing AI into MEMS can lead us to brain-computer interfaces and super-human intelligence

    OpenAIRE

    Sanders, David

    2009-01-01

    Last year, I spoke about the progress being made in machine intelligence (Sanders, 2008c) and with sensors and networks of sensors (Sanders, 2008b). Earlier this year (in this journal) I spoke about ambient-intelligence, rapid-prototyping and the role of humans in the factories of the future (Sanders, 2009a). I addressed new applications and technologies such as merging machines with human beings, micro-electromechanics, electro-mechanical systems that can be personalized, smarter than human ...

  14. Applying QA to nuclear-development programs

    International Nuclear Information System (INIS)

    Caplinger, W.H.

    1981-12-01

    The application of quality assurance (QA) principles to developmental programs is usually accomplished by tailoring or selecting appropriate requirements from large QA systems. Developmental work at Westinghouse Hanford Company (WHC) covers the complete range from basic research to in-core reactor tests. Desired requirements are selected from the 18 criteria in ANSI/ASME NQA Standard 1 by the cognizant program engineer in conjunction with the quality engineer. These referenced criteria assure that QA for the program is planned, implemented, and maintained. In addition, the WHC QA Manual provides four categories or levels of QA that are assigned to programs or components within the program. These categories are based on safety, reliability, and consequences of failure to provide a cost effective program

  15. QA CLASSIFICATION ANALYSIS OF GROUND SUPPORT SYSTEMS

    International Nuclear Information System (INIS)

    D. W. Gwyn

    1996-01-01

    The purpose and objective of this analysis is to determine if the permanent function Ground Support Systems (CI: BABEEOOOO) are quality-affecting items and if so, to establish the appropriate Quality Assurance (QA) classification

  16. Construction QA/QC systems: comparative analysis

    International Nuclear Information System (INIS)

    Willenbrock, J.H.; Shepard, S.

    1980-01-01

    An analysis which compares the quality assurance/quality control (QA/QC) systems adopted in the highway, nuclear power plant, and U.S. Navy construction areas with the traditional quality control approach used in building construction is presented. Full participation and support by the owner as well as the contractor and AE firm are required if a QA/QC system is to succeed. Process quality control, acceptance testing and quality assurance responsibilities must be clearly defined in the contract documents. The owner must audit these responsibilities. A contractor quality control plan, indicating the tasks which will be performed and the fact that QA/QC personnel are independent of project time/cost pressures should be submitted for approval. The architect must develop realistic specifications which consider the natural variability of material. Acceptance criteria based on the random sampling technique should be used. 27 refs

  17. Intelligent Access to Sequence and Structure Databases (IASSD) - an interface for accessing information from major web databases.

    Science.gov (United States)

    Ganguli, Sayak; Gupta, Manoj Kumar; Basu, Protip; Banik, Rahul; Singh, Pankaj Kumar; Vishal, Vineet; Bera, Abhisek Ranjan; Chakraborty, Hirak Jyoti; Das, Sasti Gopal

    2014-01-01

    With the advent of age of big data and advances in high throughput technology accessing data has become one of the most important step in the entire knowledge discovery process. Most users are not able to decipher the query result that is obtained when non specific keywords or a combination of keywords are used. Intelligent access to sequence and structure databases (IASSD) is a desktop application for windows operating system. It is written in Java and utilizes the web service description language (wsdl) files and Jar files of E-utilities of various databases such as National Centre for Biotechnology Information (NCBI) and Protein Data Bank (PDB). Apart from that IASSD allows the user to view protein structure using a JMOL application which supports conditional editing. The Jar file is freely available through e-mail from the corresponding author.

  18. Investigations of low qa discharges in the SINP tokamak

    Indian Academy of Sciences (India)

    Low edge safety factor discharges including very low qa (1 qa ... From fluctuation analysis of the external magnetic probe data it has been found that MHD ... To investigate the internal details of these discharges, an internal magnetic probe ...

  19. Peer review, basic research, and engineering: Defining a role for QA professionals in basic research environments

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1989-02-01

    Within the context of doing basic research, this paper seeks to answer four major questions: (1) What is the authority structure of science. (2) What is peer review. (3) Where is the interface between basic physics research and standard engineering. and (4) Given the conclusions to the first three questions, what is the role of the QA professional in a basic research environment like Fermilab. 23 refs.

  20. Basic concept of QA for advanced technologies

    International Nuclear Information System (INIS)

    Mijnheer, Ben

    2008-01-01

    The lecture was structured as follows: (1) Rationale for accurate dose determination; (2) Existing recommendations and guidance; (3) Challenges within the current QA paradigm; (4) New paradigm adopted by AAPM TG 100; and (5) Application of new paradigm to IMRT. Attention was paid, i.a., to major accidents in radiotherapy such as Epinal-1. (P.A.)

  1. Radiotherapy QA of the DAHANCA 19 protocol

    DEFF Research Database (Denmark)

    Samsøe, E.; Andersen, E.; Hansen, C. R.

    2015-01-01

    Purpose/Objective: It has been demonstrated that nonadherence to protocol-specified radiotherapy (RT) requirements is associated with reduced survival, local control and potentially increased toxicity [1]. Thus, quality assurance (QA) of RT is important when evaluating the results of clinical...

  2. Distributed intelligence in CAMAC

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1977-01-01

    The CAMAC digital interface standard has served us well since 1969. During this time there have been enormous advances in digital electronics. In particular, low cost microprocessors now make it feasible to consider use of distributed intelligence even in simple data acquisition systems. This paper describes a simple extension of the CAMAC standard which allows distributed intelligence at the crate level

  3. Intelligence in Artificial Intelligence

    OpenAIRE

    Datta, Shoumen Palit Austin

    2016-01-01

    The elusive quest for intelligence in artificial intelligence prompts us to consider that instituting human-level intelligence in systems may be (still) in the realm of utopia. In about a quarter century, we have witnessed the winter of AI (1990) being transformed and transported to the zenith of tabloid fodder about AI (2015). The discussion at hand is about the elements that constitute the canonical idea of intelligence. The delivery of intelligence as a pay-per-use-service, popping out of ...

  4. Minimum requirements on a QA program in radiation oncology

    International Nuclear Information System (INIS)

    Almond, P.R.

    1996-01-01

    In April, 1994, the American Association of Physicists in Medicine published a ''Comprehensive QA for radiation oncology:'' a report of the AAPM Radiation Therapy Committee. This is a comprehensive QA program which is likely to become the standard for such programs in the United States. The program stresses the interdisciplinary nature of QA in radiation oncology involving the radiation oncologists, the radiotherapy technologies (radiographers), dosimetrists, and accelerator engineers, as well as the medical physicists. This paper describes a comprehensive quality assurance program with the main emphasis on the quality assurance in radiation therapy using a linear accelerator. The paper deals with QA for a linear accelerator and simulator and QA for treatment planning computers. Next the treatment planning process and QA for individual patients is described. The main features of this report, which should apply to QA programs in any country, emphasizes the responsibilities of the medical physicist. (author). 7 refs, 9 tabs

  5. Minimum requirements on a QA program in radiation oncology

    Energy Technology Data Exchange (ETDEWEB)

    Almond, P R [Louisville Univ., Louisville, KY (United States). J.G. Brown Cancer Center

    1996-08-01

    In April, 1994, the American Association of Physicists in Medicine published a ``Comprehensive QA for radiation oncology:`` a report of the AAPM Radiation Therapy Committee. This is a comprehensive QA program which is likely to become the standard for such programs in the United States. The program stresses the interdisciplinary nature of QA in radiation oncology involving the radiation oncologists, the radiotherapy technologies (radiographers), dosimetrists, and accelerator engineers, as well as the medical physicists. This paper describes a comprehensive quality assurance program with the main emphasis on the quality assurance in radiation therapy using a linear accelerator. The paper deals with QA for a linear accelerator and simulator and QA for treatment planning computers. Next the treatment planning process and QA for individual patients is described. The main features of this report, which should apply to QA programs in any country, emphasizes the responsibilities of the medical physicist. (author). 7 refs, 9 tabs.

  6. A biological model for construction of meaning to serve as an interface between an intelligent system and its environments

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, W.J. [Univ of California, Berkeley, CA (United States)

    1996-12-31

    There are two main levels of neural function to be modeled with appropriate state variables and operations. Microscopic activity is seen in the fraction of the variance of single neuron pulse trains (>99.9%) that is largely random and uncorrelated with pulse trains of other neurons in the neuropil. Macroscopic activity is revealed in the >0.1% of the total variance of each neuron that is covariant with all other neurons in neuropil comprising a population. It is observed in dendritic potentials recorded as surface EEGs. The {open_quotes}spontaneous{close_quotes} background activity of neuropil at both levels arises from mutual excitation within a population of excitatory neurons. Its governing point attractor is set by the macroscopic state, which acts as an order parameter to regulate the contributing neurons. The point attractor manifests a homogeneous field of white noise, which can be modeled by a continuous time state variable for pulse density. Neuropil comprises both excitatory and inhibitory neurons Their interactions at the macroscopic level give oscillations, manifesting a limit cycle attractor. Multiple areas of neuropil comprising a sensory system interact. Due to their incommensurate characteristic frequencies and the long axonal delays between them, the system maintains a global chaotic attractor having multiple wings, one for each discriminable class of stimuli. Access to each wing is by stimulus- induced state transitions, causing construction of macroscopic chaotic patterns, that are carried to targets of cortical transmission by axon tracts. AM patterns of the carrier are extracted by the targets by spatiotemporal integration, thereby retrieving the covariance comprising the chaotic signal. In digital models, noise serves to stabilize the chaotic attractors. An example will be given of the model operating as an interface between the environment and a pattern classifier, which learns to form its own feature detectors.

  7. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    International Nuclear Information System (INIS)

    Folkerts, M; Graves, Y; Tian, Z; Gu, X; Jia, X; Jiang, S

    2014-01-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA

  8. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    Energy Technology Data Exchange (ETDEWEB)

    Folkerts, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); University of California, San Diego, La Jolla, CA (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States); Tian, Z; Gu, X; Jia, X; Jiang, S [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.

  9. Physics acceptance and QA procedures for IMRT

    International Nuclear Information System (INIS)

    LoSasso, T.; Ling, C.

    2001-01-01

    Full text: Intensity modulated radiation therapy (IMRT) may improve tumor control without compromising normal tissues by facilitating higher, more conformal tumor doses relative to 3D CRT. Intensity modulation (IM) is now possible with inverse planning and radiation delivery using dynamic multileaf collimation. Compared to 3D CRT, certain components in the IMRT process are more obscure to the user. Thus, special quality assurance procedures are required. Hardware and software are still relatively new to many users, and the potential for error is unknown. The relationship between monitor unit (MU) setting and radiation dose for IM beams is much more complex than for non-IM fields. The leaf sequence computer files, which control the MLC position as a function of MU, are large and do not lend themselves to simple manual verification. The 'verification' port film for each IM treatment field, usually obtained with the MLC set at the extreme leaf positions for that field to outline the entire irradiated area, does not verify the intensity modulation pattern. Finally, in IMRT using DMLC (the so-called sliding window technique), a small error in the window (or gap) width will lead to a significant dose error. In earlier papers, we provided an evaluation of the mechanical and dosimetric aspects in the use of a MLC in the dynamic mode. Mechanical tolerances are significantly tighter for DMLC than for static MLC treatments. Transmission through the leaves and through rounded leaf ends and head scatter were shown to be significant to the accuracy of radiation dose delivery using DMLC. With these considerations, we concluded that the present DMLC hardware and software are effective for routine clinical implementation, provided that a carefully designed routine QA procedure is followed to assure the normality of operation. In our earlier studies, an evaluation of the long-term stability of DMLC operation had not yet been performed. This paper describes the current status of our

  10. IC design challenges for ambient intelligence

    NARCIS (Netherlands)

    Aarts, E.H.L.; Roovers, R.L.J.

    2003-01-01

    The vision of ambient intelligence opens a world of unprecedented experiences: the interaction of people with electronic devices is changed as contextual awareness, natural interfaces and ubiquitous availability of information are realized. We analyze the consequences of the ambient intelligence

  11. Robot Advanced Intelligent Control developed through Versatile ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... environments of human life exposed to great dangers such as support and repair in .... intelligent control interfaces, network quality of service, shared resources and ..... Artificial Intelligence series, volume 6556, p. 336-349 ...

  12. Analysis of QA audit checklist for equipment suppliers

    International Nuclear Information System (INIS)

    Tian Xuehang

    2012-01-01

    Eleven aspects during the equipment manufacturing by the suppliers, including the guidelines and objectives of quality assurance, management department review, document and record control, staffing and training, design control, procurement control, control of items, process control, inspection and testing control, non-conformance control, and internal and external QA audit, are analyzed in this article. The detailed QA audit checklist on these above mentioned aspects are described and the problems found in real QA audit are listed in this article. (authors)

  13. A comparative study and analysis of QA requirements for the establishment of a nuclear R and D QA system

    International Nuclear Information System (INIS)

    Kim, Kwan Hyun

    2000-06-01

    This technical report provides recommendations on how to fulfill the requirements of the code in relation to QA activities for nuclear R and D field. This guide applies to the quality assurance (QA) programmes of the responsible organization, i.e. the organization having overall responsibility for the nuclear power plant, as well as to any other separate QA programmes in each stage of a nuclear R and D project. This guide covers QA work on items, services and processes impacting nuclear safety during siting, design, construction, commissioning, operation and decommissioning of nuclear power plants. The impact on safety may occur during the performance of the QA work, or owing to the application of the results of the QA. This guide may, with appropriate modification, also be usefully applied at nuclear installations other than nuclear R and D field

  14. Manufacturing and QA of adaptors for LHC

    International Nuclear Information System (INIS)

    Madhu Murthy, V.; Dwivedi, J.; Goswami, S.G.; Soni, H.C.; Mainaud Durand, H.; Quesnel, J.P.; )

    2006-01-01

    The LHC low beta quadrupoles, have very tight alignment tolerances and are located in areas with strong radiation field. They require remote re-alignment, by motorized jacks, based on the feedback of alignment sensors of each magnet. Jacks designed to support arc cryomagnets of LHC are modified and motorized with the help of adaptors. Two types of adapters, for vertical and transverse axes of the jacks, were developed and supplied through collaboration between RRCAT, DAE, India and CERN, Geneva. This paper describes their functional requirements, manufacture and quality assurance (QA). (author)

  15. Accounting for human factor in QC and QA inspections

    International Nuclear Information System (INIS)

    Goodman, J.

    1986-01-01

    Two types of human error during QC/QA inspection have been identified. The method of accounting for the effects of human error in QC/QA inspections was developed. The result of evaluation of the proportion of discrepant items in the population is affected significantly by human factor

  16. The intelligent data recorder

    International Nuclear Information System (INIS)

    Kojima, Mamoru; Hidekuma, Sigeru.

    1985-01-01

    The intelligent data recorder has been developed to data acquisition for a microwave interferometer. The 'RS-232C' which is the standard interface is used for data transmission to the host computer. Then, it's easy to connect with any computer which has general purpose serial port. In this report, the charcteristics of the intelligent data recorder and the way of developing the software are described. (author)

  17. Intelligent Information Retrieval: An Introduction.

    Science.gov (United States)

    Gauch, Susan

    1992-01-01

    Discusses the application of artificial intelligence to online information retrieval systems and describes several systems: (1) CANSEARCH, from MEDLINE; (2) Intelligent Interface for Information Retrieval (I3R); (3) Gausch's Query Reformulation; (4) Environmental Pollution Expert (EP-X); (5) PLEXUS (gardening); and (6) SCISOR (corporate…

  18. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    Energy Technology Data Exchange (ETDEWEB)

    Sathiaseelan, V [Northwestern Memorial Hospital, Chicago, IL (United States); Thomadsen, B [University of Wisconsin, Madison, WI (United States)

    2014-06-15

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  19. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    International Nuclear Information System (INIS)

    Sathiaseelan, V; Thomadsen, B

    2014-01-01

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  20. Artificial Intelligence--Applications in Education.

    Science.gov (United States)

    Poirot, James L.; Norris, Cathleen A.

    1987-01-01

    This first in a projected series of five articles discusses artificial intelligence and its impact on education. Highlights include the history of artificial intelligence and the impact of microcomputers; learning processes; human factors and interfaces; computer assisted instruction and intelligent tutoring systems; logic programing; and expert…

  1. Intelligible Artificial Intelligence

    OpenAIRE

    Weld, Daniel S.; Bansal, Gagan

    2018-01-01

    Since Artificial Intelligence (AI) software uses techniques like deep lookahead search and stochastic optimization of huge neural networks to fit mammoth datasets, it often results in complex behavior that is difficult for people to understand. Yet organizations are deploying AI algorithms in many mission-critical settings. In order to trust their behavior, we must make it intelligible --- either by using inherently interpretable models or by developing methods for explaining otherwise overwh...

  2. Role of QA in total quality management environment

    International Nuclear Information System (INIS)

    McCarthy, J.B.; Ayres, R.A.

    1992-01-01

    A successful company in today's highly competitive business environment must emphasize quality in all activities at all times. For most companies, this requires a major cultural change to establish appropriate operating attitudes and priorities. A total quality environment is required where quality becomes a way of life, and this process must be carefully managed. It will not be accomplished in a few short months with a simple management pronouncement. Instead, it evolves over a period of years through continuous incremental improvement. This evolution towards total quality requires a dramatic change in the quality assurance (QA) function of most companies. Traditionally, quality was automatically equated to QA and its attendant procedures and personnel. Now, quality is becoming a global concept, and QA can play a significant role in the process. The QA profession must, however, recognize and accept a new role as consultant, coach, and partner in today's total quality game. The days of the hard-line enforcer of procedural requirements are gone

  3. Current Status of QA For Nuclear Power Plants in Japan

    International Nuclear Information System (INIS)

    Nagoshi, Hitohiko

    1986-01-01

    It is the current status of QA and our QA experiences with nuclear power plants against the background of the Japanese social and business environment. Accordingly, in 1972, 'The Guidance for Quality Assurance in Construction of Nuclear Power Plants' based on U. S. 10CEF50 Appendix B, was published by the Japan Electric Association. 'Jug-4101 The Guide for Quality Assurance of Nuclear Power Plants' has been prepared by referring to the IAEA QA code. The Guide has been accepted by the Japanese nuclear industry and applied to the QA programs of every organization concerned therewith. The Japanese approach to higher quality will naturally be different from that of other countries because of Japan's cultural, social, and economic conditions. Even higher quality is being aimed at through the LWR Improvement and Standardization Program and coordinated quality assurance efforts

  4. Worldwide QA networks for radiotherapy dosimetry

    International Nuclear Information System (INIS)

    Izewska, J.; Svensson, H.; Ibbott, G.

    2002-01-01

    institutions participating in the U.S. National Cancer Institute's (NCI's) co-operative clinical trials. The RPC currently monitors approximately 1300 centres throughout the USA, Canada and several other countries. The audit tools include, in addition to mailed TLD, review of the institution's dosimetry data, the treatment records of patients entered into trials, and the institution's QA programme. Anthropomorphic phantoms have been developed to evaluate specific treatment techniques. Other currently operating external audit programmes have been either associated with national and international clinical trial groups, similarly to RPC, e.g. EORTC (European Organisation for Research in Treatment of Cancer) in Europe, MRC (Medical Research Council) in the UK, or have been one-off national dosimetry intercomparison exercises, carried out to test various levels of radiotherapy dosimetry, e.g. in Sweden, the Netherlands, Belgium, Switzerland, Australia. Some individual countries have set up comprehensive regular audits of radiotherapy centres, including QA programmes, equipment and dosimetry, e.g. Finland, New Zealand. The IAEA supports its Member States in developing national programmes for TLD based QA audits in radiotherapy dosimetry and whenever possible, establishes links between the national programmes and the IAEA's Dosimetry Laboratory. It disseminates its standardised TLD methodology and provides technical back up to national TLD networks assuring at the same time traceability to primary dosimetry standards. There are several countries (Argentina, Algeria, Brazil, China, Colombia, Cuba, Czech Republic, India, Israel, Malaysia, Philippines, Poland and Vietnam) that have established TLD programmes to audit radiotherapy beams in their countries with assistance of the IAEA. Recently a new IAEA project has been initiated for national TLD audits in non-reference conditions as significant numbers of deviations in non-reference situations, as used clinically on patients, have been

  5. NRC [Nuclear Regulatory Commission] perspective of software QA [quality assurance] in the nuclear history

    International Nuclear Information System (INIS)

    Weiss, S.H.

    1988-01-01

    Computer technology has been a part of the nuclear industry since its inception. However, it is only recently that computers have been integrated into reactor operations. During the early history of commercial nuclear power in the United States, the US Nuclear Regulatory Commission (NRC) discouraged the use of digital computers for real-time control and monitoring of nuclear power plant operation. At the time, this position was justified since software engineering was in its infancy, and horror stories on computer crashes were plentiful. Since the advent of microprocessors and inexpensive computer memories, significant advances have been made in fault-tolerant computer architecture that have resulted in highly reliable, durable computer systems. The NRC's requirement for safety parameter display system (SPDS) stemmed form the results of studies and investigations conducted on the Three Mile Island Unit 2 (TMI-2) accident. An NRC contractor has prepared a handbook of software QA techniques applicable to the nuclear industry, published as NUREG/CR-4640 in August 1987. Currently, the NRC is considering development of an inspection program covering software QA. Future efforts may address verification and validation as applied to expert systems and artificial intelligence programs

  6. Development of a user friendly interface for database querying in natural language by using concepts and means related to artificial intelligence

    International Nuclear Information System (INIS)

    Pujo, Pascal

    1989-01-01

    This research thesis reports the development of a user-friendly interface in natural language for querying a relational database. The developed system differs from usual approaches for its integrated architecture as the relational model management is totally controlled by the interface. The author first addresses the way to store data in order to make them accessible through an interface in natural language, and more precisely to store data with an organisation which would result in the less possible constraints in query formulation. The author then briefly presents techniques related to automatic processing in natural language, and discusses the implications of a better user-friendliness and for error processing. The next part reports the study of the developed interface: selection of data processing tools, interface development, data management at the interface level, information input by the user. The last chapter proposes an overview of possible evolutions for the interface: use of deductive functionalities, use of an extensional base and of an intentional base to deduce facts from knowledge stores in the extensional base, and handling of complex objects [fr

  7. On user behaviour adaptation under interface change

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2014-02-01

    Full Text Available International Conference on Intelligent User Interfaces, Haifa, Israel, 24-27 February 2014 On User Behaviour Adaptation Under Interface Change Benjamin Rosman_ Subramanian Ramamoorthy M. M. Hassan Mahmud School of Informatics University of Edinburgh...

  8. Q&A with Jim Collins.

    Science.gov (United States)

    Mast, Carlotta

    2003-01-01

    Applies to public education the principles that begin to explain why some organizations become great and others do not. States that an organization is able to achieve greatness only by pushing in an intelligent and consistent direction for years and even decades. (MLF)

  9. Moving from gamma passing rates to patient DVH-based QA metrics in pretreatment dose QA

    Energy Technology Data Exchange (ETDEWEB)

    Zhen, Heming; Nelms, Benjamin E.; Tome, Wolfgang A. [Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 (United States); Department of Human Oncology, University of Wisconsin, Madison, Wisconsin 53792 and Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 and Department of Human Oncology, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2011-10-15

    Purpose: The purpose of this work is to explore the usefulness of the gamma passing rate metric for per-patient, pretreatment dose QA and to validate a novel patient-dose/DVH-based method and its accuracy and correlation. Specifically, correlations between: (1) gamma passing rates for three 3D dosimeter detector geometries vs clinically relevant patient DVH-based metrics; (2) Gamma passing rates of whole patient dose grids vs DVH-based metrics, (3) gamma passing rates filtered by region of interest (ROI) vs DVH-based metrics, and (4) the capability of a novel software algorithm that estimates corrected patient Dose-DVH based on conventional phan-tom QA data are analyzed. Methods: Ninety six unique ''imperfect'' step-and-shoot IMRT plans were generated by applying four different types of errors on 24 clinical Head/Neck patients. The 3D patient doses as well as the dose to a cylindrical QA phantom were then recalculated using an error-free beam model to serve as a simulated measurement for comparison. Resulting deviations to the planned vs simulated measured DVH-based metrics were generated, as were gamma passing rates for a variety of difference/distance criteria covering: dose-in-phantom comparisons and dose-in-patient comparisons, with the in-patient results calculated both over the whole grid and per-ROI volume. Finally, patient dose and DVH were predicted using the conventional per-beam planar data as input into a commercial ''planned dose perturbation'' (PDP) algorithm, and the results of these predicted DVH-based metrics were compared to the known values. Results: A range of weak to moderate correlations were found between clinically relevant patient DVH metrics (CTV-D95, parotid D{sub mean}, spinal cord D1cc, and larynx D{sub mean}) and both 3D detector and 3D patient gamma passing rate (3%/3 mm, 2%/2 mm) for dose-in-phantom along with dose-in-patient for both whole patient volume and filtered per-ROI. There was

  10. QA role in advanced energy activities: Reductionism, emergence, and functionalism; presuppositions in designing internal QA audits

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1988-06-01

    After a brief overview of the mission of Fermilab, this paper explores some of the problems associated with designing internal QA audits. The paper begins with several examples of how audits should not be designed, then goes on to analyze two types of presuppositions about organizational structure (reductionism and emergence) that can be misleading and skew the data sample if folded too heavily into the checklist. A third type of presupposition (functionalism), is proposed as a viable way of achieving a more well-rounded measure of the performance of an organization, i.e. its effectiveness, not just compliance.

  11. Development of a user-friendly interface for the searching of a data base in natural language while using concepts and means of artificial intelligence

    International Nuclear Information System (INIS)

    Pujo, Pascal

    1989-01-01

    This research thesis aimed at the development of a natural-language-based user-friendly interface for the searching of relational data bases. The author first addresses how to store data which will be accessible through an interface in natural language: this organisation must result in as few constraints as possible in query formulation. He briefly presents techniques related to the automatic processing of natural language, and highlights the need for a more user-friendly interface. Then, he presents the developed interface and outlines the user-friendliness and ergonomics of implemented procedures. He shows how the interface has been designed to deliver information and explanations on its processing. This allows the user to control the relevance of the answer. He also indicates the classification of mistakes and errors which may be present in queries in natural language. He finally gives an overview of possible evolutions of the interface, briefly presents deductive functionalities which could expand data management. The handling of complex objects is also addressed [fr

  12. Artificial intelligence

    CERN Document Server

    Hunt, Earl B

    1975-01-01

    Artificial Intelligence provides information pertinent to the fundamental aspects of artificial intelligence. This book presents the basic mathematical and computational approaches to problems in the artificial intelligence field.Organized into four parts encompassing 16 chapters, this book begins with an overview of the various fields of artificial intelligence. This text then attempts to connect artificial intelligence problems to some of the notions of computability and abstract computing devices. Other chapters consider the general notion of computability, with focus on the interaction bet

  13. Intelligent mechatronics; Intelligent mechatronics

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, H. [The University of Tokyo, Tokyo (Japan). Institute of Industrial Science

    1995-10-01

    Intelligent mechatronics (IM) was explained as follows: a study of IM essentially targets realization of a robot namely, but in the present stage the target is a creation of new values by intellectualization of machine, that is, a combination of the information infrastructure and the intelligent machine system. IM is also thought to be constituted of computers positively used and micromechatronics. The paper next introduces examples of IM study, mainly those the author is concerned with as shown below: sensor gloves, robot hands, robot eyes, tele operation, three-dimensional object recognition, mobile robot, magnetic bearing, construction of remote controlled unmanned dam, robot network, sensitivity communication using neuro baby, etc. 27 figs.

  14. A proposal of ubiquitous fuzzy computing for ambient Intelligence

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.

    2008-01-01

    Ambient Intelligence is considered as the composition of three emergent technologies: Ubiquitous Computing, Ubiquitous Communication and Intelligent User Interfaces. The aim of integration of aforesaid technologies is to make wider the interaction between human beings and information technology

  15. Ubiquitous fuzzy computing in open ambient intelligence environments

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.

    2006-01-01

    Ambient intelligence (AmI) is considered as the composition of three emergent technologies: ubiquitous computing, ubiquitous communication and intelligent user interfaces. The aim of integration of aforesaid technologies is to make wider the interaction between human beings and information

  16. Artificial Intelligence and Moral intelligence

    OpenAIRE

    Laura Pana

    2008-01-01

    We discuss the thesis that the implementation of a moral code in the behaviour of artificial intelligent systems needs a specific form of human and artificial intelligence, not just an abstract intelligence. We present intelligence as a system with an internal structure and the structural levels of the moral system, as well as certain characteristics of artificial intelligent agents which can/must be treated as 1- individual entities (with a complex, specialized, autonomous or selfdetermined,...

  17. Q&A: The AI composer

    Science.gov (United States)

    Spinney, Laura

    2017-09-01

    Computer scientist Luc Steels uses artificial intelligence to explore the origins and evolution of language. He is best known for his 1999-2001 Talking Heads Experiment, in which robots had to construct a language from scratch to communicate with each other. Now Steels, who works at the Free University of Brussels (VUB), has composed an opera based on the legend of Faust, with a twenty-first-century twist. He talks about Mozart as a nascent computer programmer, how music maps onto language, and the blurred boundaries of a digitized world.

  18. The GSPC: Newest Franchise in al-Qa'ida's Global Jihad

    National Research Council Canada - National Science Library

    Boudali, Lianne K

    2007-01-01

    ... of support in Europe. The alignment of the GSPC with al Qa ida represents a significant change in the group's strategy, however, its decision to join al Qa ida's global jihad should be understood as an act of desperation...

  19. Artificial Intelligence.

    Science.gov (United States)

    Information Technology Quarterly, 1985

    1985-01-01

    This issue of "Information Technology Quarterly" is devoted to the theme of "Artificial Intelligence." It contains two major articles: (1) Artificial Intelligence and Law" (D. Peter O'Neill and George D. Wood); (2) "Artificial Intelligence: A Long and Winding Road" (John J. Simon, Jr.). In addition, it contains two sidebars: (1) "Calculating and…

  20. Competitive Intelligence.

    Science.gov (United States)

    Bergeron, Pierrette; Hiller, Christine A.

    2002-01-01

    Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligence process; and discusses education and training opportunities for competitive intelligence, including core competencies…

  1. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    Science.gov (United States)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  2. Utility view on QA/QC of WWER-440 fuel design and manufacture

    International Nuclear Information System (INIS)

    Vesely, P.

    1999-01-01

    In this lecture the legislation implements in the Czech Republic, QA/QC system at CEZ, demonstration and development program (purchaser point of view), audit of QA/QC system for fuel design and manufacturing as well as QA/QC records are discussed

  3. THE IMPORTANCE OF A SUCCESSFUL QUALITY ASSURANCE (QA) PROGRAM FROM A RESEARCH MANAGER'S PERSPECTIVE

    Science.gov (United States)

    The paper discusses the Air Pollution Prevention and Control Division's Quality Assurance (QA) program and the approaches used to meet QA requirements in the Division. The presentation is a technical manager's perspective of the Division's requirements for and approach to QA in i...

  4. Intelligence Ethics:

    DEFF Research Database (Denmark)

    Rønn, Kira Vrist

    2016-01-01

    Questions concerning what constitutes a morally justified conduct of intelligence activities have received increased attention in recent decades. However, intelligence ethics is not yet homogeneous or embedded as a solid research field. The aim of this article is to sketch the state of the art...... of intelligence ethics and point out subjects for further scrutiny in future research. The review clusters the literature on intelligence ethics into two groups: respectively, contributions on external topics (i.e., the accountability of and the public trust in intelligence agencies) and internal topics (i.......e., the search for an ideal ethical framework for intelligence actions). The article concludes that there are many holes to fill for future studies on intelligence ethics both in external and internal discussions. Thus, the article is an invitation – especially, to moral philosophers and political theorists...

  5. Intelligent Decision Technologies : Proceedings of the 4th International Conference on Intelligent Decision Technologies

    CERN Document Server

    Watanabe, Toyohide; Phillips-Wren, Gloria; Howlett, Robert; Jain, Lakhmi

    2012-01-01

    The Intelligent Decision Technologies (IDT) International Conference encourages an interchange of research on intelligent systems and intelligent technologies that enhance or improve decision making. The focus of IDT is interdisciplinary and includes research on all aspects of intelligent decision technologies, from fundamental development to real applications. IDT has the potential to expand their support of decision making in such areas as finance, accounting, marketing, healthcare, medical and diagnostic systems, military decisions, production and operation, networks, traffic management, crisis response, human-machine interfaces, financial and stock market monitoring and prediction, and robotics. Intelligent decision systems implement advances in intelligent agents, fuzzy logic, multi-agent systems, artificial neural networks, and genetic algorithms, among others.  Emerging areas of active research include virtual decision environments, social networking, 3D human-machine interfaces, cognitive interfaces,...

  6. An intelligent CPIB controller

    International Nuclear Information System (INIS)

    Wikne, J.C.

    1987-12-01

    An intelligent GPIB (General Purpose Interface Bus) controller is described. It employs an autonomous slave CPU together with a dedicated controller/talker/listener chip to handle the GPIB bus protocol, thus freeing the host computer from this time-consuming task. Distribution of a large part of the necessary software to the slave side, assures that the system can be implemented on virtually any computer with a minimum of effort

  7. QA engineering for the LCP USA magnet manufacturers

    International Nuclear Information System (INIS)

    Childress, C.E.; Batey, J.E.; Burn, P.B.

    1981-01-01

    This paper describes the QA and QC efforts and results used in fabricating the superconducting magnets of competing designs being developed by American Manufacturers for testing in the ORNL Large Coil Test Facility. Control of the design, materials and processes to assure proper functioning of the magnets in the test facility as well as the content of archival data being compiled is discussed

  8. Discussion of QA grading for AP1000 NP plant

    International Nuclear Information System (INIS)

    Luo Shuiyun; Zhang Qingchuan

    2012-01-01

    The grading method of quality assurance for the following AP1000 project is presented based on the Westinghouse classification principle, referring to the classification method of the AP1000 self-reliance supporting project and considering the factors of classification, which can meet the requirements of domestic nuclear safety regulation and standard of the QA classification. (authors)

  9. Q&A: Grace Anne Koppel, Living Well with COPD

    Science.gov (United States)

    ... their own lives back is the most rewarding thing we have ever done. Read More "The Challenge of COPD" Articles Q&A: Grace Anne Koppel, Living Well with COPD / What is COPD? / What Causes COPD? / Getting Tested / Am I at Risk? / COPD Quiz Fall ...

  10. Q&A: The Basics of California's School Finance System

    Science.gov (United States)

    EdSource, 2006

    2006-01-01

    In a state as large and complex as California, education financing can become as complicated as rocket science. This two-page Q&A provides a brief, easy-to-understand explanation of California's school finance system and introduces the issues of its adequacy and equity. A list of resources providing additional information is provided.

  11. A community Q&A for HEP Software and Computing ?

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    How often do you use StackOverflow or ServerFault to find information in your daily work? Would you be interested in a community Q&A site for HEP Software and Computing, for instance a dedicated StackExchange site? I looked into this question...

  12. Process control analysis of IMRT QA: implications for clinical trials

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Rice, Roger K; Yoo, Sua; Court, Laurence E; McMillan, Sharon K; Russell, J Donald; Pacyniak, John M; Woo, Milton K; Basran, Parminder S; Boyer, Arthur L; Bonilla, Claribel

    2008-01-01

    The purpose of this study is two-fold: first is to investigate the process of IMRT QA using control charts and second is to compare control chart limits to limits calculated using the standard deviation (σ). Head and neck and prostate IMRT QA cases from seven institutions in both academic and community settings are considered. The percent difference between the point dose measurement in phantom and the corresponding result from the treatment planning system (TPS) is used for analysis. The average of the percent difference calculations defines the accuracy of the process and is called the process target. This represents the degree to which the process meets the clinical goal of 0% difference between the measurements and TPS. IMRT QA process ability defines the ability of the process to meet clinical specifications (e.g. 5% difference between the measurement and TPS). The process ability is defined in two ways: (1) the half-width of the control chart limits, and (2) the half-width of ±3σ limits. Process performance is characterized as being in one of four possible states that describes the stability of the process and its ability to meet clinical specifications. For the head and neck cases, the average process target across institutions was 0.3% (range: -1.5% to 2.9%). The average process ability using control chart limits was 7.2% (range: 5.3% to 9.8%) compared to 6.7% (range: 5.3% to 8.2%) using standard deviation limits. For the prostate cases, the average process target across the institutions was 0.2% (range: -1.8% to 1.4%). The average process ability using control chart limits was 4.4% (range: 1.3% to 9.4%) compared to 5.3% (range: 2.3% to 9.8%) using standard deviation limits. Using the standard deviation to characterize IMRT QA process performance resulted in processes being preferentially placed in one of the four states. This is in contrast to using control charts for process characterization where the IMRT QA processes were spread over three of the

  13. Intelligence Naturelle et Intelligence Artificielle

    OpenAIRE

    Dubois, Daniel

    2011-01-01

    Cet article présente une approche systémique du concept d’intelligence naturelle en ayant pour objectif de créer une intelligence artificielle. Ainsi, l’intelligence naturelle, humaine et animale non-humaine, est une fonction composée de facultés permettant de connaître et de comprendre. De plus, l'intelligence naturelle reste indissociable de la structure, à savoir les organes du cerveau et du corps. La tentation est grande de doter les systèmes informatiques d’une intelligence artificielle ...

  14. Design and implementation of intelligent vehicle system based on brain-computer interface%基于脑-机接口的智能小车系统设计与实现

    Institute of Scientific and Technical Information of China (English)

    陈东伟; 吴方; 王震; 韩娜; 黄家良; 韦逸成; 林焕杨

    2013-01-01

    This system builts a mind race system-MindAuto by means of embedded system and braincomputer interface.Firstly,it made the EEG acquisition device with the TGAM chip and acquired the EEG dataset; secondly,the MindReader and PC was connected with bluetooth,while the data acquired from MindReader,such as raw,attention,meditation and blink were transferred wirelessly to PC and were quantized by eSence algorithm; thirdly,the Arduino intelligent vehicle was wirelessly connected to PC with bluetooth,and controlled by I/O interface with the quantized EEG data; finally,the control system was implemented with multi-functional track model.%结合嵌入式系统和脑-机接口技术,构建MindAuto——意念赛车系统.首先以TGAM为核心制作脑电采集设备MindReader,进行脑电数据的采集;其次通过蓝牙将MindReader和PC机相连,将采集到的脑波原始数据raw data、注意力值Attention、放松度值Meditation和眨眼强度值Blink传输到PC机,利用eSense算法将脑电数据进行量化;然后通过蓝牙无线连接到Arduino智能小车平台,通过I/O控制口实现脑波对小车的控制;最后结合多功能轨道模型,实现此控制系统.

  15. Embedded systems design issues in ambient intelligence

    NARCIS (Netherlands)

    Aarts, E.H.L.; Roovers, R.L.J.; Basten, A.A.; Geilen, M.C.W.; Groot, de H.W.H.

    2003-01-01

    The vision of ambient intelligence opens a world of unprecedented ex.periences: the interaction of people with electronic devices is changed as context awareness, natural interfaces and ubiquitous availability of information are realized. We analyze the consequences of the ambient intelligence

  16. Freedom and privacy in ambient intelligence

    NARCIS (Netherlands)

    Brey, Philip A.E.

    2006-01-01

    This paper analyzes ethical aspects of the new paradigm of Ambient Intelligence, which is a combination of Ubiquitous Computing and Intelligent User Interfaces (IUI’s). After an introduction to the approach, two key ethical dimensions will be analyzed: freedom and privacy. It is argued that Ambient

  17. IMRT QA using machine learning: A multi-institutional validation.

    Science.gov (United States)

    Valdes, Gilmer; Chan, Maria F; Lim, Seng Boh; Scheuermann, Ryan; Deasy, Joseph O; Solberg, Timothy D

    2017-09-01

    To validate a machine learning approach to Virtual intensity-modulated radiation therapy (IMRT) quality assurance (QA) for accurately predicting gamma passing rates using different measurement approaches at different institutions. A Virtual IMRT QA framework was previously developed using a machine learning algorithm based on 498 IMRT plans, in which QA measurements were performed using diode-array detectors and a 3%local/3 mm with 10% threshold at Institution 1. An independent set of 139 IMRT measurements from a different institution, Institution 2, with QA data based on portal dosimetry using the same gamma index, was used to test the mathematical framework. Only pixels with ≥10% of the maximum calibrated units (CU) or dose were included in the comparison. Plans were characterized by 90 different complexity metrics. A weighted poison regression with Lasso regularization was trained to predict passing rates using the complexity metrics as input. The methodology predicted passing rates within 3% accuracy for all composite plans measured using diode-array detectors at Institution 1, and within 3.5% for 120 of 139 plans using portal dosimetry measurements performed on a per-beam basis at Institution 2. The remaining measurements (19) had large areas of low CU, where portal dosimetry has a larger disagreement with the calculated dose and as such, the failure was expected. These beams need further modeling in the treatment planning system to correct the under-response in low-dose regions. Important features selected by Lasso to predict gamma passing rates were as follows: complete irradiated area outline (CIAO), jaw position, fraction of MLC leafs with gaps smaller than 20 or 5 mm, fraction of area receiving less than 50% of the total CU, fraction of the area receiving dose from penumbra, weighted average irregularity factor, and duty cycle. We have demonstrated that Virtual IMRT QA can predict passing rates using different measurement techniques and across multiple

  18. Artificial Intelligence.

    Science.gov (United States)

    Wash, Darrel Patrick

    1989-01-01

    Making a machine seem intelligent is not easy. As a consequence, demand has been rising for computer professionals skilled in artificial intelligence and is likely to continue to go up. These workers develop expert systems and solve the mysteries of machine vision, natural language processing, and neural networks. (Editor)

  19. Intelligent Design

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    2005-01-01

    Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig.......Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig....

  20. QA [quality assurance] at Fermilab; the hermeneutics of NQA-1

    International Nuclear Information System (INIS)

    Bodnarczuk, M.

    1988-06-01

    This paper opens with a brief overview of the purpose of Fermilab and a historical synopsis of the development and current status of quality assurance (QA) at the Laboratory. The paper subsequently addresses some of the more important aspects of interpreting the national standard ANSI/ASME NQA-1 in pure research environments like Fermilab. Highlights of this discussion include, what is hermeneutics and why are hermeneutical considerations relevant for QA, a critical analysis of NQA-1 focussing on teleological aspects of the standard, a description of the hermeneutical approach to NQA-1 used at Fermilab which attempts to capture the true intents of the document without violating the deeply ingrained traditions of quality standards and peer review that have been foundational to the overall success of the paradigms of high-energy physics

  1. QA (quality assurance) at Fermilab; the hermeneutics of NQA-1

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1988-06-01

    This paper opens with a brief overview of the purpose of Fermilab and a historical synopsis of the development and current status of quality assurance (QA) at the Laboratory. The paper subsequently addresses some of the more important aspects of interpreting the national standard ANSI/ASME NQA-1 in pure research environments like Fermilab. Highlights of this discussion include, what is hermeneutics and why are hermeneutical considerations relevant for QA, a critical analysis of NQA-1 focussing on teleological aspects of the standard, a description of the hermeneutical approach to NQA-1 used at Fermilab which attempts to capture the true intents of the document without violating the deeply ingrained traditions of quality standards and peer review that have been foundational to the overall success of the paradigms of high-energy physics.

  2. USGS QA Plan: Certification of digital airborne mapping products

    Science.gov (United States)

    Christopherson, J.

    2007-01-01

    To facilitate acceptance of new digital technologies in aerial imaging and mapping, the US Geological Survey (USGS) and its partners have launched a Quality Assurance (QA) Plan for Digital Aerial Imagery. This should provide a foundation for the quality of digital aerial imagery and products. It introduces broader considerations regarding processes employed by aerial flyers in collecting, processing and delivering data, and provides training and information for US producers and users alike.

  3. QA/QC - Practices and procedures in WWER fuel management

    International Nuclear Information System (INIS)

    Keselica, M.

    1999-01-01

    Construction time schedule and commissioning (unit by unit) of the NPP Dukovany as well as structure of electricity generation in the CEZ in 1998 are reviewed. History of QA/QC system establishment and rules (system standards) as well as organization chart of the NPP Dukovany and quality manual of reactor physics department are presented. Standards of worker's qualification and nuclear fuel inspections are discussed. Fuel reliability indicators are presented

  4. Institutional Patient-specific IMRT QA Does Not Predict Unacceptable Plan Delivery

    Energy Technology Data Exchange (ETDEWEB)

    Kry, Stephen F., E-mail: sfkry@mdanderson.org [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Molineu, Andrea [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Kerns, James R.; Faught, Austin M.; Huang, Jessie Y.; Pulliam, Kiley B.; Tonigan, Jackie [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Health Science Center Houston, Graduate School of Biomedical Sciences, Houston, Texas (United States); Alvarez, Paola [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Stingo, Francesco [The University of Texas Health Science Center Houston, Graduate School of Biomedical Sciences, Houston, Texas (United States); Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Followill, David S. [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Health Science Center Houston, Graduate School of Biomedical Sciences, Houston, Texas (United States)

    2014-12-01

    Purpose: To determine whether in-house patient-specific intensity modulated radiation therapy quality assurance (IMRT QA) results predict Imaging and Radiation Oncology Core (IROC)-Houston phantom results. Methods and Materials: IROC Houston's IMRT head and neck phantoms have been irradiated by numerous institutions as part of clinical trial credentialing. We retrospectively compared these phantom results with those of in-house IMRT QA (following the institution's clinical process) for 855 irradiations performed between 2003 and 2013. The sensitivity and specificity of IMRT QA to detect unacceptable or acceptable plans were determined relative to the IROC Houston phantom results. Additional analyses evaluated specific IMRT QA dosimeters and analysis methods. Results: IMRT QA universally showed poor sensitivity relative to the head and neck phantom, that is, poor ability to predict a failing IROC Houston phantom result. Depending on how the IMRT QA results were interpreted, overall sensitivity ranged from 2% to 18%. For different IMRT QA methods, sensitivity ranged from 3% to 54%. Although the observed sensitivity was particularly poor at clinical thresholds (eg 3% dose difference or 90% of pixels passing gamma), receiver operator characteristic analysis indicated that no threshold showed good sensitivity and specificity for the devices evaluated. Conclusions: IMRT QA is not a reasonable replacement for a credentialing phantom. Moreover, the particularly poor agreement between IMRT QA and the IROC Houston phantoms highlights surprising inconsistency in the QA process.

  5. KINERJA AKADEMIK PASCA SERTIFIKASI AUN-QA PADA PROGRAM STUDI DI INSTITUT PERTANIAN BOGOR

    Directory of Open Access Journals (Sweden)

    Adelyna Adelyna

    2016-05-01

    Full Text Available The aims of the research are to evaluate the academic performance progress of the six study programs of IPB after the certification of AUN-QA. The research was a case study in six study programs that had been certified by AUN-QA until December 2014. This research was conducted with the objectives of defining the relevant indicators of BSC IPB and AUN-QA criteria, analyzing the criteria values of AUN-QA after the AUN-QA certification, analyzing the academic performance based on KPI BSC after the AUN-QA certification, and analyzing the problems in improving academic performance as the basis for the formulation of strategies for improving academic quality. The method used in this research was the balanced scorecard approach (BSC. The results showed that the certification of AUN-QA contains 15 relevant criteria and supports the achievement of BSC IPB. Key performance indicators (KPI BSC IPB supported by the AUN-QA criteria consist of 21 of the 33 indicators of BSC IPB, and 14 of them are relegated to the BSC department indicators. The AUN-QA criteria values on the study program have increased with the highest criterion value in student quality and the lowest one in support staff quality. The weak criteria required to be improved include support staff quality, student assessment, stakeholder feedback, and program specification.Keywords: AUN-QA certification, academic performance, balanced scorecard

  6. Institutional Patient-specific IMRT QA Does Not Predict Unacceptable Plan Delivery

    International Nuclear Information System (INIS)

    Kry, Stephen F.; Molineu, Andrea; Kerns, James R.; Faught, Austin M.; Huang, Jessie Y.; Pulliam, Kiley B.; Tonigan, Jackie; Alvarez, Paola; Stingo, Francesco; Followill, David S.

    2014-01-01

    Purpose: To determine whether in-house patient-specific intensity modulated radiation therapy quality assurance (IMRT QA) results predict Imaging and Radiation Oncology Core (IROC)-Houston phantom results. Methods and Materials: IROC Houston's IMRT head and neck phantoms have been irradiated by numerous institutions as part of clinical trial credentialing. We retrospectively compared these phantom results with those of in-house IMRT QA (following the institution's clinical process) for 855 irradiations performed between 2003 and 2013. The sensitivity and specificity of IMRT QA to detect unacceptable or acceptable plans were determined relative to the IROC Houston phantom results. Additional analyses evaluated specific IMRT QA dosimeters and analysis methods. Results: IMRT QA universally showed poor sensitivity relative to the head and neck phantom, that is, poor ability to predict a failing IROC Houston phantom result. Depending on how the IMRT QA results were interpreted, overall sensitivity ranged from 2% to 18%. For different IMRT QA methods, sensitivity ranged from 3% to 54%. Although the observed sensitivity was particularly poor at clinical thresholds (eg 3% dose difference or 90% of pixels passing gamma), receiver operator characteristic analysis indicated that no threshold showed good sensitivity and specificity for the devices evaluated. Conclusions: IMRT QA is not a reasonable replacement for a credentialing phantom. Moreover, the particularly poor agreement between IMRT QA and the IROC Houston phantoms highlights surprising inconsistency in the QA process

  7. Research and applications: Artificial intelligence

    Science.gov (United States)

    Chaitin, L. J.; Duda, R. O.; Johanson, P. A.; Raphael, B.; Rosen, C. A.; Yates, R. A.

    1970-01-01

    The program is reported for developing techniques in artificial intelligence and their application to the control of mobile automatons for carrying out tasks autonomously. Visual scene analysis, short-term problem solving, and long-term problem solving are discussed along with the PDP-15 simulator, LISP-FORTRAN-MACRO interface, resolution strategies, and cost effectiveness.

  8. System for intelligent teleoperation research

    International Nuclear Information System (INIS)

    Orlando, N.E.

    1983-01-01

    The Automation Technology Branch of NASA Langley Research Center is developing a research capability in the field of artificial intelligence, particularly as applicable in teleoperator/robotics development for remote space operations. As a testbed for experimentation in these areas, a system concept has been developed and is being implemented. This system, termed DAISIE (Distributed Artificially Intelligent System for Interacting with the Environment), interfaces the key processes of perception, reasoning, and manipulation by linking hardware sensors and manipulators to a modular artificial intelligence (AI) software system in a hierarchical control structure. Verification experiments have been performed: one experiment used a blocksworld database and planner embedded in the DAISIE system to intelligently manipulate a simple physical environment; the other experiment implemented a joint-space collision avoidance algorithm. Continued system development is planned

  9. Communications interface for plant monitoring system

    International Nuclear Information System (INIS)

    Lee, K.L.; Morgan, F.A.

    1988-01-01

    This paper presents the communications interface for an intelligent color graphic system which PSE and G developed as part of a plant monitoring system. The intelligent graphic system is designed to off-load traditional host functions such as dynamic graphic updates, keyboard handling and alarm display. The distributed system's data and synchronization problems and their solutions are discussed

  10. IMRT QA: Selecting gamma criteria based on error detection sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Steers, Jennifer M. [Department of Radiation Oncology, Cedars-Sinai Medical Center, Los Angeles, California 90048 and Physics and Biology in Medicine IDP, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90095 (United States); Fraass, Benedick A., E-mail: benedick.fraass@cshs.org [Department of Radiation Oncology, Cedars-Sinai Medical Center, Los Angeles, California 90048 (United States)

    2016-04-15

    Purpose: The gamma comparison is widely used to evaluate the agreement between measurements and treatment planning system calculations in patient-specific intensity modulated radiation therapy (IMRT) quality assurance (QA). However, recent publications have raised concerns about the lack of sensitivity when employing commonly used gamma criteria. Understanding the actual sensitivity of a wide range of different gamma criteria may allow the definition of more meaningful gamma criteria and tolerance limits in IMRT QA. We present a method that allows the quantitative determination of gamma criteria sensitivity to induced errors which can be applied to any unique combination of device, delivery technique, and software utilized in a specific clinic. Methods: A total of 21 DMLC IMRT QA measurements (ArcCHECK®, Sun Nuclear) were compared to QA plan calculations with induced errors. Three scenarios were studied: MU errors, multi-leaf collimator (MLC) errors, and the sensitivity of the gamma comparison to changes in penumbra width. Gamma comparisons were performed between measurements and error-induced calculations using a wide range of gamma criteria, resulting in a total of over 20 000 gamma comparisons. Gamma passing rates for each error class and case were graphed against error magnitude to create error curves in order to represent the range of missed errors in routine IMRT QA using 36 different gamma criteria. Results: This study demonstrates that systematic errors and case-specific errors can be detected by the error curve analysis. Depending on the location of the error curve peak (e.g., not centered about zero), 3%/3 mm threshold = 10% at 90% pixels passing may miss errors as large as 15% MU errors and ±1 cm random MLC errors for some cases. As the dose threshold parameter was increased for a given %Diff/distance-to-agreement (DTA) setting, error sensitivity was increased by up to a factor of two for select cases. This increased sensitivity with increasing dose

  11. Graded approach for establishment of QA requirements for Type B packaging of radioactive material

    International Nuclear Information System (INIS)

    Fabian, R.R.; Woodruff, K.C.

    1988-01-01

    A study that was conducted by the Nuclear Regulatory Commission for the U.S. Congress to assess the effectiveness of quality assurance (QA) activities has demonstrated a need to modify and improve the application of QA requirements for the nuclear industry. As a result, the packaging community, along with the nuclear industry as a whole, has taken action to increase the efficacy of the QA function. The results of the study indicate that a graded approach for establishing QA requirements is the preferred method. The essence of the graded approach is the establishment of applicable QA requirements to an extent consistent with the importance to safety of an item, component, system, or activity. This paper describes the process that is used to develop the graded approach for QA requirements pertaining to Type B packaging

  12. Intelligent playgrounds

    DEFF Research Database (Denmark)

    Larsen, Lasse Juel

    2009-01-01

    This paper examines play, gaming and learning in regard to intelligent playware developed for outdoor use. The key questions are how does these novel artefacts influence the concept of play, gaming and learning. Up until now play and game have been understood as different activities. This paper...... examines if the sharp differentiation between the two can be uphold in regard to intelligent playware for outdoor use. Play and game activities will be analysed and viewed in conjunction with learning contexts. This paper will stipulate that intelligent playware facilitates rapid shifts in contexts...

  13. Artificial intelligence

    CERN Document Server

    Ennals, J R

    1987-01-01

    Artificial Intelligence: State of the Art Report is a two-part report consisting of the invited papers and the analysis. The editor first gives an introduction to the invited papers before presenting each paper and the analysis, and then concludes with the list of references related to the study. The invited papers explore the various aspects of artificial intelligence. The analysis part assesses the major advances in artificial intelligence and provides a balanced analysis of the state of the art in this field. The Bibliography compiles the most important published material on the subject of

  14. Artificial Intelligence

    CERN Document Server

    Warwick, Kevin

    2011-01-01

    if AI is outside your field, or you know something of the subject and would like to know more then Artificial Intelligence: The Basics is a brilliant primer.' - Nick Smith, Engineering and Technology Magazine November 2011 Artificial Intelligence: The Basics is a concise and cutting-edge introduction to the fast moving world of AI. The author Kevin Warwick, a pioneer in the field, examines issues of what it means to be man or machine and looks at advances in robotics which have blurred the boundaries. Topics covered include: how intelligence can be defined whether machines can 'think' sensory

  15. Artificial Intelligence Techniques: Applications for Courseware Development.

    Science.gov (United States)

    Dear, Brian L.

    1986-01-01

    Introduces some general concepts and techniques of artificial intelligence (natural language interfaces, expert systems, knowledge bases and knowledge representation, heuristics, user-interface metaphors, and object-based environments) and investigates ways these techniques might be applied to analysis, design, development, implementation, and…

  16. Intelligent Advertising

    OpenAIRE

    Díaz Pinedo, Edilfredo Eliot

    2012-01-01

    Intelligent Advertisement diseña e implementa un sistema de publicidad para dispositivos móviles en un centro comercial, donde los clientes reciben publicidad de forma pasiva en sus dispositivos mientras están dentro.

  17. Improvement of QA/QC activities in the construction of nuclear power plant

    International Nuclear Information System (INIS)

    Jinji Tomita; Shigetaka Tomaru

    1987-01-01

    Construction of commercial nuclear power plants in Japan started at around 1965. In this presentation are described quality assurance (QA) activities of a plant supplier who is a manufacturer of the key components as well. The QA activities until now are divided into several periods of the construction history in Japan. First term is 1960's when the QA activities are featured as the study and implementation through the construction of imported plants. Since then technologies and procedures of our own have been established and improved for the construction of high reliability plants. Our present QA activities are based on the active reflection of those lessons learned of past experiences. (author)

  18. A proposal of an open ubiquitous fuzzy computing system for Ambient Intelligence

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.; Lee, R.S.T.; Lioa, V.

    2007-01-01

    Ambient Intelligence (AmI) is considered as the composition of three emergent technologies: Ubiquitous Computing, Ubiquitous Communication and Intelligent User Interfaces. The aim of integration of aforesaid technologies is to make wider the interaction between human beings and information

  19. BUSINESS INTELLIGENCE

    OpenAIRE

    Bogdan Mohor Dumitrita

    2011-01-01

    The purpose of this work is to present business intelligence systems. These systems can be extremely complex and important in modern market competition. Its effectiveness also reflects in price, so we have to exlore their financial potential before investment. The systems have 20 years long history and during that time many of such tools have been developed, but they are rarely still in use. Business intelligence system consists of three main areas: Data Warehouse, ETL tools and tools f...

  20. Intelligent indexing

    International Nuclear Information System (INIS)

    Farkas, J.

    1992-01-01

    In this paper we discuss the relevance of artificial intelligence to the automatic indexing of natural language text. We describe the use of domain-specific semantically-based thesauruses and address the problem of creating adequate knowledge bases for intelligent indexing systems. We also discuss the relevance of the Hilbert space ι 2 to the compact representation of documents and to the definition of the similarity of natural language texts. (author). 17 refs., 2 figs

  1. Intelligent indexing

    Energy Technology Data Exchange (ETDEWEB)

    Farkas, J

    1993-12-31

    In this paper we discuss the relevance of artificial intelligence to the automatic indexing of natural language text. We describe the use of domain-specific semantically-based thesauruses and address the problem of creating adequate knowledge bases for intelligent indexing systems. We also discuss the relevance of the Hilbert space {iota}{sup 2} to the compact representation of documents and to the definition of the similarity of natural language texts. (author). 17 refs., 2 figs.

  2. Kinetic Interface

    DEFF Research Database (Denmark)

    2009-01-01

    A kinetic interface for orientation detection in a video training system is disclosed. The interface includes a balance platform instrumented with inertial motion sensors. The interface engages a participant's sense of balance in training exercises.......A kinetic interface for orientation detection in a video training system is disclosed. The interface includes a balance platform instrumented with inertial motion sensors. The interface engages a participant's sense of balance in training exercises....

  3. Building Watson: An Overview of the DeepQA Project

    OpenAIRE

    Ferrucci, David; Brown, Eric; Chu-Carroll, Jennifer; Fan, James; Gondek, David; Kalyanpur, Aditya A.; Lally, Adam; Murdock, J. William; Nyberg, Eric; Prager, John; Schlaefer, Nico; Welty, Chris

    2010-01-01

    IBM Research undertook a challenge to build a computer system that could compete at the human champion level in real time on the American TV Quiz show, Jeopardy! The extent of the challenge includes fielding a real-time automatic contestant on the show, not merely a laboratory exercise. The Jeopardy! Challenge helped us address requirements that led to the design of the DeepQA architecture and the implementation of Watson. After 3 years of intense research and development by a core team of ab...

  4. ATLAS IBL Stave QA - In and Around SR1

    CERN Document Server

    Carney, Rebecca

    2013-01-01

    During the Phase-I upgrade the ATLAS Inner tracker will have a whole new layer of pixels inserted between the existing B-layer and a new, smaller, beam pipe. Briefly, there are 14 assemblies of 32 single and double-chip hybrid silicon pixel chips arranged side-by-side on light-weight, thermally conductive carbon-fibre coated carbon foam supports called staves. When the staves arrive at CERN, fully assembled, they undergo a QA procedure, which checks the power characteristics of sensors and read-out chips, and assess the quality of individual pixels.

  5. Common QA/QM Criteria for Multinational Vendor Inspection

    International Nuclear Information System (INIS)

    2014-01-01

    This VICWG document provides the 'Common QA/QM Criteria' which will be used in Multinational Vendor Inspection. The 'Common QA/QM Criteria' provides the basic consideration when performing the Vendor Inspection. These criteria has been developed in conformity with International Codes and Standards such as IAEA, ISO and so on that MDEP member countries adopted. The purpose of the VICWG is to establish areas of co-operation in the Vendor Inspection practices among MDEP member countries as described in the MDEP issue-specific Terms of Reference (ToR). As part of this, from the beginning, a survey was performed to understand and to identify areas of commonality and differences between regulatory practices of member countries in the area of vendor inspection. The VICWG also collaborated by performing Witnessed Inspections and Joint Inspections. Through these activities, it was recognized that member countries commonly apply the IAEA safety standard (GS-R-3) to the vendor inspection criteria, and almost ail European member countries apply the ISO standard (ISO9001). In the US, the NRC regulatory requirement in 10 CFR, Part 50, Appendix B is used. South Korea uses the same criteria as in the US. As a result of the information obtained, a comparison table between codes and standards (IAEAGS-R-3, ISO 9001:2008.10CFR50 Appendix Band ASME NQA-1) has been developed in order to inform the development of 'Common QA/QM Criteria'. The result is documented in Table 1, 'MDEP CORE QA/QM Requirement and Comparison between Codes and Standards'. In addition, each country's criteria were compared with the US 10CFR50 Appendix B as a template. Table 2 shows VICWG Survey on Quality Assurance Program Requirements. Through these activities above, we considered that the core requirements should be consistent with both IAEA safety standard and ISO standard, and considered that the common requirements in the US 10CFR50 Appendix B used to the survey

  6. Combinatorial Nano-Bio Interfaces.

    Science.gov (United States)

    Cai, Pingqiang; Zhang, Xiaoqian; Wang, Ming; Wu, Yun-Long; Chen, Xiaodong

    2018-06-08

    Nano-bio interfaces are emerging from the convergence of engineered nanomaterials and biological entities. Despite rapid growth, clinical translation of biomedical nanomaterials is heavily compromised by the lack of comprehensive understanding of biophysicochemical interactions at nano-bio interfaces. In the past decade, a few investigations have adopted a combinatorial approach toward decoding nano-bio interfaces. Combinatorial nano-bio interfaces comprise the design of nanocombinatorial libraries and high-throughput bioevaluation. In this Perspective, we address challenges in combinatorial nano-bio interfaces and call for multiparametric nanocombinatorics (composition, morphology, mechanics, surface chemistry), multiscale bioevaluation (biomolecules, organelles, cells, tissues/organs), and the recruitment of computational modeling and artificial intelligence. Leveraging combinatorial nano-bio interfaces will shed light on precision nanomedicine and its potential applications.

  7. Multimodal follow-up questions to multimodal answers in a QA system

    NARCIS (Netherlands)

    van Schooten, B.W.; op den Akker, Hendrikus J.A.

    2007-01-01

    We are developing a dialogue manager (DM) for a multimodal interactive Question Answering (QA) system. Our QA system presents answers using text and pictures, and the user may pose follow-up questions using text or speech, while indicating screen elements with the mouse. We developed a corpus of

  8. Application of QA grading to Yucca Mountain Site Characterization Project items and activities

    International Nuclear Information System (INIS)

    Murthy, R.B.; Smith, S.C.

    1991-01-01

    Grading is the act of selecting the quality assurance (QA) measures necessary to develop and maintain confidence in the quality of an item or activity. The list of QA measures from which this selection is made are the 20 criteria of the Yucca Mountain Site Characterization Project Quality Assurance Requirements Document

  9. Poster - Thur Eve - 29: Detecting changes in IMRT QA using statistical process control.

    Science.gov (United States)

    Drever, L; Salomons, G

    2012-07-01

    Statistical process control (SPC) methods were used to analyze 239 measurement based individual IMRT QA events. The selected IMRT QA events were all head and neck (H&N) cases with 70Gy in 35 fractions, and all prostate cases with 76Gy in 38 fractions planned between March 2009 and 2012. The results were used to determine if the tolerance limits currently being used for IMRT QA were able to indicate if the process was under control. The SPC calculations were repeated for IMRT QA of the same type of cases that were planned after the treatment planning system was upgraded from Eclipse version 8.1.18 to version 10.0.39. The initial tolerance limits were found to be acceptable for two of the three metrics tested prior to the upgrade. After the upgrade to the treatment planning system the SPC analysis found that the a priori limits were no longer capable of indicating control for 2 of the 3 metrics analyzed. The changes in the IMRT QA results were clearly identified using SPC, indicating that it is a useful tool for finding changes in the IMRT QA process. Routine application of SPC to IMRT QA results would help to distinguish unintentional trends and changes from the random variation in the IMRT QA results for individual plans. © 2012 American Association of Physicists in Medicine.

  10. Technical report on comparative analysis of ASME QA requirements and ISO series

    International Nuclear Information System (INIS)

    Kim, Kwan Hyun

    2000-06-01

    This technical report provides the differences on the QA requirement ASME and ISO in nuclear fields. This report applies to the quality assurance(QA) programmes of the design of two requirement. The organization having overall responsibility for the nuclear design, preservation, fabrication shall be described in this report in each stage of design project

  11. QA practice for online analyzers in water steam cycles

    International Nuclear Information System (INIS)

    Staub, L.

    2010-01-01

    The liberalization of power markets throughout the world has resulted in more and more power stations being operated in cycling mode, with frequent load changes and multiple daily start-up and shut-down cycles. This more flexible operation also calls for better automation and poses new challenges to water chemistry in water steam cycles, to avoid subsequent damage to vital plant components such as turbines, boilers or condensers. But automation for the most important chemistry control tool, the sampling and online analyzer system, is only possible if chemists can rely on their online analysis equipment. Proof of plausibility as well as reliability and availability of online analysis results becomes a major focus. While SOP and standard QA procedures for laboratory equipment are well established and daily practice, such measures are widely neglected for online process analyzers. This paper is aiming to establish a roadmap for the implementation of SOP and QA/QC procedures for online instruments in water steam cycles, leading to reliable chemical information that is trustworthy for process automation and chemistry control in water steam cycles. (author)

  12. QA practice for online analyzers in water steam cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2009-01-01

    The liberalization of power markets throughout the world has resulted in more and more power stations being operated in cycling mode, with frequent load changes and multiple daily start-up and shut-down cycles. This more flexible operation also calls for better automation and poses new challenges to water chemistry in water steam cycles, to avoid subsequent damage to vital plant components such as turbines, boilers or condensers. But automation for the most important chemistry control tool, the sampling and online analyzer system, is only possible if chemists can rely on their online analysis equipment. Proof of plausibility as well as reliability and availability of online analysis results becomes a major focus. While SOP and standard QA procedures for laboratory equipment are well established and daily practice, such measures are widely neglected for online process analyzers. This paper is aiming to establish a roadmap for the implementation of SOP and QA/QC procedures for online instruments in water steam cycles, leading to reliable chemical information that is trustworthy for process automation and chemistry control in water steam cycles. (author)

  13. Poster - 10: QA of Ultrasound Images for Prostate Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Szpala, Stanislaw; Kohli, Kirpal S. [BCCA-Fraser Valley Centre (Canada)

    2016-08-15

    Purpose: The current QA protocol of ultrasound systems used in prostate brachytherapy (TG128) addresses geometrical verifications, but the scope of evaluation of image quality is limited. We recognized importance of the latter in routine practice, and designed a protocol for QA of the images. Methods: Images of an ultrasound prostate phantom (CIRS053) were collected with BK Flex Focus 400. The images were saved as bmp after adjusting the gain to 50% for consistent results. Mean pixel values and signal to noise ratio were inspected in the representative sections of the phantom, including the mock prostate and the unechoic medium. Constancy of these numbers over a one year period was looked at. Results: The typical intensity in the mock prostate region in the transverse images ranged between 95 and 118 (out of 256), and the signal to noise was about 10. The intensity in the urethra region was about 170±40, and the unechoic medium was 2±2. The mean and the signal to noise ratio remained almost unchanged after a year, while the signal in the unechoic medium increased to about 7±4. Similar values were obtained in the sagittal images. Conclusions: The image analysis discussed above allows quick evaluation of constancy of the image quality. This may be also useful in troubleshooting image-quality problems during routine exams, which might not be due to deterioration of the US system, but other reasons, e.g. variations in tissue properties or air being trapped between the probe and the anatomy.

  14. Intelligent systems

    CERN Document Server

    Irwin, J David

    2011-01-01

    Technology has now progressed to the point that intelligent systems are replacing humans in the decision making processes as well as aiding in the solution of very complex problems. In many cases intelligent systems are already outperforming human activities. Artificial neural networks are not only capable of learning how to classify patterns, such images or sequence of events, but they can also effectively model complex nonlinear systems. Their ability to classify sequences of events is probably more popular in industrial applications where there is an inherent need to model nonlinear system

  15. Intelligent Universe

    Energy Technology Data Exchange (ETDEWEB)

    Hoyle, F

    1983-01-01

    The subject is covered in chapters, entitled: chance and the universe (synthesis of proteins; the primordial soup); the gospel according to Darwin (discussion of Darwin theory of evolution); life did not originate on earth (fossils from space; life in space); the interstellar connection (living dust between the stars; bacteria in space falling to the earth; interplanetary dust); evolution by cosmic control (microorganisms; genetics); why aren't the others here (a cosmic origin of life); after the big bang (big bang and steady state); the information rich universe; what is intelligence up to; the intelligent universe.

  16. Artificial intelligence

    International Nuclear Information System (INIS)

    Perret-Galix, D.

    1992-01-01

    A vivid example of the growing need for frontier physics experiments to make use of frontier technology is in the field of artificial intelligence and related themes. This was reflected in the second international workshop on 'Software Engineering, Artificial Intelligence and Expert Systems in High Energy and Nuclear Physics' which took place from 13-18 January at France Telecom's Agelonde site at La Londe des Maures, Provence. It was the second in a series, the first having been held at Lyon in 1990

  17. Artificial Intelligence and Moral intelligence

    Directory of Open Access Journals (Sweden)

    Laura Pana

    2008-07-01

    Full Text Available We discuss the thesis that the implementation of a moral code in the behaviour of artificial intelligent systems needs a specific form of human and artificial intelligence, not just an abstract intelligence. We present intelligence as a system with an internal structure and the structural levels of the moral system, as well as certain characteristics of artificial intelligent agents which can/must be treated as 1- individual entities (with a complex, specialized, autonomous or selfdetermined, even unpredictable conduct, 2- entities endowed with diverse or even multiple intelligence forms, like moral intelligence, 3- open and, even, free-conduct performing systems (with specific, flexible and heuristic mechanisms and procedures of decision, 4 – systems which are open to education, not just to instruction, 5- entities with “lifegraphy”, not just “stategraphy”, 6- equipped not just with automatisms but with beliefs (cognitive and affective complexes, 7- capable even of reflection (“moral life” is a form of spiritual, not just of conscious activity, 8 – elements/members of some real (corporal or virtual community, 9 – cultural beings: free conduct gives cultural value to the action of a ”natural” or artificial being. Implementation of such characteristics does not necessarily suppose efforts to design, construct and educate machines like human beings. The human moral code is irremediably imperfect: it is a morality of preference, of accountability (not of responsibility and a morality of non-liberty, which cannot be remedied by the invention of ethical systems, by the circulation of ideal values and by ethical (even computing education. But such an imperfect morality needs perfect instruments for its implementation: applications of special logic fields; efficient psychological (theoretical and technical attainments to endow the machine not just with intelligence, but with conscience and even spirit; comprehensive technical

  18. Concept of a QA-programme for the recipient country, goals and measures

    International Nuclear Information System (INIS)

    Thomas, F.W.

    1986-04-01

    Ordering, design and erection of a NPP is a complex business even in a country with experience. Therefore a QA-Programme can be helpful to do the work in a planned and organized manner. In the case of a recipient country the use of administrative QA-measures seems to be a necessary support, especially for the ordering company. It is not the intention of the QA-Programme to say what to do, and so it cannot solve ''political'' questions of the business, but the QA-Programme can say how the work has to be done to bring it to a good end. This lecture points out the most important and interesting questions in the phase of establishing a QA-Programme. Examples of solutions are given. (author). 13 figs

  19. Applications of QA to RandD support of HLW programs

    International Nuclear Information System (INIS)

    Ryder, D.E.

    1988-05-01

    The application of a formal QA program to any discipline or organization can be difficult to achieve and to do so with a research and development organization has special challenges that are somewhat unique. This paper describes how a QA program based upon a national consensus standard (developed for application to the design, construction and operation of nuclear facilities) has been successfully applied to some of the research and development activities in support of the High Level Waste Programs. This description includes a discussion on the importance of being creative when interpreting the QA standard, a brief overview of the QA program that was developed and the results achieved during implementation of the QA program. 4 refs., 4 figs

  20. Adding intelligence to scientific data management

    Science.gov (United States)

    Campbell, William J.; Short, Nicholas M., Jr.; Treinish, Lloyd A.

    1989-01-01

    NASA plans to solve some of the problems of handling large-scale scientific data bases by turning to artificial intelligence (AI) are discussed. The growth of the information glut and the ways that AI can help alleviate the resulting problems are reviewed. The employment of the Intelligent User Interface prototype, where the user will generate his own natural language query with the assistance of the system, is examined. Spatial data management, scientific data visualization, and data fusion are discussed.

  1. Plant intelligence

    Science.gov (United States)

    Lipavská, Helena; Žárský, Viktor

    2009-01-01

    The concept of plant intelligence, as proposed by Anthony Trewavas, has raised considerable discussion. However, plant intelligence remains loosely defined; often it is either perceived as practically synonymous to Darwinian fitness, or reduced to a mere decorative metaphor. A more strict view can be taken, emphasizing necessary prerequisites such as memory and learning, which requires clarifying the definition of memory itself. To qualify as memories, traces of past events have to be not only stored, but also actively accessed. We propose a criterion for eliminating false candidates of possible plant intelligence phenomena in this stricter sense: an “intelligent” behavior must involve a component that can be approximated by a plausible algorithmic model involving recourse to stored information about past states of the individual or its environment. Re-evaluation of previously presented examples of plant intelligence shows that only some of them pass our test. “You were hurt?” Kumiko said, looking at the scar. Sally looked down. “Yeah.” “Why didn't you have it removed?” “Sometimes it's good to remember.” “Being hurt?” “Being stupid.”—(W. Gibson: Mona Lisa Overdrive) PMID:19816094

  2. Speech Intelligibility

    Science.gov (United States)

    Brand, Thomas

    Speech intelligibility (SI) is important for different fields of research, engineering and diagnostics in order to quantify very different phenomena like the quality of recordings, communication and playback devices, the reverberation of auditoria, characteristics of hearing impairment, benefit using hearing aids or combinations of these things.

  3. Internet-based intelligent information processing systems

    CERN Document Server

    Tonfoni, G; Ichalkaranje, N S

    2003-01-01

    The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap

  4. QA lessons learned for parameter control from the WIPP Project

    International Nuclear Information System (INIS)

    Richards, R.R.

    1998-01-01

    This paper provides a summary of lessons learned from experiences on the Waste Isolation Pilot Plant (WJPP) Project in implementation of quality assurance controls surrounding inputs for performance assessment analysis. Since the performance assessment (PA) process is inherent in compliance determination for any waste repository, these lessons-learned are intended to be useful to investigators, analysts, and Quality Assurance (QA) practitioners working on high level waste disposal projects. On the WIPP Project, PA analyses for regulatory-compliance determination utilized several inter-related computer programs (codes) that mathematically modeled phenomena such as radionuclide release, retardation, and transport. The input information for those codes are the parameters that are the subject of this paper. Parameters were maintained in a computer database, which was then queried electronically by the PA codes whenever input was needed as the analyses were run

  5. A mathematical framework for virtual IMRT QA using machine learning.

    Science.gov (United States)

    Valdes, G; Scheuermann, R; Hung, C Y; Olszanski, A; Bellerive, M; Solberg, T D

    2016-07-01

    It is common practice to perform patient-specific pretreatment verifications to the clinical delivery of IMRT. This process can be time-consuming and not altogether instructive due to the myriad sources that may produce a failing result. The purpose of this study was to develop an algorithm capable of predicting IMRT QA passing rates a priori. From all treatment, 498 IMRT plans sites were planned in eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam Linacs. 3%/3 mm local dose/distance-to-agreement (DTA) was recorded using a commercial 2D diode array. Each plan was characterized by 78 metrics that describe different aspects of their complexity that could lead to disagreements between the calculated and measured dose. A Poisson regression with Lasso regularization was trained to learn the relation between the plan characteristics and each passing rate. Passing rates 3%/3 mm local dose/DTA can be predicted with an error smaller than 3% for all plans analyzed. The most important metrics to describe the passing rates were determined to be the MU factor (MU per Gy), small aperture score, irregularity factor, and fraction of the plan delivered at the corners of a 40 × 40 cm field. The higher the value of these metrics, the worse the passing rates. The Virtual QA process predicts IMRT passing rates with a high likelihood, allows the detection of failures due to setup errors, and it is sensitive enough to detect small differences between matched Linacs.

  6. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  7. Refinement of MLC modeling improves commercial QA dosimetry system for SRS and SBRT patient-specific QA.

    Science.gov (United States)

    Hillman, Yair; Kim, Josh; Chetty, Indrin; Wen, Ning

    2018-04-01

    Mobius 3D (M3D) provides a volumetric dose verification of the treatment planning system's calculated dose using an independent beam model and a collapsed cone convolution superposition algorithm. However, there is a lack of investigation into M3D's accuracy and effectiveness for stereotactic radiosurgery (SRS) and stereotactic body radiotherapy (SBRT) quality assurance (QA). Here, we collaborated with the vendor to develop a revised M3D beam model for SRS/SBRT cases treated with a 6X flattening filter-free (FFF) beam and high-definition multiple leaf collimator (HDMLC) on an Edge linear accelerator. Eighty SRS/SBRT cases, planned with AAA dose algorithm and validated with Gafchromic film, were compared to M3D dose calculations using 3D gamma analysis with 2%/2 mm gamma criteria and a 10% threshold. A revised beam model was developed by refining the HD-MLC model in M3D to improve small field dose calculation accuracy and beam profile agreement. All cases were reanalyzed using the revised beam model. The impact of heterogeneity corrections for lung cases was investigated by applying lung density overrides to five cases. For the standard and revised beam models, respectively, the mean gamma passing rates were 94.6% [standard deviation (SD): 6.1%] and 98.0% [SD: 1.7%] (for the overall patient), 88.2% [SD: 17.3%] and 93.8% [SD: 6.8%] (for the brain PTV), 71.4% [SD: 18.4%] and 81.5% [SD: 14.3%] (for the lung PTV), 83.3% [SD: 16.7%] and 67.9% [SD: 23.0%] (for the spine PTV), and 78.6% [SD: 14.0%] and 86.8% [SD: 12.5%] (for the PTV of all other sites). The lung PTV mean gamma passing rates improved from 74.1% [SD: 7.5%] to 89.3% [SD: 7.2%] with the lung density overridden. The revised beam model achieved an output factor within 3% of plastic scintillator measurements for 2 × 2 cm 2 MLC field size, but larger discrepancies are still seen for smaller field sizes which necessitate further improvement of the beam model. Special attention needs to be paid to small field

  8. SU-F-T-226: QA Management for a Large Institution with Multiple Campuses for FMEA

    Energy Technology Data Exchange (ETDEWEB)

    Tang, G; Chan, M; Lovelock, D; Lim, S; Febo, R; DeLauter, J; Both, S; Li, X; Ma, R; Saleh, Z; Song, Y; Tang, X; Xiong, W; Hunt, M; LoSasso, T [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2016-06-15

    Purpose: To redesign our radiation therapy QA program with the goal to improve quality, efficiency, and consistency among a growing number of campuses at a large institution. Methods: A QA committee was established with at least one physicist representing each of our six campuses (22 linacs). Weekly meetings were scheduled to advise on and update current procedures, to review end-to-end and other test results, and to prepare composite reports for internal and external audits. QA procedures for treatment and imaging equipment were derived from TG Reports 142 and 66, practice guidelines, and feedback from ACR evaluations. The committee focused on reaching a consensus on a single QA program among all campuses using the same type of equipment and reference data. Since the recommendations for tolerances referenced to baseline data were subject to interpretation in some instances, the committee reviewed the characteristics of all machines and quantified any variations before choosing between treatment planning system (i.e. treatment planning system commissioning data that is representative for all machines) or machine-specific values (i.e. commissioning data of the individual machines) as baseline data. Results: The configured QA program will be followed strictly by all campuses. Inventory of available equipment has been compiled, and additional equipment acquisitions for the QA program are made as needed. Dosimetric characteristics are evaluated for all machines using the same methods to ensure consistency of beam data where possible. In most cases, baseline data refer to treatment planning system commissioning data but machine-specific values are used as reference where it is deemed appropriate. Conclusion: With a uniform QA scheme, variations in QA procedures are kept to a minimum. With a centralized database, data collection and analysis are simplified. This program will facilitate uniformity in patient treatments and analysis of large amounts of QA data campus

  9. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  10. Artificial Intelligence.

    Science.gov (United States)

    Lawrence, David R; Palacios-González, César; Harris, John

    2016-04-01

    It seems natural to think that the same prudential and ethical reasons for mutual respect and tolerance that one has vis-à-vis other human persons would hold toward newly encountered paradigmatic but nonhuman biological persons. One also tends to think that they would have similar reasons for treating we humans as creatures that count morally in our own right. This line of thought transcends biological boundaries-namely, with regard to artificially (super)intelligent persons-but is this a safe assumption? The issue concerns ultimate moral significance: the significance possessed by human persons, persons from other planets, and hypothetical nonorganic persons in the form of artificial intelligence (AI). This article investigates why our possible relations to AI persons could be more complicated than they first might appear, given that they might possess a radically different nature to us, to the point that civilized or peaceful coexistence in a determinate geographical space could be impossible to achieve.

  11. Third Conference on Artificial Intelligence for Space Applications, part 1

    Science.gov (United States)

    Denton, Judith S. (Compiler); Freeman, Michael S. (Compiler); Vereen, Mary (Compiler)

    1987-01-01

    The application of artificial intelligence to spacecraft and aerospace systems is discussed. Expert systems, robotics, space station automation, fault diagnostics, parallel processing, knowledge representation, scheduling, man-machine interfaces and neural nets are among the topics discussed.

  12. Per-beam, planar IMRT QA passing rates do not predict clinically relevant patient dose errors

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E.; Zhen Heming; Tome, Wolfgang A. [Canis Lupus LLC and Department of Human Oncology, University of Wisconsin, Merrimac, Wisconsin 53561 (United States); Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 (United States); Departments of Human Oncology, Medical Physics, and Biomedical Engineering, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2011-02-15

    Purpose: The purpose of this work is to determine the statistical correlation between per-beam, planar IMRT QA passing rates and several clinically relevant, anatomy-based dose errors for per-patient IMRT QA. The intent is to assess the predictive power of a common conventional IMRT QA performance metric, the Gamma passing rate per beam. Methods: Ninety-six unique data sets were created by inducing four types of dose errors in 24 clinical head and neck IMRT plans, each planned with 6 MV Varian 120-leaf MLC linear accelerators using a commercial treatment planning system and step-and-shoot delivery. The error-free beams/plans were used as ''simulated measurements'' (for generating the IMRT QA dose planes and the anatomy dose metrics) to compare to the corresponding data calculated by the error-induced plans. The degree of the induced errors was tuned to mimic IMRT QA passing rates that are commonly achieved using conventional methods. Results: Analysis of clinical metrics (parotid mean doses, spinal cord max and D1cc, CTV D95, and larynx mean) vs IMRT QA Gamma analysis (3%/3 mm, 2/2, 1/1) showed that in all cases, there were only weak to moderate correlations (range of Pearson's r-values: -0.295 to 0.653). Moreover, the moderate correlations actually had positive Pearson's r-values (i.e., clinically relevant metric differences increased with increasing IMRT QA passing rate), indicating that some of the largest anatomy-based dose differences occurred in the cases of high IMRT QA passing rates, which may be called ''false negatives.'' The results also show numerous instances of false positives or cases where low IMRT QA passing rates do not imply large errors in anatomy dose metrics. In none of the cases was there correlation consistent with high predictive power of planar IMRT passing rates, i.e., in none of the cases did high IMRT QA Gamma passing rates predict low errors in anatomy dose metrics or vice versa

  13. Quality assurance (QA) and quality control (QC) of image guided radiotherapy (IGRT). Osaka Rosai Hospital experience

    International Nuclear Information System (INIS)

    Tsuboi, Kazuki; Yagi, Masayuki; Fujiwara, Kanta

    2013-01-01

    The linear accelerator with image guided radiation therapy (IGRT) was introduced in May 2010. We performed the verification of the IGRT system, id est (i.e.), acceptance test and our original performance test and confirmed the acceptability for clinical use. We also performed daily QA/QC program before the start of treatment. One-year experience of QA/QC program showed excellent stability of IGRT function compared with our old machine. We further hope to establish the more useful management system and QA/QC program. (author)

  14. Per-beam, planar IMRT QA passing rates do not predict clinically relevant patient dose errors

    International Nuclear Information System (INIS)

    Nelms, Benjamin E.; Zhen Heming; Tome, Wolfgang A.

    2011-01-01

    Purpose: The purpose of this work is to determine the statistical correlation between per-beam, planar IMRT QA passing rates and several clinically relevant, anatomy-based dose errors for per-patient IMRT QA. The intent is to assess the predictive power of a common conventional IMRT QA performance metric, the Gamma passing rate per beam. Methods: Ninety-six unique data sets were created by inducing four types of dose errors in 24 clinical head and neck IMRT plans, each planned with 6 MV Varian 120-leaf MLC linear accelerators using a commercial treatment planning system and step-and-shoot delivery. The error-free beams/plans were used as ''simulated measurements'' (for generating the IMRT QA dose planes and the anatomy dose metrics) to compare to the corresponding data calculated by the error-induced plans. The degree of the induced errors was tuned to mimic IMRT QA passing rates that are commonly achieved using conventional methods. Results: Analysis of clinical metrics (parotid mean doses, spinal cord max and D1cc, CTV D95, and larynx mean) vs IMRT QA Gamma analysis (3%/3 mm, 2/2, 1/1) showed that in all cases, there were only weak to moderate correlations (range of Pearson's r-values: -0.295 to 0.653). Moreover, the moderate correlations actually had positive Pearson's r-values (i.e., clinically relevant metric differences increased with increasing IMRT QA passing rate), indicating that some of the largest anatomy-based dose differences occurred in the cases of high IMRT QA passing rates, which may be called ''false negatives.'' The results also show numerous instances of false positives or cases where low IMRT QA passing rates do not imply large errors in anatomy dose metrics. In none of the cases was there correlation consistent with high predictive power of planar IMRT passing rates, i.e., in none of the cases did high IMRT QA Gamma passing rates predict low errors in anatomy dose metrics or vice versa. Conclusions: There is a lack of correlation between

  15. Discussion of Regulatory Guide 7.10, emphasizing the graded approach for establishing QA programs

    International Nuclear Information System (INIS)

    Gordon, L.; Lake, W.H.

    1983-01-01

    To assist applicants in establishing an acceptable QA program to meet the programmatic elements of Appendix E to 10 CFR Part 71, Regulatory Guide 7.10 was developed. Regulatory Guide 7.10 is organized in three self-contained ANNEXES. Guidance applicable to designer/fabricators, to users, and users of radiographic devices are in separate annexes. QA programs for packaging to transport radioactive material are similar in regard to the various operations a licensee may be involved in. However, the appropriate QA/QC effort to verify the program elements may vary significantly. This is referred to as the graded approach. Appendix A in the guide addresses the graded approach

  16. Interface models

    DEFF Research Database (Denmark)

    Ravn, Anders P.; Staunstrup, Jørgen

    1994-01-01

    This paper proposes a model for specifying interfaces between concurrently executing modules of a computing system. The model does not prescribe a particular type of communication protocol and is aimed at describing interfaces between both software and hardware modules or a combination of the two....... The model describes both functional and timing properties of an interface...

  17. SU-F-T-275: A Correlation Study On 3D Fluence-Based QA and 2D Dose Measurement-Based QA

    International Nuclear Information System (INIS)

    Liu, S; Mazur, T; Li, H; Green, O; Sun, B; Mutic, S; Yang, D

    2016-01-01

    Purpose: The aim of this paper was to demonstrate the feasibility and creditability of computing and verifying 3D fluencies to assure IMRT and VMAT treatment deliveries, by correlating the passing rates of the 3D fluence-based QA (P(ά)) to the passing rates of 2D dose measurementbased QA (P(Dm)). Methods: 3D volumetric primary fluencies are calculated by forward-projecting the beam apertures and modulated by beam MU values at all gantry angles. We first introduce simulated machine parameter errors (MU, MLC positions, jaw, gantry and collimator) to the plan. Using passing rates of voxel intensity differences (P(Ir)) and 3D gamma analysis (P(γ)), calculated 3D fluencies, calculated 3D delivered dose, and measured 2D planar dose in phantom from the original plan are then compared with those from corresponding plans with errors, respectively. The correlations of these three groups of resultant passing rates, i.e. 3D fluence-based QA (P(ά,Ir) and P(ά,γ)), calculated 3D dose (P(Dc,Ir) and P(Dc,γ)), and 2D dose measurement-based QA (P(Dm,Ir) and P(Dm,γ)), will be investigated. Results: 20 treatment plans with 5 different types of errors were tested. Spearman’s correlations were found between P(ά,Ir) and P(Dc,Ir), and also between P(ά,γ) and P(Dc,γ), with averaged p-value 0.037, 0.065, and averaged correlation coefficient ρ-value 0.942, 0.871 respectively. Using Matrixx QA for IMRT plans, Spearman’s correlations were also obtained between P(ά,Ir) and P(Dm,Ir) and also between P(ά,γ) and P(Dm,γ), with p-value being 0.048, 0.071 and ρ-value being 0.897, 0.779 respectively. Conclusion: The demonstrated correlations improve the creditability of using 3D fluence-based QA for assuring treatment deliveries for IMRT/VMAT plans. Together with advantages of high detection sensitivity and better visualization of machine parameter errors, this study further demonstrates the accuracy and feasibility of 3D fluence based-QA in pre-treatment QA and daily QA. Research

  18. Intelligent Tutor

    Science.gov (United States)

    1990-01-01

    NASA also seeks to advance American education by employing the technology utilization process to develop a computerized, artificial intelligence-based Intelligent Tutoring System (ITS) to help high school and college physics students. The tutoring system is designed for use with the lecture and laboratory portions of a typical physics instructional program. Its importance lies in its ability to observe continually as a student develops problem solutions and to intervene when appropriate with assistance specifically directed at the student's difficulty and tailored to his skill level and learning style. ITS originated as a project of the Johnson Space Center (JSC). It is being developed by JSC's Software Technology Branch in cooperation with Dr. R. Bowen Loftin at the University of Houston-Downtown. Program is jointly sponsored by NASA and ACOT (Apple Classrooms of Tomorrow). Other organizations providing support include Texas Higher Education Coordinating Board, the National Research Council, Pennzoil Products Company and the George R. Brown Foundation. The Physics I class of Clear Creek High School, League City, Texas are providing the classroom environment for test and evaluation of the system. The ITS is a spinoff product developed earlier to integrate artificial intelligence into training/tutoring systems for NASA astronauts flight controllers and engineers.

  19. Using FML and fuzzy technology in adaptive ambient intelligent environments

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.

    2005-01-01

    Ambient Intelligence (AmI, shortly) gathers best re-sults from three key technologies, Ubiquitous Computing, Ubiq-uitous Communication, and Intelligent User Friendly Inter-faces. The functional and spatial distribution of tasks is a natu-ral thrust to employ multi-agent paradigm to design and

  20. Intelligent Design and Intelligent Failure

    Science.gov (United States)

    Jerman, Gregory

    2015-01-01

    Good Evening, my name is Greg Jerman and for nearly a quarter century I have been performing failure analysis on NASA's aerospace hardware. During that time I had the distinct privilege of keeping the Space Shuttle flying for two thirds of its history. I have analyzed a wide variety of failed hardware from simple electrical cables to cryogenic fuel tanks to high temperature turbine blades. During this time I have found that for all the time we spend intelligently designing things, we need to be equally intelligent about understanding why things fail. The NASA Flight Director for Apollo 13, Gene Kranz, is best known for the expression "Failure is not an option." However, NASA history is filled with failures both large and small, so it might be more accurate to say failure is inevitable. It is how we react and learn from our failures that makes the difference.

  1. Intelligent robotic tracker

    Science.gov (United States)

    Otaguro, W. S.; Kesler, L. O.; Land, K. C.; Rhoades, D. E.

    1987-01-01

    An intelligent tracker capable of robotic applications requiring guidance and control of platforms, robotic arms, and end effectors has been developed. This packaged system capable of supervised autonomous robotic functions is partitioned into a multiple processor/parallel processing configuration. The system currently interfaces to cameras but has the capability to also use three-dimensional inputs from scanning laser rangers. The inputs are fed into an image processing and tracking section where the camera inputs are conditioned for the multiple tracker algorithms. An executive section monitors the image processing and tracker outputs and performs all the control and decision processes. The present architecture of the system is presented with discussion of its evolutionary growth for space applications. An autonomous rendezvous demonstration of this system was performed last year. More realistic demonstrations in planning are discussed.

  2. Intelligent multivariate process supervision

    International Nuclear Information System (INIS)

    Visuri, Pertti.

    1986-01-01

    This thesis addresses the difficulties encountered in managing large amounts of data in supervisory control of complex systems. Some previous alarm and disturbance analysis concepts are reviewed and a method for improving the supervision of complex systems is presented. The method, called multivariate supervision, is based on adding low level intelligence to the process control system. By using several measured variables linked together by means of deductive logic, the system can take into account the overall state of the supervised system. Thus, it can present to the operators fewer messages with higher information content than the conventional control systems which are based on independent processing of each variable. In addition, the multivariate method contains a special information presentation concept for improving the man-machine interface. (author)

  3. Intelligent it outsourcing

    CERN Document Server

    Willcocks, Leslie

    2013-01-01

    Intelligent IT Outsourcing enables practitioners to focus in on the essential issues that need to be addressed so that the fundamental structure of their sourcing strategy and its implementation is sound. The authors provide insight into the challenges likely to be faced and give detailed advice on how to pre-empt and manage these.IT and outsourcing continue to be problematic, not least because fundamental learning about this subject fails to be applied systematically, and because IT is inherently difficult to manage. The economics are not obvious and emerging technologies have to be addressed, therefore IT goes to the heart of many enterprises and interfaces with multiple business units and processes, and there are continuous skills shortages.Unfortunately complexities are not removed in outsourced situations where additional problems come into play, for example the supplier''s capabilities, whether the IT is right for an outsourcing solution, and whether the contract is robust but flexible enough to allow f...

  4. Web Implementation of Quality Assurance (QA) for X-ray Units in Balkanic Medical Institutions.

    Science.gov (United States)

    Urošević, Vlade; Ristić, Olga; Milošević, Danijela; Košutić, Duško

    2015-08-01

    Diagnostic radiology is the major contributor to the total dose of the population from all artificial sources. In order to reduce radiation exposure and optimize diagnostic x-ray image quality, it is necessary to increase the quality and efficiency of quality assurance (QA) and audit programs. This work presents a web application providing completely new QA solutions for x-ray modalities and facilities. The software gives complete online information (using European standards) with which the corresponding institutions and individuals can evaluate and control a facility's Radiation Safety and QA program. The software enables storage of all data in one place and sharing the same information (data), regardless of whether the measured data is used by an individual user or by an authorized institution. The software overcomes the distance and time separation of institutions and individuals who take part in QA. Upgrading the software will enable assessment of the medical exposure level to ionizing radiation.

  5. The GSPC: Newest Franchise in al-Qa'ida's Global Jihad

    National Research Council Canada - National Science Library

    Boudali, Lianne K

    2007-01-01

    .... Some observers have speculated that North Africa may be the next safe haven for al Qa ida, and that European countries may face a greater risk of attack if Algerian terrorist groups expand their base...

  6. Portland cement concrete pavement review of QC/QA data 2000 through 2009.

    Science.gov (United States)

    2011-04-01

    This report analyzes the Quality Control/Quality Assurance (QC/QA) data for Portland cement concrete pavement : (PCCP) awarded in the years 2000 through 2009. Analysis of the overall performance of the projects is accomplished by : reviewing the Calc...

  7. DeepQA: Improving the estimation of single protein model quality with deep belief networks

    OpenAIRE

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-01-01

    Background Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. Results We introduce a novel single-model quality assessment method DeepQA based on deep belie...

  8. Technical Note: Response time evolution of XR-QA2 GafChromic™ film models.

    Science.gov (United States)

    Aldelaijan, Saad; Tomic, Nada; Papaconstadopoulos, Pavlos; Schneider, James; Seuntjens, Jan; Shih, Shelley; Lewis, David; Devic, Slobodan

    2018-01-01

    To evaluate the response of the newest XR-QA2 GafChromic™ film model in terms of postexposure signal growth and energy response in comparison with the older XR-QA (Version 2) model. Pieces of film were irradiated to air kerma in air values up to 12 cGy with several beam qualities (5.3-8.25 mm Al) commonly used for CT scanning. Film response was scored in terms of net reflectance from scanned film images at various points in time postirradiation ranging from 1 to 7 days and 5 months postexposure. To reconstruct the measurement signal changes with postirradiation delay, we irradiated one film piece and then scanned it at different point times starting from 2" min and up to 3 days postexposure. For all beam qualities and dose range investigated, it appears that the XR-QA2 film signal completely saturated after 15 h. Compared to 15 h postirradiation scanning time, the observed variation in net reflectance were 3%, 2%, and 1% for film scanned 2" min, 20 min, and 3 h after exposure, respectively, which is well within the measurement uncertainty of the XR-QA2 based reference radiochromic film dosimetry system. A comparison between the XR-QA (Version 2) and the XR-QA2 film response after several months (relative to their responses after 24 h) show differences in up to 8% and 1% for each film model respectively. The replacement of cesium bromide in the older XR-QA (Version 2) film model with bismuth oxide in the newer XR-QA2 film, while keeping the same single sensitive layer structure, lead to a significantly more stable postexposure response. © 2017 American Association of Physicists in Medicine.

  9. Tolerance design of patient-specific range QA using the DMAIC framework in proton therapy.

    Science.gov (United States)

    Rah, Jeong-Eun; Shin, Dongho; Manger, Ryan P; Kim, Tae Hyun; Oh, Do Hoon; Kim, Dae Yong; Kim, Gwe-Ya

    2018-02-01

    To implement the DMAIC (Define-Measure-Analyze-Improve-Control) can be used for customizing the patient-specific QA by designing site-specific range tolerances. The DMAIC framework (process flow diagram, cause and effect, Pareto chart, control chart, and capability analysis) were utilized to determine the steps that need focus for improving the patient-specific QA. The patient-specific range QA plans were selected according to seven treatment site groups, a total of 1437 cases. The process capability index, C pm was used to guide the tolerance design of patient site-specific range. For prostate field, our results suggested that the patient range measurements were capable at the current tolerance level of ±1 mm in clinical proton plans. For other site-specific ranges, we analyzed that the tolerance tends to be overdesigned to insufficient process capability calculated by the patient-specific QA data. The customized tolerances were calculated for treatment sites. Control charts were constructed to simulate the patient QA time before and after the new tolerances were implemented. It is found that the total simulation QA time was decreased on average of approximately 20% after establishing new site-specific range tolerances. We simulated the financial impact of this project. The QA failure for whole process in proton therapy would lead up to approximately 30% increase in total cost. DMAIC framework can be used to provide an effective QA by setting customized tolerances. When tolerance design is customized, the quality is reasonably balanced with time and cost demands. © 2017 American Association of Physicists in Medicine.

  10. The implementing of the training, examination and qualification for QA auditor

    International Nuclear Information System (INIS)

    Ma Xiaozheng; Han Peicong; Zhang Zhongyuan; Zhu Guoliang

    2007-01-01

    China Power Investment Corporation has implemented the training, examination and qualification for QA auditor based on the related requirements of the nuclear safety documents. The bases, planning of the procedure, implementing procedure and suggestions for implementing the training, examination and qualification are described in this article. That can be used as the reference for implementing the training, examination and qualification for QA auditor, as well as for establishment of the related guide. (authors)

  11. QA experience at the University of Wisconsin accredited dosimetry calibration laboratory

    Energy Technology Data Exchange (ETDEWEB)

    DeWard, L.A.; Micka, J.A. [Univ. of Wisconsin, Madison, WI (United States)

    1993-12-31

    The University of Wisconsin Accredited Dosimetry Calibration Laboratory (UW ADCL) employs procedure manuals as part of its Quality Assurance (QA) program. One of these manuals covers the QA procedures and results for all of the UW ADCL measurement equipment. The QA procedures are divided into two main areas: QA for laboratory equipment and QA for external chambers sent for calibration. All internal laboratory equipment is checked and recalibrated on an annual basis, after establishing its consistency on a 6-month basis. QA for external instruments involves checking past calibration history as well as comparing to a range of calibration values for specific instrument models. Generally, the authors find that a chamber will have a variation of less than 0.5 % from previous Co-60 calibration factors, and falls within two standard deviations of previous calibrations. If x-ray calibrations are also performed, the energy response of the chamber is plotted and compared to previous instruments of the same model. These procedures give the authors confidence in the transfer of calibration values from National Institute of Standards and Technology (NIST).

  12. QA experience at the University of Wisconsin accredited dosimetry calibration laboratory

    International Nuclear Information System (INIS)

    DeWard, L.A.; Micka, J.A.

    1993-01-01

    The University of Wisconsin Accredited Dosimetry Calibration Laboratory (UW ADCL) employs procedure manuals as part of its Quality Assurance (QA) program. One of these manuals covers the QA procedures and results for all of the UW ADCL measurement equipment. The QA procedures are divided into two main areas: QA for laboratory equipment and QA for external chambers sent for calibration. All internal laboratory equipment is checked and recalibrated on an annual basis, after establishing its consistency on a 6-month basis. QA for external instruments involves checking past calibration history as well as comparing to a range of calibration values for specific instrument models. Generally, the authors find that a chamber will have a variation of less than 0.5 % from previous Co-60 calibration factors, and falls within two standard deviations of previous calibrations. If x-ray calibrations are also performed, the energy response of the chamber is plotted and compared to previous instruments of the same model. These procedures give the authors confidence in the transfer of calibration values from National Institute of Standards and Technology (NIST)

  13. SU-E-T-468: Implementation of the TG-142 QA Process for Seven Linacs with Enhanced Beam Conformance

    Energy Technology Data Exchange (ETDEWEB)

    Woollard, J; Ayan, A; DiCostanzo, D; Grzetic, S; Hessler, J; Gupta, N [OH State University, Columbus, OH (United States)

    2015-06-15

    Purpose: To develop a TG-142 compliant QA process for 7 Varian TrueBeam linear accelerators (linacs) with enhanced beam conformance and dosimetrically matched beam models. To ensure consistent performance of all 7 linacs, the QA process should include a common set of baseline values for use in routine QA on all linacs. Methods: The TG 142 report provides recommended tests, tolerances and frequencies for quality assurance of medical accelerators. Based on the guidance provided in the report, measurement tests were developed to evaluate each of the applicable parameters listed for daily, monthly and annual QA. These tests were then performed on each of our 7 new linacs as they came on line at our institution. Results: The tolerance values specified in TG-142 for each QA test are either absolute tolerances (i.e. ±2mm) or require a comparison to a baseline value. The results of our QA tests were first used to ensure that all 7 linacs were operating within the suggested tolerance values provided in TG −142 for those tests with absolute tolerances and that the performance of the linacs was adequately matched. The QA test results were then used to develop a set of common baseline values for those QA tests that require comparison to a baseline value at routine monthly and annual QA. The procedures and baseline values were incorporated into a spreadsheets for use in monthly and annual QA. Conclusion: We have developed a set of procedures for daily, monthly and annual QA of our linacs that are consistent with the TG-142 report. A common set of baseline values was developed for routine QA tests. The use of this common set of baseline values for comparison at monthly and annual QA will ensure consistent performance of all 7 linacs.

  14. SU-E-T-468: Implementation of the TG-142 QA Process for Seven Linacs with Enhanced Beam Conformance

    International Nuclear Information System (INIS)

    Woollard, J; Ayan, A; DiCostanzo, D; Grzetic, S; Hessler, J; Gupta, N

    2015-01-01

    Purpose: To develop a TG-142 compliant QA process for 7 Varian TrueBeam linear accelerators (linacs) with enhanced beam conformance and dosimetrically matched beam models. To ensure consistent performance of all 7 linacs, the QA process should include a common set of baseline values for use in routine QA on all linacs. Methods: The TG 142 report provides recommended tests, tolerances and frequencies for quality assurance of medical accelerators. Based on the guidance provided in the report, measurement tests were developed to evaluate each of the applicable parameters listed for daily, monthly and annual QA. These tests were then performed on each of our 7 new linacs as they came on line at our institution. Results: The tolerance values specified in TG-142 for each QA test are either absolute tolerances (i.e. ±2mm) or require a comparison to a baseline value. The results of our QA tests were first used to ensure that all 7 linacs were operating within the suggested tolerance values provided in TG −142 for those tests with absolute tolerances and that the performance of the linacs was adequately matched. The QA test results were then used to develop a set of common baseline values for those QA tests that require comparison to a baseline value at routine monthly and annual QA. The procedures and baseline values were incorporated into a spreadsheets for use in monthly and annual QA. Conclusion: We have developed a set of procedures for daily, monthly and annual QA of our linacs that are consistent with the TG-142 report. A common set of baseline values was developed for routine QA tests. The use of this common set of baseline values for comparison at monthly and annual QA will ensure consistent performance of all 7 linacs

  15. BWR shutdown analyzer using artificial intelligence (AI) techniques

    International Nuclear Information System (INIS)

    Cain, D.G.

    1986-01-01

    A prototype alarm system for detecting abnormal reactor shutdowns based on artificial intelligence technology is described. The system incorporates knowledge about Boiling Water Reactor (BWR) plant design and component behavior, as well as knowledge required to distinguish normal, abnormal, and ATWS accident conditions. The system was developed using a software tool environment for creating knowledge-based applications on a LISP machine. To facilitate prototype implementation and evaluation, a casual simulation of BWR shutdown sequences was developed and interfaced with the alarm system. An intelligent graphics interface for execution and control is described. System performance considerations and general observations relating to artificial intelligence application to nuclear power plant problems are provided

  16. Business Intelligence

    OpenAIRE

    Petersen, Anders

    2001-01-01

    Cílem této bakalářské práce je seznámení s Business Intelligence a zpracování vývojového trendu, který ovlivňuje podobu řešení Business Intelligence v podniku ? Business Activity Monitoring. Pro zpracování tohoto tématu byla použita metoda studia odborných pramenů, a to jak v českém, tak v anglickém jazyce. Hlavním přínosem práce je ucelený, v českém jazyce zpracovaný materiál pojednávající o Business Activity Monitoring. Práce je rozdělena do šesti hlavních kapitol. Prvních pět je věnováno p...

  17. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  18. Creating Business Intelligence from Course Management Systems

    Science.gov (United States)

    van Dyk, Liezl; Conradie, Pieter

    2007-01-01

    Purpose: This article seeks to address the interface between individual learning facilitators that use course management systems (CMS) data to support decision-making and course design and institutional infrastructure providers that are responsible for institutional business intelligence. Design/methodology/approach: The design of a data warehouse…

  19. Web Intelligence and Artificial Intelligence in Education

    Science.gov (United States)

    Devedzic, Vladan

    2004-01-01

    This paper surveys important aspects of Web Intelligence (WI) in the context of Artificial Intelligence in Education (AIED) research. WI explores the fundamental roles as well as practical impacts of Artificial Intelligence (AI) and advanced Information Technology (IT) on the next generation of Web-related products, systems, services, and…

  20. QA programme in external radiotherapy in Romania - status and perspective

    International Nuclear Information System (INIS)

    Dumitrescu, A.; Milu, C.

    2008-01-01

    Full text: Recognizing the importance of quality assurance in radiotherapy and the need to make access to radiation standards traceable to the international measurement system for every radiotherapy center, the Romanian national secondary standard dosimetry laboratory (SSDL) has started in 1999 - together with IAEA - a national quality audit programme in all the centers for external radiotherapy from Romania. At present, there are 17 radiotherapy centers in Romania, and a total of 19 teletherapy units and 4 LINCs. The programme has 3 phases: the first phase was to organize a survey in all radiotherapy centers, to collect general information on their radio therapists, medical physicists, type of equipment, dosimeters, etc. Following the survey, a quality assurance network was set up, and on-site dosimetry reviews were arranged according to a suitable timetable. The second phase consisted in performing the reference dosimetry and the calibration of the equipment. Then, a quality audit system based on mailed TLDs has been applied to all radiation beams produced by cobalt-60 therapy units and medical accelerators, in order to identify discrepancies in dosimetry larger than ± 3%. At the same time, the beam calibration performed by the SSDLs was verified. The results of the first survey were analyzed, and corrective actions were taken. A second survey was then organized, based on the mailed TLDs. This paper presents in detail the entire QA programme, its results, and the actions that are to be taken in order to improve the accuracy and consistency of the dosimetry in clinical radiotherapy in Romania. (author)

  1. QA/QC For Radon Concentration Measurement With Charcoal Canister

    International Nuclear Information System (INIS)

    Pantelic, G.; Zivanovic, M.; Rajacic, M.; Krneta Nikolic, J.; Todorovic, D.

    2015-01-01

    The primary concern of any measuring of radon or radon progeny must be the quality of the results. A good quality assurance program, when properly designed and diligently followed, ensures that laboratory staff will be able to produce the type and quality of measurement results which is needed and expected. Active charcoal detectors are used for testing the concentration of radon in dwellings. The method of measurement is based on radon adsorption on coal and measurement of gamma radiation of radon daughters. Upon closing the detectors, the measurement was carried out after achieving the equilibrium between radon and its daughters (at least 3 hours) using NaI or HPGe detector. Radon concentrations as well as measurement uncertainties were calculated according to US EPA protocol 520/5-87-005. Detectors used for the measurements were calibrated by 226Ra standard of known activity in the same geometry. Standard and background canisters are used for QA and QC, as well as for the calibration of the measurement equipment. Standard canister is a sealed canister with the same matrix and geometry as the canisters used for measurements, but with the known activity of radon. Background canister is a regular radon measurement canister, which has never been exposed. The detector background and detector efficiency are measured to ascertain whether they are within the warning and acceptance limits. (author).

  2. Automated QA framework for PetaScale data challenges

    International Nuclear Information System (INIS)

    Van Buren, G; Didenko, L; Lauret, J; Oldag, E; Ray, L

    2011-01-01

    Over the lifetime of the STAR Experiment, a large investment of workforce time has gone into a variety of QA efforts, including continuous processing of a portion of the data for automated calibration and iterative convergence and quality assurance purposes. A rotating workforce coupled with ever-increasing volumes of information to examine led to sometimes inconsistent or incomplete reporting of issues, eventually leading to additional work. The traditional approach of manually screening a data sample was no longer adequate and doomed to eventual failure with planned future growth in data extents. To prevent this collapse we have developed a new system employing user-defined reference histograms, permitting automated comparisons and nagging of issues. Based on the ROOT framework at its core, the front end is a web based service allowing shift personnel to visualize the results, and to set test parameters and thresholds defining success or failure. The versatile and flexible approach allows for a slew of histograms to be configured and grouped into categories (results and thresholds may depend on experimental triggers and data types) ensuring framework evolution with the years of running to come. Historical information is also saved to track changes and allow for rapid convergence of future tuning. Database storage and processing of data are handled outside the web server for security and fault tolerance.

  3. Final Hanford Site Transuranic (TRU) Waste Characterization QA Project Plan

    International Nuclear Information System (INIS)

    GREAGER, T.M.

    2000-01-01

    The Quality Assurance Project Plan (QAPjP) has been prepared for waste characterization activities to be conducted by the Transuranic (TRU) Project at the Hanford Site to meet requirements set forth in the Waste Isolation Pilot Plan (WIPP) Hazardous Waste Facility Permit, 4890139088-TSDF, Attachment B, including Attachments B1 through B6 (WAP) (DOE, 1999a). The QAPjP describes the waste characterization requirements and includes test methods, details of planned waste sampling and analysis, and a description of the waste characterization and verification process. In addition, the QAPjP includes a description of the quality assurance/quality control (QA/QC) requirements for the waste characterization program. Before TRU waste is shipped to the WIPP site by the TRU Project, all applicable requirements of the QAPjP shall be implemented. Additional requirements necessary for transportation to waste disposal at WIPP can be found in the ''Quality Assurance Program Document'' (DOE 1999b) and HNF-2600, ''Hanford Site Transuranic Waste Certification Plan.'' TRU mixed waste contains both TRU radioactive and hazardous components, as defined in the WLPP-WAP. The waste is designated and separately packaged as either contact-handled (CH) or remote-handled (RH), based on the radiological dose rate at the surface of the waste container. RH TRU wastes are not currently shipped to the WIPP facility

  4. Reasoning about Users' Actions in a Graphical User Interface.

    Science.gov (United States)

    Virvou, Maria; Kabassi, Katerina

    2002-01-01

    Describes a graphical user interface called IFM (Intelligent File Manipulator) that provides intelligent help to users. Explains two underlying reasoning mechanisms, one an adaptation of human plausible reasoning and one that performs goal recognition based on the effects of users' commands; and presents results of an empirical study that…

  5. Intelligent Growth Automaton of Virtual Plant Based on Physiological Engine

    Science.gov (United States)

    Zhu, Qingsheng; Guo, Mingwei; Qu, Hongchun; Deng, Qingqing

    In this paper, a novel intelligent growth automaton of virtual plant is proposed. Initially, this intelligent growth automaton analyzes the branching pattern which is controlled by genes and then builds plant; moreover, it stores the information of plant growth, provides the interface between virtual plant and environment, and controls the growth and development of plant on the basis of environment and the function of plant organs. This intelligent growth automaton can simulate that the plant growth is controlled by genetic information system, and the information of environment and the function of plant organs. The experimental results show that the intelligent growth automaton can simulate the growth of plant conveniently and vividly.

  6. WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, S; Kessler, M [The University of Michigan, Ann Arbor, MI (United States); Litzenberg, D [Univ Michigan, Ann Arbor, MI (United States); Lee, C; Irrer, J; Chen, X; Acosta, E; Weyburne, G; Lam, K; Younge, K; Matuszak, M [University of Michigan, Ann Arbor, MI (United States); Keranen, W [Varian Medical Systems, Palo Alto, CA (United States); Covington, E [University of Michigan Hospital and Health System, Ann Arbor, MI (United States); Moran, J [Univ Michigan Medical Center, Ann Arbor, MI (United States)

    2015-06-15

    Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those events and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from

  7. WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework

    International Nuclear Information System (INIS)

    Hadley, S; Kessler, M; Litzenberg, D; Lee, C; Irrer, J; Chen, X; Acosta, E; Weyburne, G; Lam, K; Younge, K; Matuszak, M; Keranen, W; Covington, E; Moran, J

    2015-01-01

    Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those events and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from

  8. Intelligent control and automation technology for nuclear applications

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Kim, Ko Ryeo; Lee, Jae Cheol; Eom, Heung Seop; Lee, Jang Soo

    1994-01-01

    Using recently established intelligent mobile robot theory and high technologies in computer science, we have designed an inspection automation system for welded parts of the reactor vessel, and we intend to establish basic technologies. The recent status of those technologies is surveyed for various application areas, and the characteristics and availability of those techniques such as intelligent mobile robot, digital computer control, intelligent user interface, realtime data processing, ultrasonic signal processing, intelligent user interface, intelligent defect recognition, are studied and examined at first. The high performance and compact size inspection system is designed, and if implemented, it is expected to be very efficient in economic point of view. In addition, the use of integrated SW system leads to the reduction of human errors. Through the analysis results and experiences, we investigated the further feasibility of basic technology applications to the various similar operation systems in NPP. (Author)

  9. Organic interfaces

    NARCIS (Netherlands)

    Poelman, W.A.; Tempelman, E.

    2014-01-01

    This paper deals with the consequences for product designers resulting from the replacement of traditional interfaces by responsive materials. Part 1 presents a theoretical framework regarding a new paradigm for man-machine interfacing. Part 2 provides an analysis of the opportunities offered by new

  10. Interface Realisms

    DEFF Research Database (Denmark)

    Pold, Søren

    2005-01-01

    This article argues for seeing the interface as an important representational and aesthetic form with implications for postmodern culture and digital aesthetics. The interface emphasizes realism due in part to the desire for transparency in Human-Computer Interaction (HCI) and partly...

  11. Intelligence and negotiating

    International Nuclear Information System (INIS)

    George, D.G.

    1990-01-01

    This paper discusses the role of US intelligence during arms control negotiations between 1982 and 1987. It also covers : the orchestration of intelligence projects; an evaluation of the performance of intelligence activities; the effect intelligence work had on actual arms negotiations; and suggestions for improvements in the future

  12. Intelligent products : A survey

    NARCIS (Netherlands)

    Meyer, G.G.; Främling, K.; Holmström, J.

    This paper presents an overview of the field of Intelligent Products. As Intelligent Products have many facets, this paper is mainly focused on the concept behind Intelligent Products, the technical foundations, and the achievable practical goals of Intelligent Products. A novel classification of

  13. Intelligence Issues for Congress

    Science.gov (United States)

    2013-04-23

    open source information— osint (newspapers...by user agencies. Section 1052 of the Intelligence Reform Act expressed the sense of Congress that there should be an open source intelligence ...center to coordinate the collection, analysis, production, and dissemination of open source intelligence to other intelligence agencies. An Open Source

  14. Quality assurance (QA) training at Westinghouse including innovative approaches for achieving an effective QA programme and establishing constructive interaction

    International Nuclear Information System (INIS)

    Chivers, J.H.; Scanga, B.E.

    1982-01-01

    Experience of the Westinghouse Water Reactors Division with indoctrination and training of quality engineers includes training of personnel from Westinghouse divisions in the USA and overseas as well as of customers' personnel. A written plan is prepared for each trainee in order to fit the training to the individual's needs, and to cover the full range of information and activities. The trainee is also given work assignments, working closely with experienced quality engineers. He may prepare inspection plans and audit check lists, assist in the preparation of QA training modules, write procedures, and perform supplier surveillance and data analyses, or make special studies of operating systems. The trainee attends seminars and special courses on work-related technical subjects. Throughout the training period, emphasis is placed on inculcating an attitude of team work in the trainee so that the result of the training is the achievement of both quality and productivity. Certification is extended (given that education/experience/skill requirements are met) to such functions as mechanical equipment quality engineering, electrical equipment quality engineering, and start-up and testing quality engineering. A well-trained quality engineer is equipped to provide technical assistance to other disciplines and, through effective co-operation with others, contributes to the success of the organization's endeavours. (author)

  15. Intelligent Governmentality

    Directory of Open Access Journals (Sweden)

    Willem de Lint

    2008-10-01

    Full Text Available Recently, within liberal democracies, the post-Westphalian consolidation of security and intelligence has ushered in the normalization not only of security in ‘securitization’ but also of intelligence in what is proposed here as ‘intelligencification.’ In outlining the features of intelligencified governance, my aim is to interrogate the view that effects or traces, and productivity rather than negation is as persuasive as commonly thought by the constructivists. After all, counter-intelligence is both about purging and reconstructing the archive for undisclosed values. In practice, what is being normalized is the authorized and legalized use of release and retention protocols of politically actionable information. The intelligencification of governmentality affords a sovereignty shell-game or the instrumentalization of sovereign power by interests that are dependent on, yet often inimical to, the power of state, national, and popular sovereignty. On voit le politique et le social comme dépendant de contingences exclusives. Récemment, au sein des démocraties libérales, la consolidation de la sécurité et des services de renseignements de sécurité qui a suivi les traités de la Westphalie a donné lieu à la normalisation non seulement de la sécurité en «sécurisation» mais aussi des services de renseignements de sécurité en ce qui est proposé ici comme «intelligencification» [terme anglais créé par l’auteur, dérivé du mot anglais «intelligence» dans le sens de renseignements des écurité]. En particulier, ce que l’on normalise dans le but de contourner des contingences exclusives est l’utilisation autorisée et légalisée de protocoles de communication et de rétention d’information qui, politiquement, pourrait mener à des poursuites. En esquissant les traits de la gouvernance «intelligencifiée», mon but est d’interroger le point de vue que les effets ou les traces, et la productivité plutôt que la

  16. Pathogen intelligence

    Directory of Open Access Journals (Sweden)

    Michael eSteinert

    2014-01-01

    Full Text Available Different species inhabit different sensory worlds and thus have evolved diverse means of processing information, learning and memory. In the escalated arms race with host defense, each pathogenic bacterium not only has evolved its individual cellular sensing and behaviour, but also collective sensing, interbacterial communication, distributed information processing, joint decision making, dissociative behaviour, and the phenotypic and genotypic heterogeneity necessary for epidemiologic success. Moreover, pathogenic populations take advantage of dormancy strategies and rapid evolutionary speed, which allow them to save co-generated intelligent traits in a collective genomic memory. This review discusses how these mechanisms add further levels of complexity to bacterial pathogenicity and transmission, and how mining for these mechanisms could help to develop new anti-infective strategies.

  17. Intelligent Routines

    CERN Document Server

    Anastassiou, George A

    Intelligent Routines II: Solving Linear Algebra and Differential Geometry with Sage” contains numerous of examples and problems as well as many unsolved problems. This book extensively applies the successful software Sage, which can be found free online http://www.sagemath.org/. Sage is a recent and popular software for mathematical computation, available freely and simple to use. This book is useful to all applied scientists in mathematics, statistics and engineering, as well for late undergraduate and graduate students of above subjects. It is the first such book in solving symbolically with Sage problems in Linear Algebra and Differential Geometry. Plenty of SAGE applications are given at each step of the exposition.

  18. SU-F-T-287: A Preliminary Study On Patient Specific VMAT Verification Using a Phosphor-Screen Based Geometric QA System (Raven QA)

    International Nuclear Information System (INIS)

    Lee, M; Yi, B; Wong, J; Ding, K

    2016-01-01

    Purpose: The RavenQA system (LAP Laser, Germany) is a QA device with a phosphor screen detector for performing the QA tasks of TG-142. This study tested if it is feasible to use the system for the patient specific QA of the Volumetric Modulated Arc Therapy (VMAT). Methods: Water equivalent material (5cm) is attached to the front of the detector plate of the RavenQA for dosimetry purpose. Then the plate is attached to the gantry to synchronize the movement between the detector and the gantry. Since the detector moves together with gantry, The ’Reset gantry to 0’ function of the Eclipse planning system (Varian, CA) is used to simulate the measurement situation when calculating dose of the detector plate. The same gantry setup is used when delivering the treatment beam for feasibility test purposes. Cumulative dose is acquired for each arc. The optical scatter component of each captured image from the CCD camera is corrected by deconvolving the 2D spatial invariant optical scatter kernel (OSK). We assume that the OSK is a 2D isotropic point spread function with inverse-squared decrease as a function of radius from the center. Results: Three cases of VMAT plans including head & neck, whole pelvis and abdomen-pelvis are tested. Setup time for measurements was less than 5 minutes. Passing rates of absolute gamma were 99.3, 98.2, 95.9 respectively for 3%/3mm criteria and 96.2, 97.1, 86.4 for 2%/2mm criteria. The abdomen-pelvis field has long treatment fields, 37cm, which are longer than the detector plate (25cm). This plan showed relatively lower passing rate than other plans. Conclusion: An algorithm for IMRT/VMAT verification using the RavenQA has been developed and tested. The model of spatially invariant OSK works well for deconvolution purpose. It is proved that the RavenQA can be used for the patient specific verification of VMAT. This work is funded in part by a Maryland Industrial Partnership Program grant to University of Maryland and to JPLC who owns the

  19. ZD-I intelligent scaler

    International Nuclear Information System (INIS)

    Zhen Zhihao; Zhou Weimin

    1999-01-01

    The ZD-I Intelligent Scaler is a new kind of high-powered scaler using high-speed CMOS96 series single chip processor. Besides the normal timing and counting functions, it can also supply 0-2000 V high voltage, store or print measuring data, communicate with PC by RS232 interface, and transfer measuring data. There is essential improvements on the panel. Keyboard without dialing switches and rheostats can perform all the operations, and the setting parameters would not lose when the scaler is switched off. So the ZD-I Intelligent Scaler is a perfect up-to-date production of the NIM-style scaler and the HV-generator

  20. ZD-I intelligent scaler

    International Nuclear Information System (INIS)

    Chen Zhihao

    2001-01-01

    The ZD-I Intelligent Scaler is a new kind of high-powered scaler using high-speed CMOS96 series single chip processor. It not only has the normal timing, counting functions, but also can supply 0-2000 V high voltage, storage or print measuring data, communicate with PC by RS232 interface, transfer measuring data. And there is essentially improvement on the panel. Keyboard without dialing switches and rheostats can perform all the operations, and the setting parameters wouldn't lose when you shut down the scaler. So the ZE-I Intelligent Scaler is a perfect updating production of the NIM-style scaler and the HV-generator

  1. Impact and payback of a QA/QC program for steam-water chemistry

    International Nuclear Information System (INIS)

    Lerman, S.I.; Wilson, D.

    1992-01-01

    QA/QC programs for analytical laboratories and in-line instrumentation are essential if we are to have any faith in the data they produce. When the analytes are at trace levels, as they frequently are in a steam-water cycle, the importance of QA/QC increases by an order of magnitude. The cost and resources of such a program, although worth it, are frequently underestimated. QA/QC is much more than running a standard several times a week. This paper will discuss some of the essential elements of such a program, compare them to the cost, and point out the impact of not having such a program. RP-2712-3 showed how essential QA/QC is to understand the limitations of instruments doing trace analysis of water. What it did not do, nor was it intended to, is discuss how good reliability can be in your own plant. QA programs that include training of personnel, written procedures, and comprehensive maintenance and inventory programs ensure optimum performance of chemical monitors. QC samples run regularly allow plant personnel to respond to poor performance in a timely manner, appropriate to plant demands. Proper data management establishes precision information necessary to determine how good our measurements are. Generally, the plant has the advantage of a central laboratory to perform corroborative analysis, and a comprehensive QA/QC program will integrate the plant monitoring operations with the central lab. Where trace analysis is concerned, attention to detail becomes paramount. Instrument performance may be below expected levels, and instruments are probably being run at the bottom end of their optimum range. Without QA/QC the plant manager can have no confidence in analytical results. Poor steam-water chemistry can go unnoticed, causing system deterioration. We can't afford to wait for another RP-2712-3 to tell us how good our data is

  2. WE-B-BRD-03: MR QA/QC for MRgRT

    Energy Technology Data Exchange (ETDEWEB)

    Layman, R. [Ohio State Univ (United States)

    2015-06-15

    The use of MRI in radiation therapy is rapidly increasing. Applications vary from the MRI simulator, to the MRI fused with CT, and to the integrated MRI+RT system. Compared with the standard MRI QA, a broader scope of QA features has to be defined in order to maximize the benefits of using MRI in radiation therapy. These QA features include geometric fidelity, image registration, motion management, cross-system alignment, and hardware interference. Advanced MRI techniques require a specific type of QA, as they are being widely used in radiation therapy planning, dose calculations, post-implant dosimetry, and prognoses. A vigorous and adaptive QA program is crucial to defining the responsibility of the entire radiation therapy group and detecting deviations from the performance of high-quality treatment. As a drastic departure from CT simulation, MRI simulation requires changes in the work flow of treatment planning and image guidance. MRI guided radiotherapy platforms are being developed and commercialized to take the advantage of the advance in knowledge, technology and clinical experience. This symposium will from an educational perspective discuss the scope and specific issues related to MRI guided radiotherapy. Learning Objectives: Understand the difference between a standard and a radiotherapy-specific MRI QA program. Understand the effects of MRI artifacts (geometric distortion and motion) on radiotherapy. Understand advanced MRI techniques (ultrashort echo, fast MRI including dynamic MRI and 4DMRI, diffusion, perfusion, and MRS) and related QA. Understand the methods to prepare MRI for treatment planning (electron density assignment, multimodality image registration, segmentation and motion management). Current status of MRI guided treatment platforms. Dr. Jihong Wang has a research grant with Elekta-MRL project. Dr. Ke Sheng receives research grants from Varian Medical systems.

  3. Intelligence: Real or artificial?

    OpenAIRE

    Schlinger, Henry D.

    1992-01-01

    Throughout the history of the artificial intelligence movement, researchers have strived to create computers that could simulate general human intelligence. This paper argues that workers in artificial intelligence have failed to achieve this goal because they adopted the wrong model of human behavior and intelligence, namely a cognitive essentialist model with origins in the traditional philosophies of natural intelligence. An analysis of the word “intelligence” suggests that it originally r...

  4. Environmental analytical laboratory setup operation and QA/QC

    International Nuclear Information System (INIS)

    Hsu, J.P.; Boyd, J.A.; DeViney, S.

    1991-01-01

    Environmental analysis requires precise and timely measurements. The required precise measurement is ensured with quality control and timeliness through an efficient operation. The efficiency of the operation also ensures cost-competitiveness. Environmental analysis plays a very important role in the environmental protection program. Due to the possible litigation involvement, most environmental analyses follow stringent criteria, such as the U.S. EPA Contract Laboratory Program procedures with analytical results documented in an orderly manner. The documentation demonstrates that all quality control steps are followed and facilitates data evaluation to determine the quality and usefulness of the data. Furthermore, the tedious documents concerning sample checking, chain-of-custody, standard or surrogate preparation, daily refrigerator and oven temperature monitoring, analytical and extraction logbooks, standard operation procedures, etc., also are an important part of the laboratory documentation. Quality control for environmental analysis is becoming more stringent, required documentation is becoming more detailed and turnaround time is shorter. However, the business is becoming more cost-competitive and it appears that this trend will continue. In this paper, we discuss what should be done to deal this high quality, fast-paced and tedious environmental analysis process at a competitive cost. The success of environmental analysis is people. The knowledge and experience of the staff are the key to a successful environmental analysis program. In order to be successful in this new area, the ability to develop new methods is crucial. In addition, the laboratory information system, laboratory automation and quality assurance/quality control (QA/QC) are major factors for laboratory success. This paper concentrates on these areas

  5. Gating treatment delivery QA based on a surrogate motion analysis

    International Nuclear Information System (INIS)

    Chojnowski, J.; Simpson, E.

    2011-01-01

    Full text: To develop a methodology to estimate intrafractional target position error during a phase-based gated treatment. Westmead Cancer Care Centre is using respiratory correlated phase-based gated beam delivery in the treatment of lung cancer. The gating technique is managed by the Varian Real-time Position Management (RPM) system, version 1.7.5. A 6-dot block is placed on the abdomen of the patient and acts as a surrogate for the target motion. During a treatment session, the motion of the surrogate can be recorded by RPM application. Analysis of the surrogate motion file by in-house developed software allows the intrafractional error of the treatment session to be computed. To validate the computed error, a simple test that involves the introduction of deliberate errors is performed. Errors of up to 1.1 cm are introduced to a metal marker placed on a surrogate using the Varian Breathing Phantom. The moving marker was scanned in prospective mode using a GE Lightspeed 16 CT scanner. Using the CT images, a difference of the marker position with and without introduced errors is compared to the calculated errors based on the surrogate motion. The average and standard deviation of a difference between calculated target position errors and measured introduced artificial errors of the marker position is 0.02 cm and 0.07 cm respectively. Conclusion The calculated target positional error based on surrogate motion analysis provides a quantitative measure of intrafractional target positional errors during treatment. Routine QA for gated treatment using surrogate motion analysis is relatively quick and simple.

  6. Microprocessor interfacing

    CERN Document Server

    Vears, R E

    2014-01-01

    Microprocessor Interfacing provides the coverage of the Business and Technician Education Council level NIII unit in Microprocessor Interfacing (syllabus U86/335). Composed of seven chapters, the book explains the foundation in microprocessor interfacing techniques in hardware and software that can be used for problem identification and solving. The book focuses on the 6502, Z80, and 6800/02 microprocessor families. The technique starts with signal conditioning, filtering, and cleaning before the signal can be processed. The signal conversion, from analog to digital or vice versa, is expl

  7. QA [Quality Assurance] role in advanced energy activities: Towards an /open quotes/orthodox/close quotes/ Quality Program: Canonizing the traditions at Fermilab

    International Nuclear Information System (INIS)

    Bodnarczuk, M.W.

    1988-02-01

    After a brief description of the goal of Fermi National Accelerator Laboratory (Fermilab) this paper poses and answers three questions related to Quality Assurance (QA) at the Laboratory. First, what is the difference between 'orthodox' and 'unorthodox' QA and is there a place for 'orthodox' QA at a laboratory like Fermilab? Second, are the deeper philosophical and cultural frameworks of high-energy physics acommodating or antagonistic to an 'orthodox' QA Program? Finally, faced with the task of developing an institutional QA program for Fermilab where does one begin? The paper is based on experience with the on-going development and implementation of an institutional QA Program at Fermilab. 10 refs

  8. Inter-cooperative collective intelligence techniques and applications

    CERN Document Server

    Bessis, Nik

    2014-01-01

    This book covers the latest advances in the rapid growing field of inter-cooperative collective intelligence aiming the integration and cooperation of various computational resources, networks and intelligent processing paradigms to collectively build intelligence and advanced decision support and interfaces for end-users. The book brings a comprehensive view of the state-of-the-art in the field of integration of sensor networks, IoT and Cloud computing, massive and intelligent querying and processing of data. As a result, the book presents lessons learned so far and identifies new research issues, challenges and opportunities for further research and development agendas. Emerging areas of applications are also identified and usefulness of inter-cooperative collective intelligence is envisaged.   Researchers, software developers, practitioners and students interested in the field of inter-cooperative collective intelligence will find the comprehensive coverage of this book useful for their research, academic...

  9. Knowledge-based control of an adaptive interface

    Science.gov (United States)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  10. WE-AB-201-03: TPS Commissioning and QA: Incorporating the Entire Planning Process

    International Nuclear Information System (INIS)

    Mutic, S.

    2015-01-01

    Treatment planning systems (TPS) are a cornerstone of modern radiation therapy. Errors in their commissioning or use can have a devastating impact on many patients. To support safe and high quality care, medical physicists must conduct efficient and proper commissioning, good clinical integration, and ongoing quality assurance (QA) of the TPS. AAPM Task Group 53 and related publications have served as seminal benchmarks for TPS commissioning and QA over the past two decades. Over the same time, continuing innovations have made the TPS even more complex and more central to the clinical process. Medical goals are now expressed in terms of the dose and margins around organs and tissues that are delineated from multiple imaging modalities (CT, MR and PET); and even temporally resolved (i.e., 4D) imaging. This information is passed on to optimization algorithms to establish accelerator movements that are programmed directly for IMRT, VMAT and stereotactic treatments. These advances have made commissioning and QA of the TPS much more challenging. This education session reviews up-to-date experience and guidance on this subject; including the recently published AAPM Medical Physics Practice Guideline (MPPG) #5 “Commissioning and QA of Treatment Planning Dose Calculations: Megavoltage Photon and Electron Beams”. Treatment Planning System Commissioning and QA: Challenges and Opportunities (Greg Salomons) This session will provide some key background and review publications describing prominent incidents relating to TPS commissioning and QA. Traditional approaches have been hardware and feature oriented. They aim to establish a functional configuration and establish specifications for regular testing of features (like dose calculation) to assure stable operation and detect failures. With the advent of more complex systems, more patient-specific testing has also been adopted. A number of actual TPS defects will be presented along with heuristics for identifying similar

  11. Beam dynamics of mixed high intensity highly charged ion Beams in the Q/A selector

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, X.H., E-mail: zhangxiaohu@impcas.ac.cn [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Yuan, Y.J.; Yin, X.J.; Qian, C.; Sun, L.T. [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Du, H.; Li, Z.S.; Qiao, J.; Wang, K.D. [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Zhao, H.W.; Xia, J.W. [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China)

    2017-06-11

    Electron cyclotron resonance (ECR) ion sources are widely used in heavy ion accelerators for their advantages in producing high quality intense beams of highly charged ions. However, it exists challenges in the design of the Q/A selection systems for mixed high intensity ion beams to reach sufficient Q/A resolution while controlling the beam emittance growth. Moreover, as the emittance of beam from ECR ion sources is coupled, the matching of phase space to post accelerator, for a wide range of ion beam species with different intensities, should be carefully studied. In this paper, the simulation and experimental results of the Q/A selection system at the LECR4 platform are shown. The formation of hollow cross section heavy ion beam at the end of the Q/A selector is revealed. A reasonable interpretation has been proposed, a modified design of the Q/A selection system has been committed for HIRFL-SSC linac injector. The features of the new design including beam simulations and experiment results are also presented.

  12. WE-AB-201-00: Treatment Planning System Commissioning and QA

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-06-15

    Treatment planning systems (TPS) are a cornerstone of modern radiation therapy. Errors in their commissioning or use can have a devastating impact on many patients. To support safe and high quality care, medical physicists must conduct efficient and proper commissioning, good clinical integration, and ongoing quality assurance (QA) of the TPS. AAPM Task Group 53 and related publications have served as seminal benchmarks for TPS commissioning and QA over the past two decades. Over the same time, continuing innovations have made the TPS even more complex and more central to the clinical process. Medical goals are now expressed in terms of the dose and margins around organs and tissues that are delineated from multiple imaging modalities (CT, MR and PET); and even temporally resolved (i.e., 4D) imaging. This information is passed on to optimization algorithms to establish accelerator movements that are programmed directly for IMRT, VMAT and stereotactic treatments. These advances have made commissioning and QA of the TPS much more challenging. This education session reviews up-to-date experience and guidance on this subject; including the recently published AAPM Medical Physics Practice Guideline (MPPG) #5 “Commissioning and QA of Treatment Planning Dose Calculations: Megavoltage Photon and Electron Beams”. Treatment Planning System Commissioning and QA: Challenges and Opportunities (Greg Salomons) This session will provide some key background and review publications describing prominent incidents relating to TPS commissioning and QA. Traditional approaches have been hardware and feature oriented. They aim to establish a functional configuration and establish specifications for regular testing of features (like dose calculation) to assure stable operation and detect failures. With the advent of more complex systems, more patient-specific testing has also been adopted. A number of actual TPS defects will be presented along with heuristics for identifying similar

  13. WE-AB-201-01: Treatment Planning System Commissioning and QA: Challenges and Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Salomons, G. [Cancer Center of Southeastern Ontario (Canada)

    2015-06-15

    Treatment planning systems (TPS) are a cornerstone of modern radiation therapy. Errors in their commissioning or use can have a devastating impact on many patients. To support safe and high quality care, medical physicists must conduct efficient and proper commissioning, good clinical integration, and ongoing quality assurance (QA) of the TPS. AAPM Task Group 53 and related publications have served as seminal benchmarks for TPS commissioning and QA over the past two decades. Over the same time, continuing innovations have made the TPS even more complex and more central to the clinical process. Medical goals are now expressed in terms of the dose and margins around organs and tissues that are delineated from multiple imaging modalities (CT, MR and PET); and even temporally resolved (i.e., 4D) imaging. This information is passed on to optimization algorithms to establish accelerator movements that are programmed directly for IMRT, VMAT and stereotactic treatments. These advances have made commissioning and QA of the TPS much more challenging. This education session reviews up-to-date experience and guidance on this subject; including the recently published AAPM Medical Physics Practice Guideline (MPPG) #5 “Commissioning and QA of Treatment Planning Dose Calculations: Megavoltage Photon and Electron Beams”. Treatment Planning System Commissioning and QA: Challenges and Opportunities (Greg Salomons) This session will provide some key background and review publications describing prominent incidents relating to TPS commissioning and QA. Traditional approaches have been hardware and feature oriented. They aim to establish a functional configuration and establish specifications for regular testing of features (like dose calculation) to assure stable operation and detect failures. With the advent of more complex systems, more patient-specific testing has also been adopted. A number of actual TPS defects will be presented along with heuristics for identifying similar

  14. WE-AB-201-03: TPS Commissioning and QA: Incorporating the Entire Planning Process

    Energy Technology Data Exchange (ETDEWEB)

    Mutic, S. [Washington University School of Medicine (United States)

    2015-06-15

    Treatment planning systems (TPS) are a cornerstone of modern radiation therapy. Errors in their commissioning or use can have a devastating impact on many patients. To support safe and high quality care, medical physicists must conduct efficient and proper commissioning, good clinical integration, and ongoing quality assurance (QA) of the TPS. AAPM Task Group 53 and related publications have served as seminal benchmarks for TPS commissioning and QA over the past two decades. Over the same time, continuing innovations have made the TPS even more complex and more central to the clinical process. Medical goals are now expressed in terms of the dose and margins around organs and tissues that are delineated from multiple imaging modalities (CT, MR and PET); and even temporally resolved (i.e., 4D) imaging. This information is passed on to optimization algorithms to establish accelerator movements that are programmed directly for IMRT, VMAT and stereotactic treatments. These advances have made commissioning and QA of the TPS much more challenging. This education session reviews up-to-date experience and guidance on this subject; including the recently published AAPM Medical Physics Practice Guideline (MPPG) #5 “Commissioning and QA of Treatment Planning Dose Calculations: Megavoltage Photon and Electron Beams”. Treatment Planning System Commissioning and QA: Challenges and Opportunities (Greg Salomons) This session will provide some key background and review publications describing prominent incidents relating to TPS commissioning and QA. Traditional approaches have been hardware and feature oriented. They aim to establish a functional configuration and establish specifications for regular testing of features (like dose calculation) to assure stable operation and detect failures. With the advent of more complex systems, more patient-specific testing has also been adopted. A number of actual TPS defects will be presented along with heuristics for identifying similar

  15. WE-AB-201-00: Treatment Planning System Commissioning and QA

    International Nuclear Information System (INIS)

    2015-01-01

    Treatment planning systems (TPS) are a cornerstone of modern radiation therapy. Errors in their commissioning or use can have a devastating impact on many patients. To support safe and high quality care, medical physicists must conduct efficient and proper commissioning, good clinical integration, and ongoing quality assurance (QA) of the TPS. AAPM Task Group 53 and related publications have served as seminal benchmarks for TPS commissioning and QA over the past two decades. Over the same time, continuing innovations have made the TPS even more complex and more central to the clinical process. Medical goals are now expressed in terms of the dose and margins around organs and tissues that are delineated from multiple imaging modalities (CT, MR and PET); and even temporally resolved (i.e., 4D) imaging. This information is passed on to optimization algorithms to establish accelerator movements that are programmed directly for IMRT, VMAT and stereotactic treatments. These advances have made commissioning and QA of the TPS much more challenging. This education session reviews up-to-date experience and guidance on this subject; including the recently published AAPM Medical Physics Practice Guideline (MPPG) #5 “Commissioning and QA of Treatment Planning Dose Calculations: Megavoltage Photon and Electron Beams”. Treatment Planning System Commissioning and QA: Challenges and Opportunities (Greg Salomons) This session will provide some key background and review publications describing prominent incidents relating to TPS commissioning and QA. Traditional approaches have been hardware and feature oriented. They aim to establish a functional configuration and establish specifications for regular testing of features (like dose calculation) to assure stable operation and detect failures. With the advent of more complex systems, more patient-specific testing has also been adopted. A number of actual TPS defects will be presented along with heuristics for identifying similar

  16. WE-AB-201-01: Treatment Planning System Commissioning and QA: Challenges and Opportunities

    International Nuclear Information System (INIS)

    Salomons, G.

    2015-01-01

    Treatment planning systems (TPS) are a cornerstone of modern radiation therapy. Errors in their commissioning or use can have a devastating impact on many patients. To support safe and high quality care, medical physicists must conduct efficient and proper commissioning, good clinical integration, and ongoing quality assurance (QA) of the TPS. AAPM Task Group 53 and related publications have served as seminal benchmarks for TPS commissioning and QA over the past two decades. Over the same time, continuing innovations have made the TPS even more complex and more central to the clinical process. Medical goals are now expressed in terms of the dose and margins around organs and tissues that are delineated from multiple imaging modalities (CT, MR and PET); and even temporally resolved (i.e., 4D) imaging. This information is passed on to optimization algorithms to establish accelerator movements that are programmed directly for IMRT, VMAT and stereotactic treatments. These advances have made commissioning and QA of the TPS much more challenging. This education session reviews up-to-date experience and guidance on this subject; including the recently published AAPM Medical Physics Practice Guideline (MPPG) #5 “Commissioning and QA of Treatment Planning Dose Calculations: Megavoltage Photon and Electron Beams”. Treatment Planning System Commissioning and QA: Challenges and Opportunities (Greg Salomons) This session will provide some key background and review publications describing prominent incidents relating to TPS commissioning and QA. Traditional approaches have been hardware and feature oriented. They aim to establish a functional configuration and establish specifications for regular testing of features (like dose calculation) to assure stable operation and detect failures. With the advent of more complex systems, more patient-specific testing has also been adopted. A number of actual TPS defects will be presented along with heuristics for identifying similar

  17. Lymphocytes Negatively Regulate NK Cell Activity via Qa-1b following Viral Infection

    Directory of Open Access Journals (Sweden)

    Haifeng C. Xu

    2017-11-01

    Full Text Available NK cells can reduce anti-viral T cell immunity during chronic viral infections, including infection with the lymphocytic choriomeningitis virus (LCMV. However, regulating factors that maintain the equilibrium between productive T cell and NK cell immunity are poorly understood. Here, we show that a large viral load resulted in inhibition of NK cell activation, which correlated with increased expression of Qa-1b, a ligand for inhibitory NK cell receptors. Qa-1b was predominantly upregulated on B cells following LCMV infection, and this upregulation was dependent on type I interferons. Absence of Qa-1b resulted in increased NK cell-mediated regulation of anti-viral T cells following viral infection. Consequently, anti-viral T cell immunity was reduced in Qa-1b- and NKG2A-deficient mice, resulting in increased viral replication and immunopathology. NK cell depletion restored anti-viral immunity and virus control in the absence of Qa-1b. Taken together, our findings indicate that lymphocytes limit NK cell activity during viral infection in order to promote anti-viral T cell immunity.

  18. QA in the design and fabrication of the TMI-2 rail cask

    International Nuclear Information System (INIS)

    Hayes, G.R.

    1988-01-01

    EGandG Idaho, Inc., acting on behalf of the US Department of Energy, is responsible for transporting core debris from Three Mile Island-Unit 2 to the Idaho National Engineering Laboratory. Transportation of the debris is being accomplished using an NRC licensed container, called the NuPac 125-B. This paper describes the NuPac 125-B Rail Cask and the quality assurance (QA) requirements for that system. Also discussed are the QA roles of the various organizations involved in designing, building, inspecting and testing the NuPac 125-B. The paper presents QA/QC systems implemented during the design, procurement, and fabrication of the cask to assure compliance with all applicable technical codes, standards and regulations. It also goes beyond the requirements aspect and describes unique QA/QC measures employed to assure that the cask was built with minimum QA problems. Finally, the lessons learned from the NuPac 125-B project is discussed. 4 refs., 4 figs

  19. Interface Anywhere

    Data.gov (United States)

    National Aeronautics and Space Administration — Current paradigms for crew interfaces to the systems that require control are constrained by decades old technologies which require the crew to be physically near an...

  20. Intelligence-Augmented Rat Cyborgs in Maze Solving.

    Directory of Open Access Journals (Sweden)

    Yipeng Yu

    Full Text Available Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs. They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg has great potential in various applications, such as search and rescue in complex terrains.

  1. Intelligence-Augmented Rat Cyborgs in Maze Solving.

    Science.gov (United States)

    Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui

    2016-01-01

    Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains.

  2. Intelligent Vehicle Health Management

    Science.gov (United States)

    Paris, Deidre E.; Trevino, Luis; Watson, Michael D.

    2005-01-01

    objectives: Guidance and Navigation; Communications and Tracking; Vehicle Monitoring; Information Transport and Integration; Vehicle Diagnostics; Vehicle Prognostics; Vehicle mission Planning; Automated Repair and Replacement; Vehicle Control; Human Computer Interface; and Onboard Verification and Validation. Furthermore, the presented framework provides complete vehicle management which not only allows for increased crew safety and mission success through new intelligence capabilities, but also yields a mechanism for more efficient vehicle operations. The representative IVHM technologies for computer platform using heterogeneous communication, 3) coupled electromagnetic oscillators for enhanced communications, 4) Linux-based real-time systems, 5) genetic algorithms, 6) Bayesian Networks, 7) evolutionary algorithms, 8) dynamic systems control modeling, and 9) advanced sensing capabilities. This paper presents IVHM technologies developed under NASA's NFFP pilot project and the integration of these technologies forms the framework for IIVM.

  3. The role of automation and artificial intelligence

    Science.gov (United States)

    Schappell, R. T.

    1983-07-01

    Consideration is given to emerging technologies that are not currently in common use, yet will be mature enough for implementation in a space station. Artificial intelligence (AI) will permit more autonomous operation and improve the man-machine interfaces. Technology goals include the development of expert systems, a natural language query system, automated planning systems, and AI image understanding systems. Intelligent robots and teleoperators will be needed, together with improved sensory systems for the robotics, housekeeping, vehicle control, and spacecraft housekeeping systems. Finally, NASA is developing the ROBSIM computer program to evaluate level of automation, perform parametric studies and error analyses, optimize trajectories and control systems, and assess AI technology.

  4. Development of intelligent supervisory control system

    International Nuclear Information System (INIS)

    Takizawa, Y.; Fukumoto, A.; Makino, M.; Takiguchi, S.

    1994-01-01

    The objective of the development of an intelligent supervisory control system for next generation plants is enhancement of the operational reliability by applying the recent outcome of artificial intelligence and computer technologies. This system consists of the supervisory control and monitoring for automatic operation, the equipment operation support for historical data management and for test scheduling, the operators' decision making support for accidental plant situations and the human-friendly interface of these support functions. The verification test results showed the validity of the functions realized by this system for the next generation control room. (author)

  5. Intelligent operation system for nuclear power plants

    International Nuclear Information System (INIS)

    Morioka, Toshihiko; Fukumoto, Akira; Suto, Osamu; Naito, Norio.

    1987-01-01

    Nuclear power plants consist of many systems and are operated by skillful operators with plenty of knowledge and experience of nuclear plants. Recently, plant automation or computerized operator support systems have come to be utilized, but the synthetic judgment of plant operation and management remains as human roles. Toshiba is of the opinion that the activities (planning, operation and maintenance) should be integrated, and man-machine interface should be human-friendly. We have begun to develop the intelligent operation system aiming at reducing the operator's role within the fundamental judgment through the use of artificial intelligence. (author)

  6. Studies on a Q/A selector for the SECRAL electron cyclotron resonance ion source.

    Science.gov (United States)

    Yang, Y; Sun, L T; Feng, Y C; Fang, X; Lu, W; Zhang, W H; Cao, Y; Zhang, X Z; Zhao, H W

    2014-08-01

    Electron cyclotron resonance ion sources are widely used in heavy ion accelerators in the world because they are capable of producing high current beams of highly charged ions. However, the design of the Q/A selector system for these devices is challenging, because it must have a sufficient ion resolution while controlling the beam emittance growth. Moreover, this system has to be matched for a wide range of ion beam species with different intensities. In this paper, research on the Q/A selector system at the SECRAL (Superconducting Electron Cyclotron Resonance ion source with Advanced design in Lanzhou) platform both in experiment and simulation is presented. Based on this study, a new Q/A selector system has been designed for SECRAL II. The features of the new design including beam simulations are also presented.

  7. Conventional patient specific IMRT QA and 3DVH verification of dose distribution for helical tomotherapy

    International Nuclear Information System (INIS)

    Sharma, Prabhat Krishna; Joshi, Kishore; Epili, D.; Gavake, Umesh; Paul, Siji; Reena, Ph.; Jamema, S.V.

    2016-01-01

    In recent years, patient-specific IMRT QA has transitioned from point dose measurements by ion chambers to films to 2D array measurements. 3DVH software has taken this transition a step further by estimating the 3D dose delivered to the patient volume from 2D diode measurements using a planned dose perturbation (PDP) algorithm. This algorithm was developed to determine, if the conventional IMRT QA though sensitive at detecting errors, has any predictive power in detecting dose errors of clinical significance related to dose to the target volume and organs at risk (OAR). The aim of this study is to compare the conventional IMRT patient specific QA and 3DVH dose distribution for patients treated with helical tomotherapy (HT)

  8. WE-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA II

    International Nuclear Information System (INIS)

    Childress, N; Murray, B

    2014-01-01

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Using DoseLab to Perform TG-142 Imaging QA The goals of this session will be to present a clinical overview of acquiring images for TG-142 Imaging QA, as well as analyzing and evaluating results using DoseLab software. DoseLab supports planar imaging QA analysis using almost any QA phantom provided by numerous vendors. General advantages and disadvantages of selecting each of these phantoms will be briefly summarized. Best practices for selecting image acquisition parameters will be presented. A demonstration of using DoseLab software to perform a series of TG-142 tests will be performed. We will disuss why DoseLab uses its own set of imaging QA formulas, and why imaging QA measurement values of the same nominal properties will vary between TG- 142 software packages. Because TG-142 does not specify baseline and tolerance values for imaging QA, the presentation will recommend performing the manufacturer's acceptance test procedure to validate the equipment is functioning correctly. Afterwards, results can be obtained using the clinic's selected set of phantoms, image acquisition parameters, and TG-142 software to set proper baseline values. This presentation will highlight the reasons why comparing imaging QA results can be trickier than comparing linear accelerator treatment results and what physicists should keep in mind when comparing imaging QA results for different machines. Physicists are often unsure of the next step when there is an issue discovered during Imaging QA. Therefore, a few common examples

  9. WE-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA II

    Energy Technology Data Exchange (ETDEWEB)

    Childress, N [Mobius Medical Management, LLC,, Houston, TX (United States); Murray, B [ZapIT Medical, Dublin, OH (Ireland)

    2014-06-15

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Using DoseLab to Perform TG-142 Imaging QA The goals of this session will be to present a clinical overview of acquiring images for TG-142 Imaging QA, as well as analyzing and evaluating results using DoseLab software. DoseLab supports planar imaging QA analysis using almost any QA phantom provided by numerous vendors. General advantages and disadvantages of selecting each of these phantoms will be briefly summarized. Best practices for selecting image acquisition parameters will be presented. A demonstration of using DoseLab software to perform a series of TG-142 tests will be performed. We will disuss why DoseLab uses its own set of imaging QA formulas, and why imaging QA measurement values of the same nominal properties will vary between TG- 142 software packages. Because TG-142 does not specify baseline and tolerance values for imaging QA, the presentation will recommend performing the manufacturer's acceptance test procedure to validate the equipment is functioning correctly. Afterwards, results can be obtained using the clinic's selected set of phantoms, image acquisition parameters, and TG-142 software to set proper baseline values. This presentation will highlight the reasons why comparing imaging QA results can be trickier than comparing linear accelerator treatment results and what physicists should keep in mind when comparing imaging QA results for different machines. Physicists are often unsure of the next step when there is an issue discovered during Imaging QA. Therefore, a few common examples

  10. Improvement in QA protocol for TLD based personnel monitoring laboratory in last five year

    International Nuclear Information System (INIS)

    Rakesh, R.B.

    2018-01-01

    The Quality Assurance (QA) in Personnel monitoring (PM) is a tool to assess the performance of PM laboratories and reliability of dose estimation with respect to standards laid down by international agencies such as IAEA (ISO trumpet curve), IEC, ANSI etc. Reliable personal dose estimation is a basic requirement for radiation protection planning as well as decision making continuous improvement in radiation protection is inherent in radiation protection practices which is highly dependent on accuracy and reliability of the monitoring data. Experience based evolution of Quality control (QC) measures as well as Quality assurance (QA) protocol are two important aspects towards continuous improvement in accuracy and reliability of personnel monitoring results. The paper describes improvement in QC measures and QA protocols initiated during the last five years which led to improvement in the quality of PM services

  11. The assessment report of QA program through the analysis of quality trend in 1994

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yung Se; Hong, Kyung Sik; Park, Sang Pil; Park, Kun Woo [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-04-01

    Effectiveness and adequacy of KAERI Qualify Assurance Program is assessed through the analysis of quality trend. As a result of assessment, Quality Assurance System for each project has reached the stage of stabilization, and especially, significant improvement of the conformance to QA procedure, the control of QA Records and documents and the inspiration of quality mind for the job has been made. However, some problems discovered in this trend analysis, ie, improvement of efficiency of quality training and economies of design verification system, are required to take preventive actions and consider appropriate measures. In the future, QA is expected to be a support to assurance of nuclear safety and development of advanced technology by making it possible to establish the best quality system suitable for our situation, based on the assessment method for quality assurance program presented in this study. 5 figs., 30 tabs. (Author).

  12. The assessment report of QA program through the analysis of quality trend in 1994

    International Nuclear Information System (INIS)

    Kim, Yung Se; Hong, Kyung Sik; Park, Sang Pil; Park, Kun Woo

    1995-04-01

    Effectiveness and adequacy of KAERI Qualify Assurance Program is assessed through the analysis of quality trend. As a result of assessment, Quality Assurance System for each project has reached the stage of stabilization, and especially, significant improvement of the conformance to QA procedure, the control of QA Records and documents and the inspiration of quality mind for the job has been made. However, some problems discovered in this trend analysis, ie, improvement of efficiency of quality training and economies of design verification system, are required to take preventive actions and consider appropriate measures. In the future, QA is expected to be a support to assurance of nuclear safety and development of advanced technology by making it possible to establish the best quality system suitable for our situation, based on the assessment method for quality assurance program presented in this study. 5 figs., 30 tabs. (Author)

  13. Educational Programs for Intelligence Professionals.

    Science.gov (United States)

    Miller, Jerry P.

    1994-01-01

    Discusses the need for education programs for competitive intelligence professionals. Highlights include definitions of intelligence functions, focusing on business intelligence; information utilization by decision makers; information sources; competencies for intelligence professionals; and the development of formal education programs. (38…

  14. A New Dimension of Business Intelligence: Location-based Intelligence

    OpenAIRE

    Zeljko Panian

    2012-01-01

    Through the course of this paper we define Locationbased Intelligence (LBI) which is outgrowing from process of amalgamation of geolocation and Business Intelligence. Amalgamating geolocation with traditional Business Intelligence (BI) results in a new dimension of BI named Location-based Intelligence. LBI is defined as leveraging unified location information for business intelligence. Collectively, enterprises can transform location data into business intelligence applic...

  15. The unusually strong hydrogen bond between the carbonyl of Q(A) and His M219 in the Rhodobacter sphaeroides reaction center is not essential for efficient electron transfer from Q(A)(-) to Q(B).

    Science.gov (United States)

    Breton, Jacques; Lavergne, Jérôme; Wakeham, Marion C; Nabedryk, Eliane; Jones, Michael R

    2007-06-05

    In native reaction centers (RCs) from photosynthetic purple bacteria the primary quinone (QA) and the secondary quinone (QB) are interconnected via a specific His-Fe-His bridge. In Rhodobacter sphaeroides RCs the C4=O carbonyl of QA forms a very strong hydrogen bond with the protonated Npi of His M219, and the Ntau of this residue is in turn coordinated to the non-heme iron atom. The second carbonyl of QA is engaged in a much weaker hydrogen bond with the backbone N-H of Ala M260. In previous work, a Trp side chain was introduced by site-directed mutagenesis at the M260 position in the RC of Rb. sphaeroides, resulting in a complex that is completely devoid of QA and therefore nonfunctional. A photochemically competent derivative of the AM260W mutant was isolated that contains a Cys side chain at the M260 position (denoted AM260(W-->C)). In the present work, the interactions between the carbonyl groups of QA and the protein in the AM260(W-->C) suppressor mutant have been characterized by light-induced FTIR difference spectroscopy of the photoreduction of QA. The QA-/QA difference spectrum demonstrates that the strong interaction between the C4=O carbonyl of QA and His M219 is lost in the mutant, and the coupled CO and CC modes of the QA- semiquinone are also strongly perturbed. In parallel, a band assigned to the perturbation of the C5-Ntau mode of His M219 upon QA- formation in the native RC is lacking in the spectrum of the mutant. Furthermore, a positive band between 2900 and 2400 cm-1 that is related to protons fluctuating within a network of highly polarizable hydrogen bonds in the native RC is reduced in amplitude in the mutant. On the other hand, the QB-/QB FTIR difference spectrum is essentially the same as for the native RC. The kinetics of electron transfer from QA- to QB were measured by the flash-induced absorption changes at 780 nm. Compared to native RCs the absorption transients are slowed by a factor of about 2 for both the slow phase (in the

  16. DeepQA: improving the estimation of single protein model quality with deep belief networks.

    Science.gov (United States)

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-12-05

    Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .

  17. TU-C-BRE-01: KEYNOTE PRESENTATION - Emerging Frontiers in IMRT QA

    Energy Technology Data Exchange (ETDEWEB)

    Siebers, J [University of Virginia Health System, Charlottesville, VA (United States)

    2014-06-15

    As IMRT treatment processes advance and mature, so must the quality assurance processes being used to validate their delivery. In some respects, treatment delivery advancements (e.g. VMAT) have out-paced QA advancements. The purpose of this session is to describe new processes that are being implemented to bring IMRT QA up-to-date with the treatment delivery advances. It would explore emerging IMRT QA paradigms, including requirements-based IMRT QA which necessitates definition of delivery errors (e.g. patient dose error, leaf positioning error) and development of processes to ensure reliable error detection. Engineeringbased QA approaches, including use of IMRT treatment delivery process trees, fault tree analysis and failure modes effects analysis would be described. Approaches to detect errors such as (1) during treatment delivery validation using exit fluence detectors (e.g. EPIDs); (2) analysis of treatment delivery via use of machine parameter log files; (3) dose recalculation using (3a) treatment planning system; (3b) record-and-verify; or (3c) entrance and exit fluence measurement parameters would be explained. The relative advantages and disadvantages of each method would be discussed. Schemes for error classification and root cause analysis would be described – steps which are essential for future error prevention. For each QA method, testing procedures and results would be presented indicating the types of errors that can be detected, those that cannot be detected, and the reliability of the error detection method (for example determined via ROC analysis). For speakers, we are seeking to engage non-commercially biased experts. Those listed below are a sub-sample of possible qualified individuals.

  18. Application of QA to R ampersand D support of HLW programs

    International Nuclear Information System (INIS)

    Ryder, D.E.

    1988-01-01

    Quality has always been of primary importance in the research and development (R ampersand D) environment. An organization's ability to attract funds for new or continued research is largely dependent on the quality of past performance. However, with the possible exceptions of peer reviews for fund allocation and the referee process prior to publication, past quality assurance (QA) activities were primarily informal good practices. This resulted in standards of acceptable practice that varied from organization to organization. The increasing complexity of R ampersand D projects and the increasing need for project results to be upheld outside the scientific community (i.e., lawsuits and licensing hearings) are encouraging R ampersand D organizations and their clients to adopt more formalized methods for the scientific process and to increase control over support organizations (i.e., suppliers and subcontractors). This has become especially true for R ampersand D organizations involved in the high-level (HLW) projects for a number of years. The PNL began to implement QA program requirements within a few HLW repository preliminary studies in 1978. In 1985, PNL developed a comprehensive QA program for R ampersand D activities in support of two of the proposed repository projects. This QA program was developed by the PNL QA department with a significant amount of support assistance and guidance from PNL upper management, the Basalt Waste Isolation Project (BWIP), and the Salt Repository Program Office (SPRO). The QA program has been revised to add a three-level feature and is currently being implemented on projects sponsored by the Office of Geologic Repositories (DOE/OGR), Repository Technology Program (DOE-CH), Nevada Nuclear Waste Storage Investigation (NNWSI) Project, and other HLW projects

  19. TU-C-BRE-01: KEYNOTE PRESENTATION - Emerging Frontiers in IMRT QA

    International Nuclear Information System (INIS)

    Siebers, J

    2014-01-01

    As IMRT treatment processes advance and mature, so must the quality assurance processes being used to validate their delivery. In some respects, treatment delivery advancements (e.g. VMAT) have out-paced QA advancements. The purpose of this session is to describe new processes that are being implemented to bring IMRT QA up-to-date with the treatment delivery advances. It would explore emerging IMRT QA paradigms, including requirements-based IMRT QA which necessitates definition of delivery errors (e.g. patient dose error, leaf positioning error) and development of processes to ensure reliable error detection. Engineeringbased QA approaches, including use of IMRT treatment delivery process trees, fault tree analysis and failure modes effects analysis would be described. Approaches to detect errors such as (1) during treatment delivery validation using exit fluence detectors (e.g. EPIDs); (2) analysis of treatment delivery via use of machine parameter log files; (3) dose recalculation using (3a) treatment planning system; (3b) record-and-verify; or (3c) entrance and exit fluence measurement parameters would be explained. The relative advantages and disadvantages of each method would be discussed. Schemes for error classification and root cause analysis would be described – steps which are essential for future error prevention. For each QA method, testing procedures and results would be presented indicating the types of errors that can be detected, those that cannot be detected, and the reliability of the error detection method (for example determined via ROC analysis). For speakers, we are seeking to engage non-commercially biased experts. Those listed below are a sub-sample of possible qualified individuals

  20. Crowd-Sourced Intelligence Agency: Prototyping counterveillance

    Directory of Open Access Journals (Sweden)

    Jennifer Gradecki

    2017-02-01

    Full Text Available This paper discusses how an interactive artwork, the Crowd-Sourced Intelligence Agency (CSIA, can contribute to discussions of Big Data intelligence analytics. The CSIA is a publicly accessible Open Source Intelligence (OSINT system that was constructed using information gathered from technical manuals, research reports, academic papers, leaked documents, and Freedom of Information Act files. Using a visceral heuristic, the CSIA demonstrates how the statistical correlations made by automated classification systems are different from human judgment and can produce false-positives, as well as how the display of information through an interface can affect the judgment of an intelligence agent. The public has the right to ask questions about how a computer program determines if they are a threat to national security and to question the practicality of using statistical pattern recognition algorithms in place of human judgment. Currently, the public’s lack of access to both Big Data and the actual datasets intelligence agencies use to train their classification algorithms keeps the possibility of performing effective sous-dataveillance out of reach. Without this data, the results returned by the CSIA will not be identical to those of intelligence agencies. Because we have replicated how OSINT is processed, however, our results will resemble the type of results and mistakes made by OSINT systems. The CSIA takes some initial steps toward contributing to an informed public debate about large-scale monitoring of open source, social media data and provides a prototype for counterveillance and sousveillance tools for citizens.

  1. Intelligent Extruder

    Energy Technology Data Exchange (ETDEWEB)

    AlperEker; Mark Giammattia; Paul Houpt; Aditya Kumar; Oscar Montero; Minesh Shah; Norberto Silvi; Timothy Cribbs

    2003-04-24

    ''Intelligent Extruder'' described in this report is a software system and associated support services for monitoring and control of compounding extruders to improve material quality, reduce waste and energy use, with minimal addition of new sensors or changes to the factory floor system components. Emphasis is on process improvements to the mixing, melting and de-volatilization of base resins, fillers, pigments, fire retardants and other additives in the :finishing'' stage of high value added engineering polymer materials. While GE Plastics materials were used for experimental studies throughout the program, the concepts and principles are broadly applicable to other manufacturers materials. The project involved a joint collaboration among GE Global Research, GE Industrial Systems and Coperion Werner & Pleiderer, USA, a major manufacturer of compounding equipment. Scope of the program included development of a algorithms for monitoring process material viscosity without rheological sensors or generating waste streams, a novel detection scheme for rapid detection of process upsets and an adaptive feedback control system to compensate for process upsets where at line adjustments are feasible. Software algorithms were implemented and tested on a laboratory scale extruder (50 lb/hr) at GE Global Research and data from a production scale system (2000 lb/hr) at GE Plastics was used to validate the monitoring and detection software. Although not evaluated experimentally, a new concept for extruder process monitoring through estimation of high frequency drive torque without strain gauges is developed and demonstrated in simulation. A plan to commercialize the software system is outlined, but commercialization has not been completed.

  2. Information for the user in design of intelligent systems

    Science.gov (United States)

    Malin, Jane T.; Schreckenghost, Debra L.

    1993-01-01

    Recommendations are made for improving intelligent system reliability and usability based on the use of information requirements in system development. Information requirements define the task-relevant messages exchanged between the intelligent system and the user by means of the user interface medium. Thus, these requirements affect the design of both the intelligent system and its user interface. Many difficulties that users have in interacting with intelligent systems are caused by information problems. These information problems result from the following: (1) not providing the right information to support domain tasks; and (2) not recognizing that using an intelligent system introduces new user supervisory tasks that require new types of information. These problems are especially prevalent in intelligent systems used for real-time space operations, where data problems and unexpected situations are common. Information problems can be solved by deriving information requirements from a description of user tasks. Using information requirements embeds human-computer interaction design into intelligent system prototyping, resulting in intelligent systems that are more robust and easier to use.

  3. Size Effect of the 2-D Bodies on the Geothermal Gradient and Q-A Plot

    Science.gov (United States)

    Thakur, M.; Blackwell, D. D.

    2009-12-01

    Using numerical models we have investigated some of the criticisms on the Q-A plot of related to the effect of size of the body on the slope and reduced heat flow. The effects of horizontal conduction depend on the relative difference of radioactivity between the body and the country rock (assuming constant thermal conductivity). Horizontal heat transfer due to different 2-D bodies was numerically studied in order to quantify resulting temperature differences at the Moho and errors on the predication of Qr (reduced heat flow). Using the two end member distributions of radioactivity, the step model (thickness 10km) and exponential model, different 2-D models of horizontal scale (width) ranging from 10 -500 km were investigated. Increasing the horizontal size of the body tends to move observations closer towards the 1-D solution. A temperature difference of 50 oC is produced (for the step model) at Moho between models of width 10 km versus 500 km. In other words the 1-D solution effectively provides large scale averaging in terms of heat flow and temperature field in the lithosphere. For bodies’ ≤ 100 km wide the geotherms at shallower levels are affected, but at depth they converge and are 50 oC lower than that of the infinite plate model temperature. In case of 2-D bodies surface heat flow is decreased due to horizontal transfer of heat, which will shift the Q-A point vertically downward on the Q-A plot. The smaller the size of the body, the more will be the deviation from the 1-D solution and the more will be the movement of Q-A point downwards on a Q-A plot. On the Q-A plot, a limited points of bodies of different sizes with different radioactivity contrast (for the step and exponential model), exactly reproduce the reduced heat flow Qr. Thus the size of the body can affect the slope on a Q-A plot but Qr is not changed. Therefore, Qr ~ 32 mWm-2 obtained from the global terrain average Q-A plot represents the best estimate of stable continental mantle heat

  4. SU-F-T-285: Evaluation of a Patient DVH-Based IMRT QA System

    Energy Technology Data Exchange (ETDEWEB)

    Zhen, H; Redler, G; Chu, J; Turian, J [Rush University Medical Center, Chicago, IL (United States)

    2016-06-15

    Purpose: To evaluate the clinical performance of a patient DVH-based QA system for prostate VMAT QA. Methods: Mobius3D(M3D) is a QA software with an independent beam model and dose engine. The MobiusFX(MFX) add-on predicts patient dose using treatment machine log files. We commissioned the Mobius beam model in two steps. First, the stock beam model was customized using machine commissioning data, then verified against the TPS with 12 simple phantom plans and 7 clinical 3D plans. Secondly, the Dosimetric Leaf Gap(DLG) in the Mobius model was fine-tuned for VMAT treatment based on ion chamber measurements for 6 clinical VMAT plans. Upon successful commissioning, we retrospectively performed IMRT QA for 12 VMAT plans with the Mobius system as well as the ArcCHECK-3DVH system. Selected patient DVH values (PTV D95, D50; Bladder D2cc, Dmean; Rectum D2cc) were compared between TPS, M3D, MFX, and 3DVH. Results: During the first commissioning step, TPS and M3D calculated target Dmean for 3D plans agree within 0.7%±0.7%, with 3D gamma passing rates of 98%±2%. In the second commissioning step, the Mobius DLG was adjusted by 1.2mm from the stock value, reducing the average difference between MFX calculation and ion chamber measurement from 3.2% to 0.1%. In retrospective prostate VMAT QA, 5 of 60 MFX calculated DVH values have a deviation greater than 5% compared to TPS. One large deviation at high dose level was identified as a potential QA failure. This echoes the 3DVH QA result, which identified 2 instances of large DVH deviation on the same structure. For all DVH’s evaluated, M3D and MFX show high level of agreement (0.1%±0.2%), indicating that the observed deviation is likely from beam modelling differences rather than delivery errors. Conclusion: Mobius system provides a viable solution for DVH based VMAT QA, with the capability of separating TPS and delivery errors.

  5. Development of database and QA systems for post closure performance assessment on a potential HLW repository

    International Nuclear Information System (INIS)

    Hwang, Y. S.; Kim, S. G.; Kang, C. H.

    2002-01-01

    In TSPA of long-term post closure radiological safety on permanent disposal of HLW in Korea, appropriate management of input and output data through QA is necessary. The robust QA system is developed using the T2R3 principles applicable for five major steps in R and D's. The proposed system is implemented in the web-based system so that all participants in TSRA are able to access the system. In addition, the internet based input database for TSPA is developed. Currently data from literature surveys, domestic laboratory and field experiments as well as expert elicitation are applied for TSPA

  6. SU-F-T-285: Evaluation of a Patient DVH-Based IMRT QA System

    International Nuclear Information System (INIS)

    Zhen, H; Redler, G; Chu, J; Turian, J

    2016-01-01

    Purpose: To evaluate the clinical performance of a patient DVH-based QA system for prostate VMAT QA. Methods: Mobius3D(M3D) is a QA software with an independent beam model and dose engine. The MobiusFX(MFX) add-on predicts patient dose using treatment machine log files. We commissioned the Mobius beam model in two steps. First, the stock beam model was customized using machine commissioning data, then verified against the TPS with 12 simple phantom plans and 7 clinical 3D plans. Secondly, the Dosimetric Leaf Gap(DLG) in the Mobius model was fine-tuned for VMAT treatment based on ion chamber measurements for 6 clinical VMAT plans. Upon successful commissioning, we retrospectively performed IMRT QA for 12 VMAT plans with the Mobius system as well as the ArcCHECK-3DVH system. Selected patient DVH values (PTV D95, D50; Bladder D2cc, Dmean; Rectum D2cc) were compared between TPS, M3D, MFX, and 3DVH. Results: During the first commissioning step, TPS and M3D calculated target Dmean for 3D plans agree within 0.7%±0.7%, with 3D gamma passing rates of 98%±2%. In the second commissioning step, the Mobius DLG was adjusted by 1.2mm from the stock value, reducing the average difference between MFX calculation and ion chamber measurement from 3.2% to 0.1%. In retrospective prostate VMAT QA, 5 of 60 MFX calculated DVH values have a deviation greater than 5% compared to TPS. One large deviation at high dose level was identified as a potential QA failure. This echoes the 3DVH QA result, which identified 2 instances of large DVH deviation on the same structure. For all DVH’s evaluated, M3D and MFX show high level of agreement (0.1%±0.2%), indicating that the observed deviation is likely from beam modelling differences rather than delivery errors. Conclusion: Mobius system provides a viable solution for DVH based VMAT QA, with the capability of separating TPS and delivery errors.

  7. Intelligent Mission Controller Node

    National Research Council Canada - National Science Library

    Perme, David

    2002-01-01

    The goal of the Intelligent Mission Controller Node (IMCN) project was to improve the process of translating mission taskings between real-world Command, Control, Communications, Computers, and Intelligence (C41...

  8. Algorithms in ambient intelligence

    NARCIS (Netherlands)

    Aarts, E.H.L.; Korst, J.H.M.; Verhaegh, W.F.J.; Verhaegh, W.F.J.; Aarts, E.H.L.; Korst, J.H.M.

    2004-01-01

    In this chapter, we discuss the new paradigm for user-centered computing known as ambient intelligence and its relation with methods and techniques from the field of computational intelligence, including problem solving, machine learning, and expert systems.

  9. Advanced intelligent systems

    CERN Document Server

    Ryoo, Young; Jang, Moon-soo; Bae, Young-Chul

    2014-01-01

    Intelligent systems have been initiated with the attempt to imitate the human brain. People wish to let machines perform intelligent works. Many techniques of intelligent systems are based on artificial intelligence. According to changing and novel requirements, the advanced intelligent systems cover a wide spectrum: big data processing, intelligent control, advanced robotics, artificial intelligence and machine learning. This book focuses on coordinating intelligent systems with highly integrated and foundationally functional components. The book consists of 19 contributions that features social network-based recommender systems, application of fuzzy enforcement, energy visualization, ultrasonic muscular thickness measurement, regional analysis and predictive modeling, analysis of 3D polygon data, blood pressure estimation system, fuzzy human model, fuzzy ultrasonic imaging method, ultrasonic mobile smart technology, pseudo-normal image synthesis, subspace classifier, mobile object tracking, standing-up moti...

  10. Designing Interfaces

    CERN Document Server

    Tidwell, Jenifer

    2010-01-01

    Despite all of the UI toolkits available today, it's still not easy to design good application interfaces. This bestselling book is one of the few reliable sources to help you navigate through the maze of design options. By capturing UI best practices and reusable ideas as design patterns, Designing Interfaces provides solutions to common design problems that you can tailor to the situation at hand. This updated edition includes patterns for mobile apps and social media, as well as web applications and desktop software. Each pattern contains full-color examples and practical design advice th

  11. Artificial Intelligence Project

    Science.gov (United States)

    1990-01-01

    Symposium on Aritificial Intelligence and Software Engineering Working Notes, March 1989. Blumenthal, Brad, "An Architecture for Automating...Artificial Intelligence Project Final Technical Report ARO Contract: DAAG29-84-K-OGO Artificial Intelligence LaboratO"ry The University of Texas at...Austin N>.. ~ ~ JA 1/I 1991 n~~~ Austin, Texas 78712 ________k A,.tificial Intelligence Project i Final Technical Report ARO Contract: DAAG29-84-K-0060

  12. Monte Carlo-based QA for IMRT of head and neck cancers

    Science.gov (United States)

    Tang, F.; Sham, J.; Ma, C.-M.; Li, J.-S.

    2007-06-01

    It is well-known that the presence of large air cavity in a dense medium (or patient) introduces significant electronic disequilibrium when irradiated with megavoltage X-ray field. This condition may worsen by the possible use of tiny beamlets in intensity-modulated radiation therapy (IMRT). Commercial treatment planning systems (TPSs), in particular those based on the pencil-beam method, do not provide accurate dose computation for the lungs and other cavity-laden body sites such as the head and neck. In this paper we present the use of Monte Carlo (MC) technique for dose re-calculation of IMRT of head and neck cancers. In our clinic, a turn-key software system is set up for MC calculation and comparison with TPS-calculated treatment plans as part of the quality assurance (QA) programme for IMRT delivery. A set of 10 off-the-self PCs is employed as the MC calculation engine with treatment plan parameters imported from the TPS via a graphical user interface (GUI) which also provides a platform for launching remote MC simulation and subsequent dose comparison with the TPS. The TPS-segmented intensity maps are used as input for the simulation hence skipping the time-consuming simulation of the multi-leaf collimator (MLC). The primary objective of this approach is to assess the accuracy of the TPS calculations in the presence of air cavities in the head and neck whereas the accuracy of leaf segmentation is verified by fluence measurement using a fluoroscopic camera-based imaging device. This measurement can also validate the correct transfer of intensity maps to the record and verify system. Comparisons between TPS and MC calculations of 6 MV IMRT for typical head and neck treatments review regional consistency in dose distribution except at and around the sinuses where our pencil-beam-based TPS sometimes over-predicts the dose by up to 10%, depending on the size of the cavities. In addition, dose re-buildup of up to 4% is observed at the posterior nasopharyngeal

  13. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    Science.gov (United States)

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA

  14. Orchestrating Multiple Intelligences

    Science.gov (United States)

    Moran, Seana; Kornhaber, Mindy; Gardner, Howard

    2006-01-01

    Education policymakers often go astray when they attempt to integrate multiple intelligences theory into schools, according to the originator of the theory, Howard Gardner, and his colleagues. The greatest potential of a multiple intelligences approach to education grows from the concept of a profile of intelligences. Each learner's intelligence…

  15. Algorithms in ambient intelligence

    NARCIS (Netherlands)

    Aarts, E.H.L.; Korst, J.H.M.; Verhaegh, W.F.J.; Weber, W.; Rabaey, J.M.; Aarts, E.

    2005-01-01

    We briefly review the concept of ambient intelligence and discuss its relation with the domain of intelligent algorithms. By means of four examples of ambient intelligent systems, we argue that new computing methods and quantification measures are needed to bridge the gap between the class of

  16. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  17. Reflection on robotic intelligence

    NARCIS (Netherlands)

    Bartneck, C.

    2006-01-01

    This paper reflects on the development or robots, both their physical shape as well as their intelligence. The later strongly depends on the progress made in the artificial intelligence (AI) community which does not yet provide the models and tools necessary to create intelligent robots. It is time

  18. Interface unit

    NARCIS (Netherlands)

    Keyson, D.V.; Freudenthal, A.; De Hoogh, M.P.A.; Dekoven, E.A.M.

    2001-01-01

    The invention relates to an interface unit comprising at least a display unit for communication with a user, which is designed for being coupled with a control unit for at least one or more parameters in a living or working environment, such as the temperature setting in a house, which control unit

  19. Recommendation in Motion: Intelligent Hypertouch Garment Design

    Directory of Open Access Journals (Sweden)

    Shuang Liang

    2013-01-01

    Full Text Available Intelligent CAD garment design becomes more and more popular by attracting the attentions from both manufacturers and professional stylists. The existing garment CAD systems and clothing simulation software fail to provide user-friendly interfaces as well as dynamic recommendation during the garment creation process. In this paper, we propose an intelligent hypertouch garment design system, which dynamically predicts the possible solutions along with the intelligent design procedure. User behavioral information and dynamic shape matching are used to learn and predict the desired garment patterns. We also propose a new hypertouch concept of gesture-based interaction for our system. We evaluate our system with a prototype platform. The results show that our system is effective, robust, and easy to use for quick garment design.

  20. Interface superconductivity

    Energy Technology Data Exchange (ETDEWEB)

    Gariglio, S., E-mail: stefano.gariglio@unige.ch [DQMP, Université de Genève, 24 Quai E.-Ansermet, CH-1211 Genève (Switzerland); Gabay, M. [Laboratoire de Physique des Solides, Bat 510, Université Paris-Sud 11, Centre d’Orsay, 91405 Orsay Cedex (France); Mannhart, J. [Max Planck Institute for Solid State Research, 70569 Stuttgart (Germany); Triscone, J.-M. [DQMP, Université de Genève, 24 Quai E.-Ansermet, CH-1211 Genève (Switzerland)

    2015-07-15

    Highlights: • We discuss interfacial superconductivity, a field boosted by the discovery of the superconducting interface between LaAlO. • This system allows the electric field control and the on/off switching of the superconducting state. • We compare superconductivity at the interface and in bulk doped SrTiO. • We discuss the role of the interfacially induced Rashba type spin–orbit. • We briefly discuss superconductivity in cuprates, in electrical double layer transistor field effect experiments. • Recent observations of a high T{sub c} in a monolayer of FeSe deposited on SrTiO{sub 3} are presented. - Abstract: Low dimensional superconducting systems have been the subject of numerous studies for many years. In this article, we focus our attention on interfacial superconductivity, a field that has been boosted by the discovery of superconductivity at the interface between the two band insulators LaAlO{sub 3} and SrTiO{sub 3}. We explore the properties of this amazing system that allows the electric field control and on/off switching of superconductivity. We discuss the similarities and differences between bulk doped SrTiO{sub 3} and the interface system and the possible role of the interfacially induced Rashba type spin–orbit. We also, more briefly, discuss interface superconductivity in cuprates, in electrical double layer transistor field effect experiments, and the recent observation of a high T{sub c} in a monolayer of FeSe deposited on SrTiO{sub 3}.

  1. Natural-language processing applied to an ITS interface

    OpenAIRE

    Antonio Gisolfi; Enrico Fischetti

    1994-01-01

    The aim of this paper is to show that with a subset of a natural language, simple systems running on PCs can be developed that can nevertheless be an effective tool for interfacing purposes in the building of an Intelligent Tutoring System (ITS). After presenting the special characteristics of the Smalltalk/V language, which provides an appropriate environment for the development of an interface, the overall architecture of the interface module is discussed. We then show how sentences are par...

  2. 40 CFR 98.214 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.214 Section 98.214 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... standard method or other enhanced industry consensus standard method published by an industry consensus...

  3. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... by a consensus-based standards organization exists, such a method shall be used. Consensus-based... (NAESB). (ii) Where no appropriate standard method developed by a consensus-based standards organization...

  4. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... published by a consensus-based standards organization if such a method exists. Consensus-based standards...). (ii) Where no appropriate standard method developed by a consensus-based standards organization exists...

  5. 40 CFR 98.144 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... fraction for each carbonate consumed based on sampling and chemical analysis using an industry consensus... testing method published by an industry consensus standards organization (e.g., ASTM, ASME, API, etc.). ...

  6. 40 CFR 98.404 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... published by a consensus-based standards organization exists, such a method shall be used. Consensus-based... (NAESB). (ii) Where no appropriate standard method developed by a consensus-based standards organization...

  7. 40 CFR 98.164 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Petroleum Products and... Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Hydrogen Production § 98.164 Monitoring and QA/QC requirements...

  8. How social Q&A sites are changing knowledge sharing in open source software communities

    NARCIS (Netherlands)

    Vasilescu, B.N.; Serebrenik, A.; Devanbu, P.; Filkov, V.

    2014-01-01

    Historically, mailing lists have been the preferred means for coordinating development and user support activities. With the emergence and popularity growth of social Q&A sites such as the StackExchange network (e.g., StackOverflow), this is beginning to change. Such sites offer different

  9. 40 CFR 98.414 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.414 Section 98.414 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Industrial Greenhouse Gases § 98.414 Monitoring...

  10. Using a User-Interactive QA System for Personalized E-Learning

    Science.gov (United States)

    Hu, Dawei; Chen, Wei; Zeng, Qingtian; Hao, Tianyong; Min, Feng; Wenyin, Liu

    2008-01-01

    A personalized e-learning framework based on a user-interactive question-answering (QA) system is proposed, in which a user-modeling approach is used to capture personal information of students and a personalized answer extraction algorithm is proposed for personalized automatic answering. In our approach, a topic ontology (or concept hierarchy)…

  11. A virtual dosimetry audit - Towards transferability of gamma index analysis between clinical trial QA groups.

    Science.gov (United States)

    Hussein, Mohammad; Clementel, Enrico; Eaton, David J; Greer, Peter B; Haworth, Annette; Ishikura, Satoshi; Kry, Stephen F; Lehmann, Joerg; Lye, Jessica; Monti, Angelo F; Nakamura, Mitsuhiro; Hurkmans, Coen; Clark, Catharine H

    2017-12-01

    Quality assurance (QA) for clinical trials is important. Lack of compliance can affect trial outcome. Clinical trial QA groups have different methods of dose distribution verification and analysis, all with the ultimate aim of ensuring trial compliance. The aim of this study was to gain a better understanding of different processes to inform future dosimetry audit reciprocity. Six clinical trial QA groups participated. Intensity modulated treatment plans were generated for three different cases. A range of 17 virtual 'measurements' were generated by introducing a variety of simulated perturbations (such as MLC position deviations, dose differences, gantry rotation errors, Gaussian noise) to three different treatment plan cases. Participants were blinded to the 'measured' data details. Each group analysed the datasets using their own gamma index (γ) technique and using standardised parameters for passing criteria, lower dose threshold, γ normalisation and global γ. For the same virtual 'measured' datasets, different results were observed using local techniques. For the standardised γ, differences in the percentage of points passing with γ audit has been an informative step in understanding differences in the verification of measured dose distributions between different clinical trial QA groups. This work lays the foundations for audit reciprocity between groups, particularly with more clinical trials being open to international recruitment. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-74,554] International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA, San Jose, CA; Notice of Affirmative Determination Regarding Application for Reconsideration By application dated November 29, 2010, a worker and a state workforce official...

  13. 40 CFR 98.174 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Iron and Steel Production § 98.174 Monitoring and QA/QC... moisture content of the stack gas. (5) Determine the mass rate of process feed or process production (as... Fusion Techniques (incorporated by reference, see § 98.7) for iron and ferrous scrap. (v) ASM CS-104 UNS...

  14. The Long and Winding Road: Duties of an NHEERL QA Manager from 1999 to 2008

    Science.gov (United States)

    My career as a US EPA Quality Assurance Manager (QAM) started on September 26, 1999 when I was appointed the QA and Records Manager for the Experimental Toxicology Division (ETD) in NHEERL, in the Office of Research and Development (ORD), on the Research Triangle Campus in RTP, N...

  15. Confocal Microscopy and Flow Cytometry System Performance: Assessment of QA Parameters that affect data Quanitification

    Science.gov (United States)

    Flow and image cytometers can provide useful quantitative fluorescence data. We have devised QA tests to be used on both a flow cytometer and a confocal microscope to assure that the data is accurate, reproducible and precise. Flow Cytometry: We have provided two simple perform...

  16. Expansion of polyalanine tracts in the QA domain may play a critical ...

    Indian Academy of Sciences (India)

    2015-09-03

    Sep 3, 2015 ... role in the clavicular development of cleidocranial dysplasia. LI-ZHENG ... RUNX2 mutation have been identified in nearly 500 families with CCD ... tracts in the QA domain of RUNX2 influences the transcrip- tional activity of ...

  17. Named Entity Recognition in a Hungarian NL Based QA System

    Science.gov (United States)

    Tikkl, Domonkos; Szidarovszky, P. Ferenc; Kardkovacs, Zsolt T.; Magyar, Gábor

    In WoW project our purpose is to create a complex search interface with the following features: search in the deep web content of contracted partners' databases, processing Hungarian natural language (NL) questions and transforming them to SQL queries for database access, image search supported by a visual thesaurus that describes in a structural form the visual content of images (also in Hungarian). This paper primarily focuses on a particular problem of question processing task: the entity recognition. Before going into details we give a short overview of the project's aims.

  18. Application of graded QA in the nuclear industry

    International Nuclear Information System (INIS)

    Churchill, G.F.

    1987-01-01

    All current schemes for grading of quality assurance are developments or variations on three themes; defining and grouping the planning and systematic actions which constitute the quality assurance approach, defining and classifying the safety, economic and other factors which influence the extent of quality assurance necessary, and using classification or grading schedules to relate grades of quality assurance to classes of factors. The codes and standards which define the actions needed for good management are listed and discussed. A typical equipment classification schedule or a light water reactor is shown. Contract interface documentation and surveillance requirements are discussed. (UK)

  19. Social intelligence, human intelligence and niche construction.

    Science.gov (United States)

    Sterelny, Kim

    2007-04-29

    This paper is about the evolution of hominin intelligence. I agree with defenders of the social intelligence hypothesis in thinking that externalist models of hominin intelligence are not plausible: such models cannot explain the unique cognition and cooperation explosion in our lineage, for changes in the external environment (e.g. increasing environmental unpredictability) affect many lineages. Both the social intelligence hypothesis and the social intelligence-ecological complexity hybrid I outline here are niche construction models. Hominin evolution is hominin response to selective environments that earlier hominins have made. In contrast to social intelligence models, I argue that hominins have both created and responded to a unique foraging mode; a mode that is both social in itself and which has further effects on hominin social environments. In contrast to some social intelligence models, on this view, hominin encounters with their ecological environments continue to have profound selective effects. However, though the ecological environment selects, it does not select on its own. Accidents and their consequences, differential success and failure, result from the combination of the ecological environment an agent faces and the social features that enhance some opportunities and suppress others and that exacerbate some dangers and lessen others. Individuals do not face the ecological filters on their environment alone, but with others, and with the technology, information and misinformation that their social world provides.

  20. On the use of biomathematical models in patient-specific IMRT dose QA

    Energy Technology Data Exchange (ETDEWEB)

    Zhen Heming [UT Southwestern Medical Center, Dallas, Texas 75390 (United States); Nelms, Benjamin E. [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Tome, Wolfgang A. [Department of Radiation Oncology, Division of Medical Physics, Montefiore Medical Center and Institute of Onco-Physics, Albert Einstein College of Medicine, Bronx, New York 10461 (United States)

    2013-07-15

    Purpose: To investigate the use of biomathematical models such as tumor control probability (TCP) and normal tissue complication probability (NTCP) as new quality assurance (QA) metrics.Methods: Five different types of error (MLC transmission, MLC penumbra, MLC tongue and groove, machine output, and MLC position) were intentionally induced to 40 clinical intensity modulated radiation therapy (IMRT) patient plans (20 H and N cases and 20 prostate cases) to simulate both treatment planning system errors and machine delivery errors in the IMRT QA process. The changes in TCP and NTCP for eight different anatomic structures (H and N: CTV, GTV, both parotids, spinal cord, larynx; prostate: CTV, rectal wall) were calculated as the new QA metrics to quantify the clinical impact on patients. The correlation between the change in TCP/NTCP and the change in selected DVH values was also evaluated. The relation between TCP/NTCP change and the characteristics of the TCP/NTCP curves is discussed.Results:{Delta}TCP and {Delta}NTCP were summarized for each type of induced error and each structure. The changes/degradations in TCP and NTCP caused by the errors vary widely depending on dose patterns unique to each plan, and are good indicators of each plan's 'robustness' to that type of error.Conclusions: In this in silico QA study the authors have demonstrated the possibility of using biomathematical models not only as patient-specific QA metrics but also as objective indicators that quantify, pretreatment, a plan's robustness with respect to possible error types.

  1. SU-C-BRD-03: Closing the Loop On Virtual IMRT QA

    International Nuclear Information System (INIS)

    Valdes, G; Scheuermann, R; Y, H C.; Olszanski, A; Bellerive, M; Solberg, T

    2015-01-01

    Purpose: To develop an algorithm that predicts a priori IMRT QA passing rates. Methods: 416 IMRT plans from all treatment sites were planned in Eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam linacs (Varian Medical Systems, Palo Alto, CA). The 3%/3mm and 2%/2mm local distance to agreement (DTA) were recorded during clinical operations using a commercial 2D diode array (MapCHECK 2, Sun Nuclear, Melbourne, FL). Each plan was characterized by 37 metrics that describe different failure modes between the calculated and measured dose. Machine-learning algorithms (MLAs) were trained to learn the relation between the plan characteristics and each passing rate. Minimization of the cross validated error, together with maximum a posteriori estimation (MAP), were used to choose the model parameters. Results: 3%/3mm local DTA can be predicted with an error smaller than 3% for 98% of the plans. For the remaining 2% of plans, the residual error was within 5%. For 2%/2mm local DTA passing rates, 96% percent of the plans were successfully predicted with an error smaller than 5%. All high-risk plans that failed the 2%/2mm local criteria were correctly identified by the algorithm. The most important metric to describe the passing rates was determined to be the MU per Gray (modulation factor). Conclusions: Logs files and independent dose calculations have been suggested as possible substitutes for measurement based IMRT QA. However, none of these methods answer the fundamental question of whether a plan can be delivered with a clinically acceptable error given the limitations of the linacs and the treatment planning system. Predicting the IMRT QA passing rates a priori closes that loop. For additional robustness, virtual IMRT QA can be combined with Linac QA and log file analysis to confirm appropriate delivery

  2. SU-F-P-07: Applying Failure Modes and Effects Analysis to Treatment Planning System QA

    International Nuclear Information System (INIS)

    Mathew, D; Alaei, P

    2016-01-01

    Purpose: A small-scale implementation of Failure Modes and Effects Analysis (FMEA) for treatment planning system QA by utilizing methodology of AAPM TG-100 report. Methods: FMEA requires numerical values for severity (S), occurrence (O) and detectability (D) of each mode of failure. The product of these three values gives a risk priority number (RPN). We have implemented FMEA for the treatment planning system (TPS) QA for two clinics which use Pinnacle and Eclipse TPS. Quantitative monthly QA data dating back to 4 years for Pinnacle and 1 year for Eclipse have been used to determine values for severity (deviations from predetermined doses at points or volumes), and occurrence of such deviations. The TPS QA protocol includes a phantom containing solid water and lung- and bone-equivalent heterogeneities. Photon and electron plans have been evaluated in both systems. The dose values at multiple distinct points of interest (POI) within the solid water, lung, and bone-equivalent slabs, as well as mean doses to several volumes of interest (VOI), have been re-calculated monthly using the available algorithms. Results: The computed doses vary slightly month-over-month. There have been more significant deviations following software upgrades, especially if the upgrade involved re-modeling of the beams. TG-100 guidance and the data presented here suggest an occurrence (O) of 2 depending on the frequency of re-commissioning the beams, severity (S) of 3, and detectability (D) of 2, giving an RPN of 12. Conclusion: Computerized treatment planning systems could pose a risk due to dosimetric errors and suboptimal treatment plans. The FMEA analysis presented here suggests that TPS QA should immediately follow software upgrades, but does not need to be performed every month.

  3. SU-F-P-07: Applying Failure Modes and Effects Analysis to Treatment Planning System QA

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, D; Alaei, P [University Minnesota, Minneapolis, MN (United States)

    2016-06-15

    Purpose: A small-scale implementation of Failure Modes and Effects Analysis (FMEA) for treatment planning system QA by utilizing methodology of AAPM TG-100 report. Methods: FMEA requires numerical values for severity (S), occurrence (O) and detectability (D) of each mode of failure. The product of these three values gives a risk priority number (RPN). We have implemented FMEA for the treatment planning system (TPS) QA for two clinics which use Pinnacle and Eclipse TPS. Quantitative monthly QA data dating back to 4 years for Pinnacle and 1 year for Eclipse have been used to determine values for severity (deviations from predetermined doses at points or volumes), and occurrence of such deviations. The TPS QA protocol includes a phantom containing solid water and lung- and bone-equivalent heterogeneities. Photon and electron plans have been evaluated in both systems. The dose values at multiple distinct points of interest (POI) within the solid water, lung, and bone-equivalent slabs, as well as mean doses to several volumes of interest (VOI), have been re-calculated monthly using the available algorithms. Results: The computed doses vary slightly month-over-month. There have been more significant deviations following software upgrades, especially if the upgrade involved re-modeling of the beams. TG-100 guidance and the data presented here suggest an occurrence (O) of 2 depending on the frequency of re-commissioning the beams, severity (S) of 3, and detectability (D) of 2, giving an RPN of 12. Conclusion: Computerized treatment planning systems could pose a risk due to dosimetric errors and suboptimal treatment plans. The FMEA analysis presented here suggests that TPS QA should immediately follow software upgrades, but does not need to be performed every month.

  4. SU-E-T-432: A Rapid and Comprehensive Procedure for Daily Proton QA

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, T; Sun, B; Grantham, K; Knutson, N; Santanam, L; Goddu, S; Klein, E [Washington University, St. Louis, MO (United States)

    2014-06-01

    Purpose: The objective is to develop a rapid and comprehensive daily QA procedure implemented at the S. Lee Kling Proton Therapy Center at Barnes-Jewish Hospital. Methods: A scribed phantom with imbedded fiducials is used for checking lasers accuracy followed by couch isocentricity and for X-ray imaging congruence with isocenter. A Daily QA3 device (Sun Nuclear, FL) was used to check output, range and profiles. Five chambers in the central region possess various build-ups. After converting the thickness of the inherent build-ups into water equivalent thickness (WET) for proton, range of any beam can be checked with additional build-up on the Daily QA3 device. In our procedure, 3 beams from 3 bands (large, small and deep) with nominal range of 20 cm are checked daily. 17cm plastic water with WET of 16.92cm are used as additional build-up so that four chambers sit on the SOBP plateau at various depths and one sit on the distal fall off. Reading from the five chambers are fitted to an error function that has been parameterized to match the SOBP with the same nominal range. Shifting of the error function to maximize the correlation between measurements and the error function is deemed as the range shift from the nominal value. Results: We have found couch isocentricity maintained over 180 degrees. Imaging system exhibits accuracy in regard to imaging and mechanical isocenters. Ranges are within 1mm accuracy from measurements in water tank, and sensitive to change of sub-millimeter. Data acquired since the start of operation show outputs, profiles and range stay within 1% or 1mm from baselines. The whole procedure takes about 40 minutes. Conclusion: Taking advantage of the design of Daily QA3 device turns the device originally designed for photon and electron into a comprehensive and rapid tool for proton daily QA.

  5. On the use of biomathematical models in patient-specific IMRT dose QA

    International Nuclear Information System (INIS)

    Zhen Heming; Nelms, Benjamin E.; Tomé, Wolfgang A.

    2013-01-01

    Purpose: To investigate the use of biomathematical models such as tumor control probability (TCP) and normal tissue complication probability (NTCP) as new quality assurance (QA) metrics.Methods: Five different types of error (MLC transmission, MLC penumbra, MLC tongue and groove, machine output, and MLC position) were intentionally induced to 40 clinical intensity modulated radiation therapy (IMRT) patient plans (20 H and N cases and 20 prostate cases) to simulate both treatment planning system errors and machine delivery errors in the IMRT QA process. The changes in TCP and NTCP for eight different anatomic structures (H and N: CTV, GTV, both parotids, spinal cord, larynx; prostate: CTV, rectal wall) were calculated as the new QA metrics to quantify the clinical impact on patients. The correlation between the change in TCP/NTCP and the change in selected DVH values was also evaluated. The relation between TCP/NTCP change and the characteristics of the TCP/NTCP curves is discussed.Results:ΔTCP and ΔNTCP were summarized for each type of induced error and each structure. The changes/degradations in TCP and NTCP caused by the errors vary widely depending on dose patterns unique to each plan, and are good indicators of each plan's “robustness” to that type of error.Conclusions: In this in silico QA study the authors have demonstrated the possibility of using biomathematical models not only as patient-specific QA metrics but also as objective indicators that quantify, pretreatment, a plan's robustness with respect to possible error types

  6. SU-E-T-432: A Rapid and Comprehensive Procedure for Daily Proton QA

    International Nuclear Information System (INIS)

    Zhao, T; Sun, B; Grantham, K; Knutson, N; Santanam, L; Goddu, S; Klein, E

    2014-01-01

    Purpose: The objective is to develop a rapid and comprehensive daily QA procedure implemented at the S. Lee Kling Proton Therapy Center at Barnes-Jewish Hospital. Methods: A scribed phantom with imbedded fiducials is used for checking lasers accuracy followed by couch isocentricity and for X-ray imaging congruence with isocenter. A Daily QA3 device (Sun Nuclear, FL) was used to check output, range and profiles. Five chambers in the central region possess various build-ups. After converting the thickness of the inherent build-ups into water equivalent thickness (WET) for proton, range of any beam can be checked with additional build-up on the Daily QA3 device. In our procedure, 3 beams from 3 bands (large, small and deep) with nominal range of 20 cm are checked daily. 17cm plastic water with WET of 16.92cm are used as additional build-up so that four chambers sit on the SOBP plateau at various depths and one sit on the distal fall off. Reading from the five chambers are fitted to an error function that has been parameterized to match the SOBP with the same nominal range. Shifting of the error function to maximize the correlation between measurements and the error function is deemed as the range shift from the nominal value. Results: We have found couch isocentricity maintained over 180 degrees. Imaging system exhibits accuracy in regard to imaging and mechanical isocenters. Ranges are within 1mm accuracy from measurements in water tank, and sensitive to change of sub-millimeter. Data acquired since the start of operation show outputs, profiles and range stay within 1% or 1mm from baselines. The whole procedure takes about 40 minutes. Conclusion: Taking advantage of the design of Daily QA3 device turns the device originally designed for photon and electron into a comprehensive and rapid tool for proton daily QA

  7. SU-C-BRD-03: Closing the Loop On Virtual IMRT QA

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, G; Scheuermann, R; Y, H C.; Olszanski, A; Bellerive, M; Solberg, T [University of Pennsylvania, Philadelphia, PA (United States)

    2015-06-15

    Purpose: To develop an algorithm that predicts a priori IMRT QA passing rates. Methods: 416 IMRT plans from all treatment sites were planned in Eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam linacs (Varian Medical Systems, Palo Alto, CA). The 3%/3mm and 2%/2mm local distance to agreement (DTA) were recorded during clinical operations using a commercial 2D diode array (MapCHECK 2, Sun Nuclear, Melbourne, FL). Each plan was characterized by 37 metrics that describe different failure modes between the calculated and measured dose. Machine-learning algorithms (MLAs) were trained to learn the relation between the plan characteristics and each passing rate. Minimization of the cross validated error, together with maximum a posteriori estimation (MAP), were used to choose the model parameters. Results: 3%/3mm local DTA can be predicted with an error smaller than 3% for 98% of the plans. For the remaining 2% of plans, the residual error was within 5%. For 2%/2mm local DTA passing rates, 96% percent of the plans were successfully predicted with an error smaller than 5%. All high-risk plans that failed the 2%/2mm local criteria were correctly identified by the algorithm. The most important metric to describe the passing rates was determined to be the MU per Gray (modulation factor). Conclusions: Logs files and independent dose calculations have been suggested as possible substitutes for measurement based IMRT QA. However, none of these methods answer the fundamental question of whether a plan can be delivered with a clinically acceptable error given the limitations of the linacs and the treatment planning system. Predicting the IMRT QA passing rates a priori closes that loop. For additional robustness, virtual IMRT QA can be combined with Linac QA and log file analysis to confirm appropriate delivery.

  8. Quality control of intelligence research

    International Nuclear Information System (INIS)

    Lu Yan; Xin Pingping; Wu Jian

    2014-01-01

    Quality control of intelligence research is the core issue of intelligence management, is a problem in study of information science This paper focuses on the performance of intelligence to explain the significance of intelligence research quality control. In summing up the results of the study on the basis of the analysis, discusses quality control methods in intelligence research, introduces the experience of foreign intelligence research quality control, proposes some recommendations to improve quality control in intelligence research. (authors)

  9. Interface learning

    DEFF Research Database (Denmark)

    Thorhauge, Sally

    2014-01-01

    "Interface learning - New goals for museum and upper secondary school collaboration" investigates and analyzes the learning that takes place when museums and upper secondary schools in Denmark work together in local partnerships to develop and carry out school-related, museum-based coursework...... for students. The research focuses on the learning that the students experience in the interface of the two learning environments: The formal learning environment of the upper secondary school and the informal learning environment of the museum. Focus is also on the learning that the teachers and museum...... professionals experience as a result of their collaboration. The dissertation demonstrates how a given partnership’s collaboration affects the students’ learning experiences when they are doing the coursework. The dissertation presents findings that museum-school partnerships can use in order to develop...

  10. Natural language interface for nuclear data bases

    International Nuclear Information System (INIS)

    Heger, A.S.; Koen, B.V.

    1987-01-01

    A natural language interface has been developed for access to information from a data base, simulating a nuclear plant reliability data system (NPRDS), one of the several existing data bases serving the nuclear industry. In the last decade, the importance of information has been demonstrated by the impressive diffusion of data base management systems. The present methods that are employed to access data bases fall into two main categories of menu-driven systems and use of data base manipulation languages. Both of these methods are currently used by NPRDS. These methods have proven to be tedious, however, and require extensive training by the user for effective utilization of the data base. Artificial intelligence techniques have been used in the development of several intelligent front ends for data bases in nonnuclear domains. Lunar is a natural language program for interface to a data base describing moon rock samples brought back by Apollo. Intellect is one of the first data base question-answering systems that was commercially available in the financial area. Ladder is an intelligent data base interface that was developed as a management aid to Navy decision makers. A natural language interface for nuclear data bases that can be used by nonprogrammers with little or no training provides a means for achieving this goal for this industry

  11. Brain Intelligence: Go Beyond Artificial Intelligence

    OpenAIRE

    Lu, Huimin; Li, Yujie; Chen, Min; Kim, Hyoungseop; Serikawa, Seiichi

    2017-01-01

    Artificial intelligence (AI) is an important technology that supports daily social life and economic activities. It contributes greatly to the sustainable growth of Japan's economy and solves various social problems. In recent years, AI has attracted attention as a key for growth in developed countries such as Europe and the United States and developing countries such as China and India. The attention has been focused mainly on developing new artificial intelligence information communication ...

  12. Design of data sampler in intelligent physical start-up system for nuclear reactor

    International Nuclear Information System (INIS)

    Wang Yinli; Ling Qiu

    2007-01-01

    It introduces the design of data sampler in intelligent physical start-up system for nuclear reactor. The hardware frame taking STμPSD3234A as the core and the firmware design based on USB interface are discussed. (authors)

  13. TU-FG-201-01: 18-Month Clinical Experience of a Linac Daily Quality Assurance (QA) Solution Using Only EPID and OBI

    Energy Technology Data Exchange (ETDEWEB)

    Cai, B; Sun, B; Yaddanapudi, S; Goddu, S; Li, H; Caruthers, D; Kavanaugh, J; Mutic, S [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: To describe the clinical use of a Linear Accelerator (Linac) DailyQA system with only EPID and OBI. To assess the reliability over an 18-month period and improve the robustness of this system based on QA failure analysis. Methods: A DailyQA solution utilizing an in-house designed phantom, combined EPID and OBI image acquisitions, and a web-based data analysis and reporting system was commissioned and used in our clinic to measure geometric, dosimetry and imaging components of a Varian Truebeam Linac. During an 18-month period (335 working days), the Daily QA results, including the output constancy, beam flatness and symmetry, uniformity, TPR20/10, MV and KV imaging quality, were collected and analyzed. For output constancy measurement, an independent monthly QA system with an ionization chamber (IC) and annual/incidental TG51 measurements with ADCL IC were performed and cross-compared to Daily QA system. Thorough analyses were performed on the recorded QA failures to evaluate the machine performance, optimize the data analysis algorithm, adjust the tolerance setting and improve the training procedure to prevent future failures. Results: A clinical workflow including beam delivery, data analysis, QA report generation and physics approval was established and optimized to suit daily clinical operation. The output tests over the 335 working day period cross-correlated with the monthly QA system within 1.3% and TG51 results within 1%. QA passed with one attempt on 236 days out of 335 days. Based on the QA failures analysis, the Gamma criteria is revised from (1%, 1mm) to (2%, 1mm) considering both QA accuracy and efficiency. Data analysis algorithm is improved to handle multiple entries for a repeating test. Conclusion: We described our 18-month clinical experience on a novel DailyQA system using only EPID and OBI. The long term data presented demonstrated the system is suitable and reliable for Linac daily QA.

  14. Soft Interfaces

    International Nuclear Information System (INIS)

    Strzalkowski, Ireneusz

    1997-01-01

    This book presents an extended form of the 1994 Dirac Memorial Lecture delivered by Pierre Gilles de Gennes at Cambridge University. The main task of the presentation is to show the beauty and richness of structural forms and phenomena which are observed at soft interfaces between two media. They are much more complex than forms and phenomena existing in each phase separately. Problems are discussed including both traditional, classical techniques, such as the contact angle in static and dynamic partial wetting, as well as the latest research methodology, like 'environmental' scanning electron microscopes. The book is not a systematic lecture on phenomena but it can be considered as a compact set of essays on topics which particularly fascinate the author. The continuum theory widely used in the book is based on a deep molecular approach. The author is particularly interested in a broad-minded rheology of liquid systems at interfaces with specific emphasis on polymer melts. To study this, the author has developed a special methodology called anemometry near walls. The second main topic presented in the book is the problem of adhesion. Molecular processes, energy transformations and electrostatic interaction are included in an interesting discussion of the many aspects of the principles of adhesion. The third topic concerns welding between two polymer surfaces, such as A/A and A/B interfaces. Of great worth is the presentation of various unsolved, open problems. The kind of topics and brevity of description indicate that this book is intended for a well prepared reader. However, for any reader it will present an interesting picture of how many mysterious processes are acting in the surrounding world and how these phenomena are perceived by a Nobel Laureate, who won that prize mainly for his investigations in this field. (book review)

  15. Interface Screenings

    DEFF Research Database (Denmark)

    Thomsen, Bodil Marie Stavning

    2015-01-01

    In Wim Wenders' film Until the End of the World (1991), three different diagrams for the visual integration of bodies are presented: 1) GPS tracking and mapping in a landscape, 2) video recordings layered with the memory perception of these recordings, and 3) data-created images from dreams...... and memories. From a transvisual perspective, the question is whether or not these (by now realized) diagrammatic modes involving the body in ubiquitous global media can be analysed in terms of the affects and events created in concrete interfaces. The examples used are filmic as felt sensations...

  16. MO-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA I

    Energy Technology Data Exchange (ETDEWEB)

    Clements, M [RAD Image, Colorado Springs, CO (United States); Wiesmeyer, M [Standard Imaging, Inc., Middleton, WI (United States)

    2014-06-15

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Automated Imaging QA for TG-142 with RIT Presentation Time: 2:45 – 3:15 PM This presentation will discuss software tools for automated imaging QA and phantom analysis for TG-142. All modalities used in radiation oncology will be discussed, including CBCT, planar kV imaging, planar MV imaging, and imaging and treatment coordinate coincidence. Vendor supplied phantoms as well as a variety of third-party phantoms will be shown, along with appropriate analyses, proper phantom setup procedures and scanning settings, and a discussion of image quality metrics. Tools for process automation will be discussed which include: RIT Cognition (machine learning for phantom image identification), RIT Cerberus (automated file system monitoring and searching), and RunQueueC (batch processing of multiple images). In addition to phantom analysis, tools for statistical tracking, trending, and reporting will be discussed. This discussion will include an introduction to statistical process control, a valuable tool in analyzing data and determining appropriate tolerances. An Introduction to TG-142 Imaging QA Using Standard Imaging Products Presentation Time: 3:15 – 3:45 PM Medical Physicists want to understand the logic behind TG-142 Imaging QA. What is often missing is a firm understanding of the connections between the EPID and OBI phantom imaging, the software “algorithms” that calculate the QA metrics, the establishment of baselines, and the analysis and interpretation of the results. The goal of our brief presentation will be to

  17. SU-F-T-558: ArcCheck for Patient Specific QA in Stereotactic Ablative Radiotherapy

    International Nuclear Information System (INIS)

    Ramachandran, P; Tajaldeen, A; Esen, N; Geso, M; Taylor, D; Wanigaratne, D; Roozen, K; Kron, T

    2016-01-01

    Purpose: Stereotactic Ablative Radiotherapy (SABR) is one of the most preferred treatment techniques for early stage lung cancer. This technique has been extended to other treatment sites like Spine, Liver, Scapula, Sternum etc., This has resulted in increased physics QA time on machine. In this study, we’ve tested the feasibility of using ArcCheck as an alternative method to replace film dosimetry. Methods: Twelve patients with varied diagnosis of Lung, Liver, scapula, sternum and Spine undergoing SABR were selected for this study. Pre-treatment QA was performed for all the patients which include ionization chamber and film dosimetry. The required gamma criteria for each SABR plan to pass QA and proceed to treatment is 95% (3%,1mm). In addition to this routine process, the treatment plans were exported on to an ArcCheck phantom. The planned and measured dose from the ArcCheck device were compared using four different gamma criteria: 2%,2 mm, 3%,2 mm, 3%,1 mm and 3%, 3 mm. In addition to this, we’ve also introduced errors to gantry, collimator and couch angle to assess sensitivity of the ArcCheck with potential delivery errors. Results: The ArcCheck mean passing rates for all twelve cases were 76.1%±9.7% for gamma criteria 3%,1 mm, 89.5%±5.3% for 2%,2 mm, 92.6%±4.2% for 3%,2 mm, and 97.6%±2.4% for 3%,3 mm gamma criteria. When SABR spine cases are excluded, we observe ArcCheck passing rates higher than 95% for all the studied cases with 3%, 3mm, and ArcCheck results in acceptable agreement with the film gamma results. Conclusion: Our ArcCheck results at 3%, 3 mm were found to correlate well with our non-SABR spine routine patient specific QA results (3%,1 mm). We observed significant reduction in QA time on using ArcCheck for SABR QA. This study shows that ArcCheck could replace film dosimetry for all sites except SABR spine.

  18. MO-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA I

    International Nuclear Information System (INIS)

    Clements, M; Wiesmeyer, M

    2014-01-01

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Automated Imaging QA for TG-142 with RIT Presentation Time: 2:45 – 3:15 PM This presentation will discuss software tools for automated imaging QA and phantom analysis for TG-142. All modalities used in radiation oncology will be discussed, including CBCT, planar kV imaging, planar MV imaging, and imaging and treatment coordinate coincidence. Vendor supplied phantoms as well as a variety of third-party phantoms will be shown, along with appropriate analyses, proper phantom setup procedures and scanning settings, and a discussion of image quality metrics. Tools for process automation will be discussed which include: RIT Cognition (machine learning for phantom image identification), RIT Cerberus (automated file system monitoring and searching), and RunQueueC (batch processing of multiple images). In addition to phantom analysis, tools for statistical tracking, trending, and reporting will be discussed. This discussion will include an introduction to statistical process control, a valuable tool in analyzing data and determining appropriate tolerances. An Introduction to TG-142 Imaging QA Using Standard Imaging Products Presentation Time: 3:15 – 3:45 PM Medical Physicists want to understand the logic behind TG-142 Imaging QA. What is often missing is a firm understanding of the connections between the EPID and OBI phantom imaging, the software “algorithms” that calculate the QA metrics, the establishment of baselines, and the analysis and interpretation of the results. The goal of our brief presentation will be to

  19. SU-F-T-558: ArcCheck for Patient Specific QA in Stereotactic Ablative Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Ramachandran, P [Peter MacCallum Cancer Centre, Melbourne (Australia); RMIT University, Bundoora (Australia); Tajaldeen, A; Esen, N; Geso, M [RMIT University, Bundoora (Australia); Taylor, D; Wanigaratne, D; Roozen, K; Kron, T [Peter MacCallum Cancer Centre, Melbourne (Australia)

    2016-06-15

    Purpose: Stereotactic Ablative Radiotherapy (SABR) is one of the most preferred treatment techniques for early stage lung cancer. This technique has been extended to other treatment sites like Spine, Liver, Scapula, Sternum etc., This has resulted in increased physics QA time on machine. In this study, we’ve tested the feasibility of using ArcCheck as an alternative method to replace film dosimetry. Methods: Twelve patients with varied diagnosis of Lung, Liver, scapula, sternum and Spine undergoing SABR were selected for this study. Pre-treatment QA was performed for all the patients which include ionization chamber and film dosimetry. The required gamma criteria for each SABR plan to pass QA and proceed to treatment is 95% (3%,1mm). In addition to this routine process, the treatment plans were exported on to an ArcCheck phantom. The planned and measured dose from the ArcCheck device were compared using four different gamma criteria: 2%,2 mm, 3%,2 mm, 3%,1 mm and 3%, 3 mm. In addition to this, we’ve also introduced errors to gantry, collimator and couch angle to assess sensitivity of the ArcCheck with potential delivery errors. Results: The ArcCheck mean passing rates for all twelve cases were 76.1%±9.7% for gamma criteria 3%,1 mm, 89.5%±5.3% for 2%,2 mm, 92.6%±4.2% for 3%,2 mm, and 97.6%±2.4% for 3%,3 mm gamma criteria. When SABR spine cases are excluded, we observe ArcCheck passing rates higher than 95% for all the studied cases with 3%, 3mm, and ArcCheck results in acceptable agreement with the film gamma results. Conclusion: Our ArcCheck results at 3%, 3 mm were found to correlate well with our non-SABR spine routine patient specific QA results (3%,1 mm). We observed significant reduction in QA time on using ArcCheck for SABR QA. This study shows that ArcCheck could replace film dosimetry for all sites except SABR spine.

  20. Quo Vadis, Artificial Intelligence?

    OpenAIRE

    Berrar, Daniel; Sato, Naoyuki; Schuster, Alfons

    2010-01-01

    Since its conception in the mid 1950s, artificial intelligence with its great ambition to understand and emulate intelligence in natural and artificial environments alike is now a truly multidisciplinary field that reaches out and is inspired by a great diversity of other fields. Rapid advances in research and technology in various fields have created environments into which artificial intelligence could embed itself naturally and comfortably. Neuroscience with its desire to understand nervou...

  1. Principles of artificial intelligence

    CERN Document Server

    Nilsson, Nils J

    1980-01-01

    A classic introduction to artificial intelligence intended to bridge the gap between theory and practice, Principles of Artificial Intelligence describes fundamental AI ideas that underlie applications such as natural language processing, automatic programming, robotics, machine vision, automatic theorem proving, and intelligent data retrieval. Rather than focusing on the subject matter of the applications, the book is organized around general computational concepts involving the kinds of data structures used, the types of operations performed on the data structures, and the properties of th

  2. Intelligence of programs

    Energy Technology Data Exchange (ETDEWEB)

    Novak, D

    1982-01-01

    A general discussion about the level of artificial intelligence in computer programs is presented. The suitability of various languages for the development of complex, intelligent programs is discussed, considering fourth-generation language as well as the well established structured COBOL language. It is concluded that the success of automation in many administrative fields depends to a large extent on the development of intelligent programs.

  3. Intelligence analysis – the royal discipline of Competitive Intelligence

    OpenAIRE

    František Bartes

    2011-01-01

    The aim of this article is to propose work methodology for Competitive Intelligence teams in one of the intelligence cycle’s specific area, in the so-called “Intelligence Analysis”. Intelligence Analysis is one of the stages of the Intelligence Cycle in which data from both the primary and secondary research are analyzed. The main result of the effort is the creation of added value for the information collected. Company Competiitve Intelligence, correctly understood and implemented in busines...

  4. Machine listening intelligence

    Science.gov (United States)

    Cella, C. E.

    2017-05-01

    This manifesto paper will introduce machine listening intelligence, an integrated research framework for acoustic and musical signals modelling, based on signal processing, deep learning and computational musicology.

  5. STANFORD ARTIFICIAL INTELLIGENCE PROJECT.

    Science.gov (United States)

    ARTIFICIAL INTELLIGENCE , GAME THEORY, DECISION MAKING, BIONICS, AUTOMATA, SPEECH RECOGNITION, GEOMETRIC FORMS, LEARNING MACHINES, MATHEMATICAL MODELS, PATTERN RECOGNITION, SERVOMECHANISMS, SIMULATION, BIBLIOGRAPHIES.

  6. Intelligent Optics Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Intelligent Optics Laboratory supports sophisticated investigations on adaptive and nonlinear optics; advancedimaging and image processing; ground-to-ground and...

  7. Intelligence and childlessness.

    Science.gov (United States)

    Kanazawa, Satoshi

    2014-11-01

    Demographers debate why people have children in advanced industrial societies where children are net economic costs. From an evolutionary perspective, however, the important question is why some individuals choose not to have children. Recent theoretical developments in evolutionary psychology suggest that more intelligent individuals may be more likely to prefer to remain childless than less intelligent individuals. Analyses of the National Child Development Study show that more intelligent men and women express preference to remain childless early in their reproductive careers, but only more intelligent women (not more intelligent men) are more likely to remain childless by the end of their reproductive careers. Controlling for education and earnings does not at all attenuate the association between childhood general intelligence and lifetime childlessness among women. One-standard-deviation increase in childhood general intelligence (15 IQ points) decreases women's odds of parenthood by 21-25%. Because women have a greater impact on the average intelligence of future generations, the dysgenic fertility among women is predicted to lead to a decline in the average intelligence of the population in advanced industrial nations. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Routledge companion to intelligence studies

    CERN Document Server

    Dover, Robert; Hillebrand, Claudia

    2013-01-01

    The Routledge Companion to Intelligence Studies provides a broad overview of the growing field of intelligence studies. The recent growth of interest in intelligence and security studies has led to an increased demand for popular depictions of intelligence and reference works to explain the architecture and underpinnings of intelligence activity. Divided into five comprehensive sections, this Companion provides a strong survey of the cutting-edge research in the field of intelligence studies: Part I: The evolution of intelligence studies; Part II: Abstract approaches to intelligence; Part III: Historical approaches to intelligence; Part IV: Systems of intelligence; Part V: Contemporary challenges. With a broad focus on the origins, practices and nature of intelligence, the book not only addresses classical issues, but also examines topics of recent interest in security studies. The overarching aim is to reveal the rich tapestry of intelligence studies in both a sophisticated and accessible way. This Companion...

  9. Artificial Consciousness or Artificial Intelligence

    OpenAIRE

    Spanache Florin

    2017-01-01

    Artificial intelligence is a tool designed by people for the gratification of their own creative ego, so we can not confuse conscience with intelligence and not even intelligence in its human representation with conscience. They are all different concepts and they have different uses. Philosophically, there are differences between autonomous people and automatic artificial intelligence. This is the difference between intelligence and artificial intelligence, autonomous versus a...

  10. 2015 Chinese Intelligent Systems Conference

    CERN Document Server

    Du, Junping; Li, Hongbo; Zhang, Weicun; CISC’15

    2016-01-01

    This book presents selected research papers from the 2015 Chinese Intelligent Systems Conference (CISC’15), held in Yangzhou, China. The topics covered include multi-agent systems, evolutionary computation, artificial intelligence, complex systems, computation intelligence and soft computing, intelligent control, advanced control technology, robotics and applications, intelligent information processing, iterative learning control, and machine learning. Engineers and researchers from academia, industry and the government can gain valuable insights into solutions combining ideas from multiple disciplines in the field of intelligent systems.

  11. SU-F-T-182: A Stochastic Approach to Daily QA Tolerances On Spot Properties for Proton Pencil Beam Scanning

    International Nuclear Information System (INIS)

    St James, S; Bloch, C; Saini, J

    2016-01-01

    Purpose: Proton pencil beam scanning is used clinically across the United States. There are no current guidelines on tolerances for daily QA specific to pencil beam scanning, specifically related to the individual spot properties (spot width). Using a stochastic method to determine tolerances has the potential to optimize tolerances on individual spots and decrease the number of false positive failures in daily QA. Individual and global spot tolerances were evaluated. Methods: As part of daily QA for proton pencil beam scanning, a field of 16 spots (corresponding to 8 energies) is measured using an array of ion chambers (Matrixx, IBA). Each individual spot is fit to two Gaussian functions (x,y). The spot width (σ) in × and y are recorded (32 parameters). Results from the daily QA were retrospectively analyzed for 100 days of data. The deviations of the spot widths were histogrammed and fit to a Gaussian function. The stochastic spot tolerance was taken to be the mean ± 3σ. Using these results, tolerances were developed and tested against known deviations in spot width. Results: The individual spot tolerances derived with the stochastic method decreased in 30/32 instances. Using the previous tolerances (± 20% width), the daily QA would have detected 0/20 days of the deviation. Using a tolerance of any 6 spots failing the stochastic tolerance, 18/20 days of the deviation would have been detected. Conclusion: Using a stochastic method we have been able to decrease daily tolerances on the spot widths for 30/32 spot widths measured. The stochastic tolerances can lead to detection of deviations that previously would have been picked up on monthly QA and missed by daily QA. This method could be easily extended for evaluation of other QA parameters in proton spot scanning.

  12. MO-FG-202-09: Virtual IMRT QA Using Machine Learning: A Multi-Institutional Validation

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, G; Scheuermann, R; Solberg, T [University of Pennsylvania, Philadelphia, PA (United States); Chan, M; Deasy, J [Memorial Sloan-Kettering Cancer Center, New York, NY (United States)

    2016-06-15

    Purpose: To validate a machine learning approach to Virtual IMRT QA for accurately predicting gamma passing rates using different QA devices at different institutions. Methods: A Virtual IMRT QA was constructed using a machine learning algorithm based on 416 IMRT plans, in which QA measurements were performed using diode-array detectors and a 3%local/3mm with 10% threshold. An independent set of 139 IMRT measurements from a different institution, with QA data based on portal dosimetry using the same gamma index and 10% threshold, was used to further test the algorithm. Plans were characterized by 90 different complexity metrics. A weighted poison regression with Lasso regularization was trained to predict passing rates using the complexity metrics as input. Results: In addition to predicting passing rates with 3% accuracy for all composite plans using diode-array detectors, passing rates for portal dosimetry on per-beam basis were predicted with an error <3.5% for 120 IMRT measurements. The remaining measurements (19) had large areas of low CU, where portal dosimetry has larger disagreement with the calculated dose and, as such, large errors were expected. These beams need to be further modeled to correct the under-response in low dose regions. Important features selected by Lasso to predict gamma passing rates were: complete irradiated area outline (CIAO) area, jaw position, fraction of MLC leafs with gaps smaller than 20 mm or 5mm, fraction of area receiving less than 50% of the total CU, fraction of the area receiving dose from penumbra, weighted Average Irregularity Factor, duty cycle among others. Conclusion: We have demonstrated that the Virtual IMRT QA can predict passing rates using different QA devices and across multiple institutions. Prediction of QA passing rates could have profound implications on the current IMRT process.

  13. MO-FG-202-09: Virtual IMRT QA Using Machine Learning: A Multi-Institutional Validation

    International Nuclear Information System (INIS)

    Valdes, G; Scheuermann, R; Solberg, T; Chan, M; Deasy, J

    2016-01-01

    Purpose: To validate a machine learning approach to Virtual IMRT QA for accurately predicting gamma passing rates using different QA devices at different institutions. Methods: A Virtual IMRT QA was constructed using a machine learning algorithm based on 416 IMRT plans, in which QA measurements were performed using diode-array detectors and a 3%local/3mm with 10% threshold. An independent set of 139 IMRT measurements from a different institution, with QA data based on portal dosimetry using the same gamma index and 10% threshold, was used to further test the algorithm. Plans were characterized by 90 different complexity metrics. A weighted poison regression with Lasso regularization was trained to predict passing rates using the complexity metrics as input. Results: In addition to predicting passing rates with 3% accuracy for all composite plans using diode-array detectors, passing rates for portal dosimetry on per-beam basis were predicted with an error <3.5% for 120 IMRT measurements. The remaining measurements (19) had large areas of low CU, where portal dosimetry has larger disagreement with the calculated dose and, as such, large errors were expected. These beams need to be further modeled to correct the under-response in low dose regions. Important features selected by Lasso to predict gamma passing rates were: complete irradiated area outline (CIAO) area, jaw position, fraction of MLC leafs with gaps smaller than 20 mm or 5mm, fraction of area receiving less than 50% of the total CU, fraction of the area receiving dose from penumbra, weighted Average Irregularity Factor, duty cycle among others. Conclusion: We have demonstrated that the Virtual IMRT QA can predict passing rates using different QA devices and across multiple institutions. Prediction of QA passing rates could have profound implications on the current IMRT process.

  14. The Game is aFoot, Watson: DeepQA systems and the future of HCI

    OpenAIRE

    Keates, Simeon; Varker, Philip

    2012-01-01

    In February 2011, the IBM Watson DeepQA (deep question and answer) system took part in a special challenge, pitting its question and answer capability against former Jeopardy!TM grand champions in a televised match. Watson emerged victorious from the challenge, demonstrating that current question answering technology has advanced to the point where it can arguably be more dependable than human experts. This new system represents a significant breakthrough in humanity’s decades-long endeavour ...

  15. SU-F-T-271: Comparing IMRT QA Pass Rates Before and After MLC Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Mazza, A; Perrin, D; Fontenot, J [Mary Bird Perkins Cancer Center, Baton Rouge, LA (United States)

    2016-06-15

    Purpose: To compare IMRT QA pass rates before and after an in-house MLC leaf calibration procedure. Methods: The MLC leaves and backup jaws on four Elekta linear accelerators with MLCi2 heads were calibrated using the EPID-based RIT Hancock Test as the means for evaluation. The MLCs were considered to be successfully calibrated when they could pass the Hancock Test with criteria of 1 mm jaw position tolerance, and 1 mm leaf position tolerance. IMRT QA results were collected pre- and postcalibration and analyzed using gamma analysis with 3%/3mm DTA criteria. AAPM TG-119 test plans were also compared pre- and post-calibration, at both 2%/2mm DTA and 3%/3mm DTA. Results: A weighted average was performed on the results for all four linear accelerators. The pre-calibration IMRT QA pass rate was 98.3 ± 0.1%, compared with the post-calibration pass rate of 98.5 ± 0.1%. The TG-119 test plan results showed more of an improvement, particularly at the 2%/2mm criteria. The averaged results were 89.1% pre and 96.1% post for the C-shape plan, 94.8% pre and 97.1% post for the multi-target plan, 98.6% pre and 99.7% post for the prostate plan, 94.7% pre and 94.8% post for the head/neck plan. Conclusion: The patient QA results did not show statistically significant improvement at the 3%/3mm DTA criteria after the MLC calibration procedure. However, the TG-119 test cases did show significant improvement at the 2%/2mm level.

  16. mosaicQA - A General Approach to Facilitate Basic Data Quality Assurance for Epidemiological Research.

    Science.gov (United States)

    Bialke, Martin; Rau, Henriette; Schwaneberg, Thea; Walk, Rene; Bahls, Thomas; Hoffmann, Wolfgang

    2017-05-29

    Epidemiological studies are based on a considerable amount of personal, medical and socio-economic data. To answer research questions with reliable results, epidemiological research projects face the challenge of providing high quality data. Consequently, gathered data has to be reviewed continuously during the data collection period. This article describes the development of the mosaicQA-library for non-statistical experts consisting of a set of reusable R functions to provide support for a basic data quality assurance for a wide range of application scenarios in epidemiological research. To generate valid quality reports for various scenarios and data sets, a general and flexible development approach was needed. As a first step, a set of quality-related questions, targeting quality aspects on a more general level, was identified. The next step included the design of specific R-scripts to produce proper reports for metric and categorical data. For more flexibility, the third development step focussed on the generalization of the developed R-scripts, e.g. extracting characteristics and parameters. As a last step the generic characteristics of the developed R functionalities and generated reports have been evaluated using different metric and categorical datasets. The developed mosaicQA-library generates basic data quality reports for multivariate input data. If needed, more detailed results for single-variable data, including definition of units, variables, descriptions, code lists and categories of qualified missings, can easily be produced. The mosaicQA-library enables researchers to generate reports for various kinds of metric and categorical data without the need for computational or scripting knowledge. At the moment, the library focusses on the data structure quality and supports the assessment of several quality indicators, including frequency, distribution and plausibility of research variables as well as the occurrence of missing and extreme values. To

  17. Establishing QC/QA system in the fabrication of nuclear fuel assemblies

    International Nuclear Information System (INIS)

    Suh, K.S.; Choi, S.K.; Park, H.G.; Park, T.G.; Chung, J.S.

    1980-01-01

    Quality control instruction manuals and inspection methods for UO 2 powder and zircaloy materials as the material control, and for UO 2 pellets and nuclear fuel rods as the process control were established. And for the establishment of Q.A programme, the technical specifications of the purchased materials, the control regulation of the measuring and testing equipments, and traceability chart as a part of document control have also been provided and practically applied to the fuel fabrication process

  18. Museets interface

    DEFF Research Database (Denmark)

    Pold, Søren

    2007-01-01

    Søren Pold gør sig overvejelser med udgangspunkt i museumsprojekterne Kongedragter.dk og Stigombord.dk. Han argumenterer for, at udviklingen af internettets interfaces skaber nye måder at se, forstå og interagere med kulturen på. Brugerne får nye medievaner og perceptionsmønstre, der må medtænkes i...... tilrettelæggelsen af den fremtidige formidling. Samtidig får museets genstande en ny status som flygtige ikoner i det digitale rum, og alt i alt inviterer det til, at museerne kan forholde sig mere åbent og eksperimenterende til egen praksis og rolle som kulturinstitution....

  19. An intelligent multi-media human-computer dialogue system

    Science.gov (United States)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  20. SU-F-T-262: Commissioning Varian Portal Dosimetry for EPID-Based Patient Specific QA in a Non-Aria Environment

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, M; Knutson, N [Rhode Island Hospital, Providence RI (United States); University of Rhode Island, Kingston, RI (United States); University of Massachusetts Lowell, Lowell, MA (United States); Herrington, J [University of Rhode Island, Kingston, RI (United States); Price, M [Rhode Island Hospital, Providence RI (United States); University of Rhode Island, Kingston, RI (United States); Alpert Medical School of Brown University, Providence, RI (United States)

    2016-06-15

    Purpose: Development of an in-house program facilitates a workflow that allows Electronic Portal Imaging Device (EPID) patient specific quality assurance (QA) measurements to be acquired and analyzed in the Portal Dosimetry Application (Varian Medical Systems, Palo Alto, CA) using a non-Aria Record and Verify (R&V) system (MOSAIQ, Elekta, Crawley, UK) to deliver beams in standard clinical treatment mode. Methods: Initial calibration of an in-house software tool includes characterization of EPID dosimetry parameters by importing DICOM images of varying delivered MUs to determine linear mapping factors in order to convert image pixel values to Varian-defined Calibrated Units (CU). Using this information, the Portal Dose Image Prediction (PDIP) algorithm was commissioned by converting images of various field sizes to output factors using the Eclipse Scripting Application Programming Interface (ESAPI) and converting a delivered configuration fluence to absolute dose units. To verify the algorithm configuration, an integrated image was acquired, exported directly from the R&V client, automatically converted to a compatible, calibrated dosimetric image, and compared to a PDIP calculated image using Varian’s Portal Dosimetry Application. Results: For two C-Series and one TrueBeam Varian linear accelerators, gamma comparisons (global 3% / 3mm) of PDIP algorithm predicted dosimetric images and images converted via the inhouse system demonstrated agreement for ≥99% of all pixels, exceeding vendor-recommended commissioning guidelines. Conclusion: Combinations of a programmatic image conversion tool and ESAPI allow for an efficient and accurate method of patient IMRT QA incorporating a 3rd party R&V system.

  1. [A Quality Assurance (QA) System with a Web Camera for High-dose-rate Brachytherapy].

    Science.gov (United States)

    Hirose, Asako; Ueda, Yoshihiro; Oohira, Shingo; Isono, Masaru; Tsujii, Katsutomo; Inui, Shouki; Masaoka, Akira; Taniguchi, Makoto; Miyazaki, Masayoshi; Teshima, Teruki

    2016-03-01

    The quality assurance (QA) system that simultaneously quantifies the position and duration of an (192)Ir source (dwell position and time) was developed and the performance of this system was evaluated in high-dose-rate brachytherapy. This QA system has two functions to verify and quantify dwell position and time by using a web camera. The web camera records 30 images per second in a range from 1,425 mm to 1,505 mm. A user verifies the source position from the web camera at real time. The source position and duration were quantified with the movie using in-house software which was applied with a template-matching technique. This QA system allowed verification of the absolute position in real time and quantification of dwell position and time simultaneously. It was evident from the verification of the system that the mean of step size errors was 0.31±0.1 mm and that of dwell time errors 0.1±0.0 s. Absolute position errors can be determined with an accuracy of 1.0 mm at all dwell points in three step sizes and dwell time errors with an accuracy of 0.1% in more than 10.0 s of the planned time. This system is to provide quick verification and quantification of the dwell position and time with high accuracy at various dwell positions without depending on the step size.

  2. Manufacture of Daily Check Device and Efficiency Evaluation for Daily Q.A

    International Nuclear Information System (INIS)

    Kim, Chan Yong; Jae, Young Wan; Park, Heung Deuk; Lee, Jae Hee

    2005-01-01

    Daily Q.A is the important step which must be preceded in a radiation treatment. Specially, radiation output measurement and laser alignment, SSD indicator related to a patient set-up recurrence must be confirmed for a reasonable radiation treatment. Daily Q.A proceeds correctness and a prompt way, and needs an objective measurement basis. Manufacture of the device which can facilitate confirmation of output measurement and appliances check at one time was requested. Produced the phantom formal daily check device which can confirm a lot of appliances check (output measurement and laser alignment. field size, SSD indicator) with one time of set up at a time, and measurement observed a linear accelerator (4 machine) for four months and evaluated efficiency. We were able to confirm an laser alignment, field size, SSD indicator check at the same time, and out put measurement was possible with the same set up, so daily Q.A time was reduced, and we were able to confirm an objective basis about each item measurement. As a result of having measured for four months, output measurement within ±2%, and measured laser alignment, field size, SSD indicator in range within ±1 mm. We can enforce output measurement and appliances check conveniently, and time was reduced and was able to raise efficiency of business. We were able to bring a cost reduction by substitution expensive commercialized equipment. Further It is necessary to makes a product as strong and slight materials, and improve convenience of use.

  3. The Second Round of the PHAR-QA Survey of Competences for Pharmacy Practice

    Directory of Open Access Journals (Sweden)

    Jeffrey Atkinson

    2016-09-01

    Full Text Available This paper presents the results of the second European Delphi round on the ranking of competences for pharmacy practice and compares these data to those of the first round already published. A comparison of the numbers of respondents, distribution by age group, country of residence, etc., shows that whilst the student population of respondents changed from Round 1 to 2, the populations of the professional groups (community, hospital and industrial pharmacists, pharmacists in other occupations and academics were more stable. Results are given for the consensus of ranking and the scores of ranking of 50 competences for pharmacy practice. This two-stage, large-scale Delphi process harmonized and validated the Quality Assurance in European Pharmacy Education and Training (PHAR-QA framework and ensured the adoption by the pharmacy profession of a framework proposed by the academic pharmacy community. The process of evaluation and validation of ranking of competences by the pharmacy profession is now complete, and the PHAR-QA consortium will now put forward a definitive PHAR-QA framework of competences for pharmacy practice.

  4. The Second Round of the PHAR-QA Survey of Competences for Pharmacy Practice

    Science.gov (United States)

    Atkinson, Jeffrey; De Paepe, Kristien; Pozo, Antonio Sánchez; Rekkas, Dimitrios; Volmer, Daisy; Hirvonen, Jouni; Bozic, Borut; Skowron, Agnieska; Mircioiu, Constantin; Marcincal, Annie; Koster, Andries; Wilson, Keith; van Schravendijk, Chris

    2016-01-01

    This paper presents the results of the second European Delphi round on the ranking of competences for pharmacy practice and compares these data to those of the first round already published. A comparison of the numbers of respondents, distribution by age group, country of residence, etc., shows that whilst the student population of respondents changed from Round 1 to 2, the populations of the professional groups (community, hospital and industrial pharmacists, pharmacists in other occupations and academics) were more stable. Results are given for the consensus of ranking and the scores of ranking of 50 competences for pharmacy practice. This two-stage, large-scale Delphi process harmonized and validated the Quality Assurance in European Pharmacy Education and Training (PHAR-QA) framework and ensured the adoption by the pharmacy profession of a framework proposed by the academic pharmacy community. The process of evaluation and validation of ranking of competences by the pharmacy profession is now complete, and the PHAR-QA consortium will now put forward a definitive PHAR-QA framework of competences for pharmacy practice. PMID:28970400

  5. The use of a commercial QA device for daily output check of a helical tomotherapy unit

    International Nuclear Information System (INIS)

    Alaei, Parham; Hui, Susanta K.; Higgins, Patrick D.; Gerbi, Bruce J.

    2006-01-01

    Helical tomotherapy radiation therapy units, due to their particular design and differences from a traditional linear accelerator, require different procedures by which to perform routine quality assurance (QA). One of the principal QA tasks that should be performed daily on any radiation therapy equipment is the output constancy check. The daily output check on a Hi-Art TomoTherapy unit is commonly performed utilizing ionization chambers placed inside a solid water phantom. This provides a good check of output at one point, but does not give any information on either energy or symmetry of the beam, unless more than one point is measured. This also has the added disadvantage that it has to be done by the physics staff. To address these issues, and to simplify the process, such that it can be performed by radiation therapists, we investigated the use of a commercially available daily QA device to perform this task. The use of this device simplifies the task of daily output constancy checks and eliminates the need for continued physics involvement. This device can also be used to monitor the constancy of beam energy and cone profile and can potentially be used to detect gross errors in the couch movement or laser alignment

  6. Proposal for a Similar Question Search System on a Q&A Site

    Directory of Open Access Journals (Sweden)

    Katsutoshi Kanamori

    2014-06-01

    Full Text Available There is a service to help Internet users obtain answers to specific questions when they visit a Q&A site. A Q&A site is very useful for the Internet user, but posted questions are often not answered immediately. This delay in answering occurs because in most cases another site user is answering the question manually. In this study, we propose a system that can present a question that is similar to a question posted by a user. An advantage of this system is that a user can refer to an answer to a similar question. This research measures the similarity of a candidate question based on word and dependency parsing. In an experiment, we examined the effectiveness of the proposed system for questions actually posted on the Q&A site. The result indicates that the system can show the questioner the answer to a similar question. However, the system still has a number of aspects that should be improved.

  7. A quality assurance (QA) system with a web camera for high-dose-rate brachytherapy

    International Nuclear Information System (INIS)

    Hirose, Asako; Ueda, Yoshihiro; Ohira, Shingo

    2016-01-01

    The quality assurance (QA) system that simultaneously quantifies the position and duration of an 192 Ir source (dwell position and time) was developed and the performance of this system was evaluated in high-dose-rate brachytherapy. This QA system has two functions to verify and quantify dwell position and time by using a web camera. The web camera records 30 images per second in a range from 1,425 mm to 1,505 mm. A user verifies the source position from the web camera at real time. The source position and duration were quantified with the movie using in-house software which was applied with a template-matching technique. This QA system allowed verification of the absolute position in real time and quantification of dwell position and time simultaneously. It was evident from the verification of the system that the mean of step size errors was 0.3±0.1 mm and that of dwell time errors 0.1 ± 0.0 s. Absolute position errors can be determined with an accuracy of 1.0 mm at all dwell points in three step sizes and dwell time errors with an accuracy of 0.1% in more than 10.0 s of the planned time. This system is to provide quick verification and quantification of the dwell position and time with high accuracy at various dwell positions without depending on the step size. (author)

  8. Man-machine interface requirements - advanced technology

    Science.gov (United States)

    Remington, R. W.; Wiener, E. L.

    1984-01-01

    Research issues and areas are identified where increased understanding of the human operator and the interaction between the operator and the avionics could lead to improvements in the performance of current and proposed helicopters. Both current and advanced helicopter systems and avionics are considered. Areas critical to man-machine interface requirements include: (1) artificial intelligence; (2) visual displays; (3) voice technology; (4) cockpit integration; and (5) pilot work loads and performance.

  9. Intelligent design som videnskab?

    DEFF Research Database (Denmark)

    Klausen, Søren Harnow

    2007-01-01

    Diskuterer hvorvidt intelligent design kan betegnes som videnskab; argumenterer for at dette grundet fraværet af klare demarkationskriterier næppe kan afvises.......Diskuterer hvorvidt intelligent design kan betegnes som videnskab; argumenterer for at dette grundet fraværet af klare demarkationskriterier næppe kan afvises....

  10. Distributed intelligence in CAMAC

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1977-01-01

    A simple extension of the CAMAC standard is described which allows distributed intelligence at the crate level. By distributed intelligence is meant that there is more than one source of control in a system. This standard is just now emerging from the NIM Dataway Working Group and its European counterpart. 1 figure

  11. Intelligence and treaty ratification

    International Nuclear Information System (INIS)

    Cahn, A.H.

    1990-01-01

    This paper reports that there are two sets of questions applicable to the ratification phase: what is the role of intelligence in the ratification process? What effect did intelligence have on that process. The author attempts to answer these and other questions

  12. Applying Multiple Intelligences

    Science.gov (United States)

    Christodoulou, Joanna A.

    2009-01-01

    The ideas of multiple intelligences introduced by Howard Gardner of Harvard University more than 25 years ago have taken form in many ways, both in schools and in other sometimes-surprising settings. The silver anniversary of Gardner's learning theory provides an opportunity to reflect on the ways multiple intelligences theory has taken form and…

  13. Next generation Emotional Intelligence

    Science.gov (United States)

    J. Saveland

    2012-01-01

    Emotional Intelligence has been a hot topic in leadership training since Dan Goleman published his book on the subject in 1995. Emotional intelligence competencies are typically focused on recognition and regulation of emotions in one's self and social situations, yielding four categories: self-awareness, self-management, social awareness and relationship...

  14. Intelligence by consent

    DEFF Research Database (Denmark)

    Diderichsen, Adam; Rønn, Kira Vrist

    2017-01-01

    This article contributes to the current discussions concerning an adequate framework for intelligence ethics. The first part critically scrutinises the use of Just War Theory in intelligence ethics with specific focus on the just cause criterion. We argue that using self-defence as justifying cau...

  15. Intelligence and Physical Attractiveness

    Science.gov (United States)

    Kanazawa, Satoshi

    2011-01-01

    This brief research note aims to estimate the magnitude of the association between general intelligence and physical attractiveness with large nationally representative samples from two nations. In the United Kingdom, attractive children are more intelligent by 12.4 IQ points (r=0.381), whereas in the United States, the correlation between…

  16. Intelligence and treaty ratification

    International Nuclear Information System (INIS)

    Naftzinger, J.E.

    1990-01-01

    This paper describes the atmosphere leading up to the Senate INF hearings and then surveys the broad issues they raised. After that, the author highlights several aspects of the intelligence community's involvement and discusses the specific intelligence-related issues as the Senate committees saw them, notes their impact on the outcome, and finally draws several conclusions and lessons pertinent to the future

  17. Intelligence, Race, and Genetics

    Science.gov (United States)

    Sternberg, Robert J.; Grigorenko, Elena L.; Kidd, Kenneth K.

    2005-01-01

    In this article, the authors argue that the overwhelming portion of the literature on intelligence, race, and genetics is based on folk taxonomies rather than scientific analysis. They suggest that because theorists of intelligence disagree as to what it is, any consideration of its relationships to other constructs must be tentative at best. They…

  18. Multiple Intelligences in Action.

    Science.gov (United States)

    Campbell, Bruce

    1992-01-01

    Describes the investigation of the effects of a four-step model program used with third through fifth grade students to implement Gardener's concepts of seven human intelligences--linguistic, logical/mathematical, visual/spatial, musical, kinesthetic, intrapersonal, and interpersonal intelligence--into daily learning. (BB)

  19. The Reproduction of Intelligence

    Science.gov (United States)

    Meisenberg, Gerhard

    2010-01-01

    Although a negative relationship between fertility and education has been described consistently in most countries of the world, less is known about the relationship between intelligence and reproductive outcomes. Also the paths through which intelligence influences reproductive outcomes are uncertain. The present study uses the NLSY79 to analyze…

  20. Intelligent robot action planning

    Energy Technology Data Exchange (ETDEWEB)

    Vamos, T; Siegler, A

    1982-01-01

    Action planning methods used in intelligent robot control are discussed. Planning is accomplished through environment understanding, environment representation, task understanding and planning, motion analysis and man-machine communication. These fields are analysed in detail. The frames of an intelligent motion planning system are presented. Graphic simulation of the robot's environment and motion is used to support the planning. 14 references.

  1. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  2. An overview of artificial intelligence and robotics. Volume 1: Artificial intelligence. Part B: Applications

    Science.gov (United States)

    Gevarter, W. B.

    1983-01-01

    Artificial Intelligence (AI) is an emerging technology that has recently attracted considerable attention. Many applications are now under development. This report, Part B of a three part report on AI, presents overviews of the key application areas: Expert Systems, Computer Vision, Natural Language Processing, Speech Interfaces, and Problem Solving and Planning. The basic approaches to such systems, the state-of-the-art, existing systems and future trends and expectations are covered.

  3. Intelligence and Prosocial Behavior

    DEFF Research Database (Denmark)

    Han, Ru; Shi, Jiannong; Yong, W.

    2012-01-01

    Results of prev ious studies of the relationship between prosocial behav ior and intelligence hav e been inconsistent. This study attempts to distinguish the dif f erences between sev eral prosocial tasks, and explores the way s in which cognitiv e ability inf luences prosocial behav ior. In Study...... One and Two, we reexamined the relationship between prosocial behav ior and intelligence by employ ing a costly signaling theory with f our games. The results rev ealed that the prosocial lev el of smarter children is higher than that of other children in more complicated tasks but not so in simple...... tasks. In Study Three, we tested the moderation ef f ect of the av erage intelligence across classes, and the results did not show any group intelligence ef f ect on the relationship between intelligence and prosocial behav ior....

  4. Business Intelligence Systems

    Directory of Open Access Journals (Sweden)

    Bogdan NEDELCU

    2014-02-01

    Full Text Available The aim of this article is to show the importance of business intelligence and its growing influence. It also shows when the concept of business intelligence was used for the first time and how it evolved over time. The paper discusses the utility of a business intelligence system in any organization and its contribution to daily activities. Furthermore, we highlight the role and the objectives of business intelligence systems inside an organization and the needs to grow the incomes and reduce the costs, to manage the complexity of the business environment and to cut IT costs so that the organization survives in the current competitive climate. The article contains information about architectural principles of a business intelligence system and how such a system can be achieved.

  5. Building an Information Resource Center for Competitive Intelligence.

    Science.gov (United States)

    Martin, J. Sperling

    1992-01-01

    Outlines considerations in the design of a Competitive Intelligence Information Resource Center (CIIRC), which is needed by business organizations for effective strategic decision making. Discussed are user needs, user participation, information sources, technology and interface design, operational characteristics, and planning for implementation.…

  6. Interfaces habladas

    Directory of Open Access Journals (Sweden)

    María Teresa Soto Sanfiel

    2012-04-01

    Full Text Available Este artículo describe y piensa al fenómeno de las Interfaces habladas (IH desde variados puntos de vista y niveles de análisis. El texto se ha concebido con los objetivos específicos de: 1.- procurar una visión panorámica de aspectos de la producción y consumo comunicativo de las IH; 2.- ofrecer recomendaciones para su creación y uso eficaz, y 3.- llamar la atención sobre su proliferación e inspirar su estudio desde la comunicación. A pesar de la creciente presencia de las IF en nues-tras vidas cotidianas, hay ausencia de textos que las caractericen y analicen por sus aspectos comunicativos. El trabajo es pertinente porque el fenómeno significa un cambio respecto a estadios comunica-tivos precedentes con consecuencias en las concepciones intelectuales y emocionales de los usuarios. La proliferación de IH nos abre a nue-vas realidades comunicativas: hablamos con máquinas.

  7. A framework for the intelligent control of nuclear rockets

    International Nuclear Information System (INIS)

    Parlos, A.G.; Metzger, J.D.

    1993-01-01

    An intelligent control system architecture is proposed for nuclear rockets, and its various components are briefly described. The objective of the intelligent controller is the satisfaction of performance, robustness, fault-tolerance and reliability design specifications. The proposed hierarchical architecture consists of three levels: hardware, signal processing, and knowledge processing. The functionality of the intelligent controller is implemented utilizing advanced information processing technologies such as artificial neutral networks and fuzzy expert systems. The feasibility of a number of the controller architecture components have been independently validated using computer simulations. Preliminary results are presented demonstrating some of the signal processing capabilities of the intelligent nuclear rocket controller. Further work, currently in progress, is attempting to implement a number of the knowledge processing capabilities of the controller and their interface with the lower levels of the proposed architecture

  8. Business Intelligence & Analytical Intelligence: hou het zakelijk

    OpenAIRE

    Van Nieuwenhuyse, Dries

    2013-01-01

    Technologie democratiseert, de markt consolideert, terwijl de hoeveelheid data explodeert. Het lijkt een ideale voedingsbodem voor projecten rond business intelligence en analytics. “Hoe minder de technologie het verschil zal maken, hoe prominenter de business aanwezig zal zijn.”

  9. Social Intelligence Design in Ambient Intelligence

    NARCIS (Netherlands)

    Nijholt, Antinus; Stock, Oliviero; Stock, O.; Nishida, T.; Nishida, Toyoaki

    2009-01-01

    This Special Issue of AI and Society contains a selection of papers presented at the 6th Social Intelligence Design Workshop held at ITC-irst, Povo (Trento, Italy) in July 2007. Being the 6th in a series means that there now is a well-established and also a growing research area. The interest in

  10. Event detection intelligent camera development

    International Nuclear Information System (INIS)

    Szappanos, A.; Kocsis, G.; Molnar, A.; Sarkozi, J.; Zoletnik, S.

    2008-01-01

    A new camera system 'event detection intelligent camera' (EDICAM) is being developed for the video diagnostics of W-7X stellarator, which consists of 10 distinct and standalone measurement channels each holding a camera. Different operation modes will be implemented for continuous and for triggered readout as well. Hardware level trigger signals will be generated from real time image processing algorithms optimized for digital signal processor (DSP) and field programmable gate array (FPGA) architectures. At full resolution a camera sends 12 bit sampled 1280 x 1024 pixels with 444 fps which means 1.43 Terabyte over half an hour. To analyse such a huge amount of data is time consuming and has a high computational complexity. We plan to overcome this problem by EDICAM's preprocessing concepts. EDICAM camera system integrates all the advantages of CMOS sensor chip technology and fast network connections. EDICAM is built up from three different modules with two interfaces. A sensor module (SM) with reduced hardware and functional elements to reach a small and compact size and robust action in harmful environment as well. An image processing and control unit (IPCU) module handles the entire user predefined events and runs image processing algorithms to generate trigger signals. Finally a 10 Gigabit Ethernet compatible image readout card functions as the network interface for the PC. In this contribution all the concepts of EDICAM and the functions of the distinct modules are described

  11. Spiritual Intelligence, Emotional Intelligence and Auditor’s Performance

    OpenAIRE

    Hanafi, Rustam

    2010-01-01

    The objective of this research was to investigate empirical evidence about influence audi-tor spiritual intelligence on the performance with emotional intelligence as a mediator variable. Linear regression models are developed to examine the hypothesis and path analysis. The de-pendent variable of each model is auditor performance, whereas the independent variable of model 1 is spiritual intelligence, of model 2 are emotional intelligence and spiritual intelligence. The parameters were estima...

  12. Naturalist Intelligence Among the Other Multiple Intelligences [In Bulgarian

    Directory of Open Access Journals (Sweden)

    R. Genkov

    2007-09-01

    Full Text Available The theory of multiple intelligences was presented by Gardner in 1983. The theory was revised later (1999 and among the other intelligences a naturalist intelligence was added. The criteria for distinguishing of the different types of intelligences are considered. While Gardner restricted the analysis of the naturalist intelligence with examples from the living nature only, the present paper considered this problem on wider background including objects and persons of the natural sciences.

  13. Intelligence and treaty ratification

    International Nuclear Information System (INIS)

    Sojka, G.L.

    1990-01-01

    What did the intelligence community and the Intelligence Committee di poorly in regard to the treaty ratification process for arms control? We failed to solve the compartmentalization problem/ This is a second-order problem, and, in general, analysts try to be very open; but there are problems nevertheless. There are very few, if any, people within the intelligence community who are cleared for everything relevant to our monitoring capability emdash short of probably the Director of Central Intelligence and the president emdash and this is a major problem. The formal monitoring estimates are drawn up by individuals who do not have access to all the information to make the monitoring judgements. This paper reports that the intelligence community did not present a formal document on either Soviet incentives of disincentives to cheat or on the possibility of cheating scenarios, and that was a mistake. However, the intelligence community was very responsive in producing those types of estimates, and, ultimately, the evidence behind them in response to questions. Nevertheless, the author thinks the intelligence community would do well to address this issue up front before a treaty is submitted to the Senate for advice and consent

  14. The Epistemic Status of Intelligence

    DEFF Research Database (Denmark)

    Rønn, Kira Vrist; Høffding, Simon

    2012-01-01

    We argue that the majority of intelligence definitions fail to recognize that the normative epistemic status of intelligence is knowledge and not an inferior alternative. We refute the counter-arguments that intelligence ought not to be seen as knowledge because of 1) its action-oriented scope...... and robustness of claims to intelligence-knowledge can be assessed....

  15. Moral Intelligence in the Schools

    Science.gov (United States)

    Clarken, Rodney H.

    2009-01-01

    Moral intelligence is newer and less studied than the more established cognitive, emotional and social intelligences, but has great potential to improve our understanding of learning and behavior. Moral intelligence refers to the ability to apply ethical principles to personal goals, values and actions. The construct of moral intelligence consists…

  16. SU-E-J-52: Decreasing Frequency of Performing TG-142 Imaging QA – 5 Year Experience

    Energy Technology Data Exchange (ETDEWEB)

    Lin, T; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States)

    2015-06-15

    Purpose This study is an update to check if the frequency of imaging QA suggested by AAPM Task Group Report 142 (TG142) is necessary with our 5 year experience. TG142 presents recommendations for QA criteria of IGRT treatment. ACR has adopted it to be the requirements for any radiatiotherapy practices; however, we propose to reduce the frequency on image quality QA according to this 5 year study.Method and Materials: This study uses VarianIX2100 and Siemens Artiste Linacs to perform QAs on KV, MV, CBCT modalities. The QA was designed following under the recommendations of TG142. This study reports the daily imaging positioning/repositioning and imaging and treatment coordinate coincidence. QA results on kV, MV and CBCT from 4/7/2010∼3/11/15 are analyzed. KV, MV, CBCT images are taken with the Varian isocube localized at the isocenter. Digital graticule is used in the software to verify the isocenter position. CBCT images are taken with the cube placed at 1cm superior, lateral and anterior of the isocenter. In-line fusion software is used to verify the contrived shift. Digital ruler provided at the on-board-imaging software or adaptive-targeting software was used to measure the position differences. The position differences were recorded at AP,LR,SI directions. Results 5 year records on kV, MV, CBCT show the shifts in all three directions are within the tolerance of 1mm suggested in TG142 for stereotactic radiation treatment(SRS/SRT). There is no occasion where shifts are outside 1mm tolerance. Conclusions The daily imaging QA suggested in TG142 is useful in ensuring the accuracy needed for SRS/SRT in IGRT. 5 year measurements presented suggest that decreasing the frequency of imaging QA may be acceptable, in particular for institutions reporting no violation of tolerance over periods of few years.

  17. Experience in the application of the IAEA QA code and guides to the manufacture of nuclear reactor components

    International Nuclear Information System (INIS)

    Dutta, N.G.; Mankame, M.A.; Kulkarni, P.G.; Vijayaraghavan, R.; Balaramamoorthy, K.

    1985-01-01

    India has made considerable progress in the indigenous manufacture of 'Quality' nuclear reactor components. All activities associated with the development of atomic energy from mining of strategic minerals to the design, construction, and operation of nuclear power plants including supporting research and development efforts are mainly carried out by the Department of Atomic Energy (DAE). Through the sustained efforts of DAE, the major industries, both in public and private sectors supplying nuclear components have now adopted the practice of systematic quality assurance (QA). The stringent QA steps are mandatory for achieving the desired quality in the manufactured nuclear components. Control blades for BWRs are now indigenously manufactured by the Atomic Fuels Division (AFD) of Bhabha Atomic Research Centre (BARC), a constituent unit of DAE. For the Project Dhruva, a 100 MW(th) nuclear reactor, constructed at BARC, Trombay, Bombay, an independent cell was formed to carry out quality audit on the manufactured components. The components were designed, fabricated, inspected and tested to the desired quality level. The QA activities were enforced from the procurement of raw materials to the audit of the completed component for monitoring the manufacturer's continued compliance with the design. The major components of Dhruva, viz. calandria, end-shield, coolant channels, heat exchangers, etc., were covered under these quality audit activities. The paper highlights the QA programme implemented in the manufacture of control blades for BWRs, illustrated with a typical example, the end-shield for Dhruva. The authors consider that the recommendations and guidelines provided in the documents 50-SG-QA3, 50-SG-QA8, 50-SG-QA10, etc., were useful in providing a formal and systematic framework, under which various quality assurance functions have been carried out

  18. SU-E-J-52: Decreasing Frequency of Performing TG-142 Imaging QA – 5 Year Experience

    International Nuclear Information System (INIS)

    Lin, T; Ma, C

    2015-01-01

    Purpose This study is an update to check if the frequency of imaging QA suggested by AAPM Task Group Report 142 (TG142) is necessary with our 5 year experience. TG142 presents recommendations for QA criteria of IGRT treatment. ACR has adopted it to be the requirements for any radiatiotherapy practices; however, we propose to reduce the frequency on image quality QA according to this 5 year study.Method and Materials: This study uses VarianIX2100 and Siemens Artiste Linacs to perform QAs on KV, MV, CBCT modalities. The QA was designed following under the recommendations of TG142. This study reports the daily imaging positioning/repositioning and imaging and treatment coordinate coincidence. QA results on kV, MV and CBCT from 4/7/2010∼3/11/15 are analyzed. KV, MV, CBCT images are taken with the Varian isocube localized at the isocenter. Digital graticule is used in the software to verify the isocenter position. CBCT images are taken with the cube placed at 1cm superior, lateral and anterior of the isocenter. In-line fusion software is used to verify the contrived shift. Digital ruler provided at the on-board-imaging software or adaptive-targeting software was used to measure the position differences. The position differences were recorded at AP,LR,SI directions. Results 5 year records on kV, MV, CBCT show the shifts in all three directions are within the tolerance of 1mm suggested in TG142 for stereotactic radiation treatment(SRS/SRT). There is no occasion where shifts are outside 1mm tolerance. Conclusions The daily imaging QA suggested in TG142 is useful in ensuring the accuracy needed for SRS/SRT in IGRT. 5 year measurements presented suggest that decreasing the frequency of imaging QA may be acceptable, in particular for institutions reporting no violation of tolerance over periods of few years

  19. SU-E-CAMPUS-T-04: Statistical Process Control for Patient-Specific QA in Proton Beams

    Energy Technology Data Exchange (ETDEWEB)

    LAH, J [Myongji Hospital, Goyangsi, Gyeonggi-do (Korea, Republic of); SHIN, D [National Cancer Center, Goyangsi, Gyeonggi-do (Korea, Republic of); Kim, G [UCSD Medical Center, La Jolla, CA (United States)

    2014-06-15

    Purpose: To evaluate and improve the reliability of proton QA process, to provide an optimal customized level using the statistical process control (SPC) methodology. The aim is then to suggest the suitable guidelines for patient-specific QA process. Methods: We investigated the constancy of the dose output and range to see whether it was within the tolerance level of daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to suggest the suitable guidelines for patient-specific QA in proton beam by using process capability indices. In this study, patient QA plans were classified into 6 treatment sites: head and neck (41 cases), spinal cord (29 cases), lung (28 cases), liver (30 cases), pancreas (26 cases), and prostate (24 cases). Results: The deviations for the dose output and range of daily QA process were ±0.84% and ±019%, respectively. Our results show that the patient-specific range measurements are capable at a specification limit of ±2% in all treatment sites except spinal cord cases. In spinal cord cases, comparison of process capability indices (Cp, Cpm, Cpk ≥1, but Cpmk ≤1) indicated that the process is capable, but not centered, the process mean deviates from its target value. The UCL (upper control limit), CL (center line) and LCL (lower control limit) for spinal cord cases were 1.37%, −0.27% and −1.89%, respectively. On the other hands, the range differences in prostate cases were good agreement between calculated and measured values. The UCL, CL and LCL for prostate cases were 0.57%, −0.11% and −0.78%, respectively. Conclusion: SPC methodology has potential as a useful tool to customize an optimal tolerance levels and to suggest the suitable guidelines for patient-specific QA in clinical proton beam.

  20. Advanced intelligence and mechanism approach

    Institute of Scientific and Technical Information of China (English)

    ZHONG Yixin

    2007-01-01

    Advanced intelligence will feature the intelligence research in next 50 years.An understanding of the concept of advanced intelligence as well as its importance will be provided first,and detailed analysis on an approach,the mechanism approach.suitable to the advanced intelligence research will then be flolowed.And the mutual relationship among mechanism approach,traditional approaches existed in artificial intelligence research,and the cognitive informatics will be discussed.It is interesting to discover that mechanism approach is a good one to the Advanced Intelligence research and a tmified form of the existed approaches to artificial intelligence.

  1. Intelligent Motion and Interaction Within Virtual Environments

    Science.gov (United States)

    Ellis, Stephen R. (Editor); Slater, Mel (Editor); Alexander, Thomas (Editor)

    2007-01-01

    What makes virtual actors and objects in virtual environments seem real? How can the illusion of their reality be supported? What sorts of training or user-interface applications benefit from realistic user-environment interactions? These are some of the central questions that designers of virtual environments face. To be sure simulation realism is not necessarily the major, or even a required goal, of a virtual environment intended to communicate specific information. But for some applications in entertainment, marketing, or aspects of vehicle simulation training, realism is essential. The following chapters will examine how a sense of truly interacting with dynamic, intelligent agents may arise in users of virtual environments. These chapters are based on presentations at the London conference on Intelligent Motion and Interaction within a Virtual Environments which was held at University College, London, U.K., 15-17 September 2003.

  2. Artificial intelligence applications for operation and maintenance

    International Nuclear Information System (INIS)

    Itoh, M.; Tai, I.; Monta, K.; Sekimizu, K.

    1987-01-01

    A nuclear power plant as a typical man-machine system of the modern industry needs an efficient human window through which operators can observe every necessary detail of the plant for its safe and reliable operation. Much efforts have been devoted to the development of the computerized operator support systems (COSS). Recent development of artificial intelligence (AI) seems to offer new possibility to strengthen the performance of the COSS such as more powerful diagnosis and procedure synthesis and user friendly man-machine interfaces. From this point of view, a national project of Advanced Man-Machine System Development for Nuclear Power Plants has been carried out. Artificial intelligence application to nuclear power plant operation and maintenance is also selected as a major theme for the promotion of research and development on frontiers in the recently revised long term national program for development and utilization of nuclear energy in JAPAN

  3. Intelligent environmental sensing

    CERN Document Server

    Mukhopadhyay, Subhas

    2015-01-01

    Developing environmental sensing and monitoring technologies become essential especially for industries that may cause severe contamination. Intelligent environmental sensing uses novel sensor techniques, intelligent signal and data processing algorithms, and wireless sensor networks to enhance environmental sensing and monitoring. It finds applications in many environmental problems such as oil and gas, water quality, and agriculture. This book addresses issues related to three main approaches to intelligent environmental sensing and discusses their latest technological developments. Key contents of the book include:   Agricultural monitoring Classification, detection, and estimation Data fusion Geological monitoring Motor monitoring Multi-sensor systems Oil reservoirs monitoring Sensor motes Water quality monitoring Wireless sensor network protocol  

  4. Is Intelligence Artificial?

    OpenAIRE

    Greer, Kieran

    2014-01-01

    Our understanding of intelligence is directed primarily at the level of human beings. This paper attempts to give a more unifying definition that can be applied to the natural world in general. The definition would be used more to verify a degree of intelligence, not to quantify it and might help when making judgements on the matter. A version of an accepted test for AI is then put forward as the 'acid test' for Artificial Intelligence itself. It might be what a free-thinking program or robot...

  5. Installation and evaluation of a nuclear power plant operator advisor based on artificial intelligence technology

    International Nuclear Information System (INIS)

    Hajek, B.K.; Miller, D.W.

    1989-01-01

    This report discusses the following topics on a Nuclear Power Plant operator advisor based on artificial Intelligence Technology; Workstation conversion; Software Conversion; V ampersand V Program Development Development; Simulator Interface Development; Knowledge Base Expansion; Dynamic Testing; Database Conversion; Installation at the Perry Simulator; Evaluation of Operator Interaction; Design of Man-Machine Interface; and Design of Maintenance Facility

  6. Intelligent holographic databases

    Science.gov (United States)

    Barbastathis, George

    Memory is a key component of intelligence. In the human brain, physical structure and functionality jointly provide diverse memory modalities at multiple time scales. How could we engineer artificial memories with similar faculties? In this thesis, we attack both hardware and algorithmic aspects of this problem. A good part is devoted to holographic memory architectures, because they meet high capacity and parallelism requirements. We develop and fully characterize shift multiplexing, a novel storage method that simplifies disk head design for holographic disks. We develop and optimize the design of compact refreshable holographic random access memories, showing several ways that 1 Tbit can be stored holographically in volume less than 1 m3, with surface density more than 20 times higher than conventional silicon DRAM integrated circuits. To address the issue of photorefractive volatility, we further develop the two-lambda (dual wavelength) method for shift multiplexing, and combine electrical fixing with angle multiplexing to demonstrate 1,000 multiplexed fixed holograms. Finally, we propose a noise model and an information theoretic metric to optimize the imaging system of a holographic memory, in terms of storage density and error rate. Motivated by the problem of interfacing sensors and memories to a complex system with limited computational resources, we construct a computer game of Desert Survival, built as a high-dimensional non-stationary virtual environment in a competitive setting. The efficacy of episodic learning, implemented as a reinforced Nearest Neighbor scheme, and the probability of winning against a control opponent improve significantly by concentrating the algorithmic effort to the virtual desert neighborhood that emerges as most significant at any time. The generalized computational model combines the autonomous neural network and von Neumann paradigms through a compact, dynamic central representation, which contains the most salient features

  7. Experience Using DosimetryCheck software for IMRT and RapidArc Patient Pre-treatment QA and a New Feature for QA during Treatment

    International Nuclear Information System (INIS)

    Pinkerton, Arthur; Hannon, Michael; Kwag, Jae; Renner, Wendel Dean

    2010-01-01

    We have used the DosimetryCheck program with the EPID's on our Varian 2100EX's to perform pre-treatment QA on more than 350 patients, between the last quarter of 2006 and the present. The software uses the EPID measured fluences of the treatment fields to reconstruct the dose distribution in the CT planning model of the patient. Since the dose calculation algorithm, is different from that used by our Eclipse planning system, this provides an independent check of planning accuracy as well as treatment delivery. 2D and 3D dose distributions, point doses, Gamma distributions, DVH statistics and MU calculations can be compared. Absolute differences of Reference Point doses between Dosimetry Check and Eclipse average 1.20%, which is similar to the ionization chamber dose differences of 1.29% for the same patient verification plans. Examples of cases for various treatment sites and delivery modes will be presented. A Special Report in Medical Physics Vol. 37 Number 6 Pg. 2638-2644 from Mans et al at The Netherlands Cancer Institute demonstrated the ability of in vivo EPID dosimetry to detect treatment errors, that escaped other QA checks. A new version of DosimetryCheck awaiting FDA approval, is capable of successfully reconstructing the dose distribution in the patient from the EPID measured exit fluences. This can also be applied to CBCT images providing actual patient dose verification for a treatment session. This should be particularly useful for monitoring hypo-fractionated treatment regimens. Examples of this method will also be presented.

  8. Waste assay measurement integration system user interface

    International Nuclear Information System (INIS)

    Mousseau, K.C.; Hempstead, A.R.; Becker, G.K.

    1995-01-01

    The Waste Assay Measurement Integration System (WAMIS) is being developed to improve confidence in and lower the uncertainty of waste characterization data. There are two major components to the WAMIS: a data access and visualization component and a data interpretation component. The intent of the access and visualization software is to provide simultaneous access to all data sources that describe the contents of any particular container of waste. The visualization software also allows the user to display data at any level from raw to reduced output. Depending on user type, the software displays a menuing hierarchy, related to level of access, that allows the user to observe only those data sources s/he has been authorized to view. Access levels include system administrator, physicist, QA representative, shift operations supervisor, and data entry. Data sources are displayed in separate windows and presently include (1) real-time radiography video, (2) gamma spectra, (3) passive and active neutron, (4) radionuclide mass estimates, (5) total alpha activity (Ci), (6) container attributes, (7) thermal power (w), and (8) mass ratio estimates for americium, plutonium, and uranium isotopes. The data interpretation component is in the early phases of design, but will include artificial intelligence, expert system, and neural network techniques. The system is being developed on a Pentium PC using Microsoft Visual C++. Future generations of WAMIS will be UNIX based and will incorporate more generically radiographic/tomographic, gamma spectroscopic/tomographics, neutron, and prompt gamma measurements

  9. Multiple Intelligences and quotient spaces

    OpenAIRE

    Malatesta, Mike; Quintana, Yamilet

    2006-01-01

    The Multiple Intelligence Theory (MI) is one of the models that study and describe the cognitive abilities of an individual. In [7] is presented a referential system which allows to identify the Multiple Intelligences of the students of a course and to classify the level of development of such Intelligences. Following this tendency, the purpose of this paper is to describe the model of Multiple Intelligences as a quotient space, and also to study the Multiple Intelligences of an individual in...

  10. Business Intelligence using Software Agents

    OpenAIRE

    Ana-Ramona BOLOGA; Razvan BOLOGA

    2011-01-01

    This paper presents some ideas about business intelligence today and the importance of developing real time business solutions. The authors make an exploration of links between business intelligence and artificial intelligence and focuses specifically on the implementation of software agents-based systems in business intelligence. There are briefly presented some of the few solutions proposed so far that use software agents properties for the benefit of business intelligence. The authors then...

  11. Multimodality and Ambient Intelligence

    NARCIS (Netherlands)

    Nijholt, Antinus; Verhaegh, W.; Aarts, E.; Korst, J.

    2004-01-01

    In this chapter we discuss multimodal interface technology. We present eexamples of multimodal interfaces and show problems and opportunities. Fusion of modalities is discussed and some roadmap discussions on research in multimodality are summarized. This chapter also discusses future developments

  12. SU-G-TeP2-01: Can EPID Based Measurement Replace Traditional Daily Output QA On Megavoltage Linac?

    International Nuclear Information System (INIS)

    Saleh, Z; Tang, X; Song, Y; Obcemea, C; Beeban, N; Chan, M; Li, X; Tang, G; Lim, S; Lovelock, D; LoSasso, T; Mechalakos, J; Both, S

    2016-01-01

    Purpose: To investigate the long term stability and viability of using EPID-based daily output QA via in-house and vendor driven protocol, to replace conventional QA tools and improve QA efficiency. Methods: Two Varian TrueBeam machines (TB1&TB2) equipped with electronic portal imaging devices (EPID) were employed in this study. Both machines were calibrated per TG-51 and used clinically since Oct 2014. Daily output measurement for 6/15 MV beams were obtained using SunNuclear DailyQA3 device as part of morning QA. In addition, in-house protocol was implemented for EPID output measurement (10×10 cm fields, 100 MU, 100cm SID, output defined over an ROI of 2×2 cm around central axis). Moreover, the Varian Machine Performance Check (MPC) was used on both machines to measure machine output. The EPID and DailyQA3 based measurements of the relative machine output were compared and cross-correlated with monthly machine output as measured by an A12 Exradin 0.65cc Ion Chamber (IC) serving as ground truth. The results were correlated using Pearson test. Results: The correlations among DailyQA3, in-house EPID and Varian MPC output measurements, with the IC for 6/15 MV were similar for TB1 (0.83–0.95) and TB2 (0.55–0.67). The machine output for the 6/15MV beams on both machines showed a similar trend, namely an increase over time as indicated by all measurements, requiring a machine recalibration after 6 months. This drift is due to a known issue with pressurized monitor chamber which tends to leak over time. MPC failed occasionally but passed when repeated. Conclusion: The results indicate that the use of EPID for daily output measurements has the potential to become a viable and efficient tool for daily routine LINAC QA, thus eliminating weather (T,P) and human setup variability and increasing efficiency of the QA process.

  13. SU-G-TeP2-01: Can EPID Based Measurement Replace Traditional Daily Output QA On Megavoltage Linac?

    Energy Technology Data Exchange (ETDEWEB)

    Saleh, Z; Tang, X; Song, Y; Obcemea, C; Beeban, N; Chan, M; Li, X; Tang, G; Lim, S; Lovelock, D; LoSasso, T; Mechalakos, J; Both, S [Memorial Sloan-Kettering Cancer Center, NY (United States)

    2016-06-15

    Purpose: To investigate the long term stability and viability of using EPID-based daily output QA via in-house and vendor driven protocol, to replace conventional QA tools and improve QA efficiency. Methods: Two Varian TrueBeam machines (TB1&TB2) equipped with electronic portal imaging devices (EPID) were employed in this study. Both machines were calibrated per TG-51 and used clinically since Oct 2014. Daily output measurement for 6/15 MV beams were obtained using SunNuclear DailyQA3 device as part of morning QA. In addition, in-house protocol was implemented for EPID output measurement (10×10 cm fields, 100 MU, 100cm SID, output defined over an ROI of 2×2 cm around central axis). Moreover, the Varian Machine Performance Check (MPC) was used on both machines to measure machine output. The EPID and DailyQA3 based measurements of the relative machine output were compared and cross-correlated with monthly machine output as measured by an A12 Exradin 0.65cc Ion Chamber (IC) serving as ground truth. The results were correlated using Pearson test. Results: The correlations among DailyQA3, in-house EPID and Varian MPC output measurements, with the IC for 6/15 MV were similar for TB1 (0.83–0.95) and TB2 (0.55–0.67). The machine output for the 6/15MV beams on both machines showed a similar trend, namely an increase over time as indicated by all measurements, requiring a machine recalibration after 6 months. This drift is due to a known issue with pressurized monitor chamber which tends to leak over time. MPC failed occasionally but passed when repeated. Conclusion: The results indicate that the use of EPID for daily output measurements has the potential to become a viable and efficient tool for daily routine LINAC QA, thus eliminating weather (T,P) and human setup variability and increasing efficiency of the QA process.

  14. Engineering general intelligence

    CERN Document Server

    Goertzel, Ben; Geisweiller, Nil

    2014-01-01

    The work outlines a novel conceptual and theoretical framework for understanding Artificial General Intelligence and based on this framework outlines a practical roadmap for the development of AGI with capability at the human level and ultimately beyond.

  15. Understanding US National Intelligence

    DEFF Research Database (Denmark)

    Leander, Anna

    2014-01-01

    In July 2010, the Washington Post (WP) published the results of a project on “Top Secret America” on which twenty investigative journalists had been working for two years. The project drew attention to the change and growth in National Intelligence following 9/11 (Washington Post 2010a......). The initial idea had been to work on intelligence generally, but given that this proved overwhelming, the team narrowed down to focus only on intelligence qualified as “top secret.” Even so, the growth in this intelligence activity is remarkable. This public is returning, or in this case expanding...... at an impressive speed confirming the general contention of this volume. Between 2001 and 2010 the budget had increased by 250 percent, reaching $75 billion (the GDP of the Czech Republic). Thirty-three building complexes for top secret work had been or were under construction in the Washington area; 1...

  16. Engineering general intelligence

    CERN Document Server

    Goertzel, Ben; Geisweiller, Nil

    2014-01-01

    The work outlines a detailed blueprint for the creation of an Artificial General Intelligence system with capability at the human level and ultimately beyond, according to the Cog Prime AGI design and the Open Cog software architecture.

  17. Intelligence Issues for Congress

    National Research Council Canada - National Science Library

    Best, Jr, Richard A

    2007-01-01

    To address the challenges facing the U.S. Intelligence Community in the 21st century, congressional and executive branch initiatives have sought to improve coordination among the different agencies and to encourage better analysis...

  18. Intelligence Issues for Congress

    National Research Council Canada - National Science Library

    Best, Jr, Richard A

    2006-01-01

    To address the challenges facing the U.S. Intelligence Community in the 21st Century, congressional and executive branch initiatives have sought to improve coordination among the different agencies and to encourage better analysis...

  19. Intelligence Issues for Congress

    National Research Council Canada - National Science Library

    Best, Jr, Richard A

    2008-01-01

    To address the challenges facing the U.S. Intelligence Community in the 21st century, congressional and executive branch initiatives have sought to improve coordination among the different agencies and to encourage better analysis...

  20. Intelligent Information Systems Institute

    National Research Council Canada - National Science Library

    Gomes, Carla

    2004-01-01

    ...) at Cornell during the first three years of operation. IISI's mandate is threefold: To perform and stimulate research in computational and data-intensive methods for intelligent decision making systems...

  1. Quo vadis, Intelligent Machine?

    Directory of Open Access Journals (Sweden)

    Rosemarie Velik

    2010-09-01

    Full Text Available Artificial Intelligence (AI is a branch of computer science concerned with making computers behave like humans. At least this was the original idea. However, it turned out that this is no task easy to be solved. This article aims to give a comprehensible review on the last 60 years of artificial intelligence taking a philosophical viewpoint. It is outlined what happened so far in AI, what is currently going on in this research area, and what can be expected in future. The goal is to mediate an understanding for the developments and changes in thinking in course of time about how to achieve machine intelligence. The clear message is that AI has to join forces with neuroscience and other brain disciplines in order to make a step towards the development of truly intelligent machines.

  2. Bibliography: Artificial Intelligence.

    Science.gov (United States)

    Smith, Richard L.

    1986-01-01

    Annotates reference material on artificial intelligence, mostly at an introductory level, with applications to education and learning. Topics include: (1) programing languages; (2) expert systems; (3) language instruction; (4) tutoring systems; and (5) problem solving and reasoning. (JM)

  3. Handbook of Intelligent Vehicles

    CERN Document Server

    2012-01-01

    The Handbook of Intelligent Vehicles provides a complete coverage of the fundamentals, new technologies, and sub-areas essential to the development of intelligent vehicles; it also includes advances made to date, challenges, and future trends. Significant strides in the field have been made to date; however, so far there has been no single book or volume which captures these advances in a comprehensive format, addressing all essential components and subspecialties of intelligent vehicles, as this book does. Since the intended users are engineering practitioners, as well as researchers and graduate students, the book chapters do not only cover fundamentals, methods, and algorithms but also include how software/hardware are implemented, and demonstrate the advances along with their present challenges. Research at both component and systems levels are required to advance the functionality of intelligent vehicles. This volume covers both of these aspects in addition to the fundamentals listed above.

  4. Genes, evolution and intelligence.

    Science.gov (United States)

    Bouchard, Thomas J

    2014-11-01

    I argue that the g factor meets the fundamental criteria of a scientific construct more fully than any other conception of intelligence. I briefly discuss the evidence regarding the relationship of brain size to intelligence. A review of a large body of evidence demonstrates that there is a g factor in a wide range of species and that, in the species studied, it relates to brain size and is heritable. These findings suggest that many species have evolved a general-purpose mechanism (a general biological intelligence) for dealing with the environments in which they evolved. In spite of numerous studies with considerable statistical power, we know of very few genes that influence g and the effects are very small. Nevertheless, g appears to be highly polygenic. Given the complexity of the human brain, it is not surprising that that one of its primary faculties-intelligence-is best explained by the near infinitesimal model of quantitative genetics.

  5. Modelling intelligent behavior

    Science.gov (United States)

    Green, H. S.; Triffet, T.

    1993-01-01

    An introductory discussion of the related concepts of intelligence and consciousness suggests criteria to be met in the modeling of intelligence and the development of intelligent materials. Methods for the modeling of actual structure and activity of the animal cortex have been found, based on present knowledge of the ionic and cellular constitution of the nervous system. These have led to the development of a realistic neural network model, which has been used to study the formation of memory and the process of learning. An account is given of experiments with simple materials which exhibit almost all properties of biological synapses and suggest the possibility of a new type of computer architecture to implement an advanced type of artificial intelligence.

  6. Emotional Intelligence: Requiring Attention

    Directory of Open Access Journals (Sweden)

    Monica Tudor

    2016-01-01

    Full Text Available This article aims to highlight the need for emotional intelligence. Two methods of measurementare presented in this research, in order to better understand the necessity of a correct result. Theresults of research can lead to recommendations for improving levels of emotional intelligence andare useful for obtaining data to better compare past and present result. The papers presented inthis research are significant for future study of this subject. The first paper presents the evolutionof emotional intelligence in the past two years, more specifically its decrease concerning certaincharacteristics. The second one presents a research on the differences between generations. Thethird one shows a difference in emotional intelligence levels of children from rural versus urbanenvironments and the obstacles that they encounter in their own development.

  7. Intelligence Issues for Congress

    National Research Council Canada - National Science Library

    Best. Jr, Richard A

    2006-01-01

    To address the challenges facing the U.S. Intelligence Community in the 21st century, congressional and executive branch initiatives have sought to improve coordination among the different agencies and to encourage better analysis...

  8. Towards Intelligent Supply Chains

    DEFF Research Database (Denmark)

    Siurdyban, Artur; Møller, Charles

    2012-01-01

    applied to the context of organizational processes can increase the success rate of business operations. The framework is created using a set of theoretical based constructs grounded in a discussion across several streams of research including psychology, pedagogy, artificial intelligence, learning...... of deploying inapt operations leading to deterioration of profits. To address this problem, we propose a unified business process design framework based on the paradigm of intelligence. Intelligence allows humans and human-designed systems cope with environmental volatility, and we argue that its principles......, business process management and supply chain management. It outlines a number of system tasks combined in four integrated management perspectives: build, execute, grow and innovate, put forward as business process design propositions for Intelligent Supply Chains....

  9. TH-A-BRC-01: AAPM TG-135U1 QA for Robotic Radiosurgery

    International Nuclear Information System (INIS)

    Dieterich, S.

    2016-01-01

    AAPM TG-135U1 QA for Robotic Radiosurgery - Sonja Dieterich Since the publication of AAPM TG-135 in 2011, the technology of robotic radiosurgery has rapidly developed. AAPM TG-135U1 will provide recommendations on the clinical practice for using the IRIS collimator, fiducial-less real-time motion tracking, and Monte Carlo based treatment planning. In addition, it will summarize currently available literature about uncertainties. Learning Objectives: Understand the progression of technology since the first TG publication Learn which new QA procedures should be implemented for new technologies Be familiar with updates to clinical practice guidelines AAPM TG-178 Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance - Steven Goetsch Purpose: AAPM Task Group 178 Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance was formed in August, 2008. The Task Group has 12 medical physicists, two physicians and two consultants. Methods: A round robin dosimetry intercomparison of proposed ionization chambers, electrometer and dosimetry phantoms was conducted over a 15 month period in 2011 and 2012 (Med Phys 42, 11, Nov, 2015). The data obtained at 9 institutions (with ten different Elekta Gamma Knife units) was analyzed by the lead author using several protocols. Results: The most consistent results were obtained using the Elekta ABS 16cm diameter phantom, with the TG-51 protocol modified as recommended by Alfonso et al (Med Phys 35, 11, Nov 2008). A key white paper (Med Phys, in press) sponsored by Elekta Corporation, was used to obtain correction factors for the ionization chambers and phantoms used in this intercomparison. Consistent results were obtained for both Elekta Gamma Knife Model 4C and Gamma Knife Perfexion units as measured with each of two miniature ionization chambers. Conclusion: The full report gives clinical history and background of gamma stereotactic radiosurgery, clinical examples and history, quality assurance recommendations and outline

  10. TH-A-BRC-00: New Task Groups for External Beam QA: An Overview

    International Nuclear Information System (INIS)

    2016-01-01

    AAPM TG-135U1 QA for Robotic Radiosurgery - Sonja Dieterich Since the publication of AAPM TG-135 in 2011, the technology of robotic radiosurgery has rapidly developed. AAPM TG-135U1 will provide recommendations on the clinical practice for using the IRIS collimator, fiducial-less real-time motion tracking, and Monte Carlo based treatment planning. In addition, it will summarize currently available literature about uncertainties. Learning Objectives: Understand the progression of technology since the first TG publication Learn which new QA procedures should be implemented for new technologies Be familiar with updates to clinical practice guidelines AAPM TG-178 Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance - Steven Goetsch Purpose: AAPM Task Group 178 Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance was formed in August, 2008. The Task Group has 12 medical physicists, two physicians and two consultants. Methods: A round robin dosimetry intercomparison of proposed ionization chambers, electrometer and dosimetry phantoms was conducted over a 15 month period in 2011 and 2012 (Med Phys 42, 11, Nov, 2015). The data obtained at 9 institutions (with ten different Elekta Gamma Knife units) was analyzed by the lead author using several protocols. Results: The most consistent results were obtained using the Elekta ABS 16cm diameter phantom, with the TG-51 protocol modified as recommended by Alfonso et al (Med Phys 35, 11, Nov 2008). A key white paper (Med Phys, in press) sponsored by Elekta Corporation, was used to obtain correction factors for the ionization chambers and phantoms used in this intercomparison. Consistent results were obtained for both Elekta Gamma Knife Model 4C and Gamma Knife Perfexion units as measured with each of two miniature ionization chambers. Conclusion: The full report gives clinical history and background of gamma stereotactic radiosurgery, clinical examples and history, quality assurance recommendations and outline

  11. TH-A-BRC-00: New Task Groups for External Beam QA: An Overview

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    AAPM TG-135U1 QA for Robotic Radiosurgery - Sonja Dieterich Since the publication of AAPM TG-135 in 2011, the technology of robotic radiosurgery has rapidly developed. AAPM TG-135U1 will provide recommendations on the clinical practice for using the IRIS collimator, fiducial-less real-time motion tracking, and Monte Carlo based treatment planning. In addition, it will summarize currently available literature about uncertainties. Learning Objectives: Understand the progression of technology since the first TG publication Learn which new QA procedures should be implemented for new technologies Be familiar with updates to clinical practice guidelines AAPM TG-178 Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance - Steven Goetsch Purpose: AAPM Task Group 178 Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance was formed in August, 2008. The Task Group has 12 medical physicists, two physicians and two consultants. Methods: A round robin dosimetry intercomparison of proposed ionization chambers, electrometer and dosimetry phantoms was conducted over a 15 month period in 2011 and 2012 (Med Phys 42, 11, Nov, 2015). The data obtained at 9 institutions (with ten different Elekta Gamma Knife units) was analyzed by the lead author using several protocols. Results: The most consistent results were obtained using the Elekta ABS 16cm diameter phantom, with the TG-51 protocol modified as recommended by Alfonso et al (Med Phys 35, 11, Nov 2008). A key white paper (Med Phys, in press) sponsored by Elekta Corporation, was used to obtain correction factors for the ionization chambers and phantoms used in this intercomparison. Consistent results were obtained for both Elekta Gamma Knife Model 4C and Gamma Knife Perfexion units as measured with each of two miniature ionization chambers. Conclusion: The full report gives clinical history and background of gamma stereotactic radiosurgery, clinical examples and history, quality assurance recommendations and outline

  12. TH-A-BRC-01: AAPM TG-135U1 QA for Robotic Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Dieterich, S. [UC Davis Medical Center (United States)

    2016-06-15

    AAPM TG-135U1 QA for Robotic Radiosurgery - Sonja Dieterich Since the publication of AAPM TG-135 in 2011, the technology of robotic radiosurgery has rapidly developed. AAPM TG-135U1 will provide recommendations on the clinical practice for using the IRIS collimator, fiducial-less real-time motion tracking, and Monte Carlo based treatment planning. In addition, it will summarize currently available literature about uncertainties. Learning Objectives: Understand the progression of technology since the first TG publication Learn which new QA procedures should be implemented for new technologies Be familiar with updates to clinical practice guidelines AAPM TG-178 Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance - Steven Goetsch Purpose: AAPM Task Group 178 Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance was formed in August, 2008. The Task Group has 12 medical physicists, two physicians and two consultants. Methods: A round robin dosimetry intercomparison of proposed ionization chambers, electrometer and dosimetry phantoms was conducted over a 15 month period in 2011 and 2012 (Med Phys 42, 11, Nov, 2015). The data obtained at 9 institutions (with ten different Elekta Gamma Knife units) was analyzed by the lead author using several protocols. Results: The most consistent results were obtained using the Elekta ABS 16cm diameter phantom, with the TG-51 protocol modified as recommended by Alfonso et al (Med Phys 35, 11, Nov 2008). A key white paper (Med Phys, in press) sponsored by Elekta Corporation, was used to obtain correction factors for the ionization chambers and phantoms used in this intercomparison. Consistent results were obtained for both Elekta Gamma Knife Model 4C and Gamma Knife Perfexion units as measured with each of two miniature ionization chambers. Conclusion: The full report gives clinical history and background of gamma stereotactic radiosurgery, clinical examples and history, quality assurance recommendations and outline

  13. Comparability between NQA-1 and the QA programs for analytical laboratories within the nuclear industry and EPA hazardous waste laboratories

    International Nuclear Information System (INIS)

    English, S.L.; Dahl, D.R.

    1989-01-01

    There is increasing cooperation between the Department of Energy (DOE), Department of Defense (DOD), and the Environmental Protection Agency (EPA) in the activities associated with monitoring and clean-up of hazardous wastes. Pacific Northwest Laboratory (PNL) examined the quality assurance/quality control programs that the EPA requires of the private sector when performing routine analyses of hazardous wastes to confirm how or if the requirements correspond with PNL's QA program based upon NQA-1. This paper presents the similarities and differences between NQA-1 and the QA program identified in ASTM-C1009-83, Establishing a QA Program for Analytical Chemistry Laboratories within the Nuclear Industry; EPA QAMS-005/80, Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans, which is referenced in Statements of Work for CERCLA analytical activities; and Chapter 1 of SW-846, which is used in analyses of RCRA samples. The EPA QA programs for hazardous waste analyses are easily encompassed within an already established NQA-1 QA program. A few new terms are introduced and there is an increased emphasis upon the QC/verification, but there are many of the same basic concepts in all the programs

  14. WE-AB-201-02: TPS Commissioning and QA: A Process Orientation and Application of Control Charts

    Energy Technology Data Exchange (ETDEWEB)

    Sharpe, M. [The Princess Margaret Cancer Centre - UHN (Canada)

    2015-06-15

    Treatment planning systems (TPS) are a cornerstone of modern radiation therapy. Errors in their commissioning or use can have a devastating impact on many patients. To support safe and high quality care, medical physicists must conduct efficient and proper commissioning, good clinical integration, and ongoing quality assurance (QA) of the TPS. AAPM Task Group 53 and related publications have served as seminal benchmarks for TPS commissioning and QA over the past two decades. Over the same time, continuing innovations have made the TPS even more complex and more central to the clinical process. Medical goals are now expressed in terms of the dose and margins around organs and tissues that are delineated from multiple imaging modalities (CT, MR and PET); and even temporally resolved (i.e., 4D) imaging. This information is passed on to optimization algorithms to establish accelerator movements that are programmed directly for IMRT, VMAT and stereotactic treatments. These advances have made commissioning and QA of the TPS much more challenging. This education session reviews up-to-date experience and guidance on this subject; including the recently published AAPM Medical Physics Practice Guideline (MPPG) #5 “Commissioning and QA of Treatment Planning Dose Calculations: Megavoltage Photon and Electron Beams”. Treatment Planning System Commissioning and QA: Challenges and Opportunities (Greg Salomons) This session will provide some key background and review publications describing prominent incidents relating to TPS commissioning and QA. Traditional approaches have been hardware and feature oriented. They aim to establish a functional configuration and establish specifications for regular testing of features (like dose calculation) to assure stable operation and detect failures. With the advent of more complex systems, more patient-specific testing has also been adopted. A number of actual TPS defects will be presented along with heuristics for identifying similar

  15. WE-AB-201-02: TPS Commissioning and QA: A Process Orientation and Application of Control Charts

    International Nuclear Information System (INIS)

    Sharpe, M.

    2015-01-01

    Treatment planning systems (TPS) are a cornerstone of modern radiation therapy. Errors in their commissioning or use can have a devastating impact on many patients. To support safe and high quality care, medical physicists must conduct efficient and proper commissioning, good clinical integration, and ongoing quality assurance (QA) of the TPS. AAPM Task Group 53 and related publications have served as seminal benchmarks for TPS commissioning and QA over the past two decades. Over the same time, continuing innovations have made the TPS even more complex and more central to the clinical process. Medical goals are now expressed in terms of the dose and margins around organs and tissues that are delineated from multiple imaging modalities (CT, MR and PET); and even temporally resolved (i.e., 4D) imaging. This information is passed on to optimization algorithms to establish accelerator movements that are programmed directly for IMRT, VMAT and stereotactic treatments. These advances have made commissioning and QA of the TPS much more challenging. This education session reviews up-to-date experience and guidance on this subject; including the recently published AAPM Medical Physics Practice Guideline (MPPG) #5 “Commissioning and QA of Treatment Planning Dose Calculations: Megavoltage Photon and Electron Beams”. Treatment Planning System Commissioning and QA: Challenges and Opportunities (Greg Salomons) This session will provide some key background and review publications describing prominent incidents relating to TPS commissioning and QA. Traditional approaches have been hardware and feature oriented. They aim to establish a functional configuration and establish specifications for regular testing of features (like dose calculation) to assure stable operation and detect failures. With the advent of more complex systems, more patient-specific testing has also been adopted. A number of actual TPS defects will be presented along with heuristics for identifying similar

  16. Estimation of eye lens dose during brain scans using Gafchromic XR-QA2 film in various multidetector CT scanners

    International Nuclear Information System (INIS)

    Akhilesh, Philomina; Jamhale, Shramika H.; Sharma, S.D.; Kumar, Rajesh; Datta, D.; Kulkarni, Arti R.

    2017-01-01

    The purpose of this study was to estimate eye lens dose during brain scans in 16-, 64-, 128- and 256-slice multidetector computed tomography (CT) scanners in helical acquisition mode and to test the feasibility of using radiochromic film as eye lens dosemeter during CT scanning. Eye lens dose measurements were performed using Gafchromic XR-QA2 film on a polystyrene head phantom designed with outer dimensions equivalent to the head size of a reference Indian man. The response accuracy of XR-QA2 film was validated by using thermoluminescence dosemeters. The eye lens dose measured using XR-QA2 film on head phantom for plain brain scanning in helical mode ranged from 43.8 to 45.8 mGy. The XR-QA2 film measured dose values were in agreement with TLD measured dose values within a maximum variation of 8.9%. The good correlation between the two data sets confirms the viability of using XR-QA2 film for eye lens dosimetry. (authors)

  17. Business Intelligence Integrated Solutions

    Directory of Open Access Journals (Sweden)

    Cristescu Marian Pompiliu

    2017-01-01

    Full Text Available This paper shows how businesses make decisions better and faster in terms of customers, partners and operations by turning data into valuable business information. The paper describes how to bring together people's and business intelligence information to achieve successful business strategies. There is the possibility of developing business intelligence projects in large and medium-sized organizations only with the Microsoft product described in the paper, and possible alternatives can be discussed according to the required features.

  18. Artificial intelligence in medicine.

    OpenAIRE

    Ramesh, A. N.; Kambhampati, C.; Monson, J. R. T.; Drew, P. J.

    2004-01-01

    INTRODUCTION: Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. METHODS: Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of ...

  19. Artificial Intelligence Study (AIS).

    Science.gov (United States)

    1987-02-01

    ARTIFICIAL INTELLIGNECE HARDWARE ....... 2-50 AI Architecture ................................... 2-49 AI Hardware ....................................... 2...ftf1 829 ARTIFICIAL INTELLIGENCE STUDY (RIS)(U) MAY CONCEPTS 1/3 A~NLYSIS AGENCY BETHESA RD R B NOJESKI FED 6? CM-RP-97-1 NCASIFIED /01/6 M |K 1.0...p/ - - ., e -- CAA- RP- 87-1 SAOFŔ)11 I ARTIFICIAL INTELLIGENCE STUDY (AIS) tNo DTICFEBRUARY 1987 LECT 00 I PREPARED BY RESEARCH AND ANALYSIS

  20. Artificial Intelligence in Astronomy

    Science.gov (United States)

    Devinney, E. J.; Prša, A.; Guinan, E. F.; Degeorge, M.

    2010-12-01

    From the perspective (and bias) as Eclipsing Binary researchers, we give a brief overview of the development of Artificial Intelligence (AI) applications, describe major application areas of AI in astronomy, and illustrate the power of an AI approach in an application developed under the EBAI (Eclipsing Binaries via Artificial Intelligence) project, which employs Artificial Neural Network technology for estimating light curve solution parameters of eclipsing binary systems.

  1. Minimally Naturalistic Artificial Intelligence

    OpenAIRE

    Hansen, Steven Stenberg

    2017-01-01

    The rapid advancement of machine learning techniques has re-energized research into general artificial intelligence. While the idea of domain-agnostic meta-learning is appealing, this emerging field must come to terms with its relationship to human cognition and the statistics and structure of the tasks humans perform. The position of this article is that only by aligning our agents' abilities and environments with those of humans do we stand a chance at developing general artificial intellig...

  2. Artificial intelligence in cardiology

    OpenAIRE

    Bonderman, Diana

    2017-01-01

    Summary Decision-making is complex in modern medicine and should ideally be based on available data, structured knowledge and proper interpretation in the context of an individual patient. Automated algorithms, also termed artificial intelligence that are able to extract meaningful patterns from data collections and build decisions upon identified patterns may be useful assistants in clinical decision-making processes. In this article, artificial intelligence-based studies in clinical cardiol...

  3. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  4. Intelligent Lighting Control System

    OpenAIRE

    García, Elena; Rodríguez González, Sara; de Paz Santana, Juan F.; Bajo Pérez, Javier

    2014-01-01

    This paper presents an adaptive architecture that allows centralized control of public lighting and intelligent management, in order to economise on lighting and maintain maximum comfort status of the illuminated areas. To carry out this management, architecture merges various techniques of artificial intelligence (AI) and statistics such as artificial neural networks (ANN), multi-agent systems (MAS), EM algorithm, methods based on ANOVA and a Service Oriented Aproach (SOA). It performs optim...

  5. Deep nets vs expert designed features in medical physics: An IMRT QA case study.

    Science.gov (United States)

    Interian, Yannet; Rideout, Vincent; Kearney, Vasant P; Gennatas, Efstathios; Morin, Olivier; Cheung, Joey; Solberg, Timothy; Valdes, Gilmer

    2018-03-30

    The purpose of this study was to compare the performance of Deep Neural Networks against a technique designed by domain experts in the prediction of gamma passing rates for Intensity Modulated Radiation Therapy Quality Assurance (IMRT QA). A total of 498 IMRT plans across all treatment sites were planned in Eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam Linacs. Measurements were performed using a commercial 2D diode array, and passing rates for 3%/3 mm local dose/distance-to-agreement (DTA) were recorded. Separately, fluence maps calculated for each plan were used as inputs to a convolution neural network (CNN). The CNNs were trained to predict IMRT QA gamma passing rates using TensorFlow and Keras. A set of model architectures, inspired by the convolutional blocks of the VGG-16 ImageNet model, were constructed and implemented. Synthetic data, created by rotating and translating the fluence maps during training, was created to boost the performance of the CNNs. Dropout, batch normalization, and data augmentation were utilized to help train the model. The performance of the CNNs was compared to a generalized Poisson regression model, previously developed for this application, which used 78 expert designed features. Deep Neural Networks without domain knowledge achieved comparable performance to a baseline system designed by domain experts in the prediction of 3%/3 mm Local gamma passing rates. An ensemble of neural nets resulted in a mean absolute error (MAE) of 0.70 ± 0.05 and the domain expert model resulted in a 0.74 ± 0.06. Convolutional neural networks (CNNs) with transfer learning can predict IMRT QA passing rates by automatically designing features from the fluence maps without human expert supervision. Predictions from CNNs are comparable to a system carefully designed by physicist experts. © 2018 American Association of Physicists in Medicine.

  6. Developing a mailed phantom to implement a local QA program in Egypt radiotherapy centers

    Science.gov (United States)

    Soliman, H. A.; Aletreby, M.

    2016-07-01

    In this work, a simple method that differs from the IAEA/WHO Thermoluminescent dosimeters (TLD) postal quality assurance (QA) program is developed. A small perspex; polymethyl methacrylate (PMMA), phantom measured 50 mm × 50 mm × 50 mm is constructed to be used for absorbed dose verification of high-energy photon beams in some major radiotherapy centers in Egypt. The phantom weighted only 140.7 g with two buildup covers weighted 14.8 and 43.19 g for the Cobalt-60 and the 6-MV X-ray beams, respectively. This phantom is aimed for use in the future's external audit/QA services in Egypt for the first time. TLD-700 chips are used for testing and investigating a convenient and national dosimetry QA program. Although the used methodology is comparable to previously introduced but new system; it has smaller size, less weight, and different more available material. Comparison with the previous similar designs is introduced. Theoretical calculations were done by the commercial Eclipse treatment planning system, implementing the pencil beam convolution algorithm to verify the accuracy of the experimental calculation of the dose conversion factor of water to the perspex phantom. The new constructed small phantom and methodology was applied in 10 participating radiotherapy centers. The absorbed dose was verified under the reference conditions for both 60Co and 6-MV high-energy photon beams. The checked beams were within the 5% limit except for four photon beams. There was an agreement of 0.2% between our experimental data and those previously published confirming the validity of the applied method in verifying radiotherapy absorbed dose.

  7. A novel approach to EPID-based 3D volumetric dosimetry for IMRT and VMAT QA

    Science.gov (United States)

    Alhazmi, Abdulaziz; Gianoli, Chiara; Neppl, Sebastian; Martins, Juliana; Veloza, Stella; Podesta, Mark; Verhaegen, Frank; Reiner, Michael; Belka, Claus; Parodi, Katia

    2018-06-01

    Intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) are relatively complex treatment delivery techniques and require quality assurance (QA) procedures. Pre-treatment dosimetric verification represents a fundamental QA procedure in daily clinical routine in radiation therapy. The purpose of this study is to develop an EPID-based approach to reconstruct a 3D dose distribution as imparted to a virtual cylindrical water phantom to be used for plan-specific pre-treatment dosimetric verification for IMRT and VMAT plans. For each depth, the planar 2D dose distributions acquired in air were back-projected and convolved by depth-specific scatter and attenuation kernels. The kernels were obtained by making use of scatter and attenuation models to iteratively estimate the parameters from a set of reference measurements. The derived parameters served as a look-up table for reconstruction of arbitrary measurements. The summation of the reconstructed 3D dose distributions resulted in the integrated 3D dose distribution of the treatment delivery. The accuracy of the proposed approach was validated in clinical IMRT and VMAT plans by means of gamma evaluation, comparing the reconstructed 3D dose distributions with Octavius measurement. The comparison was carried out using (3%, 3 mm) criteria scoring 99% and 96% passing rates for IMRT and VMAT, respectively. An accuracy comparable to the one of the commercial device for 3D volumetric dosimetry was demonstrated. In addition, five IMRT and five VMAT were validated against the 3D dose calculation performed by the TPS in a water phantom using the same passing rate criteria. The median passing rates within the ten treatment plans was 97.3%, whereas the lowest was 95%. Besides, the reconstructed 3D distribution is obtained without predictions relying on forward dose calculation and without external phantom or dosimetric devices. Thus, the approach provides a fully automated, fast and easy QA

  8. Professionalizing Intelligence Analysis

    Directory of Open Access Journals (Sweden)

    James B. Bruce

    2015-09-01

    Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.

  9. GABA predicts visual intelligence.

    Science.gov (United States)

    Cook, Emily; Hammett, Stephen T; Larsson, Jonas

    2016-10-06

    Early psychological researchers proposed a link between intelligence and low-level perceptual performance. It was recently suggested that this link is driven by individual variations in the ability to suppress irrelevant information, evidenced by the observation of strong correlations between perceptual surround suppression and cognitive performance. However, the neural mechanisms underlying such a link remain unclear. A candidate mechanism is neural inhibition by gamma-aminobutyric acid (GABA), but direct experimental support for GABA-mediated inhibition underlying suppression is inconsistent. Here we report evidence consistent with a global suppressive mechanism involving GABA underlying the link between sensory performance and intelligence. We measured visual cortical GABA concentration, visuo-spatial intelligence and visual surround suppression in a group of healthy adults. Levels of GABA were strongly predictive of both intelligence and surround suppression, with higher levels of intelligence associated with higher levels of GABA and stronger surround suppression. These results indicate that GABA-mediated neural inhibition may be a key factor determining cognitive performance and suggests a physiological mechanism linking surround suppression and intelligence. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  10. Alzheimer's disease and intelligence.

    Science.gov (United States)

    Yeo, R A; Arden, R; Jung, R E

    2011-06-01

    A significant body of evidence has accumulated suggesting that individual variation in intellectual ability, whether assessed directly by intelligence tests or indirectly through proxy measures, is related to risk of developing Alzheimer's disease (AD) in later life. Important questions remain unanswered, however, such as the specificity of risk for AD vs. other forms of dementia, and the specific links between premorbid intelligence and development of the neuropathology characteristic of AD. Lower premorbid intelligence has also emerged as a risk factor for greater mortality across myriad health and mental health diagnoses. Genetic covariance contributes importantly to these associations, and pleiotropic genetic effects may impact diverse organ systems through similar processes, including inefficient design and oxidative stress. Through such processes, the genetic underpinnings of intelligence, specifically, mutation load, may also increase the risk of developing AD. We discuss how specific neurobiologic features of relatively lower premorbid intelligence, including reduced metabolic efficiency, may facilitate the development of AD neuropathology. The cognitive reserve hypothesis, the most widely accepted account of the intelligence-AD association, is reviewed in the context of this larger literature.

  11. Logic Programs as a Specification and Description Tool in the Design Process of an Intelligent Tutoring System

    OpenAIRE

    Möbus, Claus

    1987-01-01

    We propose the use of logic programs when designing intelligent tutoring systems. With their help we specified the small-step semantics of the learning curriculum, designed the graphical user interface, derived instructions and modelled students' knowledge.

  12. Evaluation of Uncertainty of IMRT QA Using 2 Dimensional Array Detector for Head and Neck Patients

    International Nuclear Information System (INIS)

    Ban, Tae Joon; Lee, Woo Suk; Kim, Dae Sup; Baek, Geum Mun; Kwak, Jung Won

    2011-01-01

    IMRT QA using 2 Dimensional array detector is carried out with condition for discrete dose distribution clinically. And it can affect uncertainty of evaluation using gamma method. We analyze gamma index variation according to grid size and suggest validate range of grid size for IMRT QA in Hospital. We performed QA using OniPro I'mRT system software version 1.7b on 10 patients (head and neck) for IMRT. The reference dose plane (grid size, 0.1 cm; location, [0, 0, 0]) from RTP was compared with the dose plane that has different grid size (0.1 cm, 0.5 cm, 1.0 cm, 2.0 cm, 4.0 cm) and different location (along Y-axis 0 cm, 0.2 cm, 0.5 cm, 1.0 cm). The gamma index variation was evaluated by observing the level of changes in Gamma pass rate, Average signal, Standard deviation for each case. The average signal for each grid size showed difference levels of 0%, -0.19%, -0.04%, -0.46%, -8.32% and the standard deviation for each grid size showed difference levels of 0%, -0.30%, 1.24%, -0.70%, -7.99%. The gamma pass rate for each grid size showed difference levels of 0%, 0.27%, -1.43%, 5.32%, 5.60%. The gamma evaluation results according to distance in grid size range of 0.1 cm to 1.0 cm showed good agreement with reference condition (grid size 0.1 cm) within 1.5% and over 5% in case of the grid size was greater than 2.0 cm. We recognize that the grid size of gamma evaluation can make errors of IMRT QA. So we have to consider uncertainty of gamma evaluation according to the grid size and apply smaller than 2 cm grid size to reduce error and increase accuracy clinically.

  13. Beware of Imitators: Al-Qa’ida through the Lens of its Confidential Secretary

    Science.gov (United States)

    2012-06-04

    describe al-Qa`ida’s ideology are distortions which follow typologies devised by the West.45 He rejects all labels, such as “Wahhabis,” “salafi...political or economic target, resulting in the death of 19 people: 14 German tourists ; a French citizen; and four Tunisians.74 In addition, Harun lists...decision-making. He disagrees with the attacks against tourists in July 2007,99 deeming them to be fruitless in so far as serving the causes of the umma

  14. From Q&A to Slumdog Millionaire: it’s written

    OpenAIRE

    Bulger, Laura Fernanda

    2009-01-01

    In this paper, we seek to analyse the film adaptation of Vikas Swarup’ novel, Q&A, published in 2005. Slumdog Millionaire was directed by British filmmaker Danny Boyle and released in 2008. Thus, three years after its publication, Vikas Swarup’s novel was turned into a blockbuster earning successive awards including the 2009 Best Picture Award from the Hollywood Academy. Its success was not the result of a mega “business strategy”; it was largely due to Danny Boyle’s direction and Simon Beauf...

  15. Intelligent pressure measurement in multiple sensor arrays

    International Nuclear Information System (INIS)

    Matthews, C.A.

    1995-01-01

    Pressure data acquisition has typically consisted of a group of sensors scanned by an electronic or mechanical multiplexer. The data accuracy was dependent upon the temperature stability of the sensors. This paper describes a new method of pressure measurement that combines individual temperature compensated pressure sensors, a microprocessor, and an A/D converter in one module. Each sensor has its own temperature characteristics stored in a look-up table to minimize sensor thermal errors. The result is an intelligent pressure module that can output temperature compensated engineering units over an Ethernet interface. Calibration intervals can be dramatically extended depending upon system accuracy requirements and calibration techniques used

  16. An intelligent interlock design support system

    International Nuclear Information System (INIS)

    Hayashi, Toshifumi; Kamiyama, Masahiko

    1990-01-01

    This paper presents an intelligent interlock design support system, called Handy. BWR plant interlocks have been designed on a conventional CAD system operating on a mini-computer based time sharing system. However, its ability to support interlock designers is limited, mainly due to the system not being capable of manipulating the interlock logic. Handy improves the design efficiency with consistent manipulation of the logic and drawings, interlock simulation, versatile database management, object oriented user interface, high resolution high speed graphics, and automatic interlock outlining with a design support expert system. Handy is now being tested by designers, and is expected to greatly contribute to their efficiency. (author)

  17. Smart Optoelectronic Sensors and Intelligent Sensor Systems

    Directory of Open Access Journals (Sweden)

    Sergey Y. YURISH

    2012-03-01

    Full Text Available Light-to-frequency converters are widely used in various optoelectronic sensor systems. However, a further frequency-to-digital conversion is a bottleneck in such systems due to a broad frequency range of light-to-frequency converters’ outputs. This paper describes an effective OEM design approach, which can be used for smart and intelligent sensor systems design. The design is based on novel, multifunctional integrated circuit of Universal Sensors & Transducers Interface especially designed for such sensor applications. Experimental results have confirmed an efficiency of this approach and high metrological performances.

  18. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  19. Automated Intelligent Assistant for mass spectrometry operation

    International Nuclear Information System (INIS)

    Filby, E.E.; Rankin, R.A.; Yoshida, D.E.

    1991-01-01

    The Automated Intelligent Assistant is designed to insure that our mass spectrometers produce timely, high-quality measurement data. The design combines instrument interfacing and expert system technology to automate an adaptable set-point damage prevention strategy. When shutdowns occur, the Assistant can help guide troubleshooting efforts. Stored real-time data will help our development program upgrade and improve the system, and also make it possible to re-run previously-observed instrument problems as ''live'' training exercises for the instrument operators. Initial work has focused on implementing the Assistant for the instrument ultra-high vacuum components. 14 refs., 5 figs

  20. Space Communication Artificial Intelligence for Link Evaluation Terminal (SCAILET)

    Science.gov (United States)

    Shahidi, Anoosh K.; Schlegelmilch, Richard F.; Petrik, Edward J.; Walters, Jerry L.

    1992-01-01

    A software application to assist end-users of the high burst rate (HBR) link evaluation terminal (LET) for satellite communications is being developed. The HBR LET system developed at NASA Lewis Research Center is an element of the Advanced Communications Technology Satellite (ACTS) Project. The HBR LET is divided into seven major subsystems, each with its own expert. Programming scripts, test procedures defined by design engineers, set up the HBR LET system. These programming scripts are cryptic, hard to maintain and require a steep learning curve. These scripts were developed by the system engineers who will not be available for the end-users of the system. To increase end-user productivity a friendly interface needs to be added to the system. One possible solution is to provide the user with adequate documentation to perform the needed tasks. With the complexity of this system the vast amount of documentation needed would be overwhelming and the information would be hard to retrieve. With limited resources, maintenance is another reason for not using this form of documentation. An advanced form of interaction is being explored using current computer techniques. This application, which incorporates a combination of multimedia and artificial intelligence (AI) techniques to provided end-users with an intelligent interface to the HBR LET system, is comprised of an intelligent assistant, intelligent tutoring, and hypermedia documentation. The intelligent assistant and tutoring systems address the critical programming needs of the end-user.

  1. Measuring Emotion Regulation with Single Dry Electrode Brain Computer Interface

    NARCIS (Netherlands)

    van der Wal, C.N.; Irrmischer, M.; Guo, Y.; Friston, K.; Faisal, A.; Hill, S.; Peng, H.

    2015-01-01

    Wireless brain computer interfaces (BCI’s) are promising for new intelligent applications in which emotions are detected by measuring brain activity. Applications, such as serious games and video game therapy, are measuring and using the user’s emotional state in order to determine the intensity

  2. 78 FR 90 - Defense Intelligence Agency National Intelligence University Board of Visitors Closed Meeting

    Science.gov (United States)

    2013-01-02

    ... DEPARTMENT OF DEFENSE Office of the Secretary Defense Intelligence Agency National Intelligence University Board of Visitors Closed Meeting AGENCY: National Intelligence University, Defense Intelligence... hereby given that a closed meeting of the National Intelligence University Board of Visitors has been...

  3. Integrated design of intelligent surveillance systems and their user interface

    NARCIS (Netherlands)

    Toet, A.

    2005-01-01

    Modern complex surveillance systems consisting of multiple and heterogeneous sensors, automatic information registration and data analysis techniques, and decision support tools should provide the human operator an integrated, transparent and easily comprehensible view of the surveyed scene.

  4. Intelligent and Adaptive Interface (IAI) for Cognitive Cockpit (CC)

    Science.gov (United States)

    2004-03-31

    goals3 and plans and generating system plans would be incorporated as task knowledge. The Dialogue Model, which is currently undeveloped in LOCATE...pieces of software. Modularity can also serve to improve the organisational effectiveness of software, whereby a suitable division of labour among...a sophisticated tool in support of future combat aircraft acquisition. While CA can monitor similar activities in countries like the UK and USA we

  5. Profiling nonhuman intelligence: An exercise in developing unbiased tools for describing other "types" of intelligence on earth

    Science.gov (United States)

    Herzing, Denise L.

    2014-02-01

    Intelligence has historically been studied by comparing nonhuman cognitive and language abilities with human abilities. Primate-like species, which show human-like anatomy and share evolutionary lineage, have been the most studied. However, when comparing animals of non-primate origins our abilities to profile the potential for intelligence remains inadequate. Historically our measures for nonhuman intelligence have included a variety of tools: (1) physical measurements - brain to body ratio, brain structure/convolution/neural density, presence of artifacts and physical tools, (2) observational and sensory measurements - sensory signals, complexity of signals, cross-modal abilities, social complexity, (3) data mining - information theory, signal/noise, pattern recognition, (4) experimentation - memory, cognition, language comprehension/use, theory of mind, (5) direct interfaces - one way and two way interfaces with primates, dolphins, birds and (6) accidental interactions - human/animal symbiosis, cross-species enculturation. Because humans tend to focus on "human-like" attributes and measures and scientists are often unwilling to consider other "types" of intelligence that may not be human equated, our abilities to profile "types" of intelligence that differ on a variety of scales is weak. Just as biologists stretch their definitions of life to look at extremophiles in unusual conditions, so must we stretch our descriptions of types of minds and begin profiling, rather than equating, other life forms we may encounter.

  6. Business Intelligence using Software Agents

    Directory of Open Access Journals (Sweden)

    Ana-Ramona BOLOGA

    2011-12-01

    Full Text Available This paper presents some ideas about business intelligence today and the importance of developing real time business solutions. The authors make an exploration of links between business intelligence and artificial intelligence and focuses specifically on the implementation of software agents-based systems in business intelligence. There are briefly presented some of the few solutions proposed so far that use software agents properties for the benefit of business intelligence. The authors then propose some basic ideas for developing real-time agent-based software system for business intelligence in supply chain management, using Case Base Reasoning Agents.

  7. Fluid intelligence: A brief history.

    Science.gov (United States)

    Kent, Phillip

    2017-01-01

    The concept of fluid and crystallized intelligence was introduced to the psychological community approximately 75 years ago by Raymond B. Cattell, and it continues to be an area of active research and controversy. The purpose of this paper is to provide a brief overview of the origin of the concept, early efforts to define intelligence and uses of intelligence tests to address pressing social issues, and the ongoing controversies associated with fluid intelligence and the structure of intelligence. The putative neuropsychological underpinnings and neurological substrates of fluid intelligence are discussed.

  8. SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors

    International Nuclear Information System (INIS)

    Nelson, C; Mason, B; Kirsner, S; Ohrt, J

    2015-01-01

    Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, and MLC errors. The gantry and collimator errors were off by 4 0 for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors

  9. MO-D-213-05: Sensitivity of Routine IMRT QA Metrics to Couch and Collimator Rotations

    International Nuclear Information System (INIS)

    Alaei, P

    2015-01-01

    Purpose: To assess the sensitivity of gamma index and other IMRT QA metrics to couch and collimator rotations. Methods: Two brain IMRT plans with couch and/or collimator rotations in one or more of the fields were evaluated using the IBA MatriXX ion chamber array and its associated software (OmniPro-I’mRT). The plans were subjected to routine QA by 1) Creating a composite planar dose in the treatment planning system (TPS) with the couch/collimator rotations and 2) Creating the planar dose after “zeroing” the rotations. Plan deliveries to MatriXX were performed with all rotations set to zero on a Varian 21ex linear accelerator. This in effect created TPS-created planar doses with an induced rotation error. Point dose measurements for the delivered plans were also performed in a solid water phantom. Results: The IMRT QA of the plans with couch and collimator rotations showed clear discrepancies in the planar dose and 2D dose profile overlays. The gamma analysis, however, did pass with the criteria of 3%/3mm (for 95% of the points), albeit with a lower percentage pass rate, when one or two of the fields had a rotation. Similar results were obtained with tighter criteria of 2%/2mm. Other QA metrics such as percentage difference or distance-to-agreement (DTA) histograms produced similar results. The point dose measurements did not obviously indicate the error due to location of dose measurement (on the central axis) and the size of the ion chamber used (0.6 cc). Conclusion: Relying on Gamma analysis, percentage difference, or DTA to determine the passing of an IMRT QA may miss critical errors in the plan delivery due to couch/collimator rotations. A combination of analyses for composite QA plans, or per-beam analysis, would detect these errors

  10. Design of intelligent house system based on Yeelink

    Directory of Open Access Journals (Sweden)

    Lin Zhi-Huang

    2016-01-01

    Full Text Available In order to monitor the security situation of house in real time, an intelligent house remote monitoring system is designed based on Yeelink cloud services and ZigBee wireless communication technology. This system includes three parts, ZigBee wireless sensor networks, intelligent house gateway and Yeelink Cloud Services. Users can access Yeelink website or APP to get real time information in the house, receiving information including gas concentration, temperature. Also, remote commands can be sent from mobile devices to control the household appliances. The user who can monitor and control the house effectively through a simple and convenient user interface, will feel much more safe and comfortable.

  11. Artificial intelligence: the future in nuclear plant maintenance

    International Nuclear Information System (INIS)

    Norgate, G.

    1984-01-01

    The role of robotics and remote handling equipment in future nuclear power plant maintenance activities is discussed in the context of artificial intelligence applications. Special requirements manipulators, control systems, and man-machine interfaces for nuclear applications are noted. Tasks might include inspection with cameras, eddy current probes, and leak detectors; the collection of material samples; radiation monitoring; and the disassembly, repair and reassembly of a variety of system components. A robot with vision and force sensing and an intelligent control system that can access a knowledge base is schematically described. Recent advances in image interpretation systems are also discussed

  12. New Perspectives on Intelligence Collection and Processing

    Science.gov (United States)

    2016-06-01

    MASINT Measurement and Signature Intelligence NPS Naval Postgraduate School OSINT Open Source Intelligence pdf Probability Density Function SIGINT...MASINT): different types of sensors • Open Source Intelligence ( OSINT ): from all open sources • Signals Intelligence (SIGINT): intercepting the

  13. Trends in ambient intelligent systems the role of computational intelligence

    CERN Document Server

    Khan, Mohammad; Abraham, Ajith

    2016-01-01

    This book demonstrates the success of Ambient Intelligence in providing possible solutions for the daily needs of humans. The book addresses implications of ambient intelligence in areas of domestic living, elderly care, robotics, communication, philosophy and others. The objective of this edited volume is to show that Ambient Intelligence is a boon to humanity with conceptual, philosophical, methodical and applicative understanding. The book also aims to schematically demonstrate developments in the direction of augmented sensors, embedded systems and behavioral intelligence towards Ambient Intelligent Networks or Smart Living Technology. It contains chapters in the field of Ambient Intelligent Networks, which received highly positive feedback during the review process. The book contains research work, with in-depth state of the art from augmented sensors, embedded technology and artificial intelligence along with cutting-edge research and development of technologies and applications of Ambient Intelligent N...

  14. Characterization of a prototype MR-compatible Delta4 QA-system in a 1.5 tesla MR-linac

    NARCIS (Netherlands)

    de Vries, Wilfred J H; Seravalli, Enrica; Houweling, Anette; Woodings, Simon J; van Rooij, Rob; Wolthaus, Jochem W H; Lagendijk, JJW; Raaymakers, Bas W

    2018-01-01

    To perform patient plan-quality assurance (QA) on the newly installed MR-Linac (MRL) there was a need for having an MR-compatible QA-device. An MR compatible device (MR-Delta4) was developed together with Scandidos AB (Uppsala, Sweden). The basic characteristics of the detector response

  15. Intensity-modulated radiation therapy: dynamic MLC (DMLC) therapy, multisegment therapy and tomotherapy. An example of QA in DMLC therapy

    International Nuclear Information System (INIS)

    Webb, S.

    1998-01-01

    Intensity-modulated radiation therapy will make a quantum leap in tumor control. It is the new radiation therapy for the new millennium. The major methods to achieve IMRT are: 1. Dynamic multileaf collimator (DMLC) therapy, 2. multisegment therapy, and 3. tomotherapy. The principles of these 3 techniques are briefly reviewed. Each technique presents unique QA issues which are outlined. As an example this paper will present the results of a recent new study of an important QA concern in DMLC therapy. (orig.) [de

  16. Summary Report for the Evaluation of Current QA Processes Within the FRMAC FAL and EPA MERL.

    Energy Technology Data Exchange (ETDEWEB)

    Shanks, Sonoya T.; Ted Redding; Lynn Jaussi; Allen, Mark B.; Fournier, Sean Donovan; Leonard, Elliott J.

    2017-04-01

    The Federal Radiological Monitoring and Assessment Center (FRMAC) relies on accurate and defensible analytical laboratory data to support its mission. Therefore, FRMAC must ensure that the environmental analytical laboratories providing analytical services maintain an ongoing capability to provide accurate analytical results to DOE. It is undeniable that the more Quality Assurance (QA) and Quality Control (QC) measures required of the laboratory, the less resources that are available for analysis of response samples. Being that QA and QC measures in general are understood to comprise a major effort related to a laboratory’s operations, requirements should only be considered if they are deemed “value-added” for the FRMAC mission. This report provides observations of areas for improvement and potential interoperability opportunities in the areas of Batch Quality Control Requirements, Written Communications, Data Review Processes, Data Reporting Processes, along with the lessons learned as they apply to items in the early phase of a response that will be critical for developing a more efficient, integrated response for future interactions between the FRMAC and EPA assets.

  17. TU-E-BRB-02: DIR QA Options and Research Development

    International Nuclear Information System (INIS)

    Kirby, N.

    2015-01-01

    Deformable image registration (DIR) is developing rapidly and is poised to substantially improve dose fusion accuracy for adaptive and retreatment planning and motion management and PET fusion to enhance contour delineation for treatment planning. However, DIR dose warping accuracy is difficult to quantify, in general, and particularly difficult to do so on a patient-specific basis. As clinical DIR options become more widely available, there is an increased need to understand the implications of incorporating DIR into clinical workflow. Several groups have assessed DIR accuracy in clinically relevant scenarios, but no comprehensive review material is yet available. This session will also discuss aspects of the AAPM Task Group 132 on the Use of Image Registration and Data Fusion Algorithms and Techniques in Radiotherapy Treatment Planning official report, which provides recommendations for DIR clinical use. We will summarize and compare various commercial DIR software options, outline successful clinical techniques, show specific examples with discussion of appropriate and inappropriate applications of DIR, discuss the clinical implications of DIR, provide an overview of current DIR error analysis research, review QA options and research phantom development and present TG-132 recommendations. Learning Objectives: Compare/contrast commercial DIR software and QA options Overview clinical DIR workflow for retreatment To understand uncertainties introduced by DIR Review TG-132 proposed recommendations

  18. TU-E-BRB-02: DIR QA Options and Research Development

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, N. [University of Texas HSC SA (United States)

    2015-06-15

    Deformable image registration (DIR) is developing rapidly and is poised to substantially improve dose fusion accuracy for adaptive and retreatment planning and motion management and PET fusion to enhance contour delineation for treatment planning. However, DIR dose warping accuracy is difficult to quantify, in general, and particularly difficult to do so on a patient-specific basis. As clinical DIR options become more widely available, there is an increased need to understand the implications of incorporating DIR into clinical workflow. Several groups have assessed DIR accuracy in clinically relevant scenarios, but no comprehensive review material is yet available. This session will also discuss aspects of the AAPM Task Group 132 on the Use of Image Registration and Data Fusion Algorithms and Techniques in Radiotherapy Treatment Planning official report, which provides recommendations for DIR clinical use. We will summarize and compare various commercial DIR software options, outline successful clinical techniques, show specific examples with discussion of appropriate and inappropriate applications of DIR, discuss the clinical implications of DIR, provide an overview of current DIR error analysis research, review QA options and research phantom development and present TG-132 recommendations. Learning Objectives: Compare/contrast commercial DIR software and QA options Overview clinical DIR workflow for retreatment To understand uncertainties introduced by DIR Review TG-132 proposed recommendations.

  19. Dynamic wedge, electron energy and beam profile Q.A. using an ionization chamber linear array

    International Nuclear Information System (INIS)

    Kenny, M.B.; Todd, S.P.

    1996-01-01

    Since the introduction of multi-modal linacs the quality assurance workload of a Physical Sciences department has increased dramatically. The advent of dynamic wedges has further complicated matters because of the need to invent accurate methods to perform Q.A. in a reasonable time. We have been using an ionization chamber linear array, the Thebes 7000 TM by Victoreen, Inc., for some years to measure X-ray and electron beam profiles. Two years ago we developed software to perform Q.A. on our dynamic wedges using the array and more recently included a routine to check electron beam energies using the method described by Rosenow, U.F. et al., Med. Phys. 18(1) 19-25. The integrated beam and profile management system has enabled us to maintain a comprehensive quality assurance programme on all our linaccs. Both our efficiency and accuracy have increased to the point where we are able to keep up with the greater number of tests required without an increase in staff or hours spent in quality assurance. In changing the processor from the Z80 of the Thebes console to the 486 of the PC we have also noticed a marked increase in the calibration stability of the array. (author)

  20. A multi-institutional survey evaluating patient related QA – phase II

    Directory of Open Access Journals (Sweden)

    Teichmann Tobias

    2017-09-01

    Full Text Available In phase I of the survey a planning intercomparison of patient-related QA was performed at 12 institutions. The participating clinics created phantom based IMRT and VMAT plans which were measured utilizing the ArcCheck diode array. Mobius3D (M3D was used in phase II. It acts as a secondary dose verification tool for patient-specific QA based on average linac beam data collected by Mobius Medical Systems. All Quasimodo linac plans will be analyzed for the continuation of the intercomparison. We aim to determine if Mobius3D is suited for use with diverse treatment techniques, if beam model customization is needed. Initially we computed first Mobius3D results by transferring all plans from phase I to our Mobius3D server. Because of some larger PTV mean dose differences we checked if output factor customization would be beneficial. We performed measurements and output factor correction to account for discrepancies in reference conditions. Compared to Mobius3D's preconfigured average beam data values, these corrected output factors differed by ±1.5% for field sizes between 7x7cm2 and 30x30cm2 and to −3.9% for 3x3cm2. Our method of correcting the output factors turns out good congruence to M3D's reference values for these medium field sizes.

  1. M073: Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2004-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance.

  2. Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2007-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse-height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance. (author)

  3. Intelligent Home Control System Based on ARM10

    Science.gov (United States)

    Chen, G. X.; Jiang, J.; Zhong, L. H.

    2017-10-01

    Intelligent home is becoming the hot spot of social attention in the 21st century. When it is in China, it is a really new industry. However, there is no doubt that Intelligent home will become a new economic growth point of social development; it will change the life-style of human being. To develop the intelligent home, we should keep up with the development trend of technology. This is the reason why I talk about the intelligent home control system here. In this paper, intelligent home control system is designed for alarm and remote control on gas- leaking, fire disaster, earthquake prediction, etc., by examining environmental changes around house. When the Intelligent home control system has detected an accident occurs, the processor will communicate with the GSM module, informing the house keeper the occurrence of accident. User can receive and send the message to the system to cut the power by mobile phone. The system can get access to DCCthrough ARM10 JTAG interface, using DCC to send and receive messages. At the same time, the debugger on the host is mainly used to receive the user’s command and send it to the debug component in the target system. The data that returned from the target system is received and displayed to the user in a certain format.

  4. Social Representations of Intelligence

    Directory of Open Access Journals (Sweden)

    Elena Zubieta

    2016-02-01

    Full Text Available The article stresses the relationship between Explicit and Implicit theories of Intelligence. Following the line of common sense epistemology and the theory of Social Representations, a study was carried out in order to analyze naive’s explanations about Intelligence Definitions. Based on Mugny & Carugati (1989 research, a self-administered questionnaire was designed and filled in by 286 subjects. Results are congruent with the main hyphotesis postulated: A general overlap between explicit and implicit theories showed up. According to the results Intelligence appears as both, a social attribute related to social adaptation and as a concept defined in relation with contextual variables similar to expert’s current discourses. Nevertheless, conceptions based on “gifted ideology” still are present stressing the main axes of Intelligence debate: biological and sociological determinism. In the same sense, unfamiliarity and social identity are reaffirmed as organizing principles of social representation. The distance with the object -measured as the belief in intelligence differences as a solve/non solve problem- and the level of implication with the topic -teachers/no teachers- appear as discriminating elements at the moment of supporting specific dimensions. 

  5. Modelling traffic flows with intelligent cars and intelligent roads

    NARCIS (Netherlands)

    van Arem, Bart; Tampere, Chris M.J.; Malone, Kerry

    2003-01-01

    This paper addresses the modeling of traffic flows with intelligent cars and intelligent roads. It will describe the modeling approach MIXIC and review the results for different ADA systems: Adaptive Cruise Control, a special lane for Intelligent Vehicles, cooperative following and external speed

  6. Intelligence analysis – the royal discipline of Competitive Intelligence

    Directory of Open Access Journals (Sweden)

    František Bartes

    2011-01-01

    Full Text Available The aim of this article is to propose work methodology for Competitive Intelligence teams in one of the intelligence cycle’s specific area, in the so-called “Intelligence Analysis”. Intelligence Analysis is one of the stages of the Intelligence Cycle in which data from both the primary and secondary research are analyzed. The main result of the effort is the creation of added value for the information collected. Company Competiitve Intelligence, correctly understood and implemented in business practice, is the “forecasting of the future”. That is forecasting about the future, which forms the basis for strategic decisions made by the company’s top management. To implement that requirement in corporate practice, the author perceives Competitive Intelligence as a systemic application discipline. This approach allows him to propose a “Work Plan” for Competitive Intelligence as a fundamental standardized document to steer Competitive Intelligence team activities. The author divides the Competitive Intelligence team work plan into five basic parts. Those parts are derived from the five-stage model of the intelligence cycle, which, in the author’s opinion, is more appropriate for complicated cases of Competitive Intelligence.

  7. The Literature of Competitive Intelligence.

    Science.gov (United States)

    Walker, Thomas D.

    1994-01-01

    Describes competitive intelligence (CI) literature in terms of its location, quantity, authorship, length, and problems of bibliographic access. Highlights include subject access; competitive intelligence research; espionage and security; monographs; and journals. (21 references) (LRW)

  8. Artificial intelligence in nanotechnology.

    Science.gov (United States)

    Sacha, G M; Varona, P

    2013-11-15

    During the last decade there has been increasing use of artificial intelligence tools in nanotechnology research. In this paper we review some of these efforts in the context of interpreting scanning probe microscopy, the study of biological nanosystems, the classification of material properties at the nanoscale, theoretical approaches and simulations in nanoscience, and generally in the design of nanodevices. Current trends and future perspectives in the development of nanocomputing hardware that can boost artificial-intelligence-based applications are also discussed. Convergence between artificial intelligence and nanotechnology can shape the path for many technological developments in the field of information sciences that will rely on new computer architectures and data representations, hybrid technologies that use biological entities and nanotechnological devices, bioengineering, neuroscience and a large variety of related disciplines.

  9. Artificial intelligence in nanotechnology

    Science.gov (United States)

    Sacha, G. M.; Varona, P.

    2013-11-01

    During the last decade there has been increasing use of artificial intelligence tools in nanotechnology research. In this paper we review some of these efforts in the context of interpreting scanning probe microscopy, the study of biological nanosystems, the classification of material properties at the nanoscale, theoretical approaches and simulations in nanoscience, and generally in the design of nanodevices. Current trends and future perspectives in the development of nanocomputing hardware that can boost artificial-intelligence-based applications are also discussed. Convergence between artificial intelligence and nanotechnology can shape the path for many technological developments in the field of information sciences that will rely on new computer architectures and data representations, hybrid technologies that use biological entities and nanotechnological devices, bioengineering, neuroscience and a large variety of related disciplines.

  10. Intelligent environmental data warehouse

    International Nuclear Information System (INIS)

    Ekechukwu, B.

    1998-01-01

    Making quick and effective decisions in environment management are based on multiple and complex parameters, a data warehouse is a powerful tool for the over all management of massive environmental information. Selecting the right data from a warehouse is an important factor consideration for end-users. This paper proposed an intelligent environmental data warehouse system. It consists of data warehouse to feed an environmental researchers and managers with desire environmental information needs to their research studies and decision in form of geometric and attribute data for study area, and a metadata for the other sources of environmental information. In addition, the proposed intelligent search engine works according to a set of rule, which enables the system to be aware of the environmental data wanted by the end-user. The system development process passes through four stages. These are data preparation, warehouse development, intelligent engine development and internet platform system development. (author)

  11. Intelligent control systems 1990

    International Nuclear Information System (INIS)

    Shoureshi, R.

    1991-01-01

    The field of artificial intelligence (Al) has generated many useful ideas and techniques that can be integrated into the design of control systems. It is believed and, for special cases, has been demonstrated, that integration of Al into control systems would provide the necessary tools for solving many of the complex problems that present control techniques and Al algorithms are unable to do, individually. However, this integration requires the development of basic understanding and new fundamentals to provide scientific bases for achievement of its potential. This book presents an overview of some of the latest research studies in the area of intelligent control systems. These papers present techniques for formulation of intelligent control, and development of the rule-based control systems. Papers present applications of control systems in nuclear power plants and HVAC systems

  12. Artificial Intelligence in Cardiology.

    Science.gov (United States)

    Johnson, Kipp W; Torres Soto, Jessica; Glicksberg, Benjamin S; Shameer, Khader; Miotto, Riccardo; Ali, Mohsin; Ashley, Euan; Dudley, Joel T

    2018-06-12

    Artificial intelligence and machine learning are poised to influence nearly every aspect of the human condition, and cardiology is not an exception to this trend. This paper provides a guide for clinicians on relevant aspects of artificial intelligence and machine learning, reviews selected applications of these methods in cardiology to date, and identifies how cardiovascular medicine could incorporate artificial intelligence in the future. In particular, the paper first reviews predictive modeling concepts relevant to cardiology such as feature selection and frequent pitfalls such as improper dichotomization. Second, it discusses common algorithms used in supervised learning and reviews selected applications in cardiology and related disciplines. Third, it describes the advent of deep learning and related methods collectively called unsupervised learning, provides contextual examples both in general medicine and in cardiovascular medicine, and then explains how these methods could be applied to enable precision cardiology and improve patient outcomes. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Artificial intelligence in nanotechnology

    International Nuclear Information System (INIS)

    Sacha, G M; Varona, P

    2013-01-01

    During the last decade there has been increasing use of artificial intelligence tools in nanotechnology research. In this paper we review some of these efforts in the context of interpreting scanning probe microscopy, the study of biological nanosystems, the classification of material properties at the nanoscale, theoretical approaches and simulations in nanoscience, and generally in the design of nanodevices. Current trends and future perspectives in the development of nanocomputing hardware that can boost artificial-intelligence-based applications are also discussed. Convergence between artificial intelligence and nanotechnology can shape the path for many technological developments in the field of information sciences that will rely on new computer architectures and data representations, hybrid technologies that use biological entities and nanotechnological devices, bioengineering, neuroscience and a large variety of related disciplines. (topical review)

  14. Understanding the Globalization of Intelligence

    DEFF Research Database (Denmark)

    Svendsen, Adam David Morgan

    "This book provides an introduction to the complexities of contemporary Western Intelligence and its dynamics during an era of globalization. Towards an understanding of the globalization of intelligence process, Svendsen focuses on the secretive phenomenon of international or foreign intelligence...... cooperation ('liaison'), as it occurs in both theory and practice. Reflecting a complex coexistence plurality of several different and overlapping concepts in action, the challenging process of the globalization of intelligence emerges as essential for complex issue management purposes during a globalized era...

  15. Artificial Intelligence and Economic Theories

    OpenAIRE

    Marwala, Tshilidzi; Hurwitz, Evan

    2017-01-01

    The advent of artificial intelligence has changed many disciplines such as engineering, social science and economics. Artificial intelligence is a computational technique which is inspired by natural intelligence such as the swarming of birds, the working of the brain and the pathfinding of the ants. These techniques have impact on economic theories. This book studies the impact of artificial intelligence on economic theories, a subject that has not been extensively studied. The theories that...

  16. Collective Intelligence in Crises

    DEFF Research Database (Denmark)

    Büscher, Monika; Liegl, Michael; Thomas, Vanessa

    2014-01-01

    New practices of social media use in emergency response seem to enable broader `situation awareness' and new forms of crisis management. The scale and speed of innovation in this field engenders disruptive innovation or a reordering of social, political, economic practices of emergency response....... By examining these dynamics with the concept of social collective intelligence, important opportunities and challenges can be examined. In this chapter we focus on socio-technical aspects of social collective intelligence in crises to discuss positive and negative frictions and avenues for innovation...

  17. Artificial intelligence executive summary

    International Nuclear Information System (INIS)

    Wamsley, S.J.; Purvis, E.E. III

    1984-01-01

    Artificial intelligence (AI) is a high technology field that can be used to provide problem solving diagnosis, guidance and for support resolution of problems. It is not a stand alone discipline, but can also be applied to develop data bases for retention of the expertise that is required for its own knowledge base. This provides a way to retain knowledge that otherwise may be lost. Artificial Intelligence Methodology can provide an automated construction management decision support system, thereby restoring the manager's emphasis to project management

  18. Artificial intelligence in cardiology.

    Science.gov (United States)

    Bonderman, Diana

    2017-12-01

    Decision-making is complex in modern medicine and should ideally be based on available data, structured knowledge and proper interpretation in the context of an individual patient. Automated algorithms, also termed artificial intelligence that are able to extract meaningful patterns from data collections and build decisions upon identified patterns may be useful assistants in clinical decision-making processes. In this article, artificial intelligence-based studies in clinical cardiology are reviewed. The text also touches on the ethical issues and speculates on the future roles of automated algorithms versus clinicians in cardiology and medicine in general.

  19. Intelligent Freigth Transport Systems

    DEFF Research Database (Denmark)

    Overø, Helene Martine; Larsen, Allan; Røpke, Stefan

    2009-01-01

    is to enhance the efficiency and lower the environmental impact in freight transport. In this paper, a pilot project involving real-time waste collection at a Danish waste collection company is described, and a solution approach is proposed. The problem corresponds to the dynamic version of the waste collection......The Danish innovation project entitled “Intelligent Freight Transport Systems” aims at developing prototype systems integrating public intelligent transport systems (ITS) with the technology in vehicles and equipment as well as the IT-systems at various transport companies. The objective...

  20. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.