WorldWideScience

Sample records for automated model abstraction

  1. Automated Predicate Abstraction for Real-Time Models

    Directory of Open Access Journals (Sweden)

    Bahareh Badban

    2009-11-01

    Full Text Available We present a technique designed to automatically compute predicate abstractions for dense real-timed models represented as networks of timed automata. We use the CIPM algorithm in our previous work which computes new invariants for timed automata control locations and prunes the model, to compute a predicate abstraction of the model. We do so by taking information regarding control locations and their newly computed invariants into account.

  2. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  3. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  4. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  5. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  6. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  7. Automation for mineral resource development

    Energy Technology Data Exchange (ETDEWEB)

    Norrie, A.W.; Turner, D.R. (eds.)

    1986-01-01

    A total of 55 papers were presented at the symposium under the following headings: automation and the future of mining; modelling and control of mining processes; transportation for mining; automation and the future of metallurgical processes; modelling and control of metallurgical processes; and general aspects. Fifteen papers have been abstracted separately.

  8. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  9. The triconnected abstraction of process models

    OpenAIRE

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    2009-01-01

    Contents: Artem Polyvanny, Sergey Smirnow, and Mathias Weske The Triconnected Abstraction of Process Models 1 Introduction 2 Business Process Model Abstraction 3 Preliminaries 4 Triconnected Decomposition 4.1 Basic Approach for Process Component Discovery 4.2 SPQR-Tree Decomposition 4.3 SPQR-Tree Fragments in the Context of Process Models 5 Triconnected Abstraction 5.1 Abstraction Rules 5.2 Abstraction Algorithm 6 Related Work and Conclusions

  10. Abstracting Concepts and Methods.

    Science.gov (United States)

    Borko, Harold; Bernier, Charles L.

    This text provides a complete discussion of abstracts--their history, production, organization, publication--and of indexing. Instructions for abstracting are outlined, and standards and criteria for abstracting are stated. Management, automation, and personnel are discussed in terms of possible economies that can be derived from the introduction…

  11. Temperature control of fimbriation circuit switch in uropathogenic Escherichia coli: quantitative analysis via automated model abstraction.

    Science.gov (United States)

    Kuwahara, Hiroyuki; Myers, Chris J; Samoilov, Michael S

    2010-03-26

    Uropathogenic Escherichia coli (UPEC) represent the predominant cause of urinary tract infections (UTIs). A key UPEC molecular virulence mechanism is type 1 fimbriae, whose expression is controlled by the orientation of an invertible chromosomal DNA element-the fim switch. Temperature has been shown to act as a major regulator of fim switching behavior and is overall an important indicator as well as functional feature of many urologic diseases, including UPEC host-pathogen interaction dynamics. Given this panoptic physiological role of temperature during UTI progression and notable empirical challenges to its direct in vivo studies, in silico modeling of corresponding biochemical and biophysical mechanisms essential to UPEC pathogenicity may significantly aid our understanding of the underlying disease processes. However, rigorous computational analysis of biological systems, such as fim switch temperature control circuit, has hereto presented a notoriously demanding problem due to both the substantial complexity of the gene regulatory networks involved as well as their often characteristically discrete and stochastic dynamics. To address these issues, we have developed an approach that enables automated multiscale abstraction of biological system descriptions based on reaction kinetics. Implemented as a computational tool, this method has allowed us to efficiently analyze the modular organization and behavior of the E. coli fimbriation switch circuit at different temperature settings, thus facilitating new insights into this mode of UPEC molecular virulence regulation. In particular, our results suggest that, with respect to its role in shutting down fimbriae expression, the primary function of FimB recombinase may be to effect a controlled down-regulation (rather than increase) of the ON-to-OFF fim switching rate via temperature-dependent suppression of competing dynamics mediated by recombinase FimE. Our computational analysis further implies that this down

  12. Temperature control of fimbriation circuit switch in uropathogenic Escherichia coli: quantitative analysis via automated model abstraction.

    Directory of Open Access Journals (Sweden)

    Hiroyuki Kuwahara

    2010-03-01

    Full Text Available Uropathogenic Escherichia coli (UPEC represent the predominant cause of urinary tract infections (UTIs. A key UPEC molecular virulence mechanism is type 1 fimbriae, whose expression is controlled by the orientation of an invertible chromosomal DNA element-the fim switch. Temperature has been shown to act as a major regulator of fim switching behavior and is overall an important indicator as well as functional feature of many urologic diseases, including UPEC host-pathogen interaction dynamics. Given this panoptic physiological role of temperature during UTI progression and notable empirical challenges to its direct in vivo studies, in silico modeling of corresponding biochemical and biophysical mechanisms essential to UPEC pathogenicity may significantly aid our understanding of the underlying disease processes. However, rigorous computational analysis of biological systems, such as fim switch temperature control circuit, has hereto presented a notoriously demanding problem due to both the substantial complexity of the gene regulatory networks involved as well as their often characteristically discrete and stochastic dynamics. To address these issues, we have developed an approach that enables automated multiscale abstraction of biological system descriptions based on reaction kinetics. Implemented as a computational tool, this method has allowed us to efficiently analyze the modular organization and behavior of the E. coli fimbriation switch circuit at different temperature settings, thus facilitating new insights into this mode of UPEC molecular virulence regulation. In particular, our results suggest that, with respect to its role in shutting down fimbriae expression, the primary function of FimB recombinase may be to effect a controlled down-regulation (rather than increase of the ON-to-OFF fim switching rate via temperature-dependent suppression of competing dynamics mediated by recombinase FimE. Our computational analysis further implies

  13. An Abstraction-Based Data Model for Information Retrieval

    Science.gov (United States)

    McAllister, Richard A.; Angryk, Rafal A.

    Language ontologies provide an avenue for automated lexical analysis that may be used to supplement existing information retrieval methods. This paper presents a method of information retrieval that takes advantage of WordNet, a lexical database, to generate paths of abstraction, and uses them as the basis for an inverted index structure to be used in the retrieval of documents from an indexed corpus. We present this method as a entree to a line of research on using ontologies to perform word-sense disambiguation and improve the precision of existing information retrieval techniques.

  14. Engineering Abstractions in Model Checking and Testing

    DEFF Research Database (Denmark)

    Achenbach, Michael; Ostermann, Klaus

    2009-01-01

    Abstractions are used in model checking to tackle problems like state space explosion or modeling of IO. The application of these abstractions in real software development processes, however, lacks engineering support. This is one reason why model checking is not widely used in practice yet...... and testing is still state of the art in falsification. We show how user-defined abstractions can be integrated into a Java PathFinder setting with tools like AspectJ or Javassist and discuss implications of remaining weaknesses of these tools. We believe that a principled engineering approach to designing...... and implementing abstractions will improve the applicability of model checking in practice....

  15. Implementing The Automated Phases Of The Partially-Automated Digital Triage Process Model

    Directory of Open Access Journals (Sweden)

    Gary D Cantrell

    2012-12-01

    Full Text Available Digital triage is a pre-digital-forensic phase that sometimes takes place as a way of gathering quick intelligence. Although effort has been undertaken to model the digital forensics process, little has been done to date to model digital triage. This work discuses the further development of a model that does attempt to address digital triage the Partially-automated Crime Specific Digital Triage Process model. The model itself will be presented along with a description of how its automated functionality was implemented to facilitate model testing.

  16. Recent advances in automated system model extraction (SME)

    International Nuclear Information System (INIS)

    Narayanan, Nithin; Bloomsburgh, John; He Yie; Mao Jianhua; Patil, Mahesh B; Akkaraju, Sandeep

    2006-01-01

    In this paper we present two different techniques for automated extraction of system models from FEA models. We discuss two different algorithms: for (i) automated N-DOF SME for electrostatically actuated MEMS and (ii) automated N-DOF SME for MEMS inertial sensors. We will present case studies for the two different algorithms presented

  17. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  18. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  19. Agent-based Modeling Automated: Data-driven Generation of Innovation Diffusion Models

    NARCIS (Netherlands)

    Jensen, T.; Chappin, E.J.L.

    2016-01-01

    Simulation modeling is useful to gain insights into driving mechanisms of diffusion of innovations. This study aims to introduce automation to make identification of such mechanisms with agent-based simulation modeling less costly in time and labor. We present a novel automation procedure in which

  20. The Conversion of Cardiovascular Conference Abstracts to Publications

    DEFF Research Database (Denmark)

    Fosbøl, Emil L.; Fosbøl, Philip Loldrup; Harrington, Robert A.

    2012-01-01

    a systematic and automated evaluation of rates, timing, and correlates of publication from scientific abstracts presented at 3 major cardiovascular conferences. Methods and Results—Using an automated computer algorithm, we searched the ISI Web of Science to identify peer-reviewed publications of abstracts......–3, 2.5–5.8) for ESC (P for difference between groups 0.01). Clinical science and population science were less likely to be published compared with basic science. Conclusions—One third of abstracts were translated into publications by 2 years after presentation and less than one half by 5 years after...

  1. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  2. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    International Nuclear Information System (INIS)

    Andersson, J.

    2011-01-01

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operators mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  3. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  4. SATURATED ZONE FLOW AND TRANSPORT MODEL ABSTRACTION

    International Nuclear Information System (INIS)

    B.W. ARNOLD

    2004-01-01

    The purpose of the saturated zone (SZ) flow and transport model abstraction task is to provide radionuclide-transport simulation results for use in the total system performance assessment (TSPA) for license application (LA) calculations. This task includes assessment of uncertainty in parameters that pertain to both groundwater flow and radionuclide transport in the models used for this purpose. This model report documents the following: (1) The SZ transport abstraction model, which consists of a set of radionuclide breakthrough curves at the accessible environment for use in the TSPA-LA simulations of radionuclide releases into the biosphere. These radionuclide breakthrough curves contain information on radionuclide-transport times through the SZ. (2) The SZ one-dimensional (I-D) transport model, which is incorporated in the TSPA-LA model to simulate the transport, decay, and ingrowth of radionuclide decay chains in the SZ. (3) The analysis of uncertainty in groundwater-flow and radionuclide-transport input parameters for the SZ transport abstraction model and the SZ 1-D transport model. (4) The analysis of the background concentration of alpha-emitting species in the groundwater of the SZ

  5. Task-focused modeling in automated agriculture

    Science.gov (United States)

    Vriesenga, Mark R.; Peleg, K.; Sklansky, Jack

    1993-01-01

    Machine vision systems analyze image data to carry out automation tasks. Our interest is in machine vision systems that rely on models to achieve their designed task. When the model is interrogated from an a priori menu of questions, the model need not be complete. Instead, the machine vision system can use a partial model that contains a large amount of information in regions of interest and less information elsewhere. We propose an adaptive modeling scheme for machine vision, called task-focused modeling, which constructs a model having just sufficient detail to carry out the specified task. The model is detailed in regions of interest to the task and is less detailed elsewhere. This focusing effect saves time and reduces the computational effort expended by the machine vision system. We illustrate task-focused modeling by an example involving real-time micropropagation of plants in automated agriculture.

  6. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  7. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  8. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    Keating, Kevin S.; Pyle, Anna Marie

    2012-01-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  9. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. (Chalmers Univ. of Technology. Division Design and Human factors. Dept. of Product and Production Development, Goeteborg (Sweden))

    2011-01-15

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operator's mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  10. Automated data acquisition technology development:Automated modeling and control development

    Science.gov (United States)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  11. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  12. Models of Automation Surprise: Results of a Field Survey in Aviation

    Directory of Open Access Journals (Sweden)

    Robert De Boer

    2017-09-01

    Full Text Available Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration with automation. In this paper, we report the results of a field study that empirically compared and contrasted two models of automation surprises: a normative individual-cognition model and a sensemaking model based on distributed cognition. Our data prove a good fit for the sense-making model. This finding is relevant for aviation safety, since our understanding of the cognitive processes that govern human interaction with automation drive what we need to do to reduce the frequency of automation-induced events.

  13. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  14. Automation model of sewerage rehabilitation planning.

    Science.gov (United States)

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  15. Automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

    Science.gov (United States)

    Bordoli, Lorenza; Schwede, Torsten

    2012-01-01

    Comparative protein structure modeling is a computational approach to build three-dimensional structural models for proteins using experimental structures of related protein family members as templates. Regular blind assessments of modeling accuracy have demonstrated that comparative protein structure modeling is currently the most reliable technique to model protein structures. Homology models are often sufficiently accurate to substitute for experimental structures in a wide variety of applications. Since the usefulness of a model for specific application is determined by its accuracy, model quality estimation is an essential component of protein structure prediction. Comparative protein modeling has become a routine approach in many areas of life science research since fully automated modeling systems allow also nonexperts to build reliable models. In this chapter, we describe practical approaches for automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

  16. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  17. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  18. Variability-Specific Abstraction Refinement for Family-Based Model Checking

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Wasowski, Andrzej

    2017-01-01

    and property, while the number of possible scenarios is very large. In this work, we present an automatic iterative abstraction refinement procedure for family-based model checking. We use Craig interpolation to refine abstract variational models based on the obtained spurious counterexamples (traces...

  19. Model and information abstraction for description-driven systems

    International Nuclear Information System (INIS)

    Estrella, F.; McClatchey, R.; Kovacs, Z.; Goff, J.-M.L.

    2001-01-01

    A crucial factor in the creation of adaptable systems dealing with changing requirements is the suitability of the underlying technology in allowing the evolution of the system. A reflective system utilizes an open architecture where implicit system aspects are reified to become explicit first-class (meta-data) objects. These implicit system aspects are often fundamental structures which are inaccessible and immutable, and their reification as meta-data objects can serve as the basis for changes and extensions to the system, making it self-describing. To address the evolvability issue, the author proposes a reflective architecture based on two orthogonal abstractions-model abstraction and information abstraction. In this architecture the modeling abstractions allow for the separation of the description meta-data from the system aspects they represent so that they can be managed and versioned independently, asynchronously and explicitly

  20. Knowledge acquisition for temporal abstraction.

    Science.gov (United States)

    Stein, A; Musen, M A; Shahar, Y

    1996-01-01

    Temporal abstraction is the task of detecting relevant patterns in data over time. The knowledge-based temporal-abstraction method uses knowledge about a clinical domain's contexts, external events, and parameters to create meaningful interval-based abstractions from raw time-stamped clinical data. In this paper, we describe the acquisition and maintenance of domain-specific temporal-abstraction knowledge. Using the PROTEGE-II framework, we have designed a graphical tool for acquiring temporal knowledge directly from expert physicians, maintaining the knowledge in a sharable form, and converting the knowledge into a suitable format for use by an appropriate problem-solving method. In initial tests, the tool offered significant gains in our ability to rapidly acquire temporal knowledge and to use that knowledge to perform automated temporal reasoning.

  1. RCrane: semi-automated RNA model building.

    Science.gov (United States)

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  2. Coupling Radar Rainfall to Hydrological Models for Water Abstraction Management

    Science.gov (United States)

    Asfaw, Alemayehu; Shucksmith, James; Smith, Andrea; MacDonald, Ken

    2015-04-01

    The impacts of climate change and growing water use are likely to put considerable pressure on water resources and the environment. In the UK, a reform to surface water abstraction policy has recently been proposed which aims to increase the efficiency of using available water resources whilst minimising impacts on the aquatic environment. Key aspects to this reform include the consideration of dynamic rather than static abstraction licensing as well as introducing water trading concepts. Dynamic licensing will permit varying levels of abstraction dependent on environmental conditions (i.e. river flow and quality). The practical implementation of an effective dynamic abstraction strategy requires suitable flow forecasting techniques to inform abstraction asset management. Potentially the predicted availability of water resources within a catchment can be coupled to predicted demand and current storage to inform a cost effective water resource management strategy which minimises environmental impacts. The aim of this work is to use a historical analysis of UK case study catchment to compare potential water resource availability using modelled dynamic abstraction scenario informed by a flow forecasting model, against observed abstraction under a conventional abstraction regime. The work also demonstrates the impacts of modelling uncertainties on the accuracy of predicted water availability over range of forecast lead times. The study utilised a conceptual rainfall-runoff model PDM - Probability-Distributed Model developed by Centre for Ecology & Hydrology - set up in the Dove River catchment (UK) using 1km2 resolution radar rainfall as inputs and 15 min resolution gauged flow data for calibration and validation. Data assimilation procedures are implemented to improve flow predictions using observed flow data. Uncertainties in the radar rainfall data used in the model are quantified using artificial statistical error model described by Gaussian distribution and

  3. Cellular-automation fluids: A model for flow in porous media

    International Nuclear Information System (INIS)

    Rothman, D.H.

    1987-01-01

    Because the intrinsic inhomogeneity of porous media makes the application of proper boundary conditions difficult, fluid flow through microgeometric models has typically been achieved with idealized arrays of geometrically simple pores, throats, and cracks. The author proposes here an attractive alternative, capable of freely and accurately modeling fluid flow in grossly irregular geometries. This new method numerically solves the Navier-Stokes equations using the cellular-automation fluid model introduced by Frisch, Hasslacher, and Pomeau. The cellular-automation fluid is extraordinarily simple - particles of unit mass traveling with unit velocity reside on a triangular lattice and obey elementary collisions rules - but capable of modeling much of the rich complexity of real fluid flow. The author shows how cellular-automation fluids are applied to the study of porous media. In particular, he discusses issues of scale on the cellular-automation lattice and present the results of 2-D simulations, including numerical estimation of permeability and verification of Darcy's law

  4. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  5. Dockomatic - automated ligand creation and docking

    Directory of Open Access Journals (Sweden)

    Hampikian Greg

    2010-11-01

    Full Text Available Abstract Background The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. Results DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. Conclusions DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  6. Compositional Abstraction of PEPA Models for Transient Analysis

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    - or interval - Markov chains allow us to aggregate states in such a way as to safely bound transient probabilities of the original Markov chain. Whilst we can apply this technique directly to a PEPA model, it requires us to obtain the CTMC of the model, whose state space may be too large to construct......Stochastic process algebras such as PEPA allow complex stochastic models to be described in a compositional way, but this leads to state space explosion problems. To combat this, there has been a great deal of work in developing techniques for abstracting Markov chains. In particular, abstract...

  7. Automated particulate sampler field test model operations guide

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, S.M.; Miley, H.S.

    1996-10-01

    The Automated Particulate Sampler Field Test Model Operations Guide is a collection of documents which provides a complete picture of the Automated Particulate Sampler (APS) and the Field Test in which it was evaluated. The Pacific Northwest National Laboratory (PNNL) Automated Particulate Sampler was developed for the purpose of radionuclide particulate monitoring for use under the Comprehensive Test Ban Treaty (CTBT). Its design was directed by anticipated requirements of small size, low power consumption, low noise level, fully automatic operation, and most predominantly the sensitivity requirements of the Conference on Disarmament Working Paper 224 (CDWP224). This guide is intended to serve as both a reference document for the APS and to provide detailed instructions on how to operate the sampler. This document provides a complete description of the APS Field Test Model and all the activity related to its evaluation and progression.

  8. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  9. An Abstract Model of Historical Processes

    Directory of Open Access Journals (Sweden)

    Michael Poulshock

    2017-06-01

    Full Text Available A theoretical model is presented which provides a way to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents interact over time, using the power they have to try to get more of it, while being constrained in their strategic choices by social inertia. The outcomes of the model are probabilistic. More research is needed to determine whether the model has any empirical validity.

  10. Applications of Bayesian temperature profile reconstruction to automated comparison with heat transport models and uncertainty quantification of current diffusion

    International Nuclear Information System (INIS)

    Irishkin, M.; Imbeaux, F.; Aniel, T.; Artaud, J.F.

    2015-01-01

    Highlights: • We developed a method for automated comparison of experimental data with models. • A unique platform implements Bayesian analysis and integrated modelling tools. • The method is tokamak-generic and is applied to Tore Supra and JET pulses. • Validation of a heat transport model is carried out. • We quantified the uncertainties due to Te profiles in current diffusion simulations. - Abstract: In the context of present and future long pulse tokamak experiments yielding a growing size of measured data per pulse, automating data consistency analysis and comparisons of measurements with models is a critical matter. To address these issues, the present work describes an expert system that carries out in an integrated and fully automated way (i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis (ii) a prediction of the reconstructed quantities, according to some models and (iii) a comparison of the first two steps. The first application shown is devoted to the development of an automated comparison method between the experimental plasma profiles reconstructed using Bayesian methods and time dependent solutions of the transport equations. The method was applied to model validation of a simple heat transport model with three radial shape options. It has been tested on a database of 21 Tore Supra and 14 JET shots. The second application aims at quantifying uncertainties due to the electron temperature profile in current diffusion simulations. A systematic reconstruction of the Ne, Te, Ti profiles was first carried out for all time slices of the pulse. The Bayesian 95% highest probability intervals on the Te profile reconstruction were then used for (i) data consistency check of the flux consumption and (ii) defining a confidence interval for the current profile simulation. The method has been applied to one Tore Supra pulse and one JET pulse.

  11. Constraint-Based Abstraction of a Model Checker for Infinite State Systems

    DEFF Research Database (Denmark)

    Banda, Gourinath; Gallagher, John Patrick

    Abstract interpretation-based model checking provides an approach to verifying properties of infinite-state systems. In practice, most previous work on abstract model checking is either restricted to verifying universal properties, or develops special techniques for temporal logics such as modal t...... to implementation of abstract model checking algorithms for abstract domains based on constraints, making use of an SMT solver....

  12. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  13. Automation of electroweak NLO corrections in general models

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Jean-Nicolas [Universitaet Wuerzburg (Germany)

    2016-07-01

    I discuss the automation of generation of scattering amplitudes in general quantum field theories at next-to-leading order in perturbation theory. The work is based on Recola, a highly efficient one-loop amplitude generator for the Standard Model, which I have extended so that it can deal with general quantum field theories. Internally, Recola computes off-shell currents and for new models new rules for off-shell currents emerge which are derived from the Feynman rules. My work relies on the UFO format which can be obtained by a suited model builder, e.g. FeynRules. I have developed tools to derive the necessary counterterm structures and to perform the renormalization within Recola in an automated way. I describe the procedure using the example of the two-Higgs-doublet model.

  14. Automated planning through abstractions in dynamic and stochastic environments

    OpenAIRE

    Martínez Muñoz, Moisés

    2016-01-01

    Mención Internacional en el título de doctor Generating sequences of actions - plans - for an automatic system, like a robot, using Automated Planning is particularly diflicult in stochastic and/or dynamic environments. These plans are composed of actions whose execution, in certain scenarios, might fail, which in tum prevents the execution of the rest of the actions in the plan. Also, in some environments, plans must he generated fast, hoth at the start of the execution and after every ex...

  15. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    Irishkin, Maxim

    2014-01-01

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author) [fr

  16. Abstraction and Model Checking in the PEPA Plug-in for Eclipse

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    lead to very large Markov chains. One way of analysing such models is to use abstraction - constructing a smaller model that bounds the properties of the original. We present an extension to the PEPA plug-in for Eclipse that enables abstracting and model checking of PEPA models. This implements two new...

  17. Resource Allocation Model for Modelling Abstract RTOS on Multiprocessor System-on-Chip

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2003-01-01

    Resource Allocation is an important problem in RTOS's, and has been an active area of research. Numerous approaches have been developed and many different techniques have been combined for a wide range of applications. In this paper, we address the problem of resource allocation in the context...... of modelling an abstract RTOS on multiprocessor SoC platforms. We discuss the implementation details of a simplified basic priority inheritance protocol for our abstract system model in SystemC....

  18. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann

    2012-01-01

    Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However...... (and reportedly one or two critics) can engage one another on several agreed questions about such frameworks. The goal is to aid non-aligned practitioners in choosing between alternative frameworks for their human-automation interaction design challenges....

  19. Automated MRI segmentation for individualized modeling of current flow in the human head.

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible

  20. Particle Tracking Model and Abstraction of Transport Processes

    Energy Technology Data Exchange (ETDEWEB)

    B. Robinson

    2004-10-21

    The purpose of this report is to document the abstraction model being used in total system performance assessment (TSPA) model calculations for radionuclide transport in the unsaturated zone (UZ). The UZ transport abstraction model uses the particle-tracking method that is incorporated into the finite element heat and mass model (FEHM) computer code (Zyvoloski et al. 1997 [DIRS 100615]) to simulate radionuclide transport in the UZ. This report outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the UZ at Yucca Mountain. In addition, methods for determining and inputting transport parameters are outlined for use in the TSPA for license application (LA) analyses. Process-level transport model calculations are documented in another report for the UZ (BSC 2004 [DIRS 164500]). Three-dimensional, dual-permeability flow fields generated to characterize UZ flow (documented by BSC 2004 [DIRS 169861]; DTN: LB03023DSSCP9I.001 [DIRS 163044]) are converted to make them compatible with the FEHM code for use in this abstraction model. This report establishes the numerical method and demonstrates the use of the model that is intended to represent UZ transport in the TSPA-LA. Capability of the UZ barrier for retarding the transport is demonstrated in this report, and by the underlying process model (BSC 2004 [DIRS 164500]). The technical scope, content, and management of this report are described in the planning document ''Technical Work Plan for: Unsaturated Zone Transport Model Report Integration'' (BSC 2004 [DIRS 171282]). Deviations from the technical work plan (TWP) are noted within the text of this report, as appropriate. The latest version of this document is being prepared principally to correct parameter values found to be in error due to transcription errors, changes in source data that were not captured in the report, calculation errors, and errors in interpretation of source data.

  1. Particle Tracking Model and Abstraction of Transport Processes

    International Nuclear Information System (INIS)

    Robinson, B.

    2004-01-01

    The purpose of this report is to document the abstraction model being used in total system performance assessment (TSPA) model calculations for radionuclide transport in the unsaturated zone (UZ). The UZ transport abstraction model uses the particle-tracking method that is incorporated into the finite element heat and mass model (FEHM) computer code (Zyvoloski et al. 1997 [DIRS 100615]) to simulate radionuclide transport in the UZ. This report outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the UZ at Yucca Mountain. In addition, methods for determining and inputting transport parameters are outlined for use in the TSPA for license application (LA) analyses. Process-level transport model calculations are documented in another report for the UZ (BSC 2004 [DIRS 164500]). Three-dimensional, dual-permeability flow fields generated to characterize UZ flow (documented by BSC 2004 [DIRS 169861]; DTN: LB03023DSSCP9I.001 [DIRS 163044]) are converted to make them compatible with the FEHM code for use in this abstraction model. This report establishes the numerical method and demonstrates the use of the model that is intended to represent UZ transport in the TSPA-LA. Capability of the UZ barrier for retarding the transport is demonstrated in this report, and by the underlying process model (BSC 2004 [DIRS 164500]). The technical scope, content, and management of this report are described in the planning document ''Technical Work Plan for: Unsaturated Zone Transport Model Report Integration'' (BSC 2004 [DIRS 171282]). Deviations from the technical work plan (TWP) are noted within the text of this report, as appropriate. The latest version of this document is being prepared principally to correct parameter values found to be in error due to transcription errors, changes in source data that were not captured in the report, calculation errors, and errors in interpretation of source data

  2. Simulation Model of Automated Peat Briquetting Press Drive

    Directory of Open Access Journals (Sweden)

    A. A. Marozka

    2012-01-01

    Full Text Available The paper presents the developed fully functional simulation model of an automated peat briquetting press drive. The given model makes it possible to reduce financial and time costs while developing, designing and operating a double-stamp peat briquetting press drive.

  3. Automated side-chain model building and sequence assignment by template matching

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2002-01-01

    A method for automated macromolecular side-chain model building and for aligning the sequence to the map is described. An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer

  4. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  5. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  6. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  7. A Framework for Semi-Automated Implementation of Multidimensional Data Models

    Directory of Open Access Journals (Sweden)

    Ilona Mariana NAGY

    2012-08-01

    Full Text Available Data warehousing solution development represents a challenging task which requires the employment of considerable resources on behalf of enterprises and sustained commitment from the stakeholders. Costs derive mostly from the amount of time invested in the design and physical implementation of these large projects, time that we consider, may be decreased through the automation of several processes. Thus, we present a framework for semi-automated implementation of multidimensional data models and introduce an automation prototype intended to reduce the time of data structures generation in the warehousing environment. Our research is focused on the design of an automation component and the development of a corresponding prototype from technical metadata.

  8. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  9. An abstract machine model of dynamic module replacement

    OpenAIRE

    Walton, Chris; Kırlı, Dilsun; Gilmore, Stephen

    2000-01-01

    In this paper we define an abstract machine model for the mλ typed intermediate language. This abstract machine is used to give a formal description of the operation of run-time module replacement for the programming language Dynamic ML. The essential technical device which we employ for module replacement is a modification of two-space copying garbage collection. We show how the operation of module replacement could be applied to other garbage-collected languages such as Java.

  10. Automated clearing system and the banking sector performance: the ...

    African Journals Online (AJOL)

    Automated clearing system and the banking sector performance: the Nigerian experience. ... Abstract. This study investigated the impact of automated clearing system on the Nigerian banking system. ... AJOL African Journals Online. HOW TO ...

  11. Cost Accounting in the Automated Manufacturing Environment

    Science.gov (United States)

    1988-06-01

    1 NAVAL POSTGRADUATE SCHOOL M terey, California 0 DTIC II ELECTE R AD%$° NO 0,19880 -- THESIS COST ACCOUNTING IN THE AUTOMATED MANUFACTURING...PROJECT TASK WORK UNIT ELEMENT NO. NO NO ACCESSION NO 11. TITLE (Include Security Classification) E COST ACCOUNTING IN THE AUTOMATED MANUFACTURING...GROUP ’" Cost Accounting ; Product Costing ; Automated Manufacturing; CAD/CAM- CIM 19 ABSTRACT (Continue on reverse if necessary and identify by blo

  12. A semantic approach for business process model abstraction

    NARCIS (Netherlands)

    Smirnov, S.; Reijers, H.A.; Weske, M.H.; Mouratidis, H.; Rolland, C.

    2011-01-01

    Models of business processes can easily become large and difficult to understand. Abstraction has proven to be an effective means to present a readable, high-level view of a business process model, by showing aggregated activities and leaving out irrelevant details. Yet, it is an open question how

  13. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  14. A Modal Logic for Abstract Delta Modeling

    NARCIS (Netherlands)

    F.S. de Boer (Frank); M. Helvensteijn (Michiel); J. Winter (Joost)

    2012-01-01

    htmlabstractAbstract Delta Modeling is a technique for implementing (software) product lines. Deltas are put in a partial order which restricts their application and are then sequentially applied to a core product in order to form specific products in the product line. In this paper we explore the

  15. Automating an integrated spatial data-mining model for landfill site selection

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  16. Automated model-based testing of hybrid systems

    NARCIS (Netherlands)

    Osch, van M.P.W.J.

    2009-01-01

    In automated model-based input-output conformance testing, tests are automati- cally generated from a speci¯cation and automatically executed on an implemen- tation. Input is applied to the implementation and output is observed from the implementation. If the observed output is allowed according to

  17. Adaptive Algorithms for Automated Processing of Document Images

    Science.gov (United States)

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  18. An Immune-inspired Adaptive Automated Intrusion Response System Model

    Directory of Open Access Journals (Sweden)

    Ling-xi Peng

    2012-09-01

    Full Text Available An immune-inspired adaptive automated intrusion response system model, named as , is proposed. The descriptions of self, non-self, immunocyte, memory detector, mature detector and immature detector of the network transactions, and the realtime network danger evaluation equations are given. Then, the automated response polices are adaptively performed or adjusted according to the realtime network danger. Thus, not only accurately evaluates the network attacks, but also greatly reduces the response times and response costs.

  19. Spike Neural Models Part II: Abstract Neural Models

    Directory of Open Access Journals (Sweden)

    Johnson, Melissa G.

    2018-02-01

    Full Text Available Neurons are complex cells that require a lot of time and resources to model completely. In spiking neural networks (SNN though, not all that complexity is required. Therefore simple, abstract models are often used. These models save time, use less computer resources, and are easier to understand. This tutorial presents two such models: Izhikevich's model, which is biologically realistic in the resulting spike trains but not in the parameters, and the Leaky Integrate and Fire (LIF model which is not biologically realistic but does quickly and easily integrate input to produce spikes. Izhikevich's model is based on Hodgkin-Huxley's model but simplified such that it uses only two differentiation equations and four parameters to produce various realistic spike patterns. LIF is based on a standard electrical circuit and contains one equation. Either of these two models, or any of the many other models in literature can be used in a SNN. Choosing a neural model is an important task that depends on the goal of the research and the resources available. Once a model is chosen, network decisions such as connectivity, delay, and sparseness, need to be made. Understanding neural models and how they are incorporated into the network is the first step in creating a SNN.

  20. Tailored model abstraction in performance assessments

    International Nuclear Information System (INIS)

    Kessler, J.H.

    1995-01-01

    Total System Performance Assessments (TSPAs) are likely to be one of the most significant parts of making safety cases for the continued development and licensing of geologic repositories for the disposal of spent fuel and HLW. Thus, it is critical that the TSPA model capture the 'essence' of the physical processes relevant to demonstrating the appropriate regulation is met. But how much detail about the physical processes must be modeled and understood before there is enough confidence that the appropriate essence has been captured? In this summary the level of model abstraction that is required is discussed. Approaches for subsystem and total system performance analyses are outlined, and the role of best estimate models is examined. It is concluded that a conservative approach for repository performance, based on limited amount of field and laboratory data, can provide sufficient confidence for a regulatory decision

  1. Ecosystem models (a bibliography with abstracts). Report for 1964-Nov 1975

    International Nuclear Information System (INIS)

    Harrison, E.A.

    1975-11-01

    The bibliography contains abstracts which cover marine biology, natural resources, wildlife, plants, water pollution, microorganisms, food chains, radioactive substances, limnology, and diseases as related to ecosystem models. Contains 214 abstracts

  2. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  3. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  4. The Use of AMET & Automated Scripts for Model Evaluation

    Science.gov (United States)

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  5. Search for a New Conceptual Bookkeeping Model : Different Levels of Abstraction

    NARCIS (Netherlands)

    Sweere, A.M.J.; van Groenendaal, W.J.H.

    1999-01-01

    Nowadays, every bookkeeping system used in practice is automated. Most bookkeeping software and integrated information systems are based on databases. In this paper, we develop a new conceptual bookkeeping model which is not based on manual techniques, but which is applicable in a database

  6. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  7. Business process model abstraction : a definition, catalog, and survey

    NARCIS (Netherlands)

    Smirnov, S.; Reijers, H.A.; Weske, M.H.; Nugteren, T.

    2012-01-01

    The discipline of business process management aims at capturing, understanding, and improving work in organizations by using process models as central artifacts. Since business-oriented tasks require different information from such models to be highlighted, a range of abstraction techniques has been

  8. Aspect-Oriented Model Development at Different Levels of Abstraction

    NARCIS (Netherlands)

    Alférez, Mauricio; Amálio, Nuno; Ciraci, S.; Fleurey, Franck; Kienzle, Jörg; Klein, Jacques; Kramer, Max; Mosser, Sebastien; Mussbacher, Gunter; Roubtsova, Ella; Zhang, Gefei; France, Robert B.; Kuester, Jochen M.; Bordbar, Behzad; Paige, Richard F.

    2011-01-01

    The last decade has seen the development of diverse aspect- oriented modeling (AOM) approaches. This paper presents eight different AOM approaches that produce models at different level of abstraction. The approaches are different with respect to the phases of the development lifecycle they target,

  9. Automated Model Fit Method for Diesel Engine Control Development

    NARCIS (Netherlands)

    Seykens, X.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is

  10. Automated model fit method for diesel engine control development

    NARCIS (Netherlands)

    Seykens, X.L.J.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.J.H.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is

  11. Automated side-chain model building and sequence assignment by template matching.

    Science.gov (United States)

    Terwilliger, Thomas C

    2003-01-01

    An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer.

  12. Automation life-cycle cost model

    Science.gov (United States)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  13. From abstract to peer-reviewed publication: country matters

    DEFF Research Database (Denmark)

    Fosbol, E.; Fosbøl, Philip Loldrup; Eapen, Z. J.

    2013-01-01

    within 2 years of the conference. Less is known about the relative difference between countries in regards to likelihood of publication. Methods: Using a validated automated computer algorithm, we searched the ISI Web of Science to identify peer-reviewed publications of abstracts presented at the AHA...... observed a significant variation among countries in terms of odds of subsequent publication (Figure). Conclusions: Our results show that conversion of science from an abstract into a peer-reviewed publication varies significantly by country. Local national initiatives should be deployed in order to break...

  14. Inference and abstraction of the biometric passport

    NARCIS (Netherlands)

    Aarts, F.; Schmaltz, J.; Vaandrager, F.W.; Margaria, T.; Steffen, B.

    2010-01-01

    Model-based testing is a promising software testing technique for the automation of test generation and test execution. One obstacle to its adoption is the difficulty of developing models. Learning techniques provide tools to automatically derive automata-based models. Automation is obtained at the

  15. A Stepwise Fitting Procedure for automated fitting of Ecopath with Ecosim models

    Directory of Open Access Journals (Sweden)

    Erin Scott

    2016-01-01

    Full Text Available The Stepwise Fitting Procedure automates testing of alternative hypotheses used for fitting Ecopath with Ecosim (EwE models to observation reference data (Mackinson et al. 2009. The calibration of EwE model predictions to observed data is important to evaluate any model that will be used for ecosystem based management. Thus far, the model fitting procedure in EwE has been carried out manually: a repetitive task involving setting >1000 specific individual searches to find the statistically ‘best fit’ model. The novel fitting procedure automates the manual procedure therefore producing accurate results and lets the modeller concentrate on investigating the ‘best fit’ model for ecological accuracy.

  16. Physical and Chemical Environmental Abstraction Model

    International Nuclear Information System (INIS)

    Nowak, E.

    2000-01-01

    As directed by a written development plan (CRWMS M and O 1999a), Task 1, an overall conceptualization of the physical and chemical environment (P/CE) in the emplacement drift is documented in this Analysis/Model Report (AMR). Included are the physical components of the engineered barrier system (EBS). The intended use of this descriptive conceptualization is to assist the Performance Assessment Department (PAD) in modeling the physical and chemical environment within a repository drift. It is also intended to assist PAD in providing a more integrated and complete in-drift geochemical model abstraction and to answer the key technical issues raised in the U.S. Nuclear Regulatory Commission (NRC) Issue Resolution Status Report (IRSR) for the Evolution of the Near-Field Environment (NFE) Revision 2 (NRC 1999). EBS-related features, events, and processes (FEPs) have been assembled and discussed in ''EBS FEPs/Degradation Modes Abstraction'' (CRWMS M and O 2000a). Reference AMRs listed in Section 6 address FEPs that have not been screened out. This conceptualization does not directly address those FEPs. Additional tasks described in the written development plan are recommended for future work in Section 7.3. To achieve the stated purpose, the scope of this document includes: (1) the role of in-drift physical and chemical environments in the Total System Performance Assessment (TSPA) (Section 6.1); (2) the configuration of engineered components (features) and critical locations in drifts (Sections 6.2.1 and 6.3, portions taken from EBS Radionuclide Transport Abstraction (CRWMS M and O 2000b)); (3) overview and critical locations of processes that can affect P/CE (Section 6.3); (4) couplings and relationships among features and processes in the drifts (Section 6.4); and (5) identities and uses of parameters transmitted to TSPA by some of the reference AMRs (Section 6.5). This AMR originally considered a design with backfill, and is now being updated (REV 00 ICN1) to address

  17. Automated Test Case Generation

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  18. Modeling of prepregs during automated draping sequences

    Science.gov (United States)

    Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny

    2017-10-01

    The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.

  19. Context based mixture model for cell phase identification in automated fluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Zhou Xiaobo

    2007-01-01

    Full Text Available Abstract Background Automated identification of cell cycle phases of individual live cells in a large population captured via automated fluorescence microscopy technique is important for cancer drug discovery and cell cycle studies. Time-lapse fluorescence microscopy images provide an important method to study the cell cycle process under different conditions of perturbation. Existing methods are limited in dealing with such time-lapse data sets while manual analysis is not feasible. This paper presents statistical data analysis and statistical pattern recognition to perform this task. Results The data is generated from Hela H2B GFP cells imaged during a 2-day period with images acquired 15 minutes apart using an automated time-lapse fluorescence microscopy. The patterns are described with four kinds of features, including twelve general features, Haralick texture features, Zernike moment features, and wavelet features. To generate a new set of features with more discriminate power, the commonly used feature reduction techniques are used, which include Principle Component Analysis (PCA, Linear Discriminant Analysis (LDA, Maximum Margin Criterion (MMC, Stepwise Discriminate Analysis based Feature Selection (SDAFS, and Genetic Algorithm based Feature Selection (GAFS. Then, we propose a Context Based Mixture Model (CBMM for dealing with the time-series cell sequence information and compare it to other traditional classifiers: Support Vector Machine (SVM, Neural Network (NN, and K-Nearest Neighbor (KNN. Being a standard practice in machine learning, we systematically compare the performance of a number of common feature reduction techniques and classifiers to select an optimal combination of a feature reduction technique and a classifier. A cellular database containing 100 manually labelled subsequence is built for evaluating the performance of the classifiers. The generalization error is estimated using the cross validation technique. The

  20. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  1. Selected translated abstracts of Russian-language climate-change publications. 4: General circulation models

    Energy Technology Data Exchange (ETDEWEB)

    Burtis, M.D. [comp.] [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Razuvaev, V.N.; Sivachok, S.G. [All-Russian Research Inst. of Hydrometeorological Information--World Data Center, Obninsk (Russian Federation)

    1996-10-01

    This report presents English-translated abstracts of important Russian-language literature concerning general circulation models as they relate to climate change. Into addition to the bibliographic citations and abstracts translated into English, this report presents the original citations and abstracts in Russian. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  2. Model-based object classification using unification grammars and abstract representations

    Science.gov (United States)

    Liburdy, Kathleen A.; Schalkoff, Robert J.

    1993-04-01

    The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.

  3. Models and automation technologies for the curriculum development

    Directory of Open Access Journals (Sweden)

    V. N. Volkova

    2016-01-01

    Full Text Available The aim of the research was to determine the sequence of the curriculum development stages on the basis of the system analysis, as well as to create models and information technologies for the implementation of thesestages.The methods and the models of the systems’ theory and the system analysis, including methods and automated procedures for structuring organizational aims, models and automated procedures for organizing complex expertise.On the basis of the analysis of existing studies in the field of curriculum modeling, using formal mathematical language, including optimization models, that help to make distribution of disciplines by years and semesters in accordance with the relevant restrictions, it is shown, that the complexity and dimension of these tasks require the development of special software; the problem of defining the input data and restrictions requires a large time investment, that seems to be difficult to provide in real conditions of plans’ developing, thus it is almost impossible to verify the objectivity of the input data and the restrictions in such models. For a complete analysis of the process of curriculum development it is proposed to use the system definition, based on the system-targeted approach. On the basis of this definition the reasonable sequence of the integrated stages for the development of the curriculum was justified: 1 definition (specification of the requirements for the educational content; 2 determining the number of subjects, included in the curriculum; 3 definition of the sequence of the subjects; 4 distribution of subjects by semesters. The models and technologies for the implementation of these stages of curriculum development were given in the article: 1 models, based on the information approach of A.Denisov and the modified degree of compliance with objectives based on Denisov’s evaluation index (in the article the idea of evaluating the degree of the impact of disciplines for realization

  4. Complex Automated Negotiations Theories, Models, and Software Competitions

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Matsuo, Tokuro

    2013-01-01

    Complex Automated Negotiations are a widely studied, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. For this book, we solicited papers on all aspects of such complex automated negotiations, which are studied in the field of Autonomous Agents and Multi-Agent Systems. This book includes two parts, which are Part I: Agent-based Complex Automated Negotiations and Part II: Automated Negotiation Agents Competition. Each chapter in Part I is an extended version of ACAN 2011 papers after peer reviews by three PC members. Part II includes ANAC 2011 (The Second Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of...

  5. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  6. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  7. Automated main-chain model building by template matching and iterative fragment extension

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2003-01-01

    A method for automated macromolecular main-chain model building is described. An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and β-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and β-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C α positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 Å. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition

  8. Virtual model of an automated system for the storage of collected waste

    Directory of Open Access Journals (Sweden)

    Enciu George

    2017-01-01

    Full Text Available One of the problems identified in waste collection integrated systems is the storage space. The design process of an automated system for the storage of collected waste includes finding solutions for the optimal exploitation of the limited storage space, seen that the equipment for the loading, identification, transport and transfer of the waste covers most of the available space inside the integrated collection system. In the present paper a three-dimensional model of an automated storage system designed by the authors for a business partner is presented. The storage system can be used for the following types of waste: plastic and glass recipients, aluminium cans, paper, cardboard and WEEE (waste electrical and electronic equipment. Special attention has been given to the transfer subsystem, specific for the storage system, which should be able to transfer different types and shapes of waste. The described virtual model of the automated system for the storage of collected waste will be part of the virtual model of the entire integrated waste collection system as requested by the beneficiary.

  9. Automated image analysis of lateral lumber X-rays by a form model

    International Nuclear Information System (INIS)

    Mahnken, A.H.; Kohnen, M.; Steinberg, S.; Wein, B.B.; Guenther, R.W.

    2001-01-01

    Development of a software for fully automated image analysis of lateral lumbar spine X-rays. Material and method: Using the concept of active shape models, we developed a software that produces a form model of the lumbar spine from lateral lumbar spine radiographs and runs an automated image segmentation. This model is able to detect lumbar vertebrae automatically after the filtering of digitized X-ray images. The model was trained with 20 lateral lumbar spine radiographs with no pathological findings before we evaluated the software with 30 further X-ray images which were sorted by image quality ranging from one (best) to three (worst). There were 10 images for each quality. Results: Image recognition strongly depended on image quality. In group one 52 and in group two 51 out of 60 vertebral bodies including the sacrum were recognized, but in group three only 18 vertebral bodies were properly identified. Conclusion: Fully automated and reliable recognition of vertebral bodies from lateral spine radiographs using the concept of active shape models is possible. The precision of this technique is limited by the superposition of different structures. Further improvements are necessary. Therefore standardized image quality and enlargement of the training data set are required. (orig.) [de

  10. A time-use model for the automated vehicle-era

    NARCIS (Netherlands)

    Pudāne, Baiba; Molin, Eric J.E.; Arentze, Theo A.; Maknoon, Yousef; Chorus, Caspar G.

    2018-01-01

    Automated Vehicles (AVs) offer their users a possibility to perform new non-driving activities while being on the way. The effects of this opportunity on travel choices and travel demand have mostly been conceptualised and modelled via a reduced penalty associated with (in-vehicle) travel time. This

  11. A Time-use Model for the Automated Vehicle-era

    NARCIS (Netherlands)

    Pudane, B.; Molin, E.J.E.; Arentze, TA; Maknoon, M.Y.; Chorus, C.G.

    2018-01-01

    Automated Vehicles (AVs) offer their users a possibility to perform new non-driving activities while being on the way. The effects of this opportunity on travel choices and travel demand have mostly been conceptualised and modelled via a reduced penalty associated with (in-vehicle) travel time. This

  12. Efficient family-based model checking via variability abstractions

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus

    2016-01-01

    with the abstract model checking of the concrete high-level variational model. This allows the use of Spin with all its accumulated optimizations for efficient verification of variational models without any knowledge about variability. We have implemented the transformations in a prototype tool, and we illustrate......Many software systems are variational: they can be configured to meet diverse sets of requirements. They can produce a (potentially huge) number of related systems, known as products or variants, by systematically reusing common parts. For variational models (variational systems or families...... of related systems), specialized family-based model checking algorithms allow efficient verification of multiple variants, simultaneously, in a single run. These algorithms, implemented in a tool Snip, scale much better than ``the brute force'' approach, where all individual systems are verified using...

  13. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  14. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  15. Automation of program model developing for complex structure control objects

    International Nuclear Information System (INIS)

    Ivanov, A.P.; Sizova, T.B.; Mikhejkina, N.D.; Sankovskij, G.A.; Tyufyagin, A.N.

    1991-01-01

    A brief description of software for automated developing the models of integrating modular programming system, program module generator and program module library providing thermal-hydraulic calcualtion of process dynamics in power unit equipment components and on-line control system operation simulation is given. Technical recommendations for model development are based on experience in creation of concrete models of NPP power units. 8 refs., 1 tab., 4 figs

  16. AUTOMATED FEATURE BASED TLS DATA REGISTRATION FOR 3D BUILDING MODELING

    OpenAIRE

    K. Kitamura; N. Kochi; S. Kaneko

    2012-01-01

    In this paper we present a novel method for the registration of point cloud data obtained using terrestrial laser scanner (TLS). The final goal of our investigation is the automated reconstruction of CAD drawings and the 3D modeling of objects surveyed by TLS. Because objects are scanned from multiple positions, individual point cloud need to be registered to the same coordinate system. We propose in this paper an automated feature based registration procedure. Our proposed method does not re...

  17. A model for automation of radioactive dose control

    International Nuclear Information System (INIS)

    Ribeiro, Carlos Henrique Calazans; Zambon, Jose Waldir; Bitelli, Ricardo; Honaiser, Eduardo Henrique Rangel

    2009-01-01

    The paper presents a proposal for automation of the personnel dose control system to be used in nuclear medicine environments. The model has considered the Standards and rules of the National Commission of Nuclear Energy (CNEN) and of the Health Ministry. The advantages of the model is a robust management of the integrated dose and technicians qualification status. The software platform selected to be used was the Lotus Notes and an analysis of the advantages, disadvantages of the use of this platform is also presented. (author)

  18. A model for automation of radioactive dose control

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Carlos Henrique Calazans; Zambon, Jose Waldir; Bitelli, Ricardo; Honaiser, Eduardo Henrique Rangel [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Sao Paulo, SP (Brazil)], e-mail: calazans@ctmsp.mar.mil.br, e-mail: zambon@ctmsp.mar.mil.br, e-mail: bitelli@ctmsp.mar.mil.br, e-mail: honaiser@ctmsp.mar.mil.br

    2009-07-01

    The paper presents a proposal for automation of the personnel dose control system to be used in nuclear medicine environments. The model has considered the Standards and rules of the National Commission of Nuclear Energy (CNEN) and of the Health Ministry. The advantages of the model is a robust management of the integrated dose and technicians qualification status. The software platform selected to be used was the Lotus Notes and an analysis of the advantages, disadvantages of the use of this platform is also presented. (author)

  19. Advances in automated valuation modeling AVM after the non-agency mortgage crisis

    CERN Document Server

    Kauko, Tom

    2017-01-01

    This book addresses several problems related to automated valuation methodologies (AVM). Following the non-agency mortgage crisis, it offers a variety of approaches to improve the efficiency and quality of an automated valuation methodology (AVM) dealing with emerging problems and different contexts. Spatial issue, evolution of AVM standards, multilevel models, fuzzy and rough set applications and quantitative methods to define comparables are just some of the topics discussed.

  20. Automated cost modeling for coal combustion systems

    International Nuclear Information System (INIS)

    Rowe, R.M.; Anast, K.R.

    1991-01-01

    This paper reports on cost information developed at AMAX R and D Center for coal-water slurry production implemented in an automated spreadsheet (Lotus 123) for personal computer use. The spreadsheet format allows the user toe valuate impacts of various process options, coal feedstock characteristics, fuel characteristics, plant location sites, and plant sizes on fuel cost. Model flexibility reduces time and labor required to determine fuel costs and provides a basis to compare fuels manufactured by different processes. The model input includes coal characteristics, plant flowsheet definition, plant size, and market location. Based on these inputs, selected unit operations are chosen for coal processing

  1. Model abstraction addressing long-term simulations of chemical degradation of large-scale concrete structures

    International Nuclear Information System (INIS)

    Jacques, D.; Perko, J.; Seetharam, S.; Mallants, D.

    2012-01-01

    This paper presents a methodology to assess the spatial-temporal evolution of chemical degradation fronts in real-size concrete structures typical of a near-surface radioactive waste disposal facility. The methodology consists of the abstraction of a so-called full (complicated) model accounting for the multicomponent - multi-scale nature of concrete to an abstracted (simplified) model which simulates chemical concrete degradation based on a single component in the aqueous and solid phase. The abstracted model is verified against chemical degradation fronts simulated with the full model under both diffusive and advective transport conditions. Implementation in the multi-physics simulation tool COMSOL allows simulation of the spatial-temporal evolution of chemical degradation fronts in large-scale concrete structures. (authors)

  2. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  3. An automation model of Effluent Treatment Plant

    Directory of Open Access Journals (Sweden)

    Luiz Alberto Oliveira Lima Roque

    2012-07-01

    on the conservation of water resources, this paper aims to propose an automation model of an Effluent Treatment Plant, using Ladder programming language and supervisory systems.

  4. Using All-Sky Imaging to Improve Telescope Scheduling (Abstract)

    Science.gov (United States)

    Cole, G. M.

    2017-12-01

    (Abstract only) Automated scheduling makes it possible for a small telescope to observe a large number of targets in a single night. But when used in areas which have less-than-perfect sky conditions such automation can lead to large numbers of observations of clouds and haze. This paper describes the development of a "sky-aware" telescope automation system that integrates the data flow from an SBIG AllSky340c camera with an enhanced dispatch scheduler to make optimum use of the available observing conditions for two highly instrumented backyard telescopes. Using the minute-by-minute time series image stream and a self-maintained reference database, the software maintains a file of sky brightness, transparency, stability, and forecasted visibility at several hundred grid positions. The scheduling software uses this information in real time to exclude targets obscured by clouds and select the best observing task, taking into account the requirements and limits of each instrument.

  5. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  6. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  7. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  8. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  9. An Overview of the Automated Dispatch Controller Algorithms in the System Advisor Model (SAM)

    Energy Technology Data Exchange (ETDEWEB)

    DiOrio, Nicholas A [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-11-22

    Three automatic dispatch modes have been added to the battery model within the System Adviser Model. These controllers have been developed to perform peak shaving in an automated fashion, providing users with a way to see the benefit of reduced demand charges without manually programming a complicated dispatch control. A flexible input option allows more advanced interaction with the automated controller. This document will describe the algorithms in detail and present brief results on its use and limitations.

  10. A Simple Take on Typed Abstract Syntax in Haskell-Like Languages

    DEFF Research Database (Denmark)

    Danvy, Olivier; Rhiger, Morten

    2000-01-01

    We present a simple way to program typed abstract syntax in a language following a Hindley-Milner typing discipline, such as Haskell and ML, and we apply it to automate two proofs about normalization functions as embodied in type-directed partial evaluation for the simply typed lambda calculus...

  11. Modeling take-over performance in level 3 conditionally automated vehicles.

    Science.gov (United States)

    Gold, Christian; Happee, Riender; Bengler, Klaus

    2017-11-28

    Taking over vehicle control from a Level 3 conditionally automated vehicle can be a demanding task for a driver. The take-over determines the controllability of automated vehicle functions and thereby also traffic safety. This paper presents models predicting the main take-over performance variables take-over time, minimum time-to-collision, brake application and crash probability. These variables are considered in relation to the situational and driver-related factors time-budget, traffic density, non-driving-related task, repetition, the current lane and driver's age. Regression models were developed using 753 take-over situations recorded in a series of driving simulator experiments. The models were validated with data from five other driving simulator experiments of mostly unrelated authors with another 729 take-over situations. The models accurately captured take-over time, time-to-collision and crash probability, and moderately predicted the brake application. Especially the time-budget, traffic density and the repetition strongly influenced the take-over performance, while the non-driving-related tasks, the lane and drivers' age explained a minor portion of the variance in the take-over performances. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Factors Impacting the Conversion of Abstracts Presented at the Canadian Cardiovascular Congress Meetings to Full Publications

    DEFF Research Database (Denmark)

    Abuzeid, W.; Fosbøl, E.; Fosbøl, Philip Loldrup

    2013-01-01

    associated with transition of abstracts to full publications.BackgroundThe rate of conversion of abstracts presented at scientific meetings into peer reviewed published manuscripts, has been a topic of interest for various medical specialties. Rapid translation of abstracts into manuscripts allows...... for reliable and rapid communication of scientific knowledge into practice. Methods;Using a previously validated automated computer algorithm, we searched the ISI Web of Science to identify peer-reviewed full manuscript publications of abstracts presented at the CCC. We manually entered information about...

  14. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  15. The Abstract Machine Model for Transaction-based System Control

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.

    2003-01-31

    Recent work applying statistical mechanics to economic modeling has demonstrated the effectiveness of using thermodynamic theory to address the complexities of large scale economic systems. Transaction-based control systems depend on the conjecture that when control of thermodynamic systems is based on price-mediated strategies (e.g., auctions, markets), the optimal allocation of resources in a market-based control system results in an emergent optimal control of the thermodynamic system. This paper proposes an abstract machine model as the necessary precursor for demonstrating this conjecture and establishes the dynamic laws as the basis for a special theory of emergence applied to the global behavior and control of complex adaptive systems. The abstract machine in a large system amounts to the analog of a particle in thermodynamic theory. The permit the establishment of a theory dynamic control of complex system behavior based on statistical mechanics. Thus we may be better able to engineer a few simple control laws for a very small number of devices types, which when deployed in very large numbers and operated as a system of many interacting markets yields the stable and optimal control of the thermodynamic system.

  16. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  17. Model of informational system for freight insurance automation based on digital signature

    OpenAIRE

    Maxim E. SLOBODYANYUK

    2009-01-01

    In the article considered a model of informational system for freight insurance automation based on digital signature, showed architecture, macro flowchart of information flow in model, components (modules) and their functions. Described calculation method of costs on interactive cargo insurance via proposed system, represented main characteristics and options of existing transport management systems, conceptual cost models.

  18. Model of informational system for freight insurance automation based on digital signature

    Directory of Open Access Journals (Sweden)

    Maxim E. SLOBODYANYUK

    2009-01-01

    Full Text Available In the article considered a model of informational system for freight insurance automation based on digital signature, showed architecture, macro flowchart of information flow in model, components (modules and their functions. Described calculation method of costs on interactive cargo insurance via proposed system, represented main characteristics and options of existing transport management systems, conceptual cost models.

  19. Initial Assessment and Modeling Framework Development for Automated Mobility Districts: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Yi [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Young, Stanley E [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Garikapati, Venu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Chen, Yuche [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhu, Lei [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-07

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displaces private automobiles for day-to-day travel in dense activity districts. This paper examines a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMD). This paper reviews several such districts, including airports, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology and others with more traditional transit-based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs

  20. Automated Liquibase Generator And ValidatorALGV

    Directory of Open Access Journals (Sweden)

    Manik Jain

    2015-08-01

    Full Text Available Abstract This paper presents an automation tool namely ALGV Automated Liquibase Generator and Validator for the automated generation and verification of liquibase scripts. Liquibase is one of the most efficient ways of applying and persisting changes to a database schema. Since its invention by Nathan Voxland 1 it has become de facto standard for database change management. The advantages of using liquibase scripts over traditional sql queries ranges from version control to reusing the same scripts over multiple database platforms. Irrespective of its advantages manual creation of liquibase scripts takes a lot of effort and sometimes is error-prone. ALGV helps to reduce the time consuming liquibase script generation manual typing efforts possible error occurrence and manual verification process and time by 75. Automating the liquibase generation process also helps to remove the burden of recollecting specific tags to be used for a particular change. Moreover developers can concentrate on the business logic and business data rather than wasting their precious efforts in writing files.

  1. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    Schreiner, R.

    2001-01-01

    The purpose of this work is to develop the Engineered Barrier System (EBS) radionuclide transport abstraction model, as directed by a written development plan (CRWMS M and O 1999a). This abstraction is the conceptual model that will be used to determine the rate of release of radionuclides from the EBS to the unsaturated zone (UZ) in the total system performance assessment-license application (TSPA-LA). In particular, this model will be used to quantify the time-dependent radionuclide releases from a failed waste package (WP) and their subsequent transport through the EBS to the emplacement drift wall/UZ interface. The development of this conceptual model will allow Performance Assessment Operations (PAO) and its Engineered Barrier Performance Department to provide a more detailed and complete EBS flow and transport abstraction. The results from this conceptual model will allow PA0 to address portions of the key technical issues (KTIs) presented in three NRC Issue Resolution Status Reports (IRSRs): (1) the Evolution of the Near-Field Environment (ENFE), Revision 2 (NRC 1999a), (2) the Container Life and Source Term (CLST), Revision 2 (NRC 1999b), and (3) the Thermal Effects on Flow (TEF), Revision 1 (NRC 1998). The conceptual model for flow and transport in the EBS will be referred to as the ''EBS RT Abstraction'' in this analysis/modeling report (AMR). The scope of this abstraction and report is limited to flow and transport processes. More specifically, this AMR does not discuss elements of the TSPA-SR and TSPA-LA that relate to the EBS but are discussed in other AMRs. These elements include corrosion processes, radionuclide solubility limits, waste form dissolution rates and concentrations of colloidal particles that are generally represented as boundary conditions or input parameters for the EBS RT Abstraction. In effect, this AMR provides the algorithms for transporting radionuclides using the flow geometry and radionuclide concentrations determined by other

  2. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  3. 77 FR 14535 - Agency Information Collection Activities: Application To Use the Automated Commercial Environment...

    Science.gov (United States)

    2012-03-12

    ... Activities: Application To Use the Automated Commercial Environment (ACE) AGENCY: U.S. Customs and Border... Commercial Environment (ACE). This request for comment is being made pursuant to the Paperwork Reduction Act... Number: None. Abstract: The Automated Commercial Environment (ACE) is a trade processing system that will...

  4. Generating Models of Infinite-State Communication Protocols Using Regular Inference with Abstraction

    Science.gov (United States)

    Aarts, Fides; Jonsson, Bengt; Uijen, Johan

    In order to facilitate model-based verification and validation, effort is underway to develop techniques for generating models of communication system components from observations of their external behavior. Most previous such work has employed regular inference techniques which generate modest-size finite-state models. They typically suppress parameters of messages, although these have a significant impact on control flow in many communication protocols. We present a framework, which adapts regular inference to include data parameters in messages and states for generating components with large or infinite message alphabets. A main idea is to adapt the framework of predicate abstraction, successfully used in formal verification. Since we are in a black-box setting, the abstraction must be supplied externally, using information about how the component manages data parameters. We have implemented our techniques by connecting the LearnLib tool for regular inference with the protocol simulator ns-2, and generated a model of the SIP component as implemented in ns-2.

  5. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    Science.gov (United States)

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  6. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  7. Statistical modelling of networked human-automation performance using working memory capacity.

    Science.gov (United States)

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  8. Geochemistry Model Abstraction and Sensitivity Studies for the 21 PWR CSNF Waste Package

    International Nuclear Information System (INIS)

    Bernot, P.; LeStrange, S.; Thomas, E.; Zarrabi, K.; Arthur, S.

    2002-01-01

    The CSNF geochemistry model abstraction, as directed by the TWP (BSC 2002b), was developed to provide regression analysis of EQ6 cases to obtain abstracted values of pH (and in some cases HCO 3 - concentration) for use in the Configuration Generator Model. The pH of the system is the controlling factor over U mineralization, CSNF degradation rate, and HCO 3 - concentration in solution. The abstraction encompasses a large variety of combinations for the degradation rates of materials. The ''base case'' used EQ6 simulations looking at differing steel/alloy corrosion rates, drip rates, and percent fuel exposure. Other values such as the pH/HCO 3 - dependent fuel corrosion rate and the corrosion rate of A516 were kept constant. Relationships were developed for pH as a function of these differing rates to be used in the calculation of total C and subsequently, the fuel rate. An additional refinement to the abstraction was the addition of abstracted pH values for cases where there was limited O 2 for waste package corrosion and a flushing fluid other than J-13, which has been used in all EQ6 calculation up to this point. These abstractions also used EQ6 simulations with varying combinations of corrosion rates of materials to abstract the pH (and HCO 3 - in the case of the limiting O 2 cases) as a function of WP materials corrosion rates. The goodness of fit for most of the abstracted values was above an R 2 of 0.9. Those below this value occurred during the time at the very beginning of WP corrosion when large variations in the system pH are observed. However, the significance of F-statistic for all the abstractions showed that the variable relationships are significant. For the abstraction, an analysis of the minerals that may form the ''sludge'' in the waste package was also presented. This analysis indicates that a number a different iron and aluminum minerals may form in the waste package other than those described in the EQ6 output files which are based on the use

  9. Shingle 2.0: generalising self-consistent and automated domain discretisation for multi-scale geophysical models

    Science.gov (United States)

    Candy, Adam S.; Pietrzak, Julie D.

    2018-01-01

    The approaches taken to describe and develop spatial discretisations of the domains required for geophysical simulation models are commonly ad hoc, model- or application-specific, and under-documented. This is particularly acute for simulation models that are flexible in their use of multi-scale, anisotropic, fully unstructured meshes where a relatively large number of heterogeneous parameters are required to constrain their full description. As a consequence, it can be difficult to reproduce simulations, to ensure a provenance in model data handling and initialisation, and a challenge to conduct model intercomparisons rigorously. This paper takes a novel approach to spatial discretisation, considering it much like a numerical simulation model problem of its own. It introduces a generalised, extensible, self-documenting approach to carefully describe, and necessarily fully, the constraints over the heterogeneous parameter space that determine how a domain is spatially discretised. This additionally provides a method to accurately record these constraints, using high-level natural language based abstractions that enable full accounts of provenance, sharing, and distribution. Together with this description, a generalised consistent approach to unstructured mesh generation for geophysical models is developed that is automated, robust and repeatable, quick-to-draft, rigorously verified, and consistent with the source data throughout. This interprets the description above to execute a self-consistent spatial discretisation process, which is automatically validated to expected discrete characteristics and metrics. Library code, verification tests, and examples available in the repository at https://github.com/shingleproject/Shingle. Further details of the project presented at http://shingleproject.org.

  10. Illuminance-based slat angle selection model for automated control of split blinds

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Jia; Olbina, Svetlana [Rinker School of Building Construction, University of Florida, Gainesville, FL 32611-5703 (United States)

    2011-03-15

    Venetian blinds play an important role in controlling daylight in buildings. Automated blinds overcome some limitations of manual blinds; however, the existing automated systems mainly control the direct solar radiation and glare and cannot be used for controlling innovative blind systems such as split blinds. This research developed an Illuminance-based Slat Angle Selection (ISAS) model that predicts the optimum slat angles of split blinds to achieve the designed indoor illuminance. The model was constructed based on a series of multi-layer feed-forward artificial neural networks (ANNs). The illuminance values at the sensor points used to develop the ANNs were obtained by the software EnergyPlus trademark. The weather determinants (such as horizontal illuminance and sun angles) were used as the input variables for the ANNs. The illuminance level at a sensor point was the output variable for the ANNs. The ISAS model was validated by evaluating the errors in the calculation of the: 1) illuminance and 2) optimum slat angles. The validation results showed that the power of the ISAS model to predict illuminance was 94.7% while its power to calculate the optimum slat angles was 98.5%. For about 90% of time in the year, the illuminance percentage errors were less than 10%, and the percentage errors in calculating the optimum slat angles were less than 5%. This research offers a new approach for the automated control of split blinds and a guide for future research to utilize the adaptive nature of ANNs to develop a more practical and applicable blind control system. (author)

  11. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  12. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  13. Drivers' communicative interactions: on-road observations and modelling for integration in future automation systems.

    Science.gov (United States)

    Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas

    2014-01-01

    Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.

  14. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  15. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  16. Generic process model structures: towards a standard notation for abstract representations

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2007-10-01

    Full Text Available in the case of objects, or repositories in the case of process models. The creation of the MIT Process Handbook was a step in this direction. However, although the authors used object-oriented concepts in the abstract representations, they did not rigorously...

  17. Proceedings. Fourth international symposium on mine mechanisation and automation

    Energy Technology Data Exchange (ETDEWEB)

    Gurgenci, H.; Hood, M. [eds.

    1997-12-31

    Papers in the first volume are presented under the following session headings: drilling; mining robotics; machine monitoring; mine automation systems; reliability and maintenance; mine automation - communications mechanical excavation of medium-strength rock; and new mining equipment technologies. The second volume covers: mechanical excavation of hard rock; autonomous vehicles; mechanical excavation industry experience; machine guidance; applications of rock mechanics, mine planning management and scheduling; orebody delineation; and safety. Selected papers have been abstracted separately for the IEA Coal Research databases available on CD-ROM and the worldwide web.

  18. Twenty-ninth ORNL/DOE conference on analytical chemistry in energy technology. Abstracts of papers

    International Nuclear Information System (INIS)

    1986-01-01

    This booklet contains separate abstracts of 55 individual papers presented at this conference. Different sections in the book are titled as follows: laser techniques; resonance ionization spectroscopy; laser applications; new developments in mass spectrometry; analytical chemistry of hazardous waste; and automation and data management

  19. ℮-conome: an automated tissue counting platform of cone photoreceptors for rodent models of retinitis pigmentosa

    Directory of Open Access Journals (Sweden)

    Clérin Emmanuelle

    2011-12-01

    Full Text Available Abstract Background Retinitis pigmentosa is characterized by the sequential loss of rod and cone photoreceptors. The preservation of cones would prevent blindness due to their essential role in human vision. Rod-derived Cone Viability Factor is a thioredoxin-like protein that is secreted by rods and is involved in cone survival. To validate the activity of Rod-derived Cone Viability Factors (RdCVFs as therapeutic agents for treating retinitis Pigmentosa, we have developed e-conome, an automated cell counting platform for retinal flat mounts of rodent models of cone degeneration. This automated quantification method allows for faster data analysis thereby accelerating translational research. Methods An inverted fluorescent microscope, motorized and coupled to a CCD camera records images of cones labeled with fluorescent peanut agglutinin lectin on flat-mounted retinas. In an average of 300 fields per retina, nine Z-planes at magnification X40 are acquired after two-stage autofocus individually for each field. The projection of the stack of 9 images is subject to a threshold, filtered to exclude aberrant images based on preset variables. The cones are identified by treating the resulting image using 13 variables empirically determined. The cone density is calculated over the 300 fields. Results The method was validated by comparison to the conventional stereological counting. The decrease in cone density in rd1 mouse was found to be equivalent to the decrease determined by stereological counting. We also studied the spatiotemporal pattern of the degeneration of cones in the rd1 mouse and show that while the reduction in cone density starts in the central part of the retina, cone degeneration progresses at the same speed over the whole retinal surface. We finally show that for mice with an inactivation of the Nucleoredoxin-like genes Nxnl1 or Nxnl2 encoding RdCVFs, the loss of cones is more pronounced in the ventral retina. Conclusion The automated

  20. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    -erythrulose. Experiments were performed using automated microwell studies at the 150 or 800 mu L scale. The derived kinetic parameters were then verified in a second round of experiments where model predictions showed excellent agreement with experimental data obtained under conditions not included in the original......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  1. An Automated Weather Research and Forecasting (WRF)-Based Nowcasting System: Software Description

    Science.gov (United States)

    2013-10-01

    14. ABSTRACT A Web service /Web interface software package has been engineered to address the need for an automated means to run the Weather Research...An Automated Weather Research and Forecasting (WRF)- Based Nowcasting System: Software Description by Stephen F. Kirby, Brian P. Reen, and...Based Nowcasting System: Software Description Stephen F. Kirby, Brian P. Reen, and Robert E. Dumais Jr. Computational and Information Sciences

  2. Abstraction of an Affective-Cognitive Decision Making Model Based on Simulated Behaviour and Perception Chains

    Science.gov (United States)

    Sharpanskykh, Alexei; Treur, Jan

    Employing rich internal agent models of actors in large-scale socio-technical systems often results in scalability issues. The problem addressed in this paper is how to improve computational properties of a complex internal agent model, while preserving its behavioral properties. The problem is addressed for the case of an existing affective-cognitive decision making model instantiated for an emergency scenario. For this internal decision model an abstracted behavioral agent model is obtained, which ensures a substantial increase of the computational efficiency at the cost of approximately 1% behavioural error. The abstraction technique used can be applied to a wide range of internal agent models with loops, for example, involving mutual affective-cognitive interactions.

  3. Generation and exploration of aggregation abstractions for scheduling and resource allocation

    Science.gov (United States)

    Lowry, Michael R.; Linden, Theodore A.

    1993-01-01

    This paper presents research on the abstraction of computational theories for scheduling and resource allocation. The paper describes both theory and methods for the automated generation of aggregation abstractions and approximations in which detailed resource allocation constraints are replaced by constraints between aggregate demand and capacity. The interaction of aggregation abstraction generation with the more thoroughly investigated abstractions of weakening operator preconditions is briefly discussed. The purpose of generating abstract theories for aggregated demand and resources includes: answering queries about aggregate properties, such as gross feasibility; reducing computational costs by using the solution of aggregate problems to guide the solution of detailed problems; facilitating reformulating theories to approximate problems for which there are efficient problem-solving methods; and reducing computational costs of scheduling by providing more opportunities for variable and value-ordering heuristics to be effective. Experiments are being developed to characterize the properties of aggregations that make them cost effective. Both abstract and concrete theories are represented in a variant of first-order predicate calculus, which is a parameterized multi-sorted logic that facilitates specification of large problems. A particular problem is conceptually represented as a set of ground sentences that is consistent with a quantified theory.

  4. Programming with models: modularity and abstraction provide powerful capabilities for systems biology.

    Science.gov (United States)

    Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy

    2009-03-06

    Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously.

  5. Use of noncrystallographic symmetry for automated model building at medium to low resolution.

    Science.gov (United States)

    Wiegels, Tim; Lamzin, Victor S

    2012-04-01

    A novel method is presented for the automatic detection of noncrystallographic symmetry (NCS) in macromolecular crystal structure determination which does not require the derivation of molecular masks or the segmentation of density. It was found that throughout structure determination the NCS-related parts may be differently pronounced in the electron density. This often results in the modelling of molecular fragments of variable length and accuracy, especially during automated model-building procedures. These fragments were used to identify NCS relations in order to aid automated model building and refinement. In a number of test cases higher completeness and greater accuracy of the obtained structures were achieved, specifically at a crystallographic resolution of 2.3 Å or poorer. In the best case, the method allowed the building of up to 15% more residues automatically and a tripling of the average length of the built fragments.

  6. Seismic Consequence Abstraction

    International Nuclear Information System (INIS)

    Gross, M.

    2004-01-01

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274])

  7. Seismic Consequence Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  8. Automated and model-based assembly of an anamorphic telescope

    Science.gov (United States)

    Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.

  9. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1990-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems

  10. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  11. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    Directory of Open Access Journals (Sweden)

    Mohamed Saad

    2015-08-01

    Full Text Available Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid management technique where retail or wholesale customers are requested either electronically or manually to reduce their load. Therefore this paper aims to design and model an automated power system for optimal new load locations using DPL DIgSILENT Programming Language. This study is a diagnostic approach that assists system operator about any voltage violation cases that would happen during adding new load to the grid. The process of identifying the optimal bus bar location involves a complicated calculation of the power consumptions at each load bus As a result the DPL program would consider all the IEEE 30 bus internal networks data then a load flow simulation will be executed. To add the new load to the first bus in the network. Therefore the developed model will simulate the new load at each available bus bar in the network and generate three analytical reports for each case that captures the overunder voltage and the loading elements among the grid.

  12. Automated crack detection in conductive smart-concrete structures using a resistor mesh model

    Science.gov (United States)

    Downey, Austin; D'Alessandro, Antonella; Ubertini, Filippo; Laflamme, Simon

    2018-03-01

    Various nondestructive evaluation techniques are currently used to automatically detect and monitor cracks in concrete infrastructure. However, these methods often lack the scalability and cost-effectiveness over large geometries. A solution is the use of self-sensing carbon-doped cementitious materials. These self-sensing materials are capable of providing a measurable change in electrical output that can be related to their damage state. Previous work by the authors showed that a resistor mesh model could be used to track damage in structural components fabricated from electrically conductive concrete, where damage was located through the identification of high resistance value resistors in a resistor mesh model. In this work, an automated damage detection strategy that works through placing high value resistors into the previously developed resistor mesh model using a sequential Monte Carlo method is introduced. Here, high value resistors are used to mimic the internal condition of damaged cementitious specimens. The proposed automated damage detection method is experimentally validated using a 500 × 500 × 50 mm3 reinforced cement paste plate doped with multi-walled carbon nanotubes exposed to 100 identical impact tests. Results demonstrate that the proposed Monte Carlo method is capable of detecting and localizing the most prominent damage in a structure, demonstrating that automated damage detection in smart-concrete structures is a promising strategy for real-time structural health monitoring of civil infrastructure.

  13. Abstraction of man-made shapes

    KAUST Repository

    Mehra, Ravish; Zhou, Qingnan; Long, Jeremy; Sheffer, Alla; Gooch, Amy Ashurst; Mitra, Niloy J.

    2009-01-01

    Man-made objects are ubiquitous in the real world and in virtual environments. While such objects can be very detailed, capturing every small feature, they are often identified and characterized by a small set of defining curves. Compact, abstracted shape descriptions based on such curves are often visually more appealing than the original models, which can appear to be visually cluttered. We introduce a novel algorithm for abstracting three-dimensional geometric models using characteristic curves or contours as building blocks for the abstraction. Our method robustly handles models with poor connectivity, including the extreme cases of polygon soups, common in models of man-made objects taken from online repositories. In our algorithm, we use a two-step procedure that first approximates the input model using a manifold, closed envelope surface and then extracts from it a hierarchical abstraction curve network along with suitable normal information. The constructed curve networks form a compact, yet powerful, representation for the input shapes, retaining their key shape characteristics while discarding minor details and irregularities. © 2009 ACM.

  14. Modal abstractions of concurrent behavior

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nanz, Sebastian; Nielson, Hanne Riis

    2011-01-01

    We present an effective algorithm for the automatic construction of finite modal transition systems as abstractions of potentially infinite concurrent processes. Modal transition systems are recognized as valuable abstractions for model checking because they allow for the validation as well...... as refutation of safety and liveness properties. However, the algorithmic construction of finite abstractions from potentially infinite concurrent processes is a missing link that prevents their more widespread usage for model checking of concurrent systems. Our algorithm is a worklist algorithm using concepts...... from abstract interpretation and operating upon mappings from sets to intervals in order to express simultaneous over- and underapprox-imations of the multisets of process actions available in a particular state. We obtain a finite abstraction that is 3-valued in both states and transitions...

  15. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    Science.gov (United States)

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  16. Geometry Based Design Automation : Applied to Aircraft Modelling and Optimization

    OpenAIRE

    Amadori, Kristian

    2012-01-01

    Product development processes are continuously challenged by demands for increased efficiency. As engineering products become more and more complex, efficient tools and methods for integrated and automated design are needed throughout the development process. Multidisciplinary Design Optimization (MDO) is one promising technique that has the potential to drastically improve concurrent design. MDO frameworks combine several disciplinary models with the aim of gaining a holistic perspective of ...

  17. Inventory Abstraction

    International Nuclear Information System (INIS)

    Leigh, C.

    2000-01-01

    The purpose of the inventory abstraction as directed by the development plan (CRWMS M and O 1999b) is to: (1) Interpret the results of a series of relative dose calculations (CRWMS M and O 1999c, 1999d). (2) Recommend, including a basis thereof, a set of radionuclides that should be modeled in the Total System Performance Assessment in Support of the Site Recommendation (TSPA-SR) and the Total System Performance Assessment in Support of the Final Environmental Impact Statement (TSPA-FEIS). (3) Provide initial radionuclide inventories for the TSPA-SR and TSPA-FEIS models. (4) Answer the U.S. Nuclear Regulatory Commission (NRC)'s Issue Resolution Status Report ''Key Technical Issue: Container Life and Source Term'' (CLST IRSR) (NRC 1999) key technical issue (KTI): ''The rate at which radionuclides in SNF [Spent Nuclear Fuel] are released from the EBS [Engineered Barrier System] through the oxidation and dissolution of spent fuel'' (Subissue 3). The scope of the radionuclide screening analysis encompasses the period from 100 years to 10,000 years after the potential repository at Yucca Mountain is sealed for scenarios involving the breach of a waste package and subsequent degradation of the waste form as required for the TSPA-SR calculations. By extending the time period considered to one million years after repository closure, recommendations are made for the TSPA-FEIS. The waste forms included in the inventory abstraction are Commercial Spent Nuclear Fuel (CSNF), DOE Spent Nuclear Fuel (DSNF), High-Level Waste (HLW), naval Spent Nuclear Fuel (SNF), and U.S. Department of Energy (DOE) plutonium waste. The intended use of this analysis is in TSPA-SR and TSPA-FEIS. Based on the recommendations made here, models for release, transport, and possibly exposure will be developed for the isotopes that would be the highest contributors to the dose given a release to the accessible environment. The inventory abstraction is important in assessing system performance because

  18. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  19. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  20. Automated main-chain model building by template matching and iterative fragment extension.

    Science.gov (United States)

    Terwilliger, Thomas C

    2003-01-01

    An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and beta-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and beta-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C(alpha) positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 A. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition.

  1. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1997-06-01

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  2. Utilizing Gaze Behavior for Inferring Task Transitions Using Abstract Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Daniel Fernando Tello Gamarra

    2016-12-01

    Full Text Available We demonstrate an improved method for utilizing observed gaze behavior and show that it is useful in inferring hand movement intent during goal directed tasks. The task dynamics and the relationship between hand and gaze behavior are learned using an Abstract Hidden Markov Model (AHMM. We show that the predicted hand movement transitions occur consistently earlier in AHMM models with gaze than those models that do not include gaze observations.

  3. Modeling Abstraction Hierarchy Levels of the Cyber Attacks Using Random Process

    OpenAIRE

    Durrieu , Gilles; Frénod , Emmanuel; Morineau , Thierry; NGUYEN , THONG

    2017-01-01

    International audience; Aspects of human behavior in cyber security allow more natural security to the user. This research focuses the appearance of anticipating cyber threats and their abstraction hierarchy levels on the mental picture levels of human. The study concerns the modeling of the behaviors of mental states of an individual under cyber attacks. The mental state of agents being not observable, we propose a non-stationary hidden Markov chain approach to model the agent mental behavio...

  4. AutoLens: Automated Modeling of a Strong Lens's Light, Mass and Source

    Science.gov (United States)

    Nightingale, J. W.; Dye, S.; Massey, Richard J.

    2018-05-01

    This work presents AutoLens, the first entirely automated modeling suite for the analysis of galaxy-scale strong gravitational lenses. AutoLens simultaneously models the lens galaxy's light and mass whilst reconstructing the extended source galaxy on an adaptive pixel-grid. The method's approach to source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing AutoLens to cleanly deblend its light from the source. Single component mass models representing the lens's total mass density profile are demonstrated, which in conjunction with light modeling can detect central images using a centrally cored profile. Decomposed mass modeling is also shown, which can fully decouple a lens's light and dark matter and determine whether the two component are geometrically aligned. The complexity of the light and mass models are automatically chosen via Bayesian model comparison. These steps form AutoLens's automated analysis pipeline, such that all results in this work are generated without any user-intervention. This is rigorously tested on a large suite of simulated images, assessing its performance on a broad range of lens profiles, source morphologies and lensing geometries. The method's performance is excellent, with accurate light, mass and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.

  5. Development of an automated core model for nuclear reactors

    International Nuclear Information System (INIS)

    Mosteller, R.D.

    1998-01-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of this project was to develop an automated package of computer codes that can model the steady-state behavior of nuclear-reactor cores of various designs. As an added benefit, data produced for steady-state analysis also can be used as input to the TRAC transient-analysis code for subsequent safety analysis of the reactor at any point in its operating lifetime. The basic capability to perform steady-state reactor-core analysis already existed in the combination of the HELIOS lattice-physics code and the NESTLE advanced nodal code. In this project, the automated package was completed by (1) obtaining cross-section libraries for HELIOS, (2) validating HELIOS by comparing its predictions to results from critical experiments and from the MCNP Monte Carlo code, (3) validating NESTLE by comparing its predictions to results from numerical benchmarks and to measured data from operating reactors, and (4) developing a linkage code to transform HELIOS output into NESTLE input

  6. The Development Of Mathematical Model For Automated Fingerprint Identification Systems Analysis

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2001-01-01

    Fingerprint has a strong oriented and periodic structure composed of dark lines of raised skin (ridges) and clear lines of lowered skin (furrows)that twist to form a distinct pattern. Although the manner in which the ridges flow is distinctive, other characteristics of the fingerprint called m inutiae a re what are most unique to the individual. These features are particular patterns consisting of terminations or bifurcations of the ridges. To assert if two fingerprints are from the same finger or not, experts detect those minutiae. AFIS (Automated Fingerprint Identification Systems) extract and compare these features for determining a match. The classic methods of fingerprints recognition are not suitable for direct implementation in form of computer algorithms. The creation of a finger's model was however the necessity of development of new, better algorithms of analysis. This paper presents a new numerical methods of fingerprints' simulation based on mathematical model of arrangement of dermatoglyphics and creation of minutiae. This paper describes also the design and implementation of an automated fingerprint identification systems which operates in two stages: minutiae extraction and minutiae matching

  7. The ModelCC Model-Driven Parser Generator

    Directory of Open Access Journals (Sweden)

    Fernando Berzal

    2015-01-01

    Full Text Available Syntax-directed translation tools require the specification of a language by means of a formal grammar. This grammar must conform to the specific requirements of the parser generator to be used. This grammar is then annotated with semantic actions for the resulting system to perform its desired function. In this paper, we introduce ModelCC, a model-based parser generator that decouples language specification from language processing, avoiding some of the problems caused by grammar-driven parser generators. ModelCC receives a conceptual model as input, along with constraints that annotate it. It is then able to create a parser for the desired textual syntax and the generated parser fully automates the instantiation of the language conceptual model. ModelCC also includes a reference resolution mechanism so that ModelCC is able to instantiate abstract syntax graphs, rather than mere abstract syntax trees.

  8. Development of a web-based CANDU core management procedures automation system

    International Nuclear Information System (INIS)

    Lee, S.; Park, D.; Yeom, C.; Suh, H.

    2007-01-01

    Introduce CANDU core management procedures automation system (COMPAS) - A web-based application which semi-automates several CANDU core management tasks. It provides various functionalities including selection and evaluation of refueling channel, detector calibration, coolant flow estimation and thermal power calculation through automated interfacing with analysis codes (RFSP, NUCIRC, etc.) and plant data. It also utilizes brand new .NET computing technology such as ASP.NET, smart client, web services and so on. Since almost all functions are abstracted from the previous experiences of the current working members of the Wolsong Nuclear Power Plant (NPP), it will lead to an efficient and safe operation of CANDU plants. (author)

  9. Development of a web-based CANDU core management procedures automation system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.; Park, D.; Yeom, C. [Inst. for Advanced Engineering (IAE), Yongin (Korea, Republic of); Suh, H. [Korea Hydro and Nuclear Power (KHNP), Wolsong (Korea, Republic of)

    2007-07-01

    Introduce CANDU core management procedures automation system (COMPAS) - A web-based application which semi-automates several CANDU core management tasks. It provides various functionalities including selection and evaluation of refueling channel, detector calibration, coolant flow estimation and thermal power calculation through automated interfacing with analysis codes (RFSP, NUCIRC, etc.) and plant data. It also utilizes brand new .NET computing technology such as ASP.NET, smart client, web services and so on. Since almost all functions are abstracted from the previous experiences of the current working members of the Wolsong Nuclear Power Plant (NPP), it will lead to an efficient and safe operation of CANDU plants. (author)

  10. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  11. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    Science.gov (United States)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  12. Abstraction of Models for Pitting and Crevice Corrosion of Drip Shield and Waste Package Outer Barrier

    Energy Technology Data Exchange (ETDEWEB)

    K. Mon

    2001-08-29

    This analyses and models report (AMR) was conducted in response to written work direction (CRWMS M and O 1999a). ICN 01 of this AMR was developed following guidelines provided in TWP-MGR-MD-000004 REV 01, ''Technical Work Plan for: Integrated Management of Technical Product Input Department'' (BSC 2001, Addendum B). The purpose and scope of this AMR is to review and analyze upstream process-level models (CRWMS M and O 2000a and CRWMS M and O 2000b) and information relevant to pitting and crevice corrosion degradation of waste package outer barrier (Alloy 22) and drip shield (Titanium Grade 7) materials, and to develop abstractions of the important processes in a form that is suitable for input to the WAPDEG analysis for long-term degradation of waste package outer barrier and drip shield in the repository. The abstraction is developed in a manner that ensures consistency with the process-level models and information and captures the essential behavior of the processes represented. Also considered in the model abstraction are the probably range of exposure conditions in emplacement drifts and local exposure conditions on drip shield and waste package surfaces. The approach, method, and assumptions that are employed in the model abstraction are documented and justified.

  13. Abstraction of Models for Pitting and Crevice Corrosion of Drip Shield and Waste Package Outer Barrier

    International Nuclear Information System (INIS)

    Mon, K.

    2001-01-01

    This analyses and models report (AMR) was conducted in response to written work direction (CRWMS M and O 1999a). ICN 01 of this AMR was developed following guidelines provided in TWP-MGR-MD-000004 REV 01, ''Technical Work Plan for: Integrated Management of Technical Product Input Department'' (BSC 2001, Addendum B). The purpose and scope of this AMR is to review and analyze upstream process-level models (CRWMS M and O 2000a and CRWMS M and O 2000b) and information relevant to pitting and crevice corrosion degradation of waste package outer barrier (Alloy 22) and drip shield (Titanium Grade 7) materials, and to develop abstractions of the important processes in a form that is suitable for input to the WAPDEG analysis for long-term degradation of waste package outer barrier and drip shield in the repository. The abstraction is developed in a manner that ensures consistency with the process-level models and information and captures the essential behavior of the processes represented. Also considered in the model abstraction are the probably range of exposure conditions in emplacement drifts and local exposure conditions on drip shield and waste package surfaces. The approach, method, and assumptions that are employed in the model abstraction are documented and justified

  14. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  15. Automated smoother for the numerical decoupling of dynamics models.

    Science.gov (United States)

    Vilela, Marco; Borges, Carlos C H; Vinga, Susana; Vasconcelos, Ana Tereza R; Santos, Helena; Voit, Eberhard O; Almeida, Jonas S

    2007-08-21

    Structure identification of dynamic models for complex biological systems is the cornerstone of their reverse engineering. Biochemical Systems Theory (BST) offers a particularly convenient solution because its parameters are kinetic-order coefficients which directly identify the topology of the underlying network of processes. We have previously proposed a numerical decoupling procedure that allows the identification of multivariate dynamic models of complex biological processes. While described here within the context of BST, this procedure has a general applicability to signal extraction. Our original implementation relied on artificial neural networks (ANN), which caused slight, undesirable bias during the smoothing of the time courses. As an alternative, we propose here an adaptation of the Whittaker's smoother and demonstrate its role within a robust, fully automated structure identification procedure. In this report we propose a robust, fully automated solution for signal extraction from time series, which is the prerequisite for the efficient reverse engineering of biological systems models. The Whittaker's smoother is reformulated within the context of information theory and extended by the development of adaptive signal segmentation to account for heterogeneous noise structures. The resulting procedure can be used on arbitrary time series with a nonstationary noise process; it is illustrated here with metabolic profiles obtained from in-vivo NMR experiments. The smoothed solution that is free of parametric bias permits differentiation, which is crucial for the numerical decoupling of systems of differential equations. The method is applicable in signal extraction from time series with nonstationary noise structure and can be applied in the numerical decoupling of system of differential equations into algebraic equations, and thus constitutes a rather general tool for the reverse engineering of mechanistic model descriptions from multivariate experimental

  16. Deep SOMs for automated feature extraction and classification from big data streaming

    Science.gov (United States)

    Sakkari, Mohamed; Ejbali, Ridha; Zaied, Mourad

    2017-03-01

    In this paper, we proposed a deep self-organizing map model (Deep-SOMs) for automated features extracting and learning from big data streaming which we benefit from the framework Spark for real time streams and highly parallel data processing. The SOMs deep architecture is based on the notion of abstraction (patterns automatically extract from the raw data, from the less to more abstract). The proposed model consists of three hidden self-organizing layers, an input and an output layer. Each layer is made up of a multitude of SOMs, each map only focusing at local headmistress sub-region from the input image. Then, each layer trains the local information to generate more overall information in the higher layer. The proposed Deep-SOMs model is unique in terms of the layers architecture, the SOMs sampling method and learning. During the learning stage we use a set of unsupervised SOMs for feature extraction. We validate the effectiveness of our approach on large data sets such as Leukemia dataset and SRBCT. Results of comparison have shown that the Deep-SOMs model performs better than many existing algorithms for images classification.

  17. Planning for Evolution in a Production Environment: Migration from a Legacy Geometry Code to an Abstract Geometry Modeling Language in STAR

    Science.gov (United States)

    Webb, Jason C.; Lauret, Jerome; Perevoztchikov, Victor

    2012-12-01

    Increasingly detailed descriptions of complex detector geometries are required for the simulation and analysis of today's high-energy and nuclear physics experiments. As new tools for the representation of geometry models become available during the course of an experiment, a fundamental challenge arises: how best to migrate from legacy geometry codes developed over many runs to the new technologies, such as the ROOT/TGeo [1] framework, without losing touch with years of development, tuning and validation. One approach, which has been discussed within the community for a number of years, is to represent the geometry model in a higher-level language independent of the concrete implementation of the geometry. The STAR experiment has used this approach to successfully migrate its legacy GEANT 3-era geometry to an Abstract geometry Modelling Language (AgML), which allows us to create both native GEANT 3 and ROOT/TGeo implementations. The language is supported by parsers and a C++ class library which enables the automated conversion of the original source code to AgML, supports export back to the original AgSTAR[5] representation, and creates the concrete ROOT/TGeo geometry implementation used by our track reconstruction software. In this paper we present our approach, design and experience and will demonstrate physical consistency between the original AgSTAR and new AgML geometry representations.

  18. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    International Nuclear Information System (INIS)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke

    2015-01-01

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  19. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  20. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery.

    Science.gov (United States)

    Yu, Victoria Y; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A; Sheng, Ke

    2015-11-01

    Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was attributed to phantom setup

  1. Assessing the impacts of water abstractions on river ecosystem services: an eco-hydraulic modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Carolli, Mauro, E-mail: mauro.carolli@unitn.it; Geneletti, Davide, E-mail: davide.geneletti@unitn.it; Zolezzi, Guido, E-mail: guido.zolezzi@unitn.it

    2017-03-15

    The provision of important river ecosystem services (ES) is dependent on the flow regime. This requires methods to assess the impacts on ES caused by interventions on rivers that affect flow regime, such as water abstractions. This study proposes a method to i) quantify the provision of a set of river ES, ii) simulate the effects of water abstraction alternatives that differ in location and abstracted flow, and iii) assess the impact of water abstraction alternatives on the selected ES. The method is based on river modelling science, and integrates spatially distributed hydrological, hydraulic and habitat models at different spatial and temporal scales. The method is applied to the hydropeaked upper Noce River (Northern Italy), which is regulated by hydropower operations. We selected locally relevant river ES: habitat suitability for the adult marble trout, white-water rafting suitability, hydroelectricity production from run-of-river (RoR) plants. Our results quantify the seasonality of river ES response variables and their intrinsic non-linearity, which explains why the same abstracted flow can produce different effects on trout habitat and rafting suitability depending on the morphology of the abstracted reach. An economic valuation of the examined river ES suggests that incomes from RoR hydropower plants are of comparable magnitude to touristic revenue losses related to the decrease in rafting suitability.

  2. Assessing the impacts of water abstractions on river ecosystem services: an eco-hydraulic modelling approach

    International Nuclear Information System (INIS)

    Carolli, Mauro; Geneletti, Davide; Zolezzi, Guido

    2017-01-01

    The provision of important river ecosystem services (ES) is dependent on the flow regime. This requires methods to assess the impacts on ES caused by interventions on rivers that affect flow regime, such as water abstractions. This study proposes a method to i) quantify the provision of a set of river ES, ii) simulate the effects of water abstraction alternatives that differ in location and abstracted flow, and iii) assess the impact of water abstraction alternatives on the selected ES. The method is based on river modelling science, and integrates spatially distributed hydrological, hydraulic and habitat models at different spatial and temporal scales. The method is applied to the hydropeaked upper Noce River (Northern Italy), which is regulated by hydropower operations. We selected locally relevant river ES: habitat suitability for the adult marble trout, white-water rafting suitability, hydroelectricity production from run-of-river (RoR) plants. Our results quantify the seasonality of river ES response variables and their intrinsic non-linearity, which explains why the same abstracted flow can produce different effects on trout habitat and rafting suitability depending on the morphology of the abstracted reach. An economic valuation of the examined river ES suggests that incomes from RoR hydropower plants are of comparable magnitude to touristic revenue losses related to the decrease in rafting suitability.

  3. Collaborative Model-based Systems Engineering for Cyber-Physical Systems, with a Building Automation Case Study

    DEFF Research Database (Denmark)

    Fitzgerald, John; Gamble, Carl; Payne, Richard

    2016-01-01

    We describe an approach to the model-based engineering of cyber-physical systems that permits the coupling of diverse discrete-event and continuous-time models and their simulators. A case study in the building automation domain demonstrates how such co-models and co-simulation can promote early...

  4. Abstracts of the symposium on unsaturated flow and transport modeling

    International Nuclear Information System (INIS)

    1982-03-01

    Abstract titles are: Recent developments in modeling variably saturated flow and transport; Unsaturated flow modeling as applied to field problems; Coupled heat and moisture transport in unsaturated soils; Influence of climatic parameters on movement of radionuclides in a multilayered saturated-unsaturated media; Modeling water and solute transport in soil containing roots; Simulation of consolidation in partially saturated soil materials; modeling of water and solute transport in unsaturated heterogeneous fields; Fluid dynamics and mass transfer in variably-saturated porous media; Solute transport through soils; One-dimensional analytical transport modeling; Convective transport of ideal tracers in unsaturated soils; Chemical transport in macropore-mesopore media under partially saturated conditions; Influence of the tension-saturated zone on contaminant migration in shallow water regimes; Influence of the spatial distribution of velocities in porous media on the form of solute transport; Stochastic vs deterministic models for solute movement in the field; and Stochastic analysis of flow and solute transport

  5. Toward Automated Inventory Modeling in Life Cycle Assessment: The Utility of Semantic Data Modeling to Predict Real-WorldChemical Production

    Science.gov (United States)

    A set of coupled semantic data models, i.e., ontologies, are presented to advance a methodology towards automated inventory modeling of chemical manufacturing in life cycle assessment. The cradle-to-gate life cycle inventory for chemical manufacturing is a detailed collection of ...

  6. Implementation and automated validation of the minimal Z' model in FeynRules

    International Nuclear Information System (INIS)

    Basso, L.; Christensen, N.D.; Duhr, C.; Fuks, B.; Speckner, C.

    2012-01-01

    We describe the implementation of a well-known class of U(1) gauge models, the 'minimal' Z' models, in FeynRules. We also describe a new automated validation tool for FeynRules models which is controlled by a web interface and allows the user to run a complete set of 2 → 2 processes on different matrix element generators, different gauges, and compare between them all. If existing, the comparison with independent implementations is also possible. This tool has been used to validate our implementation of the 'minimal' Z' models. (authors)

  7. Demo abstract: Flexhouse-2-an open source building automation platform with a focus on flexible control

    DEFF Research Database (Denmark)

    Gehrke, Oliver; Kosek, Anna Magdalena; Svendsen, Mathias

    2014-01-01

    , an open-source implementation of a building automation system which has been designed with a strong focus on enabling the integration of the building into a smart power system and dedicated support for the requirements of an R&D environment. We will demonstrate the need for such a platform, discuss...

  8. Assessing user acceptance towards automated and conventional sink use for hand decontamination using the technology acceptance model.

    Science.gov (United States)

    Dawson, Carolyn H; Mackrill, Jamie B; Cain, Rebecca

    2017-12-01

    Hand hygiene (HH) prevents harmful contaminants spreading in settings including domestic, health care and food handling. Strategies to improve HH range from behavioural techniques through to automated sinks that ensure hand surface cleaning. This study aimed to assess user experience and acceptance towards a new automated sink, compared to a normal sink. An adapted version of the technology acceptance model (TAM) assessed each mode of handwashing. A within-subjects design enabled N = 46 participants to evaluate both sinks. Perceived Ease of Use and Satisfaction of Use were significantly lower for the automated sink, compared to the conventional sink (p technology. We provide recommendations for future HH technology development to contribute a positive user experience, relevant to technology developers, ergonomists and those involved in HH across all sectors. Practitioner Summary: The need to facilitate timely, effective hand hygiene to prevent illness has led to a rise in automated handwashing systems across different contexts. User acceptance is a key factor in system uptake. This paper applies the technology acceptance model as a means to explore and optimise the design of such systems.

  9. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  10. Formal Test Automation: The Conference protocol with TGV/TorX

    NARCIS (Netherlands)

    Ural, Hasan; Du Bousquet, Lydie; Ramangalahy, Solofo; Probert, Robert L.; von Bochmann, Gregor; Simon, Severine; Viho, Cesar; Belinfante, Axel; de Vries, R.G.

    We present an experiment of automated formal conformance testing of the Conference Protocol Entity as reported in [2]. Our approach differs from other experiments, since it investigates the combination of the tools TGV for abstract test generation and TorX for test execution.

  11. PEMBELAJARAN SISTEM HIDROLIK DAN PNEUMATIK DENGAN MENGGUNAKAN AUTOMATION STUDIO

    Directory of Open Access Journals (Sweden)

    Adi Dewanto

    2015-02-01

    Full Text Available ABSTRACT Students find it difficult to master the hydraulic and pneumatic system due to the lack of imagination on the component movement. It affects students’ learningprocess on the hydraulic and pneumatic system application. In order to solve the problem, the lecturer of Mechatronics course used the Automation Studio application. This software was helpful to design various automations, such as combination of hydraulic system, pneumatic system, electric system, and PLC. The lecturing process and design simulation were conducted by using Automation Studio. In general, the students were so much helped by this program in mastering the theory and practice of hydraulic and Pneumatic. On the other hand, it was found some problem in applying the Automation Studio on the classroom. The problems were limited onthe menu option as well ason the technical aspects related to the number of the computer. The implications from the writers’ experience in using Automation Studio were there was an opportunity for computer programmer to create learningmedia/ software for certain competence which was relevant, accessible and applicable. Also, in case of software preparation, it should be conducted by the lecturers and the students before the learning process. Keywords: automation studio program, learning process, Pneumatic and hydraulic learning

  12. Constraint-Based Abstract Semantics for Temporal Logic

    DEFF Research Database (Denmark)

    Banda, Gourinath; Gallagher, John Patrick

    2010-01-01

    Abstract interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal mu-calculus, which is the basis for abstract model checking. The abstract semantic funct...

  13. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    International Nuclear Information System (INIS)

    Herbert, L.T.; Hansen, Z.N.L.

    2016-01-01

    This paper presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework's capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow is explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow. - Highlights: • We present a framework which allows for the automated restructuring of workflows. • This framework seeks to minimise the impact of errors on the workflow. • We illustrate a scalable software implementation of this framework. • We explore the practical utility of this approach through an industry case. • The impact of errors can be substantially reduced by restructuring the workflow.

  14. Automated adaptive inference of phenomenological dynamical models

    Science.gov (United States)

    Daniels, Bryan

    Understanding the dynamics of biochemical systems can seem impossibly complicated at the microscopic level: detailed properties of every molecular species, including those that have not yet been discovered, could be important for producing macroscopic behavior. The profusion of data in this area has raised the hope that microscopic dynamics might be recovered in an automated search over possible models, yet the combinatorial growth of this space has limited these techniques to systems that contain only a few interacting species. We take a different approach inspired by coarse-grained, phenomenological models in physics. Akin to a Taylor series producing Hooke's Law, forgoing microscopic accuracy allows us to constrain the search over dynamical models to a single dimension. This makes it feasible to infer dynamics with very limited data, including cases in which important dynamical variables are unobserved. We name our method Sir Isaac after its ability to infer the dynamical structure of the law of gravitation given simulated planetary motion data. Applying the method to output from a microscopically complicated but macroscopically simple biological signaling model, it is able to adapt the level of detail to the amount of available data. Finally, using nematode behavioral time series data, the method discovers an effective switch between behavioral attractors after the application of a painful stimulus.

  15. Technical Work Plan for: Near Field Environment: Engineered System: Radionuclide Transport Abstraction Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2006-12-08

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model

  16. INVENTORY ABSTRACTION

    International Nuclear Information System (INIS)

    Ragan, G.

    2001-01-01

    The purpose of the inventory abstraction, which has been prepared in accordance with a technical work plan (CRWMS M andO 2000e for/ICN--02 of the present analysis, and BSC 2001e for ICN 03 of the present analysis), is to: (1) Interpret the results of a series of relative dose calculations (CRWMS M andO 2000c, 2000f). (2) Recommend, including a basis thereof, a set of radionuclides that should be modeled in the Total System Performance Assessment in Support of the Site Recommendation (TSPA-SR) and the Total System Performance Assessment in Support of the Final Environmental Impact Statement (TSPA-FEIS). (3) Provide initial radionuclide inventories for the TSPA-SR and TSPA-FEIS models. (4) Answer the U.S. Nuclear Regulatory Commission (NRC)'s Issue Resolution Status Report ''Key Technical Issue: Container Life and Source Term'' (CLST IRSR) key technical issue (KTI): ''The rate at which radionuclides in SNF [spent nuclear fuel] are released from the EBS [engineered barrier system] through the oxidation and dissolution of spent fuel'' (NRC 1999, Subissue 3). The scope of the radionuclide screening analysis encompasses the period from 100 years to 10,000 years after the potential repository at Yucca Mountain is sealed for scenarios involving the breach of a waste package and subsequent degradation of the waste form as required for the TSPA-SR calculations. By extending the time period considered to one million years after repository closure, recommendations are made for the TSPA-FEIS. The waste forms included in the inventory abstraction are Commercial Spent Nuclear Fuel (CSNF), DOE Spent Nuclear Fuel (DSNF), High-Level Waste (HLW), naval Spent Nuclear Fuel (SNF), and U.S. Department of Energy (DOE) plutonium waste. The intended use of this analysis is in TSPA-SR and TSPA-FEIS. Based on the recommendations made here, models for release, transport, and possibly exposure will be developed for the isotopes that would be the highest contributors to the dose given a release

  17. flowClust: a Bioconductor package for automated gating of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Lo Kenneth

    2009-05-01

    Full Text Available Abstract Background As a high-throughput technology that offers rapid quantification of multidimensional characteristics for millions of cells, flow cytometry (FCM is widely used in health research, medical diagnosis and treatment, and vaccine development. Nevertheless, there is an increasing concern about the lack of appropriate software tools to provide an automated analysis platform to parallelize the high-throughput data-generation platform. Currently, to a large extent, FCM data analysis relies on the manual selection of sequential regions in 2-D graphical projections to extract the cell populations of interest. This is a time-consuming task that ignores the high-dimensionality of FCM data. Results In view of the aforementioned issues, we have developed an R package called flowClust to automate FCM analysis. flowClust implements a robust model-based clustering approach based on multivariate t mixture models with the Box-Cox transformation. The package provides the functionality to identify cell populations whilst simultaneously handling the commonly encountered issues of outlier identification and data transformation. It offers various tools to summarize and visualize a wealth of features of the clustering results. In addition, to ensure its convenience of use, flowClust has been adapted for the current FCM data format, and integrated with existing Bioconductor packages dedicated to FCM analysis. Conclusion flowClust addresses the issue of a dearth of software that helps automate FCM analysis with a sound theoretical foundation. It tends to give reproducible results, and helps reduce the significant subjectivity and human time cost encountered in FCM analysis. The package contributes to the cytometry community by offering an efficient, automated analysis platform which facilitates the active, ongoing technological advancement.

  18. In-Package Chemistry Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    P.S. Domski

    2003-07-21

    The work associated with the development of this model report was performed in accordance with the requirements established in ''Technical Work Plan for Waste Form Degradation Modeling, Testing, and Analyses in Support of SR and LA'' (BSC 2002a). The in-package chemistry model and in-package chemistry model abstraction are developed to predict the bulk chemistry inside of a failed waste package and to provide simplified expressions of that chemistry. The purpose of this work is to provide the abstraction model to the Performance Assessment Project and the Waste Form Department for development of geochemical models of the waste package interior. The scope of this model report is to describe the development and validation of the in-package chemistry model and in-package chemistry model abstraction. The in-package chemistry model will consider chemical interactions of water with the waste package materials and the waste form for commercial spent nuclear fuel (CSNF) and codisposed high-level waste glass (HLWG) and N Reactor spent fuel (CDNR). The in-package chemistry model includes two sub-models, the first a water vapor condensation (WVC) model, where water enters a waste package as vapor and forms a film on the waste package components with subsequent film reactions with the waste package materials and waste form--this is a no-flow model, the reacted fluids do not exit the waste package via advection. The second sub-model of the in-package chemistry model is the seepage dripping model (SDM), where water, water that may have seeped into the repository from the surrounding rock, enters a failed waste package and reacts with the waste package components and waste form, and then exits the waste package with no accumulation of reacted water in the waste package. Both of the submodels of the in-package chemistry model are film models in contrast to past in-package chemistry models where all of the waste package pore space was filled with water. The

  19. In-Package Chemistry Abstraction

    International Nuclear Information System (INIS)

    P.S. Domski

    2003-01-01

    The work associated with the development of this model report was performed in accordance with the requirements established in ''Technical Work Plan for Waste Form Degradation Modeling, Testing, and Analyses in Support of SR and LA'' (BSC 2002a). The in-package chemistry model and in-package chemistry model abstraction are developed to predict the bulk chemistry inside of a failed waste package and to provide simplified expressions of that chemistry. The purpose of this work is to provide the abstraction model to the Performance Assessment Project and the Waste Form Department for development of geochemical models of the waste package interior. The scope of this model report is to describe the development and validation of the in-package chemistry model and in-package chemistry model abstraction. The in-package chemistry model will consider chemical interactions of water with the waste package materials and the waste form for commercial spent nuclear fuel (CSNF) and codisposed high-level waste glass (HLWG) and N Reactor spent fuel (CDNR). The in-package chemistry model includes two sub-models, the first a water vapor condensation (WVC) model, where water enters a waste package as vapor and forms a film on the waste package components with subsequent film reactions with the waste package materials and waste form--this is a no-flow model, the reacted fluids do not exit the waste package via advection. The second sub-model of the in-package chemistry model is the seepage dripping model (SDM), where water, water that may have seeped into the repository from the surrounding rock, enters a failed waste package and reacts with the waste package components and waste form, and then exits the waste package with no accumulation of reacted water in the waste package. Both of the submodels of the in-package chemistry model are film models in contrast to past in-package chemistry models where all of the waste package pore space was filled with water. The current in

  20. Model-Based Control for Postal Automation and Baggage Handling

    NARCIS (Netherlands)

    Tarau, A.N.

    2010-01-01

    In this thesis we focus on two specific transportation systems, namely postal automation and baggage handling. Postal automation: During the last decades the volume of magazines, catalogs, and other plastic wrapped mail items that have to be processed by post sorting centers has increased

  1. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  2. Abstraction of Drift-Scale Coupled Processes

    International Nuclear Information System (INIS)

    Francis, N.D.; Sassani, D.

    2000-01-01

    This Analysis/Model Report (AMR) describes an abstraction, for the performance assessment total system model, of the near-field host rock water chemistry and gas-phase composition. It also provides an abstracted process model analysis of potentially important differences in the thermal hydrologic (TH) variables used to describe the performance of a geologic repository obtained from models that include fully coupled reactive transport with thermal hydrology and those that include thermal hydrology alone. Specifically, the motivation of the process-level model comparison between fully coupled thermal-hydrologic-chemical (THC) and thermal-hydrologic-only (TH-only) is to provide the necessary justification as to why the in-drift thermodynamic environment and the near-field host rock percolation flux, the essential TH variables used to describe the performance of a geologic repository, can be obtained using a TH-only model and applied directly into a TSPA abstraction without recourse to a fully coupled reactive transport model. Abstraction as used in the context of this AMR refers to an extraction of essential data or information from the process-level model. The abstraction analysis reproduces and bounds the results of the underlying detailed process-level model. The primary purpose of this AMR is to abstract the results of the fully-coupled, THC model (CRWMS M andO 2000a) for effects on water and gas-phase composition adjacent to the drift wall (in the near-field host rock). It is assumed that drift wall fracture water and gas compositions may enter the emplacement drift before, during, and after the heating period. The heating period includes both the preclosure, in which the repository drifts are ventilated, and the postclosure periods, with backfill and drip shield emplacement at the time of repository closure. Although the preclosure period (50 years) is included in the process models, the postclosure performance assessment starts at the end of this initial period

  3. Kotai Antibody Builder: automated high-resolution structural modeling of antibodies.

    Science.gov (United States)

    Yamashita, Kazuo; Ikeda, Kazuyoshi; Amada, Karlou; Liang, Shide; Tsuchiya, Yuko; Nakamura, Haruki; Shirai, Hiroki; Standley, Daron M

    2014-11-15

    Kotai Antibody Builder is a Web service for tertiary structural modeling of antibody variable regions. It consists of three main steps: hybrid template selection by sequence alignment and canonical rules, 3D rendering of alignments and CDR-H3 loop modeling. For the last step, in addition to rule-based heuristics used to build the initial model, a refinement option is available that uses fragment assembly followed by knowledge-based scoring. Using targets from the Second Antibody Modeling Assessment, we demonstrate that Kotai Antibody Builder generates models with an overall accuracy equal to that of the best-performing semi-automated predictors using expert knowledge. Kotai Antibody Builder is available at http://kotaiab.org standley@ifrec.osaka-u.ac.jp. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Use of noncrystallographic symmetry for automated model building at medium to low resolution

    International Nuclear Information System (INIS)

    Wiegels, Tim; Lamzin, Victor S.

    2012-01-01

    Noncrystallographic symmetry is automatically detected and used to achieve higher completeness and greater accuracy of automatically built protein structures at resolutions of 2.3 Å or poorer. A novel method is presented for the automatic detection of noncrystallographic symmetry (NCS) in macromolecular crystal structure determination which does not require the derivation of molecular masks or the segmentation of density. It was found that throughout structure determination the NCS-related parts may be differently pronounced in the electron density. This often results in the modelling of molecular fragments of variable length and accuracy, especially during automated model-building procedures. These fragments were used to identify NCS relations in order to aid automated model building and refinement. In a number of test cases higher completeness and greater accuracy of the obtained structures were achieved, specifically at a crystallographic resolution of 2.3 Å or poorer. In the best case, the method allowed the building of up to 15% more residues automatically and a tripling of the average length of the built fragments

  5. Automated home cage assessment shows behavioral changes in a transgenic mouse model of spinocerebellar ataxia type 17.

    Science.gov (United States)

    Portal, Esteban; Riess, Olaf; Nguyen, Huu Phuc

    2013-08-01

    Spinocerebellar Ataxia type 17 (SCA17) is an autosomal dominantly inherited, neurodegenerative disease characterized by ataxia, involuntary movements, and dementia. A novel SCA17 mouse model having a 71 polyglutamine repeat expansion in the TATA-binding protein (TBP) has shown age related motor deficit using a classic motor test, yet concomitant weight increase might be a confounding factor for this measurement. In this study we used an automated home cage system to test several motor readouts for this same model to confirm pathological behavior results and evaluate benefits of automated home cage in behavior phenotyping. Our results confirm motor deficits in the Tbp/Q71 mice and present previously unrecognized behavioral characteristics obtained from the automated home cage, indicating its use for high-throughput screening and testing, e.g. of therapeutic compounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  7. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    J.D. Schreiber

    2005-01-01

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers

  8. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2005-08-25

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport

  9. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J. Prouty

    2006-07-14

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  10. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    J. Prouty

    2006-01-01

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  11. Efficient abstraction selection in reinforcement learning

    NARCIS (Netherlands)

    Seijen, H. van; Whiteson, S.; Kester, L.

    2013-01-01

    This paper introduces a novel approach for abstraction selection in reinforcement learning problems modelled as factored Markov decision processes (MDPs), for which a state is described via a set of state components. In abstraction selection, an agent must choose an abstraction from a set of

  12. Automata Learning through Counterexample Guided Abstraction Refinement

    DEFF Research Database (Denmark)

    Aarts, Fides; Heidarian, Faranak; Kuppens, Harco

    2012-01-01

    to a small set of abstract events that can be handled by automata learning tools. In this article, we show how such abstractions can be constructed fully automatically for a restricted class of extended finite state machines in which one can test for equality of data parameters, but no operations on data...... are allowed. Our approach uses counterexample-guided abstraction refinement: whenever the current abstraction is too coarse and induces nondeterministic behavior, the abstraction is refined automatically. Using Tomte, a prototype tool implementing our algorithm, we have succeeded to learn – fully......Abstraction is the key when learning behavioral models of realistic systems. Hence, in most practical applications where automata learning is used to construct models of software components, researchers manually define abstractions which, depending on the history, map a large set of concrete events...

  13. Modeling nurses' attitude toward using automated unit-based medication storage and distribution systems: an extension of the technology acceptance model.

    Science.gov (United States)

    Escobar-Rodríguez, Tomás; Romero-Alonso, María Mercedes

    2013-05-01

    This article analyzes the attitude of nurses toward the use of automated unit-based medication storage and distribution systems and identifies influencing factors. Understanding these factors provides an opportunity to explore actions that might be taken to boost adoption by potential users. The theoretical grounding for this research is the Technology Acceptance Model. The Technology Acceptance Model specifies the causal relationships between perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior. The research model has six constructs, and nine hypotheses were generated from connections between these six constructs. These constructs include perceived risks, experience level, and training. The findings indicate that these three external variables are related to the perceived ease of use and perceived usefulness of automated unit-based medication storage and distribution systems, and therefore, they have a significant influence on attitude toward the use of these systems.

  14. Human-centred automation: an explorative study

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Miberg, Ann Britt

    1999-05-01

    The purpose of the programme activity on human-centred automation at the HRP is to develop knowledge (in the form of models and theories) and tools (in the form of techniques and simulators) to support design of automation that ensures effective human performance and comprehension. This report presents the work done on both the analytical and experimental side of this project. The analytical work has surveyed common definitions of automation and traditional design principles. A general finding is that human-centred automation usually is defined in terms of what it is not. This is partly due to a lack of adequate models and of human-automation interaction. Another result is a clarification of the consequences of automation, in particular with regard to situation awareness and workload. The experimental work has taken place as an explorative experiment in HAMMLAB in collaboration with IPSN (France). The purpose of this experiment was to increase the understanding of how automation influences operator performance in NPP control rooms. Two different types of automation (extensive and limited) were considered in scenarios having two different degrees of complexity (high and low), and involving diagnostic and procedural tasks. Six licensed NPP crews from the NPP at Loviisa, Finland, participated in the experiment. The dependent variables applied were plant performance, operator performance, self-rated crew performance, situation awareness, workload, and operator trust in the automation. The results from the diagnostic scenarios indicated that operators' judgement of crew efficiency was related to their level of trust in the automation, and further that operators trusted automation least and rated crew performance lowest in situations where crew performance was efficient and vice versa. The results from procedural scenarios indicated that extensive automation efficiently supported operators' performance, and further that operator' judgement of crew performance efficiency

  15. A Voyage to Arcturus: A model for automated management of a WLCG Tier-2 facility

    International Nuclear Information System (INIS)

    Roy, Gareth; Crooks, David; Mertens, Lena; Mitchell, Mark; Skipsey, Samuel Cadellin; Britton, David; Purdie, Stuart

    2014-01-01

    With the current trend towards 'On Demand Computing' in big data environments it is crucial that the deployment of services and resources becomes increasingly automated. Deployment based on cloud platforms is available for large scale data centre environments but these solutions can be too complex and heavyweight for smaller, resource constrained WLCG Tier-2 sites. Along with a greater desire for bespoke monitoring and collection of Grid related metrics, a more lightweight and modular approach is desired. In this paper we present a model for a lightweight automated framework which can be use to build WLCG grid sites, based on 'off the shelf' software components. As part of the research into an automation framework the use of both IPMI and SNMP for physical device management will be included, as well as the use of SNMP as a monitoring/data sampling layer such that more comprehensive decision making can take place and potentially be automated. This could lead to reduced down times and better performance as services are recognised to be in a non-functional state by autonomous systems.

  16. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  17. SCEE 2008 book of abstracts. The 7. international conference on scientific computing in electrical engineering (SCEE 2008)

    Energy Technology Data Exchange (ETDEWEB)

    Roos, J.; Costa, L.R.J. (ed.)

    2008-09-15

    SCEE is an international conference series dedicated to Scientific Computing in Electrical Engineering. The 7th International Conference on Scientific Computing in Electrical Engineering (SCEE 2008) in Espoo, Finland, is organized by the Helsinki University of Technology (TKK); Faculty of Electronics, Communications and Automation (ECA); Department of Radio Science and Engineering (RAD); Circuit Theory Group. (SCEE 2008 web site: http://www.ct.tkk.fi/scee2008/). The aim of the SCEE 2008 conference is to bring together scientists from academia and industry with the goal of intensive discussions on modeling and numerical simulation of electronic circuits and of electromagnetic fields. The conference is mainly directed towards mathematicians and electrical engineers. The SCEE 2008 conference has the following four main topics: 1. Computational Electromagnetics (CE), 2. Circuit Simulation (CS), 3. Coupled Problems (CP), 4. Mathematical and Computational Methods (CM). The selection of abstracts in this book was carried out by the Program Committee; each abstract was reviewed by two or three reviewers. The authors of all accepted abstracts were invited to submit an extended full paper, which will be reviewed as well. The accepted full papers will later on be published in a separate post-conference book

  18. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  19. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  20. FESetup: Automating Setup for Alchemical Free Energy Simulations.

    Science.gov (United States)

    Loeffler, Hannes H; Michel, Julien; Woods, Christopher

    2015-12-28

    FESetup is a new pipeline tool which can be used flexibly within larger workflows. The tool aims to support fast and easy setup of alchemical free energy simulations for molecular simulation packages such as AMBER, GROMACS, Sire, or NAMD. Post-processing methods like MM-PBSA and LIE can be set up as well. Ligands are automatically parametrized with AM1-BCC, and atom mappings for a single topology description are computed with a maximum common substructure search (MCSS) algorithm. An abstract molecular dynamics (MD) engine can be used for equilibration prior to free energy setup or standalone. Currently, all modern AMBER force fields are supported. Ease of use, robustness of the code, and automation where it is feasible are the main development goals. The project follows an open development model, and we welcome contributions.

  1. Model for spatial synthesis of automated control system of the GCR type reactor; Model za prostornu sintezu sistema automatskog upravljanja reaktora GCR tipa

    Energy Technology Data Exchange (ETDEWEB)

    Lazarevic, B; Matausek, M [Institut za nuklearne nauke ' Boris Kidric' , Vinca, Belgrade (Yugoslavia)

    1966-07-01

    This paper describes the model which was developed for synthesis of spatial distribution of automated control elements in the reactor. It represents a general reliable mathematical model for analyzing transition states and synthesis of the automated control and regulation systems of GCR type reactors. One-dimensional system was defined under assumption that the time dependence of parameters of the neutron diffusion equation are identical in the total volume of the reactor and that spatial distribution of neutrons is time independent. It is shown that this assumption is satisfactory in case of short term variations which are relevant for safety analysis.

  2. Automation of the Jarrell--Ash model 70-314 emission spectrometer

    International Nuclear Information System (INIS)

    Morris, W.F.; Fisher, E.R.; Taber, L.

    1978-01-01

    Automation of the Jarrell-Ash 3.4-Meter Ebert direct-reading emission spectrometer with digital scaler readout is described. The readout is interfaced to a Data General NOVA 840 minicomputer. The automation code consists of BASIC language programs for interactive routines, data processing, and report generation. Call statements within the BASIC programs invoke assembly language routines for real-time data acquisition and control. In addition, the automation objectives as well as the spectrometer-computer system functions, coding, and operating instructions are presented

  3. Component-based modeling of systems for automated fault tree generation

    International Nuclear Information System (INIS)

    Majdara, Aref; Wakabayashi, Toshio

    2009-01-01

    One of the challenges in the field of automated fault tree construction is to find an efficient modeling approach that can support modeling of different types of systems without ignoring any necessary details. In this paper, we are going to represent a new system of modeling approach for computer-aided fault tree generation. In this method, every system model is composed of some components and different types of flows propagating through them. Each component has a function table that describes its input-output relations. For the components having different operational states, there is also a state transition table. Each component can communicate with other components in the system only through its inputs and outputs. A trace-back algorithm is proposed that can be applied to the system model to generate the required fault trees. The system modeling approach and the fault tree construction algorithm are applied to a fire sprinkler system and the results are presented

  4. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  5. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  6. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    Science.gov (United States)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  7. Building control. Technical building systems: Automation and management; Building Control. Technische Gebaeudesysteme: Automation und Bewirtschaftung

    Energy Technology Data Exchange (ETDEWEB)

    Kranz, H.R.; Baenninger, M.; Bieler, P.; Brettschneider, J.P.; Damnig, A.; Fassbender, H.W.; Friedrichs, K.; Gauchel, J.; Hegewald, B.; Kaelin, W.; Lezius, A.; Markert, H.; Oehler, A.; Otto, J.; Puettmer, M. Jr.; Rohrbacher, H.; Schuerdt, K.; Vogt, D.; Wittling, J.

    1995-12-31

    Cost-optimised management and maintenance of buildings can no longer be carried out without electronic data processing. The present anthology gives a comprehensive overview of the planning and operation of building automation systems. The following topics are discussed: ecological cooling and facade concepts, facility management, jeopardy alarm technology, building automation, communication technology, open communication and networks, building system technology, norms and directives, building right and law. A special abstract has been prepared for each of the 23 chapters. (BWI). 260 figs., 161 refs. [Deutsch] Kostenoptimiertes Management, Bewirtschaftung und Instandhaltung von Gebaeuden sind ohne EDV nicht mehr denkbar. Das vorliegende Buch gibt einen umfassenden Ueberblick ueber Planung und Betrieb von Gebaeudeautomationssystemen. Es wird dabei auf folgende Themenkomplexe eingegangen: Oekologische Kuehl- und Fassadenkonzepte; Facility Management, Gefahrenmeldetechnik, Gebaeudeautomation; Kommunikationstechnik, offenen Kommunikation und Netzwerke; Gebaeudesystemtechnik und Installationsbus; Energiemanagement; Betreibererfahrungen; Normen und Richtlinien; Baurecht und Gesetz. Fuer alle 23 Einzelkapitel wurde eine gesonderte inhaltliche Erschliessung durchgefuehrt. (BWI)

  8. Automated procedures for sizing aerospace vehicle structures /SAVES/

    Science.gov (United States)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  9. User-Extensible Graphics Using Abstract Structure,

    Science.gov (United States)

    1987-08-01

    Flex 6 The Algol68 model of the graphical abstract structure 5 The creation of a PictureDefinition 6 The making of a picture from a PictureDefinition 7...data together with the operations that can be performed on that data. i 7! ś I _ § 4, The Alqol68 model of the graphical abstract structure Every

  10. Technical Work Plan for: Near Field Environment: Engineered Barrier System: Radionuclide Transport Abstraction Model Report

    International Nuclear Information System (INIS)

    J.D. Schreiber

    2006-01-01

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model will be revised to be consistent with

  11. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  12. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  13. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C 8 MIM]NTf 2 ) is formed through the reaction between [C 8 MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf 2 ) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL −1 . The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL −1 . The proposed method opens a new avenue

  14. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  15. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  16. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  17. Automating CPM-GOMS

    Science.gov (United States)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  18. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  19. Automated reconstruction of 3D models from real environments

    Science.gov (United States)

    Sequeira, V.; Ng, K.; Wolfart, E.; Gonçalves, J. G. M.; Hogg, D.

    This paper describes an integrated approach to the construction of textured 3D scene models of building interiors from laser range data and visual images. This approach has been implemented in a collection of algorithms and sensors within a prototype device for 3D reconstruction, known as the EST (Environmental Sensor for Telepresence). The EST can take the form of a push trolley or of an autonomous mobile platform. The Autonomous EST (AEST) has been designed to provide an integrated solution for automating the creation of complete models. Embedded software performs several functions, including triangulation of the range data, registration of video texture, registration and integration of data acquired from different capture points. Potential applications include facilities management for the construction industry and creating reality models to be used in general areas of virtual reality, for example, virtual studios, virtualised reality for content-related applications (e.g., CD-ROMs), social telepresence, architecture and others. The paper presents the main components of the EST/AEST, and presents some example results obtained from the prototypes. The reconstructed model is encoded in VRML format so that it is possible to access and view the model via the World Wide Web.

  20. Automated Protocol for Large-Scale Modeling of Gene Expression Data.

    Science.gov (United States)

    Hall, Michelle Lynn; Calkins, David; Sherman, Woody

    2016-11-28

    With the continued rise of phenotypic- and genotypic-based screening projects, computational methods to analyze, process, and ultimately make predictions in this field take on growing importance. Here we show how automated machine learning workflows can produce models that are predictive of differential gene expression as a function of a compound structure using data from A673 cells as a proof of principle. In particular, we present predictive models with an average accuracy of greater than 70% across a highly diverse ∼1000 gene expression profile. In contrast to the usual in silico design paradigm, where one interrogates a particular target-based response, this work opens the opportunity for virtual screening and lead optimization for desired multitarget gene expression profiles.

  1. Automation based on knowledge modeling theory and its applications in engine diagnostic systems using Space Shuttle Main Engine vibrational data. M.S. Thesis

    Science.gov (United States)

    Kim, Jonnathan H.

    1995-01-01

    Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).

  2. Explanatory item response modelling of an abstract reasoning assessment: A case for modern test design

    OpenAIRE

    Helland, Fredrik

    2016-01-01

    Assessment is an integral part of society and education, and for this reason it is important to know what you measure. This thesis is about explanatory item response modelling of an abstract reasoning assessment, with the objective to create a modern test design framework for automatic generation of valid and precalibrated items of abstract reasoning. Modern test design aims to strengthen the connections between the different components of a test, with a stress on strong theory, systematic it...

  3. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  4. On the emergence of pervasive home automation

    DEFF Research Database (Denmark)

    Torbensen, Rune Sonnich

    2012-01-01

    bridges and expandable via USB modules so that new data-communication technologies can be connected to cover the whole residence. An ’IHAP ready’ flexible communication software framework that supports wireless end-device development is proposed. The idea is to bootstrap this new market with open source...... end-devices via the Open Device Service Description Language, which employs abstract service types to describe transformations to the simple end-device application protocols used in home automation. A security system called Trusted Domain grants access for remote control of home automation devices...... by smartphones, M2M applications, and service providers. These Internet nodes and home gateways become the trusted members of a home network capable of spanning several network segments. To include new members, both locally and remotely, a new user-friendly method for establishing trust between remote devices...

  5. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  6. Abstracted Workow Framework with a Structure from Motion Application

    Science.gov (United States)

    Rossi, Adam J.

    In scientific and engineering disciplines, from academia to industry, there is an increasing need for the development of custom software to perform experiments, construct systems, and develop products. The natural mindset initially is to shortcut and bypass all overhead and process rigor in order to obtain an immediate result for the problem at hand, with the misconception that the software will simply be thrown away at the end. In a majority of the cases, it turns out the software persists for many years, and likely ends up in production systems for which it was not initially intended. In the current study, a framework that can be used in both industry and academic applications mitigates underlying problems associated with developing scientific and engineering software. This results in software that is much more maintainable, documented, and usable by others, specifically allowing new users to extend capabilities of components already implemented in the framework. There is a multi-disciplinary need in the fields of imaging science, computer science, and software engineering for a unified implementation model, which motivates the development of an abstracted software framework. Structure from motion (SfM) has been identified as one use case where the abstracted workflow framework can improve research efficiencies and eliminate implementation redundancies in scientific fields. The SfM process begins by obtaining 2D images of a scene from different perspectives. Features from the images are extracted and correspondences are established. This provides a sufficient amount of information to initialize the problem for fully automated processing. Transformations are established between views, and 3D points are established via triangulation algorithms. The parameters for the camera models for all views / images are solved through bundle adjustment, establishing a highly consistent point cloud. The initial sparse point cloud and camera matrices are used to generate a dense

  7. A conceptual model of the automated credibility assessment of the volunteered geographic information

    International Nuclear Information System (INIS)

    Idris, N H; Jackson, M J; Ishak, M H I

    2014-01-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd – sourced based applications. There are two main components proposed to be assessed in the conceptual model – metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers

  8. Advances in automated noise data acquisition and noise source modeling for power reactors

    International Nuclear Information System (INIS)

    Clapp, N.E. Jr.; Kryter, R.C.; Sweeney, F.J.; Renier, J.A.

    1981-01-01

    A newly expanded program, directed toward achieving a better appreciation of both the strengths and limitations of on-line, noise-based, long-term surveillance programs for nuclear reactors, is described. Initial results in the complementary experimental (acquisition and automated screening of noise signatures) and theoretical (stochastic modeling of likely noise sources) areas of investigation are given

  9. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  10. Work Practice Simulation of Complex Human-Automation Systems in Safety Critical Situations: The Brahms Generalized berlingen Model

    Science.gov (United States)

    Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael

    2013-01-01

    The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on­-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an

  11. A distributed substation automation model based on the multi-agents technology; Um modelo distribuido de automacao de subestacoes baseado em tecnologia multiagentes

    Energy Technology Data Exchange (ETDEWEB)

    Geus, Klaus de; Milsztajn, Flavio; Kolb, Carlos Jose Johann; Dometerco, Jose Henrique; Souza, Alexandre Mendonca de; Braga, Ciro de Carvalho; Parolin, Emerson Luis; Frisch, Arlenio Carneiro; Fortunato Junior, Luiz Kiss; Erzinger Junior, Augusto; Jonack, Marco Antonio; Guiera, Anderson Juliano Azambuja [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)]. E-mail: klaus@copel.com; flaviomil@copel.com; kolb@copel.com; dometerc@copel.com; alexandre.mendonca@copel.com; ciro@copel.com; parolin@copel.com; arlenio@copel.com; luiz.kiss@copel.com; aerzinger@copel.com; jonack@copel.com; guiera@copel.com

    2006-10-15

    The main purpose of this paper is to analyse distributed computing technology which can be used in substation automation systems. Based on performance comparative results obtained in laboratory, a specific model for distributed substation automation is proposed considering the current model employed at COPEL - Companhia Paranaense de Energia. The proposed model is based on the multi-agents technology, which has lately received special attention in the development of distributed systems with local intelligence. (author)

  12. From Abstract Art to Abstracted Artists

    Directory of Open Access Journals (Sweden)

    Romi Mikulinsky

    2016-11-01

    Full Text Available What lineage connects early abstract films and machine-generated YouTube videos? Hans Richter’s famous piece Rhythmus 21 is considered to be the first abstract film in the experimental tradition. The Webdriver Torso YouTube channel is composed of hundreds of thousands of machine-generated test patterns designed to check frequency signals on YouTube. This article discusses geometric abstraction vis-à-vis new vision, conceptual art and algorithmic art. It argues that the Webdriver Torso is an artistic marvel indicative of a form we call mathematical abstraction, which is art performed by computers and, quite possibly, for computers.

  13. Bounded Rationality of Generalized Abstract Fuzzy Economies

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2014-01-01

    Full Text Available By using a nonlinear scalarization technique, the bounded rationality model M for generalized abstract fuzzy economies in finite continuous spaces is established. Furthermore, by using the model M, some new theorems for structural stability and robustness to (λ,ϵ-equilibria of generalized abstract fuzzy economies are proved.

  14. A Co-Opetitive Automated Negotiation Model for Vertical Allied Enterprises Teams and Stakeholders

    Directory of Open Access Journals (Sweden)

    Taiguang Gao

    2018-04-01

    Full Text Available Upstream and downstream of supply chain enterprises often form a tactic vertical alliance to enhance their operational efficiency and maintain their competitive edges in the market. Hence, it is critical for an alliance to collaborate over their internal resources and resolve the profit conflicts among members, so that the functionality required by stakeholders can be fulfilled. As an effective solution, automated negotiation for the vertical allied enterprises team and stakeholder will sufficiently make use of emerging team advantages and significantly reduce the profit conflicts in teams with grouping decisions rather than unilateral decisions by some leader. In this paper, an automated negotiation model is designed to describe both the collaborative game process among the team members and the competitive negotiation process between the allied team and the stakeholder. Considering the co-competitiveness of the vertical allied team, the designed model helps the team members making decision for their own sake, and the team counter-offers for the ongoing negotiation are generated with non-cooperative game process, where the profit derived from negotiation result is distributed with Shapley value method according to contribution or importance contributed by each team member. Finally, a case study is given to testify the effectiveness of the designed model.

  15. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  16. Model-based security testing

    OpenAIRE

    Schieferdecker, Ina; Großmann, Jürgen; Schneider, Martin

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security...

  17. Nuclear materials safeguards. Volume II. 1975--March 1976 (a bibliography with abstracts). Report for 1975--Mar 1976

    International Nuclear Information System (INIS)

    Grooms, D.W.

    1976-03-01

    Citations cover the methods of safeguarding nuclear materials through effective management, accountability, nondestructive assays, instrumentation, and automated continuous inventory systems. Problem areas and recommendations for improving the management of nuclear materials are included. (Contains 88 abstracts) See also NTIS/PS-76/0200, Nuclear Materials Safeguards. Vol. 1. 1964-1974 (A title bibliography)

  18. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    OpenAIRE

    Bottaro, Márcio; Nagy, Balázs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral

    2017-01-01

    Abstract Introduction To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naïve human observers were asked to mark the light field edge according t...

  19. Abstract algebra for physicists

    International Nuclear Information System (INIS)

    Zeman, J.

    1975-06-01

    Certain recent models of composite hadrons involve concepts and theorems from abstract algebra which are unfamiliar to most theoretical physicists. The algebraic apparatus needed for an understanding of these models is summarized here. Particular emphasis is given to algebraic structures which are not assumed to be associative. (2 figures) (auth)

  20. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  1. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  2. Automating dChip: toward reproducible sharing of microarray data analysis

    Directory of Open Access Journals (Sweden)

    Li Cheng

    2008-05-01

    Full Text Available Abstract Background During the past decade, many software packages have been developed for analysis and visualization of various types of microarrays. We have developed and maintained the widely used dChip as a microarray analysis software package accessible to both biologist and data analysts. However, challenges arise when dChip users want to analyze large number of arrays automatically and share data analysis procedures and parameters. Improvement is also needed when the dChip user support team tries to identify the causes of reported analysis errors or bugs from users. Results We report here implementation and application of the dChip automation module. Through this module, dChip automation files can be created to include menu steps, parameters, and data viewpoints to run automatically. A data-packaging function allows convenient transfer from one user to another of the dChip software, microarray data, and analysis procedures, so that the second user can reproduce the entire analysis session of the first user. An analysis report file can also be generated during an automated run, including analysis logs, user comments, and viewpoint screenshots. Conclusion The dChip automation module is a step toward reproducible research, and it can prompt a more convenient and reproducible mechanism for sharing microarray software, data, and analysis procedures and results. Automation data packages can also be used as publication supplements. Similar automation mechanisms could be valuable to the research community if implemented in other genomics and bioinformatics software packages.

  3. Continuous Automated Model EvaluatiOn (CAMEO) complementing the critical assessment of structure prediction in CASP12.

    Science.gov (United States)

    Haas, Jürgen; Barbato, Alessandro; Behringer, Dario; Studer, Gabriel; Roth, Steven; Bertoni, Martino; Mostaguir, Khaled; Gumienny, Rafal; Schwede, Torsten

    2018-03-01

    Every second year, the community experiment "Critical Assessment of Techniques for Structure Prediction" (CASP) is conducting an independent blind assessment of structure prediction methods, providing a framework for comparing the performance of different approaches and discussing the latest developments in the field. Yet, developers of automated computational modeling methods clearly benefit from more frequent evaluations based on larger sets of data. The "Continuous Automated Model EvaluatiOn (CAMEO)" platform complements the CASP experiment by conducting fully automated blind prediction assessments based on the weekly pre-release of sequences of those structures, which are going to be published in the next release of the PDB Protein Data Bank. CAMEO publishes weekly benchmarking results based on models collected during a 4-day prediction window, on average assessing ca. 100 targets during a time frame of 5 weeks. CAMEO benchmarking data is generated consistently for all participating methods at the same point in time, enabling developers to benchmark and cross-validate their method's performance, and directly refer to the benchmarking results in publications. In order to facilitate server development and promote shorter release cycles, CAMEO sends weekly email with submission statistics and low performance warnings. Many participants of CASP have successfully employed CAMEO when preparing their methods for upcoming community experiments. CAMEO offers a variety of scores to allow benchmarking diverse aspects of structure prediction methods. By introducing new scoring schemes, CAMEO facilitates new development in areas of active research, for example, modeling quaternary structure, complexes, or ligand binding sites. © 2017 Wiley Periodicals, Inc.

  4. Incremental learning for automated knowledge capture

    Energy Technology Data Exchange (ETDEWEB)

    Benz, Zachary O. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Basilico, Justin Derrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Davis, Warren Leon [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dixon, Kevin R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Brian S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Nathaniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wendt, Jeremy Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-12-01

    People responding to high-consequence national-security situations need tools to help them make the right decision quickly. The dynamic, time-critical, and ever-changing nature of these situations, especially those involving an adversary, require models of decision support that can dynamically react as a situation unfolds and changes. Automated knowledge capture is a key part of creating individualized models of decision making in many situations because it has been demonstrated as a very robust way to populate computational models of cognition. However, existing automated knowledge capture techniques only populate a knowledge model with data prior to its use, after which the knowledge model is static and unchanging. In contrast, humans, including our national-security adversaries, continually learn, adapt, and create new knowledge as they make decisions and witness their effect. This artificial dichotomy between creation and use exists because the majority of automated knowledge capture techniques are based on traditional batch machine-learning and statistical algorithms. These algorithms are primarily designed to optimize the accuracy of their predictions and only secondarily, if at all, concerned with issues such as speed, memory use, or ability to be incrementally updated. Thus, when new data arrives, batch algorithms used for automated knowledge capture currently require significant recomputation, frequently from scratch, which makes them ill suited for use in dynamic, timecritical, high-consequence decision making environments. In this work we seek to explore and expand upon the capabilities of dynamic, incremental models that can adapt to an ever-changing feature space.

  5. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  6. Automated Feature Extraction of Foredune Morphology from Terrestrial Lidar Data

    Science.gov (United States)

    Spore, N.; Brodie, K. L.; Swann, C.

    2014-12-01

    Foredune morphology is often described in storm impact prediction models using the elevation of the dune crest and dune toe and compared with maximum runup elevations to categorize the storm impact and predicted responses. However, these parameters do not account for other foredune features that may make them more or less erodible, such as alongshore variations in morphology, vegetation coverage, or compaction. The goal of this work is to identify other descriptive features that can be extracted from terrestrial lidar data that may affect the rate of dune erosion under wave attack. Daily, mobile-terrestrial lidar surveys were conducted during a 6-day nor'easter (Hs = 4 m in 6 m water depth) along 20km of coastline near Duck, North Carolina which encompassed a variety of foredune forms in close proximity to each other. This abstract will focus on the tools developed for the automated extraction of the morphological features from terrestrial lidar data, while the response of the dune will be presented by Brodie and Spore as an accompanying abstract. Raw point cloud data can be dense and is often under-utilized due to time and personnel constraints required for analysis, since many algorithms are not fully automated. In our approach, the point cloud is first projected into a local coordinate system aligned with the coastline, and then bare earth points are interpolated onto a rectilinear 0.5 m grid creating a high resolution digital elevation model. The surface is analyzed by identifying features along each cross-shore transect. Surface curvature is used to identify the position of the dune toe, and then beach and berm morphology is extracted shoreward of the dune toe, and foredune morphology is extracted landward of the dune toe. Changes in, and magnitudes of, cross-shore slope, curvature, and surface roughness are used to describe the foredune face and each cross-shore transect is then classified using its pre-storm morphology for storm-response analysis.

  7. Data Abstraction Mechanisms in Sina/st

    NARCIS (Netherlands)

    Meyrowitz, N.K.; Aksit, Mehmet; Tripathi, Anand

    1988-01-01

    This paper describes a new data abstraction mechanism in an object-oriented model of computing. The data abstraction mechanism described here has been devised in the context of the design of Sina/st language. In Sina/st no language constructs have been adopted for specifying inheritance or

  8. Use of the DynaLearn learning environment by naïve student modelers : Implications for automated support

    NARCIS (Netherlands)

    Noble, R.; Bredeweg, B.; Biswas, G.; Bull, S.; Kay, J.; Mitrovic, A.

    2011-01-01

    This paper shows that naïve students will require coaching to overcome the difficulties they face in identifying the important concepts to be modeled, and understanding the causal meta-vocabulary needed for conceptual models. The results of this study will be incorporated in the automated feedback

  9. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  10. Interoperability between OPC UA and AutomationML

    OpenAIRE

    Henßen, Robert; Schleipen, Miriam

    2014-01-01

    OPC UA (OPC Unified Architecture) is a platform-independent standard series (IEC 62541) [1], [2] for communication of industrial automation devices and systems. The OPC Unified Architecture is an advanced communication technology for process control. Certainly the launching costs for the initial information model are quite high. AutomationML (Automation Markup Language) is an upcoming open standard series (IEC 62714) [3], [4] for describing production plants or plant components. The goal of t...

  11. Studying human-automation interactions: methodological lessons learned from the human-centred automation experiments 1997-2001

    International Nuclear Information System (INIS)

    Massaiu, Salvatore; Skjerve, Ann Britt Miberg; Skraaning, Gyrd Jr.; Strand, Stine; Waeroe, Irene

    2004-04-01

    This report documents the methodological lessons learned from the Human Centred Automation (HCA) programme both in terms of psychometric evaluation of the measurement techniques developed for human-automation interaction study, and in terms of the application of advanced statistical methods for analysis of experiments. The psychometric evaluation is based on data from the four experiments performed within the HCA programme. The result is a single-source reference text of measurement instruments for the study of human-automation interaction, part of which were specifically developed by the programme. The application of advanced statistical techniques is exemplified by additional analyses performed on the IPSN-HCA experiment of 1998. Special importance is given to the statistical technique Structural Equation Modeling, for the possibility it offers to advance, and empirically test, comprehensive explanations about human-automation interactions. The additional analyses of the IPSN-HCA experiment investigated how the operators formed judgments about their own performance. The issue is of substantive interest for human automation interaction research because the operators' over- or underestimation of their own performance could be seen as a symptom of human-machine mismatch, and a potential latent failure. These analyses concluded that it is the interplay between (1) the level of automation and several factors that determines the operators' bias in performance self-estimation: (2) the nature of the task, (3) the level of scenario complexity, and (4) the level of trust in the automatic system. A structural model that expresses the interplay of all these factors was empirically evaluated and was found able to provide a concise and elegant explanation of the intricate pattern of relationships between the identified factors. (Author)

  12. Development strategy and process models for phased automation of design and digital manufacturing electronics

    Science.gov (United States)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  13. A SOA-Based Embedded Systems Development Environment for Industrial Automation

    Directory of Open Access Journals (Sweden)

    Thramboulidis KC

    2008-01-01

    Full Text Available Abstract Currently available toolsets for the development of embedded systems adopt traditional architectural styles and do not cover the whole requirements of the development process, with extensibility being the major drawback. In this paper, a service-oriented architectural framework that exploits semantic web is defined. Features required in the development process are defined as web services and published into the public domain, so as to be used on demand by developers to construct their projects' specific integrated development environments (IDEs. The infrastructure required to build a web service-based IDE is presented. Specific web services are defined and the way these services affect the development process is discussed. Special focus is given on the device model and the means that such a modelling can significantly improve the development process. A prototype implementation demonstrates the applicability and usefulness of the proposed demand-led development process in the industrial automation domain.

  14. VEST: Abstract vector calculus simplification in Mathematica

    Science.gov (United States)

    Squire, J.; Burby, J.; Qin, H.

    2014-01-01

    We present a new package, VEST (Vector Einstein Summation Tools), that performs abstract vector calculus computations in Mathematica. Through the use of index notation, VEST is able to reduce three-dimensional scalar and vector expressions of a very general type to a well defined standard form. In addition, utilizing properties of the Levi-Civita symbol, the program can derive types of multi-term vector identities that are not recognized by reduction, subsequently applying these to simplify large expressions. In a companion paper Burby et al. (2013) [12], we employ VEST in the automation of the calculation of high-order Lagrangians for the single particle guiding center system in plasma physics, a computation which illustrates its ability to handle very large expressions. VEST has been designed to be simple and intuitive to use, both for basic checking of work and more involved computations.

  15. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  16. Physics of automated driving in framework of three-phase traffic theory.

    Science.gov (United States)

    Kerner, Boris S

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  17. Necessity of supporting situation understanding to prevent over-trust in automation

    International Nuclear Information System (INIS)

    Itoh, Makoto

    2010-01-01

    It is necessary to clarify how a human operator comes to expect that an automation will perform a task successfully even beyond the limit of the automation. This paper proposes a way of modeling trust in automation by which it is possible to discuss how an operator's trust in automation becomes over-trust. On the basis of the model, an experiment using a micro world was conducted to examine whether the range of user's expectation per se surpasses the limit of the capability of the automation. The results revealed that informing the human operator of the functional limit of capability of automation without giving an appropriate reason was effective but not perfect for avoiding human operator's over trust. It was also shown that the understanding of the automation's limitation can be changed through experiences due to confusion about the situation, therefore, it is necessary to support adequate situation understanding of the human operator in order to prevent over-trust in automation. (author)

  18. Physics of automated driving in framework of three-phase traffic theory

    Science.gov (United States)

    Kerner, Boris S.

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  19. Application of DEN refinement and automated model building to a difficult case of molecular-replacement phasing: the structure of a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum.

    Science.gov (United States)

    Brunger, Axel T; Das, Debanu; Deacon, Ashley M; Grant, Joanna; Terwilliger, Thomas C; Read, Randy J; Adams, Paul D; Levitt, Michael; Schröder, Gunnar F

    2012-04-01

    Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence.

  20. Comparison of manual versus automated data collection method for an evidence-based nursing practice study.

    Science.gov (United States)

    Byrne, M D; Jordan, T R; Welle, T

    2013-01-01

    The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 "false negative" patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare.

  1. Tiling as a Durable Abstraction for Parallelism and Data Locality

    Energy Technology Data Exchange (ETDEWEB)

    Unat, Didem [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chan, Cy P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Zhang, Weiqun [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bell, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Shalf, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-11-18

    Tiling is a useful loop transformation for expressing parallelism and data locality. Automated tiling transformations that preserve data-locality are increasingly important due to hardware trends towards massive parallelism and the increasing costs of data movement relative to the cost of computing. We propose TiDA as a durable tiling abstraction that centralizes parameterized tiling information within array data types with minimal changes to the source code. The data layout information can be used by the compiler and runtime to automatically manage parallelism, optimize data locality, and schedule tasks intelligently. In this study, we present the design features and early interface of TiDA along with some preliminary results.

  2. ABSTRACTION OF INFORMATION FROM 2- AND 3-DIMENSIONAL PORFLOW MODELS INTO A 1-D GOLDSIM MODEL - 11404

    International Nuclear Information System (INIS)

    Taylor, G.; Hiergesell, R.

    2010-01-01

    The Savannah River National Laboratory has developed a 'hybrid' approach to Performance Assessment modeling which has been used for a number of Performance Assessments. This hybrid approach uses a multi-dimensional modeling platform (PorFlow) to develop deterministic flow fields and perform contaminant transport. The GoldSim modeling platform is used to develop the Sensitivity and Uncertainty analyses. Because these codes are performing complementary tasks, it is incumbent upon them that for the deterministic cases they produce very similar results. This paper discusses two very different waste forms, one with no engineered barriers and one with engineered barriers, each of which present different challenges to the abstraction of data. The hybrid approach to Performance Assessment modeling used at the SRNL uses a 2-D unsaturated zone (UZ) and a 3-D saturated zone (SZ) model in the PorFlow modeling platform. The UZ model consists of the waste zone and the unsaturated zoned between the waste zone and the water table. The SZ model consists of source cells beneath the waste form to the points of interest. Both models contain 'buffer' cells so that modeling domain boundaries do not adversely affect the calculation. The information pipeline between the two models is the contaminant flux. The domain contaminant flux, typically in units of moles (or Curies) per year from the UZ model is used as a boundary condition for the source cells in the SZ. The GoldSim modeling component of the hybrid approach is an integrated UZ-SZ model. The model is a 1-D representation of the SZ, typically 1-D in the UZ, but as discussed below, depending on the waste form being analyzed may contain pseudo-2-D elements. A waste form at the Savannah River Site (SRS) which has no engineered barriers is commonly referred to as a slit trench. A slit trench, as its name implies, is an unlined trench, typically 6 m deep, 6 m wide, and 200 m long. Low level waste consisting of soil, debris, rubble, wood

  3. Designing a Software Test Automation Framework

    Directory of Open Access Journals (Sweden)

    Sabina AMARICAI

    2014-01-01

    Full Text Available Testing is an art and science that should ultimately lead to lower cost businesses through increasing control and reducing risk. Testing specialists should thoroughly understand the system or application from both the technical and the business perspective, and then design, build and implement the minimum-cost, maximum-coverage validation framework. Test Automation is an important ingredient for testing large scale applications. In this paper we discuss several test automation frameworks, their advantages and disadvantages. We also propose a custom automation framework model that is suited for applications with very complex business requirements and numerous interfaces.

  4. Automated systems to identify relevant documents in product risk management

    Science.gov (United States)

    2012-01-01

    Background Product risk management involves critical assessment of the risks and benefits of health products circulating in the market. One of the important sources of safety information is the primary literature, especially for newer products which regulatory authorities have relatively little experience with. Although the primary literature provides vast and diverse information, only a small proportion of which is useful for product risk assessment work. Hence, the aim of this study is to explore the possibility of using text mining to automate the identification of useful articles, which will reduce the time taken for literature search and hence improving work efficiency. In this study, term-frequency inverse document-frequency values were computed for predictors extracted from the titles and abstracts of articles related to three tumour necrosis factors-alpha blockers. A general automated system was developed using only general predictors and was tested for its generalizability using articles related to four other drug classes. Several specific automated systems were developed using both general and specific predictors and training sets of different sizes in order to determine the minimum number of articles required for developing such systems. Results The general automated system had an area under the curve value of 0.731 and was able to rank 34.6% and 46.2% of the total number of 'useful' articles among the first 10% and 20% of the articles presented to the evaluators when tested on the generalizability set. However, its use may be limited by the subjective definition of useful articles. For the specific automated system, it was found that only 20 articles were required to develop a specific automated system with a prediction performance (AUC 0.748) that was better than that of general automated system. Conclusions Specific automated systems can be developed rapidly and avoid problems caused by subjective definition of useful articles. Thus the efficiency of

  5. Establishment of automated culture system for murine induced pluripotent stem cells

    Directory of Open Access Journals (Sweden)

    Koike Hiroyuki

    2012-11-01

    Full Text Available Abstract Background Induced pluripotent stem (iPS cells can differentiate into any cell type, which makes them an attractive resource in fields such as regenerative medicine, drug screening, or in vitro toxicology. The most important prerequisite for these industrial applications is stable supply and uniform quality of iPS cells. Variation in quality largely results from differences in handling skills between operators in laboratories. To minimize these differences, establishment of an automated iPS cell culture system is necessary. Results We developed a standardized mouse iPS cell maintenance culture, using an automated cell culture system housed in a CO2 incubator commonly used in many laboratories. The iPS cells propagated in a chamber uniquely designed for automated culture and showed specific colony morphology, as for manual culture. A cell detachment device in the system passaged iPS cells automatically by dispersing colonies to single cells. In addition, iPS cells were passaged without any change in colony morphology or expression of undifferentiated stem cell markers during the 4 weeks of automated culture. Conclusions Our results show that use of this compact, automated cell culture system facilitates stable iPS cell culture without obvious effects on iPS cell pluripotency or colony-forming ability. The feasibility of iPS cell culture automation may greatly facilitate the use of this versatile cell source for a variety of biomedical applications.

  6. ABSTRACTION OF DRIFT SEEPAGE

    International Nuclear Information System (INIS)

    Wilson, Michael L.

    2001-01-01

    Drift seepage refers to flow of liquid water into repository emplacement drifts, where it can potentially contribute to degradation of the engineered systems and release and transport of radionuclides within the drifts. Because of these important effects, seepage into emplacement drifts is listed as a ''principal factor for the postclosure safety case'' in the screening criteria for grading of data in Attachment 1 of AP-3.15Q, Rev. 2, ''Managing Technical Product Inputs''. Abstraction refers to distillation of the essential components of a process model into a form suitable for use in total-system performance assessment (TSPA). Thus, the purpose of this analysis/model is to put the information generated by the seepage process modeling in a form appropriate for use in the TSPA for the Site Recommendation. This report also supports the Unsaturated-Zone Flow and Transport Process Model Report. The scope of the work is discussed below. This analysis/model is governed by the ''Technical Work Plan for Unsaturated Zone Flow and Transport Process Model Report'' (CRWMS MandO 2000a). Details of this activity are in Addendum A of the technical work plan. The original Work Direction and Planning Document is included as Attachment 7 of Addendum A. Note that the Work Direction and Planning Document contains tasks identified for both Performance Assessment Operations (PAO) and Natural Environment Program Operations (NEPO). Only the PAO tasks are documented here. The planning for the NEPO activities is now in Addendum D of the same technical work plan and the work is documented in a separate report (CRWMS MandO 2000b). The Project has been reorganized since the document was written. The responsible organizations in the new structure are the Performance Assessment Department and the Unsaturated Zone Department, respectively. The work plan for the seepage abstraction calls for determining an appropriate abstraction methodology, determining uncertainties in seepage, and providing

  7. The abstract geometry modeling language (AgML): experience and road map toward eRHIC

    International Nuclear Information System (INIS)

    Webb, Jason; Lauret, Jerome; Perevoztchikov, Victor

    2014-01-01

    The STAR experiment has adopted an Abstract Geometry Modeling Language (AgML) as the primary description of our geometry model. AgML establishes a level of abstraction, decoupling the definition of the detector from the software libraries used to create the concrete geometry model. Thus, AgML allows us to support both our legacy GEANT 3 simulation application and our ROOT/TGeo based reconstruction software from a single source, which is demonstrably self- consistent. While AgML was developed primarily as a tool to migrate away from our legacy FORTRAN-era geometry codes, it also provides a rich syntax geared towards the rapid development of detector models. AgML has been successfully employed by users to quickly develop and integrate the descriptions of several new detectors in the RHIC/STAR experiment including the Forward GEM Tracker (FGT) and Heavy Flavor Tracker (HFT) upgrades installed in STAR for the 2012 and 2013 runs. AgML has furthermore been heavily utilized to study future upgrades to the STAR detector as it prepares for the eRHIC era. With its track record of practical use in a live experiment in mind, we present the status, lessons learned and future of the AgML language as well as our experience in bringing the code into our production and development environments. We will discuss the path toward eRHIC and pushing the current model to accommodate for detector miss-alignment and high precision physics.

  8. An abstract machine for module replacement

    OpenAIRE

    Walton, Chris; Krl, Dilsun; Gilmore, Stephen

    1998-01-01

    In this paper we define an abstract machine model for the mλ typed intermediate language. This abstract machine is used to give a formal description of the operation of run-time module replacement from the programming language Dynamic ML. The essential technical device which we employ for module replacement is a modification of two-space copying garbage collection.

  9. Nuclear materials safeguards. Volume 2. 1975--February 1977 (a bibliography with abstracts). Report for 1975--Feb 77

    International Nuclear Information System (INIS)

    Grooms, D.W.

    1977-04-01

    Citations cover the methods of safeguarding nuclear materials through effective management, accountability, nondestructive assays, instrumentation, and automated continuous inventory systems. Problem areas and recommendations for improving the management of nuclear materials are included. (This updated bibliography contains 143 abstracts, 56 of which are new entries to the previous edition.) See also, NTIS/PS-76/0200, Nuclear Materials Safeguards. Vol. 1. 1964-1974

  10. Abstract interpretation over non-deterministic finite tree automate for set-based analysis of logic programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Puebla, G.

    2002-01-01

    , and describe its implementation. Both goal-dependent and goal-independent analysis are considered. Variations on the abstract domains operations are introduced, and we discuss the associated tradeoffs of precision and complexity. The experimental results indicate that this approach is a practical way...

  11. Semi-automated literature mining to identify putative biomarkers of disease from multiple biofluids

    Science.gov (United States)

    2014-01-01

    Background Computational methods for mining of biomedical literature can be useful in augmenting manual searches of the literature using keywords for disease-specific biomarker discovery from biofluids. In this work, we develop and apply a semi-automated literature mining method to mine abstracts obtained from PubMed to discover putative biomarkers of breast and lung cancers in specific biofluids. Methodology A positive set of abstracts was defined by the terms ‘breast cancer’ and ‘lung cancer’ in conjunction with 14 separate ‘biofluids’ (bile, blood, breastmilk, cerebrospinal fluid, mucus, plasma, saliva, semen, serum, synovial fluid, stool, sweat, tears, and urine), while a negative set of abstracts was defined by the terms ‘(biofluid) NOT breast cancer’ or ‘(biofluid) NOT lung cancer.’ More than 5.3 million total abstracts were obtained from PubMed and examined for biomarker-disease-biofluid associations (34,296 positive and 2,653,396 negative for breast cancer; 28,355 positive and 2,595,034 negative for lung cancer). Biological entities such as genes and proteins were tagged using ABNER, and processed using Python scripts to produce a list of putative biomarkers. Z-scores were calculated, ranked, and used to determine significance of putative biomarkers found. Manual verification of relevant abstracts was performed to assess our method’s performance. Results Biofluid-specific markers were identified from the literature, assigned relevance scores based on frequency of occurrence, and validated using known biomarker lists and/or databases for lung and breast cancer [NCBI’s On-line Mendelian Inheritance in Man (OMIM), Cancer Gene annotation server for cancer genomics (CAGE), NCBI’s Genes & Disease, NCI’s Early Detection Research Network (EDRN), and others]. The specificity of each marker for a given biofluid was calculated, and the performance of our semi-automated literature mining method assessed for breast and lung cancer

  12. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  13. Analysis of an Automated Vehicle Routing Problem in Logistics considering Path Interruption

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2017-01-01

    Full Text Available The application of automated vehicles in logistics can efficiently reduce the cost of logistics and reduce the potential risks in the last mile. Considering the path restriction in the initial stage of the application of automated vehicles in logistics, the conventional model for a vehicle routing problem (VRP is modified. Thus, the automated vehicle routing problem with time windows (AVRPTW model considering path interruption is established. Additionally, an improved particle swarm optimisation (PSO algorithm is designed to solve this problem. Finally, a case study is undertaken to test the validity of the model and the algorithm. Four automated vehicles are designated to execute all delivery tasks required by 25 stores. Capacities of all of the automated vehicles are almost fully utilised. It is of considerable significance for the promotion of automated vehicles in last-mile situations to develop such research into real problems arising in the initial period.

  14. Shot Automation for the National Ignition Facility

    International Nuclear Information System (INIS)

    Lagin, L J; Bettenhausen, R C; Beeler, R G; Bowers, G A; Carey, R.; Casavant, D.D.; Cline, B.D.; Demaret, R.D.; Domyancic, D.M.; Elko, S.D.; Fisher, J.M.; Hermann, M.R.; Krammen, J.E.; Kohut, T.R.; Marshall, C.D.; Mathisen, D.G.; Ludwigsen, A.P.; Patterson, Jr. R.W.; Sanchez, R.J.; Stout, E.A.; Van Arsdall, P.J.; Van Wonterghem, B.M.

    2005-01-01

    A shot automation framework has been developed and deployed during the past year to automate shots performed on the National Ignition Facility (NIF) using the Integrated Computer Control System This framework automates a 4-8 hour shot sequence, that includes inputting shot goals from a physics model, set up of the laser and diagnostics, automatic alignment of laser beams and verification of status. This sequence consists of set of preparatory verification shots, leading to amplified system shots using a 4-minute countdown, triggering during the last 2 seconds using a high-precision timing system, followed by post-shot analysis and archiving. The framework provides for a flexible, model-based execution driven of scriptable automation called macro steps. The framework is driven by high-level shot director software that provides a restricted set of shot life cycle state transitions to 25 collaboration supervisors that automate 8-laser beams (bundles) and a common set of shared resources. Each collaboration supervisor commands approximately 10 subsystem shot supervisors that perform automated control and status verification. Collaboration supervisors translate shot life cycle state commands from the shot director into sequences of ''macro steps'' to be distributed to each of its shot supervisors. Each Shot supervisor maintains order of macro steps for each subsystem and supports collaboration between macro steps. They also manage failure, restarts and rejoining into the shot cycle (if necessary) and manage auto/manual macro step execution and collaborations between other collaboration supervisors. Shot supervisors execute macro step shot functions commanded by collaboration supervisors. Each macro step has database-driven verification phases and a scripted perform phase. This provides for a highly flexible methodology for performing a variety of NIF shot types. Database tables define the order of work and dependencies (workflow) of macro steps to be performed for a

  15. Automation Marketplace 2010: New Models, Core Systems

    Science.gov (United States)

    Breeding, Marshall

    2010-01-01

    In a year when a difficult economy presented fewer opportunities for immediate gains, the major industry players have defined their business strategies with fundamentally different concepts of library automation. This is no longer an industry where companies compete on the basis of the best or the most features in similar products but one where…

  16. Abstract interpretation of reactive systems : preservation of CTL*

    NARCIS (Netherlands)

    Dams, D.; Grumberg, O.; Gerth, R.

    The advent of ever more complex reactive systems in increasingly critical areas calls for the development of automated verification techniques. Model checking is one such technique, which has proven quite successful. However, the state explosion problem remains the stumbling block in many

  17. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  18. Flexible End2End Workflow Automation of Hit-Discovery Research.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  19. Automated Controller Synthesis for non-Deterministic Piecewise-Affine Hybrid Systems

    DEFF Research Database (Denmark)

    Grunnet, Jacob Deleuran

    formations. This thesis uses a hybrid systems model of a satellite formation with possible actuator faults as a motivating example for developing an automated control synthesis method for non-deterministic piecewise-affine hybrid systems (PAHS). The method does not only open an avenue for further research...... in fault tolerant satellite formation control, but can be used to synthesise controllers for a wide range of systems where external events can alter the system dynamics. The synthesis method relies on abstracting the hybrid system into a discrete game, finding a winning strategy for the game meeting...... game and linear optimisation solvers for controller refinement. To illustrate the efficacy of the method a reoccurring satellite formation example including actuator faults has been used. The end result is the application of PAHSCTRL on the example showing synthesis and simulation of a fault tolerant...

  20. 3D model assisted fully automated scanning laser Doppler vibrometer measurements

    Science.gov (United States)

    Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve

    2017-12-01

    In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle

  1. Automation and Job Satisfaction among Reference Librarians.

    Science.gov (United States)

    Whitlatch, Jo Bell

    1991-01-01

    Discussion of job satisfaction and the level of job performance focuses on the effect of automation on job satisfaction among reference librarians. The influence of stress is discussed, a job strain model is explained, and examples of how to design a job to reduce the stress caused by automation are given. (12 references) (LRW)

  2. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  3. Automated Test Case Generation for an Autopilot Requirement Prototype

    Science.gov (United States)

    Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael

    2011-01-01

    Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.

  4. Meta-domains for Automated System Identification

    National Research Council Canada - National Science Library

    Easley, Matthew; Bradley, Elizabeth

    2000-01-01

    .... In particular we introduce a new structure for automated model building known as a meta-domain which, when instantiated with domain-specific components tailors the space of candidate models to the system at hand...

  5. Mining Repair Actions for Guiding Automated Program Fixing

    OpenAIRE

    Martinez , Matias; Monperrus , Martin

    2012-01-01

    Automated program fixing consists of generating source code in order to fix bugs in an automated manner. Our intuition is that automated program fixing can imitate human-based program fixing. Hence, we present a method to mine repair actions from software repositories. A repair action is a small semantic modification on code such as adding a method call. We then decorate repair actions with a probability distribution also learnt from software repositories. Our probabilistic repair models enab...

  6. Geological modeling of a stratified deposit with CAD-Based solid model automation

    Directory of Open Access Journals (Sweden)

    Ayten Eser

    Full Text Available Abstract The planning stages of mining activities require many comprehensive and detailed analyses. Determining the correct orebody model is the first stage and one of the most important. Three-dimensional solid modeling is one of the significant methods that can examine the position and shape of the ore deposit. Although there are many different types of mining software for determining a solid model, many users try to build geological models in the computer without knowing how these software packages work. As researchers on the subject, we wanted to answer the question "How would we do it". For this purpose, a system was developed for generating solid models using data obtained from boreholes. Obtaining this model in an AutoCAD environment will be important for geologists and engineers. Developed programs were first tested with virtual borehole data belonging to a virtual deposit. Then the real borehole data of a cement raw material site were successfully applied. This article allows readers not only to see a clear example of the programming approach to layered deposits but also to produce more complicated software in this context. Our study serves as a window to understanding the geological modeling process.

  7. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  8. Workshop on automated beam steering and shaping (ABS). Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Lindroos, M [ed.

    1999-09-10

    A workshop on Automated Beam Steering and Shaping (ABS) was held at CERN in December 1998. This was the first workshop dedicated to this subject. The workshop had two major goals: to review the present status of ABS algorithms and systems around the world and to create a worldwide ABS community. These proceedings contain summary reports from all sessions, contributions from several presentations held at the workshop, and a complete set of abstracts for all presentations. (orig.)

  9. Workshop on automated beam steering and shaping (ABS). Proceedings

    International Nuclear Information System (INIS)

    Lindroos, M.

    1999-01-01

    A workshop on Automated Beam Steering and Shaping (ABS) was held at CERN in December 1998. This was the first workshop dedicated to this subject. The workshop had two major goals: to review the present status of ABS algorithms and systems around the world and to create a worldwide ABS community. These proceedings contain summary reports from all sessions, contributions from several presentations held at the workshop, and a complete set of abstracts for all presentations. (orig.)

  10. Automated cell type discovery and classification through knowledge transfer

    Science.gov (United States)

    Lee, Hao-Chih; Kosoy, Roman; Becker, Christine E.

    2017-01-01

    Abstract Motivation: Recent advances in mass cytometry allow simultaneous measurements of up to 50 markers at single-cell resolution. However, the high dimensionality of mass cytometry data introduces computational challenges for automated data analysis and hinders translation of new biological understanding into clinical applications. Previous studies have applied machine learning to facilitate processing of mass cytometry data. However, manual inspection is still inevitable and becoming the barrier to reliable large-scale analysis. Results: We present a new algorithm called Automated Cell-type Discovery and Classification (ACDC) that fully automates the classification of canonical cell populations and highlights novel cell types in mass cytometry data. Evaluations on real-world data show ACDC provides accurate and reliable estimations compared to manual gating results. Additionally, ACDC automatically classifies previously ambiguous cell types to facilitate discovery. Our findings suggest that ACDC substantially improves both reliability and interpretability of results obtained from high-dimensional mass cytometry profiling data. Availability and Implementation: A Python package (Python 3) and analysis scripts for reproducing the results are availability on https://bitbucket.org/dudleylab/acdc. Contact: brian.kidd@mssm.edu or joel.dudley@mssm.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28158442

  11. Ukrainian Nuclear Society International Conference 'Modernization of the NPP with VVER reactor' (abstracts)

    International Nuclear Information System (INIS)

    Bar'yakhtar, V.G.

    1999-01-01

    Abstracts of the papers presented at International conference of the Ukrainian Nuclear Society 'Modernization of the NPP with VVER reactor'. The following problems are considered: improving the NPP's safety and reliability; reactor modernization, the lifetime prolongation; increasing of the reactor operating characteristics; methods of capacity factor increasing: refueling control, maintenance control; technical and economical aspects of NPP modernization; modernization of the automated control system of the fuel process at the NPP's; technical features and methods for the continued radiation and technology control at the NPP's; training, increasing the staff qualification and NPP modernization

  12. Trust in automation: integrating empirical evidence on factors that influence trust.

    Science.gov (United States)

    Hoff, Kevin Anthony; Bashir, Masooda

    2015-05-01

    We systematically review recent empirical research on factors that influence trust in automation to present a three-layered trust model that synthesizes existing knowledge. Much of the existing research on factors that guide human-automation interaction is centered around trust, a variable that often determines the willingness of human operators to rely on automation. Studies have utilized a variety of different automated systems in diverse experimental paradigms to identify factors that impact operators' trust. We performed a systematic review of empirical research on trust in automation from January 2002 to June 2013. Papers were deemed eligible only if they reported the results of a human-subjects experiment in which humans interacted with an automated system in order to achieve a goal. Additionally, a relationship between trust (or a trust-related behavior) and another variable had to be measured. All together, 101 total papers, containing 127 eligible studies, were included in the review. Our analysis revealed three layers of variability in human-automation trust (dispositional trust, situational trust, and learned trust), which we organize into a model. We propose design recommendations for creating trustworthy automation and identify environmental conditions that can affect the strength of the relationship between trust and reliance. Future research directions are also discussed for each layer of trust. Our three-layered trust model provides a new lens for conceptualizing the variability of trust in automation. Its structure can be applied to help guide future research and develop training interventions and design procedures that encourage appropriate trust. © 2014, Human Factors and Ergonomics Society.

  13. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  14. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  15. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin.; Willard, Gerald

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  16. Towards Abstract Interpretation of Epistemic Logic

    DEFF Research Database (Denmark)

    Ajspur, Mai; Gallagher, John Patrick

    applicable to infinite models. The abstract model-checker allows model-checking with infinite-state models. When applied to the problem of whether M |= φ, it terminates and returns the set of states in M at which φ might hold. If the set is empty, then M definitely does not satisfy φ, while if the set is non...

  17. Abstraction in artificial intelligence and complex systems

    CERN Document Server

    Saitta, Lorenza

    2013-01-01

    Abstraction is a fundamental mechanism underlying both human and artificial perception, representation of knowledge, reasoning and learning. This mechanism plays a crucial role in many disciplines, notably Computer Programming, Natural and Artificial Vision, Complex Systems, Artificial Intelligence and Machine Learning, Art, and Cognitive Sciences. This book first provides the reader with an overview of the notions of abstraction proposed in various disciplines by comparing both commonalities and differences.  After discussing the characterizing properties of abstraction, a formal model, the K

  18. MATHEMATICAL MODEL OF AUTOMATED REHABILITATION SYSTEM WITH BIOLOGICAL FEEDBACK FOR REHABILITATION AND DEVELOPMENT OF MUSCULOSKELETAL SYSTEM

    Directory of Open Access Journals (Sweden)

    Kirill A. Kalyashin

    2013-01-01

    Full Text Available In order to increase the efficiency and safety of rehabilitation of musculoskeletal system, the model and the algorithm for patient interaction with automated rehabilitation system with biological feedback was developed, based on registration and management of the second functional parameter, which prevents risks of overwork while intensive exercises.

  19. A comprehensive remote automated mobile robot framework for deployment of compact radiation sensors and campaign management

    International Nuclear Information System (INIS)

    Mukherjee, J.K.

    2005-01-01

    Remote controlled on-line sensing with compact radiation sensors for interactive, fast contamination mapping and source localization needs integrated command control and machine intelligence supported operation. The combination of remote operation capability and automation of sensing needs a comprehensive framework encompassing precision real-time remote controlled agent, reliable remote communication techniques for unified command and sensory data exchange with optimized bandwidth allocation between the real time low volume as well as moderate speed bulk data transfer and data abstraction for seamless multi-domain abstraction in single environment. The paper describes an indigenously developed comprehensive framework that achieves vertical integration of layered services complex functions, explains its implementation and details its operation with examples of on-line application sessions. Several important features like precise remote control of sensor trajectory generation in real time by digital signal processing, prediction and visualization of remote agent locus and attitude, spatial modeling of fixed features of the monitored region and localization of activity source over mapped region have been dealt with. (author)

  20. Désambiguisation lexicale à l'aide d'automates de polarités

    OpenAIRE

    Perrier , Guy

    2008-01-01

    Nous présentons une méthode de désambiguisation lexicale pour des grammaires lexicalisées. Cette méthode utilise une abstraction de la grammaire de départ fondée sur la notion de polarité. Elle utilise aussi des automates pour factoriser l'information conservée au cours du processus de désambiguisation.

  1. Automated parameter tuning applied to sea ice in a global climate model

    Science.gov (United States)

    Roach, Lettie A.; Tett, Simon F. B.; Mineter, Michael J.; Yamazaki, Kuniko; Rae, Cameron D.

    2018-01-01

    This study investigates the hypothesis that a significant portion of spread in climate model projections of sea ice is due to poorly-constrained model parameters. New automated methods for optimization are applied to historical sea ice in a global coupled climate model (HadCM3) in order to calculate the combination of parameters required to reduce the difference between simulation and observations to within the range of model noise. The optimized parameters result in a simulated sea-ice time series which is more consistent with Arctic observations throughout the satellite record (1980-present), particularly in the September minimum, than the standard configuration of HadCM3. Divergence from observed Antarctic trends and mean regional sea ice distribution reflects broader structural uncertainty in the climate model. We also find that the optimized parameters do not cause adverse effects on the model climatology. This simple approach provides evidence for the contribution of parameter uncertainty to spread in sea ice extent trends and could be customized to investigate uncertainties in other climate variables.

  2. An integrated system for buildings’ energy-efficient automation: Application in the tertiary sector

    International Nuclear Information System (INIS)

    Marinakis, Vangelis; Doukas, Haris; Karakosta, Charikleia; Psarras, John

    2013-01-01

    Highlights: ► We developed an interactive software for building automation systems. ► Monitoring of energy consumption in real time. ► Optimization of energy consumption implementing appropriate control scenarios. ► Pilot appraisal on remote control of active systems in the tertiary sector building. ► Significant decrease in energy and operating cost of A/C system. -- Abstract: Although integrated building automation systems have become increasingly popular, an integrated system which includes remote control technology to enable real-time monitoring of the energy consumption by energy end-users, as well as optimization functions is required. To respond to this common interest, the main aim of the paper is to present an integrated system for buildings’ energy-efficient automation. The proposed system is based on a prototype software tool for the simulation and optimization of energy consumption in the building sector, enhancing the interactivity of building automation systems. The system can incorporate energy-efficient automation functions for heating, cooling and/or lighting based on recent guidance and decisions of the National Law, energy efficiency requirements of EN 15232 and ISO 50001 Energy Management Standard among others. The presented system was applied to a supermarket building in Greece and focused on the remote control of active systems.

  3. Modelling and experimental study for automated congestion driving

    NARCIS (Netherlands)

    Urhahne, Joseph; Piastowski, P.; van der Voort, Mascha C.; Bebis, G; Boyle, R.; Parvin, B.; Koracin, D.; Pavlidis, I.; Feris, R.; McGraw, T.; Elendt, M.; Kopper, R.; Ragan, E.; Ye, Z.; Weber, G.

    2015-01-01

    Taking a collaborative approach in automated congestion driving with a Traffic Jam Assist system requires the driver to take over control in certain traffic situations. In order to warn the driver appropriately, warnings are issued (“pay attention” vs. “take action”) due to a control transition

  4. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  5. The Complexity of Abstract Machines

    Directory of Open Access Journals (Sweden)

    Beniamino Accattoli

    2017-01-01

    Full Text Available The lambda-calculus is a peculiar computational model whose definition does not come with a notion of machine. Unsurprisingly, implementations of the lambda-calculus have been studied for decades. Abstract machines are implementations schema for fixed evaluation strategies that are a compromise between theory and practice: they are concrete enough to provide a notion of machine and abstract enough to avoid the many intricacies of actual implementations. There is an extensive literature about abstract machines for the lambda-calculus, and yet—quite mysteriously—the efficiency of these machines with respect to the strategy that they implement has almost never been studied. This paper provides an unusual introduction to abstract machines, based on the complexity of their overhead with respect to the length of the implemented strategies. It is conceived to be a tutorial, focusing on the case study of implementing the weak head (call-by-name strategy, and yet it is an original re-elaboration of known results. Moreover, some of the observation contained here never appeared in print before.

  6. Set of information technologies and their role in automation of agricultural production

    OpenAIRE

    V. V. Al’t

    2018-01-01

    The modern enterprises of agrarian and industrial complex are characterized by the high level of automation of technological processes. The technological development level conformto 5th and 6th technology revolutions. The automatic and automated technologies in crop production and livestock production use data of internet technologies, Global Positioning Satellite survey and observations, mashine and tractor aggregates automated operating. The models nucleus and row of information models of a...

  7. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    Science.gov (United States)

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty

  8. Automated computer-based CT stratification as a predictor of outcome in hypersensitivity pneumonitis

    International Nuclear Information System (INIS)

    Jacob, Joseph; Mak, S.M.; Mok, W.; Hansell, D.M.; Bartholmai, B.J.; Rajagopalan, S.; Karwoski, R.; Della Casa, G.; Sugino, K.; Walsh, S.L.F.; Wells, A.U.

    2017-01-01

    Hypersensitivity pneumonitis (HP) has a variable clinical course. Modelling of quantitative CALIPER-derived CT data can identify distinct disease phenotypes. Mortality prediction using CALIPER analysis was compared to the interstitial lung disease gender, age, physiology (ILD-GAP) outcome model. CALIPER CT analysis of parenchymal patterns in 98 consecutive HP patients was compared to visual CT scoring by two radiologists. Functional indices including forced vital capacity (FVC) and diffusion capacity for carbon monoxide (DLco) in univariate and multivariate Cox mortality models. Automated stratification of CALIPER scores was evaluated against outcome models. Univariate predictors of mortality included visual and CALIPER CT fibrotic patterns, and all functional indices. Multivariate analyses identified only two independent predictors of mortality: CALIPER reticular pattern (p = 0.001) and DLco (p < 0.0001). Automated stratification distinguished three distinct HP groups (log-rank test p < 0.0001). Substitution of automated stratified groups for FVC and DLco in the ILD-GAP model demonstrated no loss of model strength (C-Index = 0.73 for both models). Model strength improved when automated stratified groups were combined with the ILD-GAP model (C-Index = 0.77). CALIPER-derived variables are the strongest CT predictors of mortality in HP. Automated CT stratification is equivalent to functional indices in the ILD-GAP model for predicting outcome in HP. (orig.)

  9. Automated computer-based CT stratification as a predictor of outcome in hypersensitivity pneumonitis

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Joseph; Mak, S.M.; Mok, W.; Hansell, D.M. [Royal Brompton and Harefield NHS Foundation Trust, Department of Radiology, Royal Brompton Hospital, London (United Kingdom); Bartholmai, B.J. [Mayo Clinic Rochester, Division of Radiology, Rochester, MN (United States); Rajagopalan, S.; Karwoski, R. [Mayo Clinic Rochester, Biomedical Imaging Resource, Rochester, MN (United States); Della Casa, G. [Universita degli Studi di Modena e Reggio Emilia, Modena, Emilia-Romagna (Italy); Sugino, K. [Toho University Omori Medical Centre, Tokyo (Japan); Walsh, S.L.F. [Kings College Hospital, London (United Kingdom); Wells, A.U. [Royal Brompton and Harefield NHS Foundation Trust, Interstitial Lung Disease Unit, Royal Brompton Hospital, London (United Kingdom)

    2017-09-15

    Hypersensitivity pneumonitis (HP) has a variable clinical course. Modelling of quantitative CALIPER-derived CT data can identify distinct disease phenotypes. Mortality prediction using CALIPER analysis was compared to the interstitial lung disease gender, age, physiology (ILD-GAP) outcome model. CALIPER CT analysis of parenchymal patterns in 98 consecutive HP patients was compared to visual CT scoring by two radiologists. Functional indices including forced vital capacity (FVC) and diffusion capacity for carbon monoxide (DLco) in univariate and multivariate Cox mortality models. Automated stratification of CALIPER scores was evaluated against outcome models. Univariate predictors of mortality included visual and CALIPER CT fibrotic patterns, and all functional indices. Multivariate analyses identified only two independent predictors of mortality: CALIPER reticular pattern (p = 0.001) and DLco (p < 0.0001). Automated stratification distinguished three distinct HP groups (log-rank test p < 0.0001). Substitution of automated stratified groups for FVC and DLco in the ILD-GAP model demonstrated no loss of model strength (C-Index = 0.73 for both models). Model strength improved when automated stratified groups were combined with the ILD-GAP model (C-Index = 0.77). CALIPER-derived variables are the strongest CT predictors of mortality in HP. Automated CT stratification is equivalent to functional indices in the ILD-GAP model for predicting outcome in HP. (orig.)

  10. ASAP: an environment for automated preprocessing of sequencing data

    Directory of Open Access Journals (Sweden)

    Torstenson Eric S

    2013-01-01

    Full Text Available Abstract Background Next-generation sequencing (NGS has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  11. On the engineering design for systematic integration of agent-orientation in industrial automation.

    Science.gov (United States)

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Hydrogen abstraction reactions by amide electron adducts

    International Nuclear Information System (INIS)

    Sevilla, M.D.; Sevilla, C.L.; Swarts, S.

    1982-01-01

    Electron reactions with a number of peptide model compounds (amides and N-acetylamino acids) in aqueous glasses at low temperature have been investigated using ESR spectroscopy. The radicals produced by electron attachment to amides, RC(OD)NDR', are found to act as hydrogen abstracting agents. For example, the propionamide electron adduct is found to abstract from its parent propionamide. Electron adducts of other amides investigated show similar behavior except for acetamide electron adduct which does not abstract from its parent compound, but does abstract from other amides. The tendency toward abstraction for amide electron adducts are compared to electron adducts of several carboxylic acids, ketones, aldehydes and esters. The comparison suggests the hydrogen abstraction tendency of the various deuterated electron adducts (DEAs) to be in the following order: aldehyde DEA > acid DEA = approximately ester DEA > ketone DEA > amide DEA. In basic glasses the hydrogen abstraction ability of the amide electron adducts is maintained until the concentration of base is increased sufficiently to convert the DEA to its anionic form, RC(O - )ND 2 . In this form the hydrogen abstracting ability of the radical is greatly diminished. Similar results were found for the ester and carboxylic acid DEA's tested. (author)

  13. Abstracting audit data for lightweight intrusion detection

    KAUST Repository

    Wang, Wei

    2010-01-01

    High speed of processing massive audit data is crucial for an anomaly Intrusion Detection System (IDS) to achieve real-time performance during the detection. Abstracting audit data is a potential solution to improve the efficiency of data processing. In this work, we propose two strategies of data abstraction in order to build a lightweight detection model. The first strategy is exemplar extraction and the second is attribute abstraction. Two clustering algorithms, Affinity Propagation (AP) as well as traditional k-means, are employed to extract the exemplars, and Principal Component Analysis (PCA) is employed to abstract important attributes (a.k.a. features) from the audit data. Real HTTP traffic data collected in our institute as well as KDD 1999 data are used to validate the two strategies of data abstraction. The extensive test results show that the process of exemplar extraction significantly improves the detection efficiency and has a better detection performance than PCA in data abstraction. © 2010 Springer-Verlag.

  14. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  15. Method for automated building of spindle thermal model with use of CAE system

    Science.gov (United States)

    Kamenev, S. V.

    2018-03-01

    The spindle is one of the most important units of the metal-cutting machine tool. Its performance is critical to minimize the machining error, especially the thermal error. Various methods are applied to improve the thermal behaviour of spindle units. One of the most important methods is mathematical modelling based on the finite element analysis. The most common approach for its realization is the use of CAE systems. This approach, however, is not capable to address the number of important effects that need to be taken into consideration for proper simulation. In the present article, the authors propose the solution to overcome these disadvantages using automated thermal model building for the spindle unit utilizing the CAE system ANSYS.

  16. Framework for Human-Automation Collaboration: Conclusions from Four Studies

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); O' Hara, John [Brookhaven National Lab. (BNL), Upton, NY (United States); Joe, Jeffrey C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Whaley, April M. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Medema, Heather [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    The Human Automation Collaboration (HAC) research project is investigating how advanced technologies that are planned for Advanced Small Modular Reactors (AdvSMR) will affect the performance and the reliability of the plant from a human factors and human performance perspective. The HAC research effort investigates the consequences of allocating functions between the operators and automated systems. More specifically, the research team is addressing how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. Oxstrand et al. (2013 - March) describes the efforts conducted by the researchers to identify the research needs for HAC. The research team reviewed the literature on HAC, developed a model of HAC, and identified gaps in the existing knowledge of human-automation collaboration. As described in Oxstrand et al. (2013 – June), the team then prioritized the research topics identified based on the specific needs in the context of AdvSMR. The prioritization was based on two sources of input: 1) The preliminary functions and tasks, and 2) The model of HAC. As a result, three analytical studies were planned and conduced; 1) Models of Teamwork, 2) Standardized HAC Performance Measurement Battery, and 3) Initiators and Triggering Conditions for Adaptive Automation. Additionally, one field study was also conducted at Idaho Falls Power.

  17. Operator-based metric for nuclear operations automation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zacharias, G.L.; Miao, A.X.; Kalkan, A. [Charles River Analytics Inc., Cambridge, MA (United States)] [and others

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  18. Grounding abstractness: Abstract concepts and the activation of the mouth

    Directory of Open Access Journals (Sweden)

    Anna M Borghi

    2016-10-01

    Full Text Available One key issue for theories of cognition is how abstract concepts, such as freedom, are represented. According to the WAT (Words As social Tools proposal, abstract concepts activate both sensorimotor and linguistic/social information, and their acquisition modality involves the linguistic experience more than the acquisition of concrete concepts. We report an experiment in which participants were presented with abstract and concrete definitions followed by concrete and abstract target-words. When the definition and the word matched, participants were required to press a key, either with the hand or with the mouth. Response times and accuracy were recorded. As predicted, we found that abstract definitions and abstract words yielded slower responses and more errors compared to concrete definitions and concrete words. More crucially, there was an interaction between the target-words and the effector used to respond (hand, mouth. While responses with the mouth were overall slower, the advantage of the hand over the mouth responses was more marked with concrete than with abstract concepts. The results are in keeping with grounded and embodied theories of cognition and support the WAT proposal, according to which abstract concepts evoke linguistic-social information, hence activate the mouth. The mechanisms underlying the mouth activation with abstract concepts (re-enactment of acquisition experience, or re-explanation of the word meaning, possibly through inner talk are discussed. To our knowledge this is the first behavioral study demonstrating with real words that the advantage of the hand over the mouth is more marked with concrete than with abstract concepts, likely because of the activation of linguistic information with abstract concepts.

  19. Automated robust generation of compact 3D statistical shape models

    Science.gov (United States)

    Vrtovec, Tomaz; Likar, Bostjan; Tomazevic, Dejan; Pernus, Franjo

    2004-05-01

    Ascertaining the detailed shape and spatial arrangement of anatomical structures is important not only within diagnostic settings but also in the areas of planning, simulation, intraoperative navigation, and tracking of pathology. Robust, accurate and efficient automated segmentation of anatomical structures is difficult because of their complexity and inter-patient variability. Furthermore, the position of the patient during image acquisition, the imaging device and protocol, image resolution, and other factors induce additional variations in shape and appearance. Statistical shape models (SSMs) have proven quite successful in capturing structural variability. A possible approach to obtain a 3D SSM is to extract reference voxels by precisely segmenting the structure in one, reference image. The corresponding voxels in other images are determined by registering the reference image to each other image. The SSM obtained in this way describes statistically plausible shape variations over the given population as well as variations due to imperfect registration. In this paper, we present a completely automated method that significantly reduces shape variations induced by imperfect registration, thus allowing a more accurate description of variations. At each iteration, the derived SSM is used for coarse registration, which is further improved by describing finer variations of the structure. The method was tested on 64 lumbar spinal column CT scans, from which 23, 38, 45, 46 and 42 volumes of interest containing vertebra L1, L2, L3, L4 and L5, respectively, were extracted. Separate SSMs were generated for each vertebra. The results show that the method is capable of reducing the variations induced by registration errors.

  20. Computer simulation and automation of data processing

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1981-01-01

    The principles of computerized simulation and automation of data processing are presented. The automized processing system is constructed according to the module-hierarchical principle. The main operating conditions of the system are as follows: preprocessing, installation analysis, interpretation, accuracy analysis and controlling parameters. The definition of the quasireal experiment permitting to plan the real experiment is given. It is pointed out that realization of the quasireal experiment by means of the computerized installation model with subsequent automized processing permits to scan the quantitative aspect of the system as a whole as well as provides optimal designing of installation parameters for obtaining maximum resolution [ru

  1. Transport safety research abstracts. No. 1

    International Nuclear Information System (INIS)

    1991-07-01

    The Transport Safety Research Abstracts is a collection of reports from Member States of the International Atomic Energy Agency, and other international organizations on research in progress or just completed in the area of safe transport of radioactive material. The main aim of TSRA is to draw attention to work that is about to be published, thus enabling interested parties to obtain further information through direct correspondence with the investigators. Information contained in this issue covers work being undertaken in 6 Member States and contracted by 1 international organization; it is hoped with succeeding issues that TSRA will be able to widen this base. TSRA is modelled after other IAEA publications describing work in progress in other programme areas, namely Health Physics Research Abstracts (No. 14 was published in 1989), Waste Management Research Abstracts (No. 20 was published in 1990), and Nuclear Safety Research Abstracts (No. 2 was published in 1990)

  2. An automated cell-counting algorithm for fluorescently-stained cells in migration assays

    Directory of Open Access Journals (Sweden)

    Novielli Nicole M

    2011-10-01

    Full Text Available Abstract A cell-counting algorithm, developed in Matlab®, was created to efficiently count migrated fluorescently-stained cells on membranes from migration assays. At each concentration of cells used (10,000, and 100,000 cells, images were acquired at 2.5 ×, 5 ×, and 10 × objective magnifications. Automated cell counts strongly correlated to manual counts (r2 = 0.99, P

  3. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  4. Dockomatic - automated ligand creation and docking.

    Science.gov (United States)

    Bullock, Casey W; Jacob, Reed B; McDougal, Owen M; Hampikian, Greg; Andersen, Tim

    2010-11-08

    The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI) application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  5. Automated quantification and sizing of unbranched filamentous cyanobacteria by model based object oriented image analysis

    OpenAIRE

    Zeder, M; Van den Wyngaert, S; Köster, O; Felder, K M; Pernthaler, J

    2010-01-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-...

  6. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    and Ben Polly, Joseph Robertson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Polly, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Collis, Jon [Colorado School of Mines, Golden, CO (United States)

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  7. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  8. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  9. Automation, consolidation, and integration in autoimmune diagnostics.

    Science.gov (United States)

    Tozzoli, Renato; D'Aurizio, Federica; Villalta, Danilo; Bizzaro, Nicola

    2015-08-01

    Over the past two decades, we have witnessed an extraordinary change in autoimmune diagnostics, characterized by the progressive evolution of analytical technologies, the availability of new tests, and the explosive growth of molecular biology and proteomics. Aside from these huge improvements, organizational changes have also occurred which brought about a more modern vision of the autoimmune laboratory. The introduction of automation (for harmonization of testing, reduction of human error, reduction of handling steps, increase of productivity, decrease of turnaround time, improvement of safety), consolidation (combining different analytical technologies or strategies on one instrument or on one group of connected instruments) and integration (linking analytical instruments or group of instruments with pre- and post-analytical devices) opened a new era in immunodiagnostics. In this article, we review the most important changes that have occurred in autoimmune diagnostics and present some models related to the introduction of automation in the autoimmunology laboratory, such as automated indirect immunofluorescence and changes in the two-step strategy for detection of autoantibodies; automated monoplex immunoassays and reduction of turnaround time; and automated multiplex immunoassays for autoantibody profiling.

  10. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  11. Automated data processing and radioassays.

    Science.gov (United States)

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots

  12. Literature classification for semi-automated updating of biological knowledgebases

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Kudahl, Ulrich Johan; Winther, Ole

    2013-01-01

    abstracts yielded classification accuracy of 0.95, thus showing significant value in support of data extraction from the literature. Conclusion: We here propose a conceptual framework for semi-automated extraction of epitope data embedded in scientific literature using principles from text mining...... types of biological data, such as sequence data, are extensively stored in biological databases, functional annotations, such as immunological epitopes, are found primarily in semi-structured formats or free text embedded in primary scientific literature. Results: We defined and applied a machine...

  13. Formal Model-Driven Engineering: Generating Data and Behavioural Components

    Directory of Open Access Journals (Sweden)

    Chen-Wei Wang

    2012-12-01

    Full Text Available Model-driven engineering is the automatic production of software artefacts from abstract models of structure and functionality. By targeting a specific class of system, it is possible to automate aspects of the development process, using model transformations and code generators that encode domain knowledge and implementation strategies. Using this approach, questions of correctness for a complex, software system may be answered through analysis of abstract models of lower complexity, under the assumption that the transformations and generators employed are themselves correct. This paper shows how formal techniques can be used to establish the correctness of model transformations used in the generation of software components from precise object models. The source language is based upon existing, formal techniques; the target language is the widely-used SQL notation for database programming. Correctness is established by giving comparable, relational semantics to both languages, and checking that the transformations are semantics-preserving.

  14. Automated optimization and construction of chemometric models based on highly variable raw chromatographic data.

    Science.gov (United States)

    Sinkov, Nikolai A; Johnston, Brandon M; Sandercock, P Mark L; Harynuk, James J

    2011-07-04

    Direct chemometric interpretation of raw chromatographic data (as opposed to integrated peak tables) has been shown to be advantageous in many circumstances. However, this approach presents two significant challenges: data alignment and feature selection. In order to interpret the data, the time axes must be precisely aligned so that the signal from each analyte is recorded at the same coordinates in the data matrix for each and every analyzed sample. Several alignment approaches exist in the literature and they work well when the samples being aligned are reasonably similar. In cases where the background matrix for a series of samples to be modeled is highly variable, the performance of these approaches suffers. Considering the challenge of feature selection, when the raw data are used each signal at each time is viewed as an individual, independent variable; with the data rates of modern chromatographic systems, this generates hundreds of thousands of candidate variables, or tens of millions of candidate variables if multivariate detectors such as mass spectrometers are utilized. Consequently, an automated approach to identify and select appropriate variables for inclusion in a model is desirable. In this research we present an alignment approach that relies on a series of deuterated alkanes which act as retention anchors for an alignment signal, and couple this with an automated feature selection routine based on our novel cluster resolution metric for the construction of a chemometric model. The model system that we use to demonstrate these approaches is a series of simulated arson debris samples analyzed by passive headspace extraction, GC-MS, and interpreted using partial least squares discriminant analysis (PLS-DA). Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Learning with Technology: Video Modeling with Concrete-Representational-Abstract Sequencing for Students with Autism Spectrum Disorder

    Science.gov (United States)

    Yakubova, Gulnoza; Hughes, Elizabeth M.; Shinaberry, Megan

    2016-01-01

    The purpose of this study was to determine the effectiveness of a video modeling intervention with concrete-representational-abstract instructional sequence in teaching mathematics concepts to students with autism spectrum disorder (ASD). A multiple baseline across skills design of single-case experimental methodology was used to determine the…

  16. Towards Automated Bargaining in Electronic Markets: A Partially Two-Sided Competition Model

    Science.gov (United States)

    Gatti, Nicola; Lazaric, Alessandro; Restelli, Marcello

    This paper focuses on the prominent issue of automating bargaining agents within electronic markets. Models of bargaining in literature deal with settings wherein there are only two agents and no model satisfactorily captures settings in which there is competition among buyers, being they more than one, and analogously among sellers. In this paper, we extend the principal bargaining protocol, i.e. the alternating-offers protocol, to capture bargaining in markets. The model we propose is such that, in presence of a unique buyer and a unique seller, agents' equilibrium strategies are those in the original protocol. Moreover, we game theoretically study the considered game providing the following results: in presence of one-sided competition (more buyers and one seller or vice versa) we provide agents' equilibrium strategies for all the values of the parameters, in presence of two-sided competition (more buyers and more sellers) we provide an algorithm that produce agents' equilibrium strategies for a large set of the parameters and we experimentally evaluate its effectiveness.

  17. Software complex AS (automation of spectrometry). User interface of experiment automation system implementation

    International Nuclear Information System (INIS)

    Astakhova, N.V.; Beskrovnyj, A.I.; Bogdzel', A.A.; Butorin, P.E.; Vasilovskij, S.G.; Gundorin, N.A.; Zlokazov, V.B.; Kutuzov, S.A.; Salamatin, I.M.; Shvetsov, V.N.

    2003-01-01

    An instrumental software complex for automation of spectrometry (AS) that enables prompt realization of experiment automation systems for spectrometers, which use data buferisation, has been developed. In the development new methods of programming and building of automation systems together with novel net technologies were employed. It is suggested that programs to schedule and conduct experiments should be based on the parametric model of the spectrometer, the approach that will make it possible to write programs suitable for any FLNP (Frank Laboratory of Neutron Physics) spectrometer and experimental technique applied and use different hardware interfaces for introducing the spectrometric data into the data acquisition system. The article describes the possibilities provided to the user in the field of scheduling and control of the experiment, data viewing, and control of the spectrometer parameters. The possibility of presenting the current spectrometer state, programs and the experimental data in the Internet in the form of dynamically formed protocols and graphs, as well as of the experiment control via the Internet is realized. To use the means of the Internet on the side of the client, applied programs are not needed. It suffices to know how to use the two programs to carry out experiments in the automated mode. The package is designed for experiments in condensed matter and nuclear physics and is ready for using. (author)

  18. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show...

  19. PHOTOGRAMMETRY-BASED AUTOMATED MEASUREMENTS FOR TOOTH SHAPE AND OCCLUSION ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available Tooth measurements (odontometry are performed for various scientific and practical applications, including dentistry. Present-day techniques are being increasingly based on 3D model use that provides wider prospects in comparison to measurements on real objects: teeth or their plaster copies. The main advantages emerge through application of new measurement methods which provide the needed degree of non-invasiveness, precision, convenience and details. Tooth measurements have been always regarded as a time-consuming research, even more so with use of new methods due to their wider opportunities. This is where automation becomes essential for further development and implication of measurement techniques. In our research automation in obtaining 3D models and automation of measurements provided essential data that was analysed to suggest recommendations for tooth preparation – one of the most responsible clinical procedures in prosthetic dentistry – within a comparatively short period of time. The original photogrammetric 3D reconstruction system allows to generate 3D models of dental arches, reproduce their closure, or occlusion, and to perform a set of standard measurement in automated mode.

  20. Photogrammetry-Based Automated Measurements for Tooth Shape and Occlusion Analysis

    Science.gov (United States)

    Knyaz, V. A.; Gaboutchian, A. V.

    2016-06-01

    Tooth measurements (odontometry) are performed for various scientific and practical applications, including dentistry. Present-day techniques are being increasingly based on 3D model use that provides wider prospects in comparison to measurements on real objects: teeth or their plaster copies. The main advantages emerge through application of new measurement methods which provide the needed degree of non-invasiveness, precision, convenience and details. Tooth measurements have been always regarded as a time-consuming research, even more so with use of new methods due to their wider opportunities. This is where automation becomes essential for further development and implication of measurement techniques. In our research automation in obtaining 3D models and automation of measurements provided essential data that was analysed to suggest recommendations for tooth preparation - one of the most responsible clinical procedures in prosthetic dentistry - within a comparatively short period of time. The original photogrammetric 3D reconstruction system allows to generate 3D models of dental arches, reproduce their closure, or occlusion, and to perform a set of standard measurement in automated mode.

  1. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  2. Modeling Patient Treatment With Medical Records: An Abstraction Hierarchy to Understand User Competencies and Needs.

    Science.gov (United States)

    St-Maurice, Justin D; Burns, Catherine M

    2017-07-28

    Health care is a complex sociotechnical system. Patient treatment is evolving and needs to incorporate the use of technology and new patient-centered treatment paradigms. Cognitive work analysis (CWA) is an effective framework for understanding complex systems, and work domain analysis (WDA) is useful for understanding complex ecologies. Although previous applications of CWA have described patient treatment, due to their scope of work patients were previously characterized as biomedical machines, rather than patient actors involved in their own care. An abstraction hierarchy that characterizes patients as beings with complex social values and priorities is needed. This can help better understand treatment in a modern approach to care. The purpose of this study was to perform a WDA to represent the treatment of patients with medical records. The methods to develop this model included the analysis of written texts and collaboration with subject matter experts. Our WDA represents the ecology through its functional purposes, abstract functions, generalized functions, physical functions, and physical forms. Compared with other work domain models, this model is able to articulate the nuanced balance between medical treatment, patient education, and limited health care resources. Concepts in the analysis were similar to the modeling choices of other WDAs but combined them in as a comprehensive, systematic, and contextual overview. The model is helpful to understand user competencies and needs. Future models could be developed to model the patient's domain and enable the exploration of the shared decision-making (SDM) paradigm. Our work domain model links treatment goals, decision-making constraints, and task workflows. This model can be used by system developers who would like to use ecological interface design (EID) to improve systems. Our hierarchy is the first in a future set that could explore new treatment paradigms. Future hierarchies could model the patient as a

  3. Automated 3-D Radiation Mapping

    International Nuclear Information System (INIS)

    Tarpinian, J. E.

    1991-01-01

    This work describes an automated radiation detection and imaging system which combines several state-of-the-art technologies to produce a portable but very powerful visualization tool for planning work in radiation environments. The system combines a radiation detection system, a computerized radiation imaging program, and computerized 3-D modeling to automatically locate and measurements are automatically collected and imaging techniques are used to produce colored, 'isodose' images of the measured radiation fields. The isodose lines from the images are then superimposed over the 3-D model of the area. The final display shows the various components in a room and their associated radiation fields. The use of an automated radiation detection system increases the quality of radiation survey obtained measurements. The additional use of a three-dimensional display allows easier visualization of the area and associated radiological conditions than two-dimensional sketches

  4. Automated MAD and MIR structure solution

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations

  5. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...... consists of three main steps: automatically adding so-called derived pragmatics to the CPN model, computing an abstract template tree, which associates pragmatics with code templates, and applying the templates to generate code which can then be compiled. We illustrate our method using a unidirectional...

  6. Automation of plasma-process fultext bibliography databases. An on-line data-collection, data-mining and data-input system

    International Nuclear Information System (INIS)

    Suzuki, Manabu; Pichl, Lukas; Murakami, Izumi; Kato, Takako; Sasaki, Akira

    2006-01-01

    Searching for relevant data, information retrieval, data extraction and data input are time- and resource-consuming activities in most data centers. Here we develop a Linux system automating the process in case of bibliography, abstract and fulltext databases. The present system is an open-source free-software low-cost solution that connects the target and provider databases in cyberspace through various web publishing formats. The abstract/fulltext relevance assessment is interfaced to external software modules. (author)

  7. Patients in assisted automated peritoneal dialysis develop strategies for selfcare

    DEFF Research Database (Denmark)

    Holch, Kirsten

      Patients in Assisted Automated Peritoneal Dialysis develop strategies for self-care Background: Since 2000 a model for Assisted Automated Peritoneal Dialysis (AAPD) in the patients own home has been developed at Aarhus University Hospital, Skejby. The patient group consists of physically...

  8. GIBS Geospatial Data Abstraction Library (GDAL)

    Data.gov (United States)

    National Aeronautics and Space Administration — GDAL is an open source translator library for raster geospatial data formats that presents a single abstract data model to the calling application for all supported...

  9. Affective processes in human-automation interactions.

    Science.gov (United States)

    Merritt, Stephanie M

    2011-08-01

    This study contributes to the literature on automation reliance by illuminating the influences of user moods and emotions on reliance on automated systems. Past work has focused predominantly on cognitive and attitudinal variables, such as perceived machine reliability and trust. However, recent work on human decision making suggests that affective variables (i.e., moods and emotions) are also important. Drawing from the affect infusion model, significant effects of affect are hypothesized. Furthermore, a new affectively laden attitude termed liking is introduced. Participants watched video clips selected to induce positive or negative moods, then interacted with a fictitious automated system on an X-ray screening task At five time points, important variables were assessed including trust, liking, perceived machine accuracy, user self-perceived accuracy, and reliance.These variables, along with propensity to trust machines and state affect, were integrated in a structural equation model. Happiness significantly increased trust and liking for the system throughout the task. Liking was the only variable that significantly predicted reliance early in the task. Trust predicted reliance later in the task, whereas perceived machine accuracy and user self-perceived accuracy had no significant direct effects on reliance at any time. Affective influences on automation reliance are demonstrated, suggesting that this decision-making process may be less rational and more emotional than previously acknowledged. Liking for a new system may be key to appropriate reliance, particularly early in the task. Positive affect can be easily induced and may be a lever for increasing liking.

  10. Automated EEG sleep staging in the term-age baby using a generative modelling approach

    Science.gov (United States)

    Pillay, Kirubin; Dereymaeker, Anneleen; Jansen, Katrien; Naulaers, Gunnar; Van Huffel, Sabine; De Vos, Maarten

    2018-06-01

    Objective. We develop a method for automated four-state sleep classification of preterm and term-born babies at term-age of 38-40 weeks postmenstrual age (the age since the last menstrual cycle of the mother) using multichannel electroencephalogram (EEG) recordings. At this critical age, EEG differentiates from broader quiet sleep (QS) and active sleep (AS) stages to four, more complex states, and the quality and timing of this differentiation is indicative of the level of brain development. However, existing methods for automated sleep classification remain focussed only on QS and AS sleep classification. Approach. EEG features were calculated from 16 EEG recordings, in 30 s epochs, and personalized feature scaling used to correct for some of the inter-recording variability, by standardizing each recording’s feature data using its mean and standard deviation. Hidden Markov models (HMMs) and Gaussian mixture models (GMMs) were trained, with the HMM incorporating knowledge of the sleep state transition probabilities. Performance of the GMM and HMM (with and without scaling) were compared, and Cohen’s kappa agreement calculated between the estimates and clinicians’ visual labels. Main results. For four-state classification, the HMM proved superior to the GMM. With the inclusion of personalized feature scaling, mean kappa (±standard deviation) was 0.62 (±0.16) compared to the GMM value of 0.55 (±0.15). Without feature scaling, kappas for the HMM and GMM dropped to 0.56 (±0.18) and 0.51 (±0.15), respectively. Significance. This is the first study to present a successful method for the automated staging of four states in term-age sleep using multichannel EEG. Results suggested a benefit in incorporating transition information using an HMM, and correcting for inter-recording variability through personalized feature scaling. Determining the timing and quality of these states are indicative of developmental delays in both preterm and term-born babies that may

  11. Automated chemical kinetic modeling via hybrid reactive molecular dynamics and quantum chemistry simulations.

    Science.gov (United States)

    Döntgen, Malte; Schmalz, Felix; Kopp, Wassja A; Kröger, Leif C; Leonhard, Kai

    2018-06-13

    An automated scheme for obtaining chemical kinetic models from scratch using reactive molecular dynamics and quantum chemistry simulations is presented. This methodology combines the phase space sampling of reactive molecular dynamics with the thermochemistry and kinetics prediction capabilities of quantum mechanics. This scheme provides the NASA polynomial and modified Arrhenius equation parameters for all species and reactions that are observed during the simulation and supplies them in the ChemKin format. The ab initio level of theory for predictions is easily exchangeable and the presently used G3MP2 level of theory is found to reliably reproduce hydrogen and methane oxidation thermochemistry and kinetics data. Chemical kinetic models obtained with this approach are ready-to-use for, e.g., ignition delay time simulations, as shown for hydrogen combustion. The presented extension of the ChemTraYzer approach can be used as a basis for methodologically advancing chemical kinetic modeling schemes and as a black-box approach to generate chemical kinetic models.

  12. Automated determination of fibrillar structures by simultaneous model building and fiber diffraction refinement.

    Science.gov (United States)

    Potrzebowski, Wojciech; André, Ingemar

    2015-07-01

    For highly oriented fibrillar molecules, three-dimensional structures can often be determined from X-ray fiber diffraction data. However, because of limited information content, structure determination and validation can be challenging. We demonstrate that automated structure determination of protein fibers can be achieved by guiding the building of macromolecular models with fiber diffraction data. We illustrate the power of our approach by determining the structures of six bacteriophage viruses de novo using fiber diffraction data alone and together with solid-state NMR data. Furthermore, we demonstrate the feasibility of molecular replacement from monomeric and fibrillar templates by solving the structure of a plant virus using homology modeling and protein-protein docking. The generated models explain the experimental data to the same degree as deposited reference structures but with improved structural quality. We also developed a cross-validation method for model selection. The results highlight the power of fiber diffraction data as structural constraints.

  13. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring

    2016-10-01

    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  14. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  15. Selected Translated Abstracts of Chinese-Language Climate Change Publications

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Burtis, M.D.

    1999-05-01

    This report contains English-translated abstracts of important Chinese-language literature concerning global climate change for the years 1995-1998. This body of literature includes the topics of adaptation, ancient climate change, climate variation, the East Asia monsoon, historical climate change, impacts, modeling, and radiation and trace-gas emissions. In addition to the biological citations and abstracts translated into English, this report presents the original citations and abstracts in Chinese. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  16. From Functions to Object-Orientation by Abstraction

    OpenAIRE

    Diertens, Bob

    2012-01-01

    In previous work we developed a framework of computational models for function and object execution. The models on an higher level of abstraction in this framework allow for concurrent execution of functions and objects. We show that the computational model for object execution complies with the fundamentals of object-orientation.

  17. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  18. Diagnosis of hearing loss using automated audiometry in an asynchronous telehealth model: A pilot accuracy study.

    Science.gov (United States)

    Brennan-Jones, Christopher G; Eikelboom, Robert H; Swanepoel, De Wet

    2017-02-01

    Introduction Standard criteria exist for diagnosing different types of hearing loss, yet audiologists interpret audiograms manually. This pilot study examined the feasibility of standardised interpretations of audiometry in a telehealth model of care. The aim of this study was to examine diagnostic accuracy of automated audiometry in adults with hearing loss in an asynchronous telehealth model using pre-defined diagnostic protocols. Materials and methods We recruited 42 study participants from a public audiology and otolaryngology clinic in Perth, Western Australia. Manual audiometry was performed by an audiologist either before or after automated audiometry. Diagnostic protocols were applied asynchronously for normal hearing, disabling hearing loss, conductive hearing loss and unilateral hearing loss. Sensitivity and specificity analyses were conducted using a two-by-two matrix and Cohen's kappa was used to measure agreement. Results The overall sensitivity for the diagnostic criteria was 0.88 (range: 0.86-1) and overall specificity was 0.93 (range: 0.86-0.97). Overall kappa ( k) agreement was 'substantial' k = 0.80 (95% confidence interval (CI) 0.70-0.89) and significant at p loss. This method has the potential to improve synchronous and asynchronous tele-audiology service delivery.

  19. Assume-Guarantee Abstraction Refinement Meets Hybrid Systems

    Science.gov (United States)

    Bogomolov, Sergiy; Frehse, Goran; Greitschus, Marius; Grosu, Radu; Pasareanu, Corina S.; Podelski, Andreas; Strump, Thomas

    2014-01-01

    Compositional verification techniques in the assume- guarantee style have been successfully applied to transition systems to efficiently reduce the search space by leveraging the compositional nature of the systems under consideration. We adapt these techniques to the domain of hybrid systems with affine dynamics. To build assumptions we introduce an abstraction based on location merging. We integrate the assume-guarantee style analysis with automatic abstraction refinement. We have implemented our approach in the symbolic hybrid model checker SpaceEx. The evaluation shows its practical potential. To the best of our knowledge, this is the first work combining assume-guarantee reasoning with automatic abstraction-refinement in the context of hybrid automata.

  20. System Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model Programmer's Manual

    Science.gov (United States)

    1982-07-01

    In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...

  1. Towards Measuring the Abstractness of State Machines based on Mutation Testing

    Directory of Open Access Journals (Sweden)

    Thomas Baar

    2017-01-01

    Full Text Available Abstract. The notation of state machines is widely adopted as a formalism to describe the behaviour of systems. Usually, multiple state machine models can be developed for the very same software system. Some of these models might turn out to be equivalent, but, in many cases, different state machines describing the same system also differ in their level of abstraction. In this paper, we present an approach to actually measure the abstractness level of state machines w.r.t. a given implemented software system. A state machine is considered to be less abstract when it is conceptionally closer to the implemented system. In our approach, this distance between state machine and implementation is measured by applying coverage criteria known from software mutation testing. Abstractness of state machines can be considered as a new metric. As for other metrics as well, a known value for the abstractness of a given state machine allows to assess its quality in terms of a simple number. In model-based software development projects, the abstract metric can help to prevent model degradation since it can actually measure the semantic distance from the behavioural specification of a system in form of a state machine to the current implementation of the system. In contrast to other metrics for state machines, the abstractness cannot be statically computed based on the state machine’s structure, but requires to execute both state machine and corresponding system implementation. The article is published in the author’s wording. 

  2. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  3. Automated analysis in generic groups

    Science.gov (United States)

    Fagerholm, Edvard

    This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups --- and prove ''computational soundness'' theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating ''pairing-product equations''. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an

  4. Automated Testing of Event-Driven Applications

    DEFF Research Database (Denmark)

    Jensen, Casper Svenning

    may be tested by selecting an interesting input (i.e. a sequence of events), and deciding if a failure occurs when the selected input is applied to the event-driven application under test. Automated testing promises to reduce the workload for developers by automatically selecting interesting inputs...... and detect failures. However, it is non-trivial to conduct automated testing of event-driven applications because of, for example, infinite input spaces and the absence of specifications of correct application behavior. In this PhD dissertation, we identify a number of specific challenges when conducting...... automated testing of event-driven applications, and we present novel techniques for solving these challenges. First, we present an algorithm for stateless model-checking of event-driven applications with partial-order reduction, and we show how this algorithm may be used to systematically test web...

  5. AUTOMATED RECONSTRUCTION OF WALLS FROM AIRBORNE LIDAR DATA FOR COMPLETE 3D BUILDING MODELLING

    Directory of Open Access Journals (Sweden)

    Y. He

    2012-07-01

    Full Text Available Automated 3D building model generation continues to attract research interests in photogrammetry and computer vision. Airborne Light Detection and Ranging (LIDAR data with increasing point density and accuracy has been recognized as a valuable source for automated 3D building reconstruction. While considerable achievements have been made in roof extraction, limited research has been carried out in modelling and reconstruction of walls, which constitute important components of a full building model. Low point density and irregular point distribution of LIDAR observations on vertical walls render this task complex. This paper develops a novel approach for wall reconstruction from airborne LIDAR data. The developed method commences with point cloud segmentation using a region growing approach. Seed points for planar segments are selected through principle component analysis, and points in the neighbourhood are collected and examined to form planar segments. Afterwards, segment-based classification is performed to identify roofs, walls and planar ground surfaces. For walls with sparse LIDAR observations, a search is conducted in the neighbourhood of each individual roof segment to collect wall points, and the walls are then reconstructed using geometrical and topological constraints. Finally, walls which were not illuminated by the LIDAR sensor are determined via both reconstructed roof data and neighbouring walls. This leads to the generation of topologically consistent and geometrically accurate and complete 3D building models. Experiments have been conducted in two test sites in the Netherlands and Australia to evaluate the performance of the proposed method. Results show that planar segments can be reliably extracted in the two reported test sites, which have different point density, and the building walls can be correctly reconstructed if the walls are illuminated by the LIDAR sensor.

  6. Mathematical Abstraction: Constructing Concept of Parallel Coordinates

    Science.gov (United States)

    Nurhasanah, F.; Kusumah, Y. S.; Sabandar, J.; Suryadi, D.

    2017-09-01

    Mathematical abstraction is an important process in teaching and learning mathematics so pre-service mathematics teachers need to understand and experience this process. One of the theoretical-methodological frameworks for studying this process is Abstraction in Context (AiC). Based on this framework, abstraction process comprises of observable epistemic actions, Recognition, Building-With, Construction, and Consolidation called as RBC + C model. This study investigates and analyzes how pre-service mathematics teachers constructed and consolidated concept of Parallel Coordinates in a group discussion. It uses AiC framework for analyzing mathematical abstraction of a group of pre-service teachers consisted of four students in learning Parallel Coordinates concepts. The data were collected through video recording, students’ worksheet, test, and field notes. The result shows that the students’ prior knowledge related to concept of the Cartesian coordinate has significant role in the process of constructing Parallel Coordinates concept as a new knowledge. The consolidation process is influenced by the social interaction between group members. The abstraction process taken place in this group were dominated by empirical abstraction that emphasizes on the aspect of identifying characteristic of manipulated or imagined object during the process of recognizing and building-with.

  7. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  8. Can automation in radiotherapy reduce costs?

    Science.gov (United States)

    Massaccesi, Mariangela; Corti, Michele; Azario, Luigi; Balducci, Mario; Ferro, Milena; Mantini, Giovanna; Mattiucci, Gian Carlo; Valentini, Vincenzo

    2015-01-01

    Computerized automation is likely to play an increasingly important role in radiotherapy. The objective of this study was to report the results of the first part of a program to implement a model for economical evaluation based on micro-costing method. To test the efficacy of the model, the financial impact of the introduction of an automation tool was estimated. A single- and multi-center validation of the model by a prospective collection of data is planned as the second step of the program. The model was implemented by using an interactive spreadsheet (Microsoft Excel, 2010). The variables to be included were identified across three components: productivity, staff, and equipment. To calculate staff requirements, the workflow of Gemelli ART center was mapped out and relevant workload measures were defined. Profit and loss, productivity and staffing were identified as significant outcomes. Results were presented in terms of earnings before interest and taxes (EBIT). Three different scenarios were hypothesized: baseline situation at Gemelli ART (scenario 1); reduction by 2 minutes of the average duration of treatment fractions (scenario 2); and increased incidence of advanced treatment modalities (scenario 3). By using the model, predicted EBIT values for each scenario were calculated across a period of eight years (from 2015 to 2022). For both scenarios 2 and 3 costs are expected to slightly increase as compared to baseline situation that is particularly due to a little increase in clinical personnel costs. However, in both cases EBIT values are more favorable than baseline situation (EBIT values: scenario 1, 27%, scenario 2, 30%, scenario 3, 28% of revenues). A model based on a micro-costing method was able to estimate the financial consequences of the introduction of an automation tool in our radiotherapy department. A prospective collection of data at Gemelli ART and in a consortium of centers is currently under way to prospectively validate the model.

  9. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  10. Abstracting massive data for lightweight intrusion detection in computer networks

    KAUST Repository

    Wang, Wei

    2016-10-15

    Anomaly intrusion detection in big data environments calls for lightweight models that are able to achieve real-time performance during detection. Abstracting audit data provides a solution to improve the efficiency of data processing in intrusion detection. Data abstraction refers to abstract or extract the most relevant information from the massive dataset. In this work, we propose three strategies of data abstraction, namely, exemplar extraction, attribute selection and attribute abstraction. We first propose an effective method called exemplar extraction to extract representative subsets from the original massive data prior to building the detection models. Two clustering algorithms, Affinity Propagation (AP) and traditional . k-means, are employed to find the exemplars from the audit data. . k-Nearest Neighbor (k-NN), Principal Component Analysis (PCA) and one-class Support Vector Machine (SVM) are used for the detection. We then employ another two strategies, attribute selection and attribute extraction, to abstract audit data for anomaly intrusion detection. Two http streams collected from a real computing environment as well as the KDD\\'99 benchmark data set are used to validate these three strategies of data abstraction. The comprehensive experimental results show that while all the three strategies improve the detection efficiency, the AP-based exemplar extraction achieves the best performance of data abstraction.

  11. New Trends in Agent-Based Complex Automated Negotiations

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Fatima, Shaheen; Matsuo, Tokuro

    2012-01-01

    Complex Automated Negotiations represent an important, emerging area in the field of Autonomous Agents and Multi-Agent Systems. Automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. These factors include the number of issues, dependencies between these issues,  representation of utilities, the negotiation protocol, the number of parties in the negotiation (bilateral or multi-party), time constraints, etc. Software agents can support automation or simulation of such complex negotiations on the behalf of their owners, and can provide them with efficient bargaining strategies. To realize such a complex automated negotiation, we have to incorporate advanced Artificial Intelligence technologies includes search, CSP, graphical utility models, Bayes nets, auctions, utility graphs, predicting and learning methods. Applications could include e-commerce tools, decision-making support tools, negotiation support tools, collaboration tools, etc. This book aims to pro...

  12. Automated quantification of proliferation with automated hot-spot selection in phosphohistone H3/MART1 dual-stained stage I/II melanoma.

    Science.gov (United States)

    Nielsen, Patricia Switten; Riber-Hansen, Rikke; Schmidt, Henrik; Steiniche, Torben

    2016-04-09

    Staging of melanoma includes quantification of a proliferation index, i.e., presumed melanocytic mitoses of H&E stains are counted manually in hot spots. Yet, its reproducibility and prognostic impact increases by immunohistochemical dual staining for phosphohistone H3 (PHH3) and MART1, which also may enable fully automated quantification by image analysis. To ensure manageable workloads and repeatable measurements in modern pathology, the study aimed to present an automated quantification of proliferation with automated hot-spot selection in PHH3/MART1-stained melanomas. Formalin-fixed, paraffin-embedded tissue from 153 consecutive stage I/II melanoma patients was immunohistochemically dual-stained for PHH3 and MART1. Whole slide images were captured, and the number of PHH3/MART1-positive cells was manually and automatically counted in the global tumor area and in a manually and automatically selected hot spot, i.e., a fixed 1-mm(2) square. Bland-Altman plots and hypothesis tests compared manual and automated procedures, and the Cox proportional hazards model established their prognostic impact. The mean difference between manual and automated global counts was 2.9 cells/mm(2) (P = 0.0071) and 0.23 cells per hot spot (P = 0.96) for automated counts in manually and automatically selected hot spots. In 77 % of cases, manual and automated hot spots overlapped. Fully manual hot-spot counts yielded the highest prognostic performance with an adjusted hazard ratio of 5.5 (95 % CI, 1.3-24, P = 0.024) as opposed to 1.3 (95 % CI, 0.61-2.9, P = 0.47) for automated counts with automated hot spots. The automated index and automated hot-spot selection were highly correlated to their manual counterpart, but altogether their prognostic impact was noticeably reduced. Because correct recognition of only one PHH3/MART1-positive cell seems important, extremely high sensitivity and specificity of the algorithm is required for prognostic purposes. Thus, automated

  13. Standard IEC 61850 substation automation

    Energy Technology Data Exchange (ETDEWEB)

    Bricchi, A.; Mezzadri, D. [Selta, Tortoreto (Italy)

    2008-07-01

    The International Electrotechnical Commission (IEC) 61850 standard is the reference communication protocol for all electrical substations protection and control systems. It creates models of all the elements and functionalities of an electrical substation, including physical elements such as switches or circuit breakers, as well as protection, control and monitoring functionalities. Network managers need to renew power substation automation and control systems in order to improve the efficiency and quality of services offered by electric utilities. Selta has proposed a new integrated solution for the automation of power substations which is fully compliant with the IEC 61850 norms. The solution involves the integration of control, automation, protection, monitoring and maintenance functions and applies leading edge technology to its systems, particularly for the TERNA network. The system is based on the use of many electronic devices at a power plant, each one with a specialized function, and all interconnected via a Station LAN. This solution, was tested on the TERNA network in Italy, in VHV and HV stations. It was shown to offer many advantages, such as an architecture based on full interoperability between control, monitoring and protection equipment; centralized and distributed automation; a LAN station that allows full interoperability between different bay units and protection relays in order to integrate equipment from various suppliers; the integration of automation systems in existing bay units and protection relays equipped with standard communication buses or with proprietary interfaces; and time synchronization for the entire system through a station GPS reception system. 10 refs., 1 tab., 7 figs.

  14. Automation and Robotics for Space-Based Systems, 1991

    Science.gov (United States)

    Williams, Robert L., II (Editor)

    1992-01-01

    The purpose of this in-house workshop was to assess the state-of-the-art of automation and robotics for space operations from an LaRC perspective and to identify areas of opportunity for future research. Over half of the presentations came from the Automation Technology Branch, covering telerobotic control, extravehicular activity (EVA) and intra-vehicular activity (IVA) robotics, hand controllers for teleoperation, sensors, neural networks, and automated structural assembly, all applied to space missions. Other talks covered the Remote Manipulator System (RMS) active damping augmentation, space crane work, modeling, simulation, and control of large, flexible space manipulators, and virtual passive controller designs for space robots.

  15. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  16. MATHEMATICAL MODELING, AUTOMATION AND CONTROL OF THE BIOCONVERSION OF SORBITOL TO SORBOSE IN THE VITAMIN C PRODUCTION PROCESS I. MATHEMATICAL MODELING

    Directory of Open Access Journals (Sweden)

    A. Bonomi

    1997-12-01

    Full Text Available In 1990, the Biotechnology and the Control Systems Groups of IPT started developing a system for the control and automation of fermentation processes, applied to the oxidation of sorbitol to sorbose by the bacteria Gluconobacter oxydans, the microbial step of the vitamin C production process, that was chosen as a case study. Initially, a thirteen-parameter model was fitted to represent the batch operation of the system utilizing a nonlinear regression analysis, the flexible polyhedron method. Based on these results, a model for the continuous process (with the same kinetic equations was constructed and its optimum operating point obtained

  17. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    Science.gov (United States)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  18. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    Directory of Open Access Journals (Sweden)

    Márcio Bottaro

    Full Text Available Abstract Introduction To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naïve human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer’s edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients.

  19. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  20. Automated vocabulary discovery for geo-parsing online epidemic intelligence

    Directory of Open Access Journals (Sweden)

    Freifeld Clark C

    2009-11-01

    Full Text Available Abstract Background Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Results Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. Conclusion The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  1. Longwall automation 2

    Energy Technology Data Exchange (ETDEWEB)

    David Hainsworth; David Reid; Con Caris; J.C. Ralston; C.O. Hargrave; Ron McPhee; I.N. Hutchinson; A. Strange; C. Wesner [CSIRO (Australia)

    2008-05-15

    This report covers a nominal two-year extension to the Major Longwall Automation Project (C10100). Production standard implementation of Longwall Automation Steering Committee (LASC) automation systems has been achieved at Beltana and Broadmeadow mines. The systems are now used on a 24/7 basis and have provided production benefits to the mines. The LASC Information System (LIS) has been updated and has been implemented successfully in the IT environment of major coal mining houses. This enables 3D visualisation of the longwall environment and equipment to be accessed on line. A simulator has been specified and a prototype system is now ready for implementation. The Shearer Position Measurement System (SPMS) has been upgraded to a modular commercial production standard hardware solution.A compact hardware solution for visual face monitoring has been developed, an approved enclosure for a thermal infrared camera has been produced and software for providing horizon control through faulted conditions has been delivered. The incorporation of the LASC Cut Model information into OEM horizon control algorithms has been bench and underground tested. A prototype system for shield convergence monitoring has been produced and studies to identify techniques for coal flow optimisation and void monitoring have been carried out. Liaison with equipment manufacturers has been maintained and technology delivery mechanisms for LASC hardware and software have been established.

  2. Automated Analysis of Infinite Scenarios

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2005-01-01

    The security of a network protocol crucially relies on the scenario in which the protocol is deployed. This paper describes syntactic constructs for modelling network scenarios and presents an automated analysis tool, which can guarantee that security properties hold in all of the (infinitely many...

  3. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  4. Implementation of the Automated Numerical Model Performance Metrics System

    Science.gov (United States)

    2011-09-26

    question. As of this writing, the DSRC IBM AIX machines DaVinci and Pascal, and the Cray XT Einstein all use the PBS batch queuing system for...3.3). 12 Appendix A – General Automation System This system provides general purpose tools and a general way to automatically run

  5. Is searching full text more effective than searching abstracts?

    Directory of Open Access Journals (Sweden)

    Lin Jimmy

    2009-02-01

    Full Text Available Abstract Background With the growing availability of full-text articles online, scientists and other consumers of the life sciences literature now have the ability to go beyond searching bibliographic records (title, abstract, metadata to directly access full-text content. Motivated by this emerging trend, I posed the following question: is searching full text more effective than searching abstracts? This question is answered by comparing text retrieval algorithms on MEDLINE® abstracts, full-text articles, and spans (paragraphs within full-text articles using data from the TREC 2007 genomics track evaluation. Two retrieval models are examined: bm25 and the ranking algorithm implemented in the open-source Lucene search engine. Results Experiments show that treating an entire article as an indexing unit does not consistently yield higher effectiveness compared to abstract-only search. However, retrieval based on spans, or paragraphs-sized segments of full-text articles, consistently outperforms abstract-only search. Results suggest that highest overall effectiveness may be achieved by combining evidence from spans and full articles. Conclusion Users searching full text are more likely to find relevant articles than searching only abstracts. This finding affirms the value of full text collections for text retrieval and provides a starting point for future work in exploring algorithms that take advantage of rapidly-growing digital archives. Experimental results also highlight the need to develop distributed text retrieval algorithms, since full-text articles are significantly longer than abstracts and may require the computational resources of multiple machines in a cluster. The MapReduce programming model provides a convenient framework for organizing such computations.

  6. Automated contour detection in cardiac MRI using active appearance models: the effect of the composition of the training set

    NARCIS (Netherlands)

    Angelié, Emmanuelle; Oost, Elco R.; Hendriksen, Dennis; Lelieveldt, Boudewijn P. F.; van der Geest, Rob J.; Reiber, Johan H. C.

    2007-01-01

    Definition of the optimal training set for the automated segmentation of short-axis left ventricular magnetic resonance (MR) imaging studies in clinical practice based on active appearance model (AAM). We investigated the segmentation accuracy by varying the size and composition of the training set

  7. A COMPARATIVE STUDY OF AUTOMATION STRATEGIES AT VOLKSWAGEN IN GERMANY AND SOUTH AFRICA

    Directory of Open Access Journals (Sweden)

    O. Wessel

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: The final car assembly lines at Volkswagen’s production sites in Germany and South Africa are analysed to determine the best automation level based on cost, productivity, quality, and flexibility for a plant location. The methodology used is proposed by the Fraunhofer Institute. The final assembly processes are analysed and classified according to the automation level. The operations are evaluated at every level of automation based on information from existing factories. If the best levels of automation for all the parameters correspond, the optimal level of automation for a plant is reached. Otherwise, improvements and/or additional considerations are required to optimise the automation level. The result of the analysis indicates that the highest automation level is not necessarily the best in terms of cost and quality, and some de-automation is required. The analysis also shows that a low automation level can result in poor product quality and low productivity. The best automation strategy should be based on the analysis of all the aspects of the process in the local context.

    AFRIKAANSE OPSOMMING: Die finale monteerlyne by Volkswagen se aanlegte in Duitsland en Suid-Afrika is ontleed om die beste outomatisasievlak te bepaal gebaseer op koste, produktiwiteit, gehalte en aanpasbaarheid gegee die ligging. Die metodologie wat gevolg is, word voorgestel deur die Fraunhofer Instituut. Die finale monteerprosesse is ontleed volgens outomatisasievlak. Die aktiwiteite is ontleed teen elke vlak van outomatisasie gebaseer op inligting van bestaande vervaardigingsaanlegte. Indien die beste outomatisasievlakke vir alle parameters ooreenstem, dan is die optimale vlak van outomatisasie bereik. Indien nie, is verbeterings en/of addisionele oorwegings nodig om die outomatisasievlak te optimiseer. Die resultaat van die ontleding toon dat die grootste mate van outomatisasie nie noodwendig die beste is in terme van koste en gehalte nie

  8. Access to Mobile Communication Technology and Willingness to Participate in Automated Telemedicine Calls Among Chronically Ill Patients in Honduras

    Science.gov (United States)

    Mendoza-Avelares, Milton O.; Milton, Evan C.; Lange, Ilta; Fajardo, Roosevelt

    2010-01-01

    Abstract Objectives: Patients in underdeveloped countries may be left behind by advances in telehealthcare. We surveyed chronically ill patients with low incomes in Honduras to measure their use of mobile technologies and willingness to participate in mobile disease management support. Materials and Methods: 624 chronically ill primary care patients in Honduras were surveyed. We examined variation in telephone access across groups defined by patients' sociodemographic characteristics, diagnoses, and access to care. Logistic regression was used to identify independent correlates of patients' interest in automated telephonic support for disease management. Results: Participants had limited education (mean 4.8 years), and 65% were unemployed. Eighty-four percent had telephone access, and 78% had cell phones. Most respondents had voicemail (61%) and text messaging (58%). Mobile technologies were particularly common among patients who had to forego clinic visits and medications due to cost concerns (each p 80%) reported that they would be willing to receive automated calls focused on appointment reminders, medication adherence, health status monitoring, and self-care education. Patients were more likely to be willing to participate in automated telemedicine services if they had to cancel a clinic appointment due to transportation problems or forego medication due to cost pressures. Conclusions: Even in this poor region of Honduras, most chronically ill patients have access to mobile technology, and most are willing to participate in automated telephone disease management support. Given barriers to in-person care, new models of mobile healthcare should be developed for chronically ill patients in developing countries. PMID:21062234

  9. Abstraction of Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Wisniewski, Rafael; Sloth, Christoffer

    2011-01-01

    To enable formal verification of a dynamical system, given by a set of differential equations, it is abstracted by a finite state model. This allows for application of methods for model checking. Consequently, it opens the possibility of carrying out the verification of reachability and timing re...

  10. Conceptual Model of an Application for Automated Generation of Webpage Mobile Versions

    Directory of Open Access Journals (Sweden)

    Todor Rachovski

    2017-11-01

    Full Text Available Accessing webpages through various types of mobile devices with different screen sizes and using different browsers has put new demands on web developers. The main challenge is the development of websites with responsive design that is adaptable depending on the mobile device used. The article presents a conceptual model of an app for automated generation of mobile pages. It has five-layer architecture: database, database management layer, business logic layer, web services layer and a presentation layer. The database stores all the data needed to run the application. The database management layer uses an ORM model to convert relational data into an object-oriented format and control the access to them. The business logic layer contains components that perform the actual work on building a mobile version of the page, including parsing, building a hierarchical model of the page and a number of transformations. The web services layer provides external applications with access to lower-level functionalities, and the presentation layer is responsible for choosing and using the appropriate CSS. A web application that uses the proposed model was developed and experiments were conducted.

  11. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    Science.gov (United States)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  12. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  13. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    Science.gov (United States)

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  14. ERP Human Enhancement Progress Report : Use case and computational model for adaptive maritime automation

    NARCIS (Netherlands)

    Kleij, R. van der; Broek, J. van den; Brake, G.M. te; Rypkema, J.A.; Schilder, C.M.C.

    2015-01-01

    Automation is often applied in order to increase the cost-effectiveness, reliability and safety of maritime ship and offshore operations. Automation of operator tasks, has not, however, eliminated human error so much as created opportunities for new kinds of error. The ambition of the Adaptive

  15. DSNF AND OTHER WASTE FORM DEGRADATION ABSTRACTION

    International Nuclear Information System (INIS)

    Thornton, T.A.

    2000-01-01

    The purpose of this analysis/model report (AMR) is to select and/or abstract conservative degradation models for DOE-(US. Department of Energy) owned spent nuclear fuel (DSNF) and the immobilized ceramic plutonium (Pu) disposition waste forms for application in the proposed monitored geologic repository (MGR) postclosure Total System Performance Assessment (TSPA). Application of the degradation models abstracted herein for purposes other than TSPA should take into consideration the fact that they are, in general, very conservative. Using these models, the forward reaction rate for the mobilization of radionuclides, as solutes or colloids, away from the waste fondwater interface by contact with repository groundwater can then be calculated. This forward reaction rate generally consists of the dissolution reaction at the surface of spent nuclear fuel (SNF) in contact with water, but the degradation models, in some cases, may also include and account for the physical disintegration of the SNF matrix. The models do not, however, account for retardation, precipitation, or inhibition of the migration of the mobilized radionuclides in the engineered barrier system (EBS). These models are based on the assumption that all components of the DSNF waste form are released congruently with the degradation of the matrix

  16. DSNF and other waste form degradation abstraction

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Thomas A.

    2000-12-20

    The purpose of this analysis/model report (AMR) is to select and/or abstract conservative degradation models for DOE-(US. Department of Energy) owned spent nuclear fuel (DSNF) and the immobilized ceramic plutonium (Pu) disposition waste forms for application in the proposed monitored geologic repository (MGR) postclosure Total System Performance Assessment (TSPA). Application of the degradation models abstracted herein for purposes other than TSPA should take into consideration the fact that they are, in general, very conservative. Using these models, the forward reaction rate for the mobilization of radionuclides, as solutes or colloids, away from the waste fondwater interface by contact with repository groundwater can then be calculated. This forward reaction rate generally consists of the dissolution reaction at the surface of spent nuclear fuel (SNF) in contact with water, but the degradation models, in some cases, may also include and account for the physical disintegration of the SNF matrix. The models do not, however, account for retardation, precipitation, or inhibition of the migration of the mobilized radionuclides in the engineered barrier system (EBS). These models are based on the assumption that all components of the DSNF waste form are released congruently with the degradation of the matrix.

  17. GoSam 2.0. Automated one loop calculations within and beyond the standard model

    International Nuclear Information System (INIS)

    Greiner, Nicolas; Deutsches Elektronen-Synchrotron

    2014-10-01

    We present GoSam 2.0, a fully automated framework for the generation and evaluation of one loop amplitudes in multi leg processes. The new version offers numerous improvements both on generational aspects as well as on the reduction side. This leads to a faster and more stable code for calculations within and beyond the Standard Model. Furthermore it contains the extended version of the standardized interface to Monte Carlo programs which allows for an easy combination with other existing tools. We briefly describe the conceptual innovations and present some phenomenological results.

  18. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  19. Abstract decomposition theorem and applications

    CERN Document Server

    Grossberg, R; Grossberg, Rami; Lessmann, Olivier

    2005-01-01

    Let K be an Abstract Elementary Class. Under the asusmptions that K has a nicely behaved forking-like notion, regular types and existence of some prime models we establish a decomposition theorem for such classes. The decomposition implies a main gap result for the class K. The setting is general enough to cover \\aleph_0-stable first-order theories (proved by Shelah in 1982), Excellent Classes of atomic models of a first order tehory (proved Grossberg and Hart 1987) and the class of submodels of a large sequentially homogenuus \\aleph_0-stable model (which is new).

  20. A logical correspondence between natural semantics and abstract machines

    DEFF Research Database (Denmark)

    Simmons, Robert J.; Zerny, Ian

    2013-01-01

    We present a logical correspondence between natural semantics and abstract machines. This correspondence enables the mechanical and fully-correct construction of an abstract machine from a natural semantics. Our logical correspondence mirrors the Reynolds functional correspondence, but we...... manipulate semantic specifications encoded in a logical framework instead of manipulating functional programs. Natural semantics and abstract machines are instances of substructural operational semantics. As a byproduct, using a substructural logical framework, we bring concurrent and stateful models...

  1. Automated fault-management in a simulated spaceflight micro-world

    Science.gov (United States)

    Lorenz, Bernd; Di Nocera, Francesco; Rottger, Stefan; Parasuraman, Raja

    2002-01-01

    BACKGROUND: As human spaceflight missions extend in duration and distance from Earth, a self-sufficient crew will bear far greater onboard responsibility and authority for mission success. This will increase the need for automated fault management (FM). Human factors issues in the use of such systems include maintenance of cognitive skill, situational awareness (SA), trust in automation, and workload. This study examine the human performance consequences of operator use of intelligent FM support in interaction with an autonomous, space-related, atmospheric control system. METHODS: An expert system representing a model-base reasoning agent supported operators at a low level of automation (LOA) by a computerized fault finding guide, at a medium LOA by an automated diagnosis and recovery advisory, and at a high LOA by automate diagnosis and recovery implementation, subject to operator approval or veto. Ten percent of the experimental trials involved complete failure of FM support. RESULTS: Benefits of automation were reflected in more accurate diagnoses, shorter fault identification time, and reduced subjective operator workload. Unexpectedly, fault identification times deteriorated more at the medium than at the high LOA during automation failure. Analyses of information sampling behavior showed that offloading operators from recovery implementation during reliable automation enabled operators at high LOA to engage in fault assessment activities CONCLUSIONS: The potential threat to SA imposed by high-level automation, in which decision advisories are automatically generated, need not inevitably be counteracted by choosing a lower LOA. Instead, freeing operator cognitive resources by automatic implementation of recover plans at a higher LOA can promote better fault comprehension, so long as the automation interface is designed to support efficient information sampling.

  2. Writing business research article abstracts: A genre approach

    Directory of Open Access Journals (Sweden)

    Carmen Piqué-Noguera

    2012-10-01

    Full Text Available A great deal has been published about oral and written genres in business (e.g., letters, research articles, oral presentations, etc., and less attention has been paid to business research article abstracts as a written genre, as many experts would argue. This research intends to raise rhetorical awareness about the role of abstracts in today’s academic world. To this effect, the abstracts of two official publications of the Association of Business Communication, Journal of Business Communication and Business Communication Quarterly, have been analyzed and compared in terms of structure and content according to models published in the specialized literature. The results show an irregular and inconsistent presentation of abstracts, a good number of them following no set pattern and thus lacking in important information for researchers. These findings suggest, first of all, that abstracts have a specific mission to fulfil and should not be disregarded; and, secondly, that journal guidelines for authors should be more explicit in their instructions on how to write and structure abstracts.

  3. On problems in defining abstract and metaphysical concepts--emergence of a new model.

    Science.gov (United States)

    Nahod, Bruno; Nahod, Perina Vukša

    2014-12-01

    Basic anthropological terminology is the first project covering terms from the domain of the social sciences under the Croatian Special Field Terminology program (Struna). Problems that have been sporadically noticed or whose existence could have been presumed during the processing of terms mainly from technical fields and sciences have finally emerged in "anthropology". The principles of the General Theory of Terminology (GTT), which are followed in Struna, were put to a truly exacting test, and sometimes stretched beyond their limits when applied to concepts that do not necessarily have references in the physical world; namely, abstract and metaphysical concepts. We are currently developing a new terminographical model based on Idealized Cognitive Models (ICM), which will hopefully ensure a better cross-filed implementation of various types of concepts and their relations. The goal of this paper is to introduce the theoretical bases of our model. Additionally, we will present a pilot study of the series of experiments in which we are trying to investigate the nature of conceptual categorization in special languages and its proposed difference form categorization in general language.

  4. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  5. A simple viability analysis for unicellular cyanobacteria using a new autofluorescence assay, automated microscopy, and ImageJ

    Directory of Open Access Journals (Sweden)

    Schulze Katja

    2011-11-01

    Full Text Available Abstract Background Currently established methods to identify viable and non-viable cells of cyanobacteria are either time-consuming (eg. plating or preparation-intensive (eg. fluorescent staining. In this paper we present a new and fast viability assay for unicellular cyanobacteria, which uses red chlorophyll fluorescence and an unspecific green autofluorescence for the differentiation of viable and non-viable cells without the need of sample preparation. Results The viability assay for unicellular cyanobacteria using red and green autofluorescence was established and validated for the model organism Synechocystis sp. PCC 6803. Both autofluorescence signals could be observed simultaneously allowing a direct classification of viable and non-viable cells. The results were confirmed by plating/colony count, absorption spectra and chlorophyll measurements. The use of an automated fluorescence microscope and a novel ImageJ based image analysis plugin allow a semi-automated analysis. Conclusions The new method simplifies the process of viability analysis and allows a quick and accurate analysis. Furthermore results indicate that a combination of the new assay with absorption spectra or chlorophyll concentration measurements allows the estimation of the vitality of cells.

  6. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  7. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  8. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  9. Waste management research abstracts No. 18

    International Nuclear Information System (INIS)

    1987-12-01

    The eighteenth issue of this publication contains over 750 abstracts from 33 IAEA member countries comprehending various aspects of radioactive waste management. Radioactive waste disposal, processing and storage, geochemical and geological investigations related to waste management, mathematical models and environmental impacts are reviewed

  10. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  11. A cellular automation model for the change of public attitude regarding nuclear energy

    International Nuclear Information System (INIS)

    Ohnishi, Teruaki

    1991-01-01

    A cellular automation model was constructed to investigate how public opinion on nuclear energy in Japan depends upon the information environment and personal communication between people. From simulation with this model, the following become clear; (i) society is a highly non-linear system with a self-organizing potential: (ii) in a society composed of one type of constituent member with homogeneous characteristics, the trend of public opinion is substantially changed only when the effort to ameliorate public acceptance over a long period of time, by means such as education, persuasion and advertisement, exceeds a certain threshold, and (iii) in the case when the amount of information on nuclear risk released from the newsmedia is reduced continuously from now on, the acceptability of nuclear energy is significantly improved so far as the extent of the reduction exceeds a certain threshold. (author)

  12. A cellular automation model for the change of public attitude regarding nuclear energy

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, Teruaki (CRC Research Inst., Chiba (Japan))

    1991-01-01

    A cellular automation model was constructed to investigate how public opinion on nuclear energy in Japan depends upon the information environment and personal communication between people. From simulation with this model, the following become clear; (i) society is a highly non-linear system with a self-organizing potential: (ii) in a society composed of one type of constituent member with homogeneous characteristics, the trend of public opinion is substantially changed only when the effort to ameliorate public acceptance over a long period of time, by means such as education, persuasion and advertisement, exceeds a certain threshold, and (iii) in the case when the amount of information on nuclear risk released from the newsmedia is reduced continuously from now on, the acceptability of nuclear energy is significantly improved so far as the extent of the reduction exceeds a certain threshold. (author).

  13. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  14. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  15. Selected Translated Abstracts of Chinese-Language Climate Change Publications; TOPICAL

    International Nuclear Information System (INIS)

    Cushman, R.M.; Burtis, M.D.

    1999-01-01

    This report contains English-translated abstracts of important Chinese-language literature concerning global climate change for the years 1995-1998. This body of literature includes the topics of adaptation, ancient climate change, climate variation, the East Asia monsoon, historical climate change, impacts, modeling, and radiation and trace-gas emissions. In addition to the biological citations and abstracts translated into English, this report presents the original citations and abstracts in Chinese. Author and title indexes are included to assist the reader in locating abstracts of particular interest

  16. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  17. Exploiting Spatial Abstraction in Predictive Analytics of Vehicle Traffic

    Directory of Open Access Journals (Sweden)

    Natalia Andrienko

    2015-04-01

    Full Text Available By applying visual analytics techniques to vehicle traffic data, we found a way to visualize and study the relationships between the traffic intensity and movement speed on links of a spatially abstracted transportation network. We observed that the traffic intensities and speeds in an abstracted network are interrelated in the same way as they are in a detailed street network at the level of street segments. We developed interactive visual interfaces that support representing these interdependencies by mathematical models. To test the possibility of utilizing them for performing traffic simulations on the basis of abstracted transportation networks, we devised a prototypical simulation algorithm employing these dependency models. The algorithm is embedded in an interactive visual environment for defining traffic scenarios, running simulations, and exploring their results. Our research demonstrates a principal possibility of performing traffic simulations on the basis of spatially abstracted transportation networks using dependency models derived from real traffic data. This possibility needs to be comprehensively investigated and tested in collaboration with transportation domain specialists.

  18. Energy Assessment of Automated Mobility Districts

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yuche [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-03

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displace private automobiles for day-to-day travel in dense activity districts. This project examines such a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMDs). The project reviews several such districts including airport, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology and others with more traditional transit based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs.

  19. Is searching full text more effective than searching abstracts?

    Science.gov (United States)

    Lin, Jimmy

    2009-02-03

    With the growing availability of full-text articles online, scientists and other consumers of the life sciences literature now have the ability to go beyond searching bibliographic records (title, abstract, metadata) to directly access full-text content. Motivated by this emerging trend, I posed the following question: is searching full text more effective than searching abstracts? This question is answered by comparing text retrieval algorithms on MEDLINE abstracts, full-text articles, and spans (paragraphs) within full-text articles using data from the TREC 2007 genomics track evaluation. Two retrieval models are examined: bm25 and the ranking algorithm implemented in the open-source Lucene search engine. Experiments show that treating an entire article as an indexing unit does not consistently yield higher effectiveness compared to abstract-only search. However, retrieval based on spans, or paragraphs-sized segments of full-text articles, consistently outperforms abstract-only search. Results suggest that highest overall effectiveness may be achieved by combining evidence from spans and full articles. Users searching full text are more likely to find relevant articles than searching only abstracts. This finding affirms the value of full text collections for text retrieval and provides a starting point for future work in exploring algorithms that take advantage of rapidly-growing digital archives. Experimental results also highlight the need to develop distributed text retrieval algorithms, since full-text articles are significantly longer than abstracts and may require the computational resources of multiple machines in a cluster. The MapReduce programming model provides a convenient framework for organizing such computations.

  20. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  1. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  2. Abstract probabilistic CNOT gate model based on double encoding: study of the errors and physical realizability

    Science.gov (United States)

    Gueddana, Amor; Attia, Moez; Chatta, Rihab

    2015-03-01

    In this work, we study the error sources standing behind the non-perfect linear optical quantum components composing a non-deterministic quantum CNOT gate model, which performs the CNOT function with a success probability of 4/27 and uses a double encoding technique to represent photonic qubits at the control and the target. We generalize this model to an abstract probabilistic CNOT version and determine the realizability limits depending on a realistic range of the errors. Finally, we discuss physical constraints allowing the implementation of the Asymmetric Partially Polarizing Beam Splitter (APPBS), which is at the heart of correctly realizing the CNOT function.

  3. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  4. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  5. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1997-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  6. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  7. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  8. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  9. County-level job automation risk and health: Evidence from the United States.

    Science.gov (United States)

    Patel, Pankaj C; Devaraj, Srikant; Hicks, Michael J; Wornell, Emily J

    2018-04-01

    Previous studies have observed a positive association between automation risk and employment loss. Based on the job insecurity-health risk hypothesis, greater exposure to automation risk could also be negatively associated with health outcomes. The main objective of this paper is to investigate the county-level association between prevalence of workers in jobs exposed to automation risk and general, physical, and mental health outcomes. As a preliminary assessment of the job insecurity-health risk hypothesis (automation risk → job insecurity → poorer health), a structural equation model was used based on individual-level data in the two cross-sectional waves (2012 and 2014) of General Social Survey (GSS). Next, using county-level data from County Health Rankings 2017, American Community Survey (ACS) 2015, and Statistics of US Businesses 2014, Two Stage Least Squares (2SLS) regression models were fitted to predict county-level health outcomes. Using the 2012 and 2014 waves of the GSS, employees in occupational classes at higher risk of automation reported more job insecurity, that, in turn, was associated with poorer health. The 2SLS estimates show that a 10% increase in automation risk at county-level is associated with 2.38, 0.8, and 0.6 percentage point lower general, physical, and mental health, respectively. Evidence suggests that exposure to automation risk may be negatively associated with health outcomes, plausibly through perceptions of poorer job security. More research is needed on interventions aimed at mitigating negative influence of automation risk on health. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  11. Debugging systems-on-chip communication-centric and abstraction-based techniques

    CERN Document Server

    Vermeulen, Bart

    2014-01-01

    This book describes an approach and supporting infrastructure to facilitate debugging the silicon implementation of a System-on-Chip (SOC), allowing its associated product to be introduced into the market more quickly.  Readers learn step-by-step the key requirements for debugging a modern, silicon SOC implementation, nine factors that complicate this debugging task, and a new debug approach that addresses these requirements and complicating factors.  The authors’ novel communication-centric, scan-based, abstraction-based, run/stop-based (CSAR) debug approach is discussed in detail, showing how it helps to meet debug requirements and address the nine, previously identified factors that complicate debugging silicon implementations of SOCs. The authors also derive the debug infrastructure requirements to support debugging of a silicon implementation of an SOC with their CSAR debug approach. This debug infrastructure consists of a generic on-chip debug architecture, a configurable automated design-for-debug ...

  12. Programme and abstracts

    International Nuclear Information System (INIS)

    1975-01-01

    Abstracts of 25 papers presented at the congress are given. The abstracts cover various topics including radiotherapy, radiopharmaceuticals, radioimmunoassay, health physics, radiation protection and nuclear medicine

  13. Three-dimensional eddy current solution of a polyphase machine test model (abstract)

    Science.gov (United States)

    Pahner, Uwe; Belmans, Ronnie; Ostovic, Vlado

    1994-05-01

    This abstract describes a three-dimensional (3D) finite element solution of a test model that has been reported in the literature. The model is a basis for calculating the current redistribution effects in the end windings of turbogenerators. The aim of the study is to see whether the analytical results of the test model can be found using a general purpose finite element package, thus indicating that the finite element model is accurate enough to treat real end winding problems. The real end winding problems cannot be solved analytically, as the geometry is far too complicated. The model consists of a polyphase coil set, containing 44 individual coils. This set generates a two pole mmf distribution on a cylindrical surface. The rotating field causes eddy currents to flow in the inner massive and conducting rotor. In the analytical solution a perfect sinusoidal mmf distribution is put forward. The finite element model contains 85824 tetrahedra and 16451 nodes. A complex single scalar potential representation is used in the nonconducting parts. The computation time required was 3 h and 42 min. The flux plots show that the field distribution is acceptable. Furthermore, the induced currents are calculated and compared with the values found from the analytical solution. The distribution of the eddy currents is very close to the distribution of the analytical solution. The most important results are the losses, both local and global. The value of the overall losses is less than 2% away from those of the analytical solution. Also the local distribution of the losses is at any given point less than 7% away from the analytical solution. The deviations of the results are acceptable and are partially due to the fact that the sinusoidal mmf distribution was not modeled perfectly in the finite element method.

  14. 2018 Congress Podium Abstracts

    Science.gov (United States)

    2018-02-21

    Each abstract has been indexed according to first author. Abstracts appear as they were submitted and have not undergone editing or the Oncology Nursing Forum’s review process. Only abstracts that will be presented appear here. For Congress scheduling information, visit congress.ons.org or check the Congress guide. Data published in abstracts presented at the ONS 43rd Annual Congress are embargoed until the conclusion of the presentation. Coverage and/or distribution of an abstract, poster, or any of its supplemental material to or by the news media, any commercial entity, or individuals, including the authors of said abstract, is strictly prohibited until the embargo is lifted. Promotion of general topics and speakers is encouraged within these guidelines.

  15. AUTOMATION OF CONTROL OF THE BUSINESS PROCESS OF PUBLISHING SCIENTIFIC JOURNALS

    Directory of Open Access Journals (Sweden)

    O. Yu. Sakaliuk

    2016-09-01

    Full Text Available We consider business process automation publishing scientific journals. It describes the focal point of publishing houses Odessa National Academy of Food Technology and the automation of business processes. A complex business process models publishing scientific journals. Analyzed organizational structure of Coordinating Centre of Scientific Journals' Publishing ONAFT structure and created its model. A process model simulation conducted business process notation eEPC and BPMN. Also held database design, creation of file structure and create AIS interface. Implemented interaction with the webcam. Justification feasibility of software development, and the definition of performance based on the results petal chart, it is safe to say that an automated way to much more efficient compared to manual mode. The developed software will accelerate the development of scientific periodicals ONAFT, which in turn improve the academy ratings at the global level, improve its image and credibility.

  16. Automation of measurement of heights waves around a model ship; Mokeisen mawari no hako keisoku no jidoka

    Energy Technology Data Exchange (ETDEWEB)

    Ikehata, M; Kato, M; Yanagida, F [Yokohama National University, Yokohama (Japan). Faculty of Engineering

    1997-10-01

    Trial fabrication and tests were performed on an instrument to automate measurement of heights of waves around a model ship. The currently used electric wave height measuring instrument takes long time for measurement, hence poor in efficiency. The method for processing optical images also has a problem in accuracy. Therefore, a computer controlled system was structured by using AC servo motors in driving the X and Y axes of a traverse equipment. Equipment was fabricated to automate the wave height measurement, in which four servo type wave height meters are installed on a moving rack in the lateral (Y-axial) direction so that wave heights to be measured by four meters can be measured automatically all at once. Wave heights can be measured continuously by moving the moving rack at a constant speed, verifying that wave shapes in longitudinal cross sections can be acquired by only one towing. Time required in the measurements using the instrument was 40 hours as a net time for fixed point measurement and 12 hours for continuous measurement, or 52 hours in total. On the other hand, the time may reach 240 hours for fixed point measurement when the conventional all-point manual traverse equipment is used. Enormous effects were obtained from automating the instrument. Collection of wave height data will continue also on tankers and other types of ships. 2 refs., 8 figs., 1 tab.

  17. Human-centred automation programme: review of experiment related studies

    International Nuclear Information System (INIS)

    Grimstad, Tone; Andresen, Gisle; Skjerve, Ann Britt Miberg

    2000-04-01

    Twenty-three empirical studies concerning automation and performance have been reviewed. The purposes of the review are to support experimental studies in the Human-Centred Automation (HCA) programme and to develop a general theory on HCA. Each study was reviewed with regard to twelve study characteristics: domain, type of study, purpose, definition of automation, variables, theoretical basis, models of operator performance, methods applied, experimental design, outcome, stated scope of results, strengths and limitations. Seven of the studies involved domain experts, the rest used students as participants. The majority of the articles originated from the aviation domain: only the study conducted in HAMMLAB considered process control in power plants. In the experimental studies, the independent variable was level of automation, or reliability of automation, while the most common dependent variables were workload, situation awareness, complacency, trust, and criteria of performance, e.g., number of correct responses or response time. Although the studies highlight important aspects of human-automation interaction, it is still unclear how system performance is affected. Nevertheless, the fact that many factors seem to be involved is taken as support for the system-oriented approach of the HCA programme. In conclusion, the review provides valuable input both to the design of experiments and to the development of a general theory. (Author). refs

  18. Correlations and fluctuations '98. Collected abstracts

    International Nuclear Information System (INIS)

    Csoergoe, T.; Hegyi, S.; Hwa, R.C.; Jancso, G.

    1998-01-01

    The proceedings of the 8. International workshop on multiparticle production contains the abstracts of papers on various topics of correlations and fluctuations. Hydrodynamic models, Bose-Einstein correlations, hadron-hadron interactions, heavy ion reactions are discussed in detail. 54 items are indexed separately for the INIS database. (K.A.)

  19. A Federated Enterprise Architecture and MBSE Modeling Framework for Integrating Design Automation into a Global PLM Approach

    OpenAIRE

    Vosgien , Thomas; Rigger , Eugen; Schwarz , Martin; Shea , Kristina

    2017-01-01

    Part 1: PLM Maturity, Implementation and Adoption; International audience; PLM and Design Automation (DA) are two interdependent and necessary approaches to increase the performance and efficiency of product development processes. Often, DA systems’ usability suffers due to a lack of integration in industrial business environments stemming from the independent consideration of PLM and DA. This article proposes a methodological and modeling framework for developing and deploying DA solutions w...

  20. 2018 Congress Poster Abstracts

    Science.gov (United States)

    2018-02-21

    Each abstract has been indexed according to the first author. Abstracts appear as they were submitted and have not undergone editing or the Oncology Nursing Forum’s review process. Only abstracts that will be presented appear here. Poster numbers are subject to change. For updated poster numbers, visit congress.ons.org or check the Congress guide. Data published in abstracts presented at the ONS 43rd Annual Congress are embargoed until the conclusion of the presentation. Coverage and/or distribution of an abstract, poster, or any of its supplemental material to or by the news media, any commercial entity, or individuals, including the authors of said abstract, is strictly prohibited until the embargo is lifted. Promotion of general topics and speakers is encouraged within these guidelines.

  1. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  2. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Science.gov (United States)

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly...) plan to both rename and modify the National Customs Automation Program (NCAP) test concerning the... data elements required to obtain release for cargo transported by air. The test will now be known as...

  3. Using graph theory for automated electric circuit solving

    International Nuclear Information System (INIS)

    Toscano, L; Stella, S; Milotti, E

    2015-01-01

    Graph theory plays many important roles in modern physics and in many different contexts, spanning diverse topics such as the description of scale-free networks and the structure of the universe as a complex directed graph in causal set theory. Graph theory is also ideally suited to describe many concepts in computer science. Therefore it is increasingly important for physics students to master the basic concepts of graph theory. Here we describe a student project where we develop a computational approach to electric circuit solving which is based on graph theoretic concepts. This highly multidisciplinary approach combines abstract mathematics, linear algebra, the physics of circuits, and computer programming to reach the ambitious goal of implementing automated circuit solving. (paper)

  4. Design and control of automated guided vehicle systems: A case study

    NARCIS (Netherlands)

    Li, Q.; Adriaansen, A.C.; Udding, J.T.; Pogromski, A.Y.

    2011-01-01

    In this paper, we study the design and control of automated guided vehicle (AGV) systems, with the focus on the quayside container transport in an automated container terminal. We first set up an event-driven model for an AGV system in the zone control framework. Then a number of layouts of the road

  5. Compositional abstractions for long-run properties of stochastic systems

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2011-01-01

    its underlying Markov chain. However, this also leads to state space explosion problems as the number of components in the model increases, which severely limits the size of models we can analyse. Because of this, we look for abstraction techniques that allow us to analyse a smaller model that safely...... be reduced in size. Importantly, we do this compositionally, so that we bound each component of the model separately, and compose these to obtain a bound for the entire model. We present an algorithm for this, based on extending the algorithm by Fourneau et al. to deal with partially-ordered state spaces....... Finally, we present some results from our implementation, which forms part of the PEPA plug-in for Eclipse. We compare the precision and state space reduction with results obtained by computing long-run averages on a CTMDP-based abstraction....

  6. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    International Nuclear Information System (INIS)

    Lee, Seungmin; Seong, Poonghyun; Kim, Jonghyun

    2013-01-01

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load

  7. A mobile, high-throughput semi-automated system for testing cognition in large non-primate animal models of Huntington disease.

    Science.gov (United States)

    McBride, Sebastian D; Perentos, Nicholas; Morton, A Jennifer

    2016-05-30

    For reasons of cost and ethical concerns, models of neurodegenerative disorders such as Huntington disease (HD) are currently being developed in farm animals, as an alternative to non-human primates. Developing reliable methods of testing cognitive function is essential to determining the usefulness of such models. Nevertheless, cognitive testing of farm animal species presents a unique set of challenges. The primary aims of this study were to develop and validate a mobile operant system suitable for high throughput cognitive testing of sheep. We designed a semi-automated testing system with the capability of presenting stimuli (visual, auditory) and reward at six spatial locations. Fourteen normal sheep were used to validate the system using a two-choice visual discrimination task. Four stages of training devised to acclimatise animals to the system are also presented. All sheep progressed rapidly through the training stages, over eight sessions. All sheep learned the 2CVDT and performed at least one reversal stage. The mean number of trials the sheep took to reach criterion in the first acquisition learning was 13.9±1.5 and for the reversal learning was 19.1±1.8. This is the first mobile semi-automated operant system developed for testing cognitive function in sheep. We have designed and validated an automated operant behavioural testing system suitable for high throughput cognitive testing in sheep and other medium-sized quadrupeds, such as pigs and dogs. Sheep performance in the two-choice visual discrimination task was very similar to that reported for non-human primates and strongly supports the use of farm animals as pre-clinical models for the study of neurodegenerative diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Automated, model-based analysis of Uj-data for electrolyte-supported SOFC short-stacks

    Energy Technology Data Exchange (ETDEWEB)

    Linder, M.; Hocker, T. [ICP Institute of Comp. Physics, ZHAW Zurich University of Appl. Sciences, Wildbachstrasse 21, CH-8401 Winterthur (Switzerland); Denzler, R.; Mai, A.; Iwanschitz, B. [Hexis AG, Zum Park 5, CH-8404 Winterthur (Switzerland)

    2011-08-15

    A simplified model is presented to analyse current-voltage characteristics of electrolyte supported solid oxide fuel cell stacks that run on reformed hydrocarbon fuels such as natural gas. The model takes into account the fuel reforming process, fuel leakages and electrochemical conversions all in an idealised manner, i.e. it assumes thermodynamic equilibrium. However, the model explicitly accounts for parameter variations that are often unavoidable under realistic testing conditions. These variations include changes in the fuel composition, changes in the fuel mass fluxes and changes in the operation temperature. In addition, internal repeat unit resistances as extracted from current-voltage data are compared with the ohmic and polarisation losses obtained from independent electrical impedance measurements data. Since the data analysis is done in a highly automated fashion, large sets of current-voltage data can be compared and 'normal' data sets can be discriminated from 'abnormal' ones. The model has the potential to accelerate testing by eliminating variations that lead to corruption and unwanted scattering of current-voltage data. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    Science.gov (United States)

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could

  10. ProofJudge: Automated Proof Judging Tool for Learning Mathematical Logic

    DEFF Research Database (Denmark)

    Villadsen, Jørgen

    2015-01-01

    Today we have software in many artefacts, from medical devices to cars and airplanes, and the software must not only be efficient and intelligent but also reliable and secure. Tests can show the presence of bugs but cannot guarantee their absence. A machine-checked proof using mathematical logic...... pen and paper because no adequate tool was available. The learning problem is how to make abstract concepts of logic as concrete as possible. ProofJudge is a computer system and teaching approach for teaching mathematical logic and automated reasoning which augments the e-learning tool NaDeA (Natural...

  11. ProofJudge: Automated Proof Judging Tool for Learning Mathematical Logic

    DEFF Research Database (Denmark)

    Villadsen, Jørgen

    2016-01-01

    Today we have software in many artefacts, from medical devices to cars and airplanes, and the software must not only be efficient and intelligent but also reliable and secure. Tests can show the presence of bugs but cannot guarantee their absence. A machine-checked proof using mathematical logic...... using pen and paper because no adequate tool was available. The learning problem is how to make abstract concepts of logic as concrete as possible. ProofJudge is a computer system and teaching approach for teaching mathematical logic and automated reasoning which augments the e-learning tool Na...

  12. Procedure automation: the effect of automated procedure execution on situation awareness and human performance

    International Nuclear Information System (INIS)

    Andresen, Gisle; Svengren, Haakan; Heimdal, Jan O.; Nilsen, Svein; Hulsund, John-Einar; Bisio, Rossella; Debroise, Xavier

    2004-04-01

    As advised by the procedure workshop convened in Halden in 2000, the Halden Project conducted an experiment on the effect of automation of Computerised Procedure Systems (CPS) on situation awareness and human performance. The expected outcome of the study was to provide input for guidance on CPS design, and to support the Halden Project's ongoing research on human reliability analysis. The experiment was performed in HAMMLAB using the HAMBO BWR simulator and the COPMA-III CPS. Eight crews of operators from Forsmark 3 and Oskarshamn 3 participated. Three research questions were investigated: 1) Does procedure automation create Out-Of-The-Loop (OOTL) performance problems? 2) Does procedure automation affect situation awareness? 3) Does procedure automation affect crew performance? The independent variable, 'procedure configuration', had four levels: paper procedures, manual CPS, automation with breaks, and full automation. The results showed that the operators experienced OOTL problems in full automation, but that situation awareness and crew performance (response time) were not affected. One possible explanation for this is that the operators monitored the automated procedure execution conscientiously, something which may have prevented the OOTL problems from having negative effects on situation awareness and crew performance. In a debriefing session, the operators clearly expressed their dislike for the full automation condition, but that automation with breaks could be suitable for some tasks. The main reason why the operators did not like the full automation was that they did not feel being in control. A qualitative analysis addressing factors contributing to response time delays revealed that OOTL problems did not seem to cause delays, but that some delays could be explained by the operators having problems with the freeze function of the CPS. Also other factors such as teamwork and operator tendencies were of importance. Several design implications were drawn

  13. 1st Latin American Congress on Automation and Robotics

    CERN Document Server

    Baca, José; Moreno, Héctor; Carrera, Isela; Cardona, Manuel

    2017-01-01

    This book contains the proceedings of the 1st Latin American Congress on Automation and Robotics held at Panama City, Panama in February 2017. It gathers research work from researchers, scientists, and engineers from academia and private industry, and presents current and exciting research applications and future challenges in Latin American. The scope of this book covers a wide range of themes associated with advances in automation and robotics research encountered in engineering and scientific research and practice. These topics are related to control algorithms, systems automation, perception, mobile robotics, computer vision, educational robotics, robotics modeling and simulation, and robotics and mechanism design. LACAR 2017 has been sponsored by SENACYT (Secretaria Nacional de Ciencia, Tecnologia e Inovacion of Panama).

  14. Machine vision automated visual inspection theory, practice and applications

    CERN Document Server

    Beyerer, Jürgen; Frese, Christian

    2016-01-01

    The book offers a thorough introduction to machine vision. It is organized in two parts. The first part covers the image acquisition, which is the crucial component of most automated visual inspection systems. All important methods are described in great detail and are presented with a reasoned structure. The second part deals with the modeling and processing of image signals and pays particular regard to methods, which are relevant for automated visual inspection.

  15. On the consistency of Koomen's fair abstraction rule

    NARCIS (Netherlands)

    Bergstra, J.A.; Baeten, J.C.M.; Klop, J.W.

    1987-01-01

    We construct a graph model for ACP,, the algebra of communicating processes w%h silent steps, in which Koomen’s Fair Abstraction Rule (KFAR) holds, and also versions of the Approximation Induction Principle (AIP) and the Recursive Definition & Specification Principles (RDP&RSP). We use this model

  16. Individualism and Collectivism in Trade Agents (extended abstract)

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2008-01-01

    This paper was originally presented at IEA/AIE 2008 [1]. The current paper is an extended abstract for presentation at BNAIC 2008. Agent-Based Modeling can contribute to the understanding of international trade processes. Models for the effects of culture and cultural differences on agent behavior

  17. Benefit of modelling regarding the quality and efficiency of PLC-programming in process automation; Nutzen von Modellierung fuer die Qualitaet und Effizienz der Steuerungsprogrammierung in der Automatisierungstechnik

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, D; Vogel-Heuser, B [Bergische Univ. Wuppertal (Germany). Lehrstuhl fuer Automatisierungstechnik/Prozessinformatik

    2006-03-15

    Software development in process automation has many deficiencies in procedures, notations and tool support. As a result, modern software engineering concepts and notations, like object oriented approaches or UML, are not wide spread in this field. Hence, drawbacks regarding start-up times, additional costs and low software quality are immense. This paper will evaluate the benefit of modelling as a design step prior to coding, regarding cognitive paradigms. Two modelling notations (UML and ICL) will be compared analyzing their impact on the quality of automation software written in IEC 61131. (orig.)

  18. Methodological proposal for validation of the disinfecting efficacy of an automated flexible endoscope reprocessor

    OpenAIRE

    Graziano, Kazuko Uchikawa; Pereira, Marta Elisa Auler; Koda, Elaine

    2016-01-01

    ABSTRACT Objective: to elaborate and apply a method to assess the efficacy of automated flexible endoscope reprocessors at a time when there is not an official method or trained laboratories to comply with the requirements described in specific standards for this type of health product in Brazil. Method: the present methodological study was developed based on the following theoretical references: International Organization for Standardization (ISO) standard ISO 15883-4/2008 and Brazilian ...

  19. Automated Inadvertent Intruder Application

    International Nuclear Information System (INIS)

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  20. Automated registration of tail bleeding in rats.

    Science.gov (United States)

    Johansen, Peter B; Henriksen, Lars; Andresen, Per R; Lauritzen, Brian; Jensen, Kåre L; Juhl, Trine N; Tranholm, Mikael

    2008-05-01

    An automated system for registration of tail bleeding in rats using a camera and a user-designed PC-based software program has been developed. The live and processed images are displayed on the screen and are exported together with a text file for later statistical processing of the data allowing calculation of e.g. number of bleeding episodes, bleeding times and bleeding areas. Proof-of-principle was achieved when the camera captured the blood stream after infusion of rat whole blood into saline. Suitability was assessed by recording of bleeding profiles in heparin-treated rats, demonstrating that the system was able to capture on/off bleedings and that the data transfer and analysis were conducted successfully. Then, bleeding profiles were visually recorded by two independent observers simultaneously with the automated recordings after tail transection in untreated rats. Linear relationships were found in the number of bleedings, demonstrating, however, a statistically significant difference in the recording of bleeding episodes between observers. Also, the bleeding time was longer for visual compared to automated recording. No correlation was found between blood loss and bleeding time in untreated rats, but in heparinized rats a correlation was suggested. Finally, the blood loss correlated with the automated recording of bleeding area. In conclusion, the automated system has proven suitable for replacing visual recordings of tail bleedings in rats. Inter-observer differences can be eliminated, monotonous repetitive work avoided, and a higher through-put of animals in less time achieved. The automated system will lead to an increased understanding of the nature of bleeding following tail transection in different rodent models.