WorldWideScience

Sample records for automated model abstraction

  1. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  2. Temperature control of fimbriation circuit switch in uropathogenic Escherichia coli: quantitative analysis via automated model abstraction.

    Directory of Open Access Journals (Sweden)

    Hiroyuki Kuwahara

    2010-03-01

    Full Text Available Uropathogenic Escherichia coli (UPEC represent the predominant cause of urinary tract infections (UTIs. A key UPEC molecular virulence mechanism is type 1 fimbriae, whose expression is controlled by the orientation of an invertible chromosomal DNA element-the fim switch. Temperature has been shown to act as a major regulator of fim switching behavior and is overall an important indicator as well as functional feature of many urologic diseases, including UPEC host-pathogen interaction dynamics. Given this panoptic physiological role of temperature during UTI progression and notable empirical challenges to its direct in vivo studies, in silico modeling of corresponding biochemical and biophysical mechanisms essential to UPEC pathogenicity may significantly aid our understanding of the underlying disease processes. However, rigorous computational analysis of biological systems, such as fim switch temperature control circuit, has hereto presented a notoriously demanding problem due to both the substantial complexity of the gene regulatory networks involved as well as their often characteristically discrete and stochastic dynamics. To address these issues, we have developed an approach that enables automated multiscale abstraction of biological system descriptions based on reaction kinetics. Implemented as a computational tool, this method has allowed us to efficiently analyze the modular organization and behavior of the E. coli fimbriation switch circuit at different temperature settings, thus facilitating new insights into this mode of UPEC molecular virulence regulation. In particular, our results suggest that, with respect to its role in shutting down fimbriae expression, the primary function of FimB recombinase may be to effect a controlled down-regulation (rather than increase of the ON-to-OFF fim switching rate via temperature-dependent suppression of competing dynamics mediated by recombinase FimE. Our computational analysis further implies

  3. Opponent Modelling in Automated Multi-Issue Negotiation Using Bayesian Learning (extended abstract)

    NARCIS (Netherlands)

    Hindriks, K.V.; Tykhonov, D.

    2008-01-01

    In this paper, we show that it is nonetheless possible to construct an opponent model, i.e. a model of the opponent’s preferences that can be effectively used to improve negotiation outcomes. We provide a generic framework for learning both the preferences associated with issue values as well as the

  4. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  5. Automated data model evaluation

    International Nuclear Information System (INIS)

    Kazi, Zoltan; Kazi, Ljubica; Radulovic, Biljana

    2012-01-01

    Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation

  6. Semi-automated entry of clinical temporal-abstraction knowledge.

    Science.gov (United States)

    Shahar, Y; Chen, H; Stites, D P; Basso, L V; Kaizer, H; Wilson, D M; Musen, M A

    1999-01-01

    The authors discuss the usability of an automated tool that supports entry, by clinical experts, of the knowledge necessary for forming high-level concepts and patterns from raw time-oriented clinical data. Based on their previous work on the RESUME system for forming high-level concepts from raw time-oriented clinical data, the authors designed a graphical knowledge acquisition (KA) tool that acquires the knowledge required by RESUME. This tool was designed using Protégé, a general framework and set of tools for the construction of knowledge-based systems. The usability of the KA tool was evaluated by three expert physicians and three knowledge engineers in three domains-the monitoring of children's growth, the care of patients with diabetes, and protocol-based care in oncology and in experimental therapy for AIDS. The study evaluated the usability of the KA tool for the entry of previously elicited knowledge. The authors recorded the time required to understand the methodology and the KA tool and to enter the knowledge; they examined the subjects' qualitative comments; and they compared the output abstractions with benchmark abstractions computed from the same data and a version of the same knowledge entered manually by RESUME experts. Understanding RESUME required 6 to 20 hours (median, 15 to 20 hours); learning to use the KA tool required 2 to 6 hours (median, 3 to 4 hours). Entry times for physicians varied by domain-2 to 20 hours for growth monitoring (median, 3 hours), 6 and 12 hours for diabetes care, and 5 to 60 hours for protocol-based care (median, 10 hours). An increase in speed of up to 25 times (median, 3 times) was demonstrated for all participants when the KA process was repeated. On their first attempt at using the tool to enter the knowledge, the knowledge engineers recorded entry times similar to those of the expert physicians' second attempt at entering the same knowledge. In all cases RESUME, using knowledge entered by means of the KA tool

  7. ABSTRACT MODELS FOR SYSTEM VIRTUALIZATION

    Directory of Open Access Journals (Sweden)

    M. G. Koveshnikov

    2015-05-01

    Full Text Available The paper is dedicated to issues of system objects securing (system files and user system or application configuration files against unauthorized access including denial of service attacks. We have suggested the method and developed abstract system virtualization models, which are used toresearch attack scenarios for different virtualization modes. Estimation for system tools virtualization technology effectiveness is given. Suggested technology is based on redirection of access requests to system objects shared among access subjects. Whole and partial system virtualization modes have been modeled. The difference between them is the following: in the whole virtualization mode all copies of access system objects are created whereon subjects’ requests are redirected including corresponding application objects;in the partial virtualization mode corresponding copies are created only for part of a system, for example, only system objects for applications. Alternative solutions effectiveness is valued relating to different attack scenarios. We consider proprietary and approved technical solution which implements system virtualization method for Microsoft Windows OS family. Administrative simplicity and capabilities of correspondingly designed system objects security tools are illustrated on this example. Practical significance of the suggested security method has been confirmed.

  8. Automated spatial and thematic generalization using a context transformation model : integrating steering parameters, classification and aggregation hierarchies, reduction factors, and topological structures for multiple abstractions

    NARCIS (Netherlands)

    Richardson, D.E.

    1993-01-01

    This dissertation presents a model for spatial and thematic digital generalization. To do so, the development of digital generalization over the last thirty years is first reviewed

    The approach to generalization taken in this research differs from other existing works as

  9. Automated planning through abstractions in dynamic and stochastic environments

    OpenAIRE

    Martínez Muñoz, Moisés

    2016-01-01

    Mención Internacional en el título de doctor Generating sequences of actions - plans - for an automatic system, like a robot, using Automated Planning is particularly diflicult in stochastic and/or dynamic environments. These plans are composed of actions whose execution, in certain scenarios, might fail, which in tum prevents the execution of the rest of the actions in the plan. Also, in some environments, plans must he generated fast, hoth at the start of the execution and after every ex...

  10. An Abstraction Theory for Qualitative Models of Biological Systems

    Directory of Open Access Journals (Sweden)

    Richard Banks

    2010-10-01

    Full Text Available Multi-valued network models are an important qualitative modelling approach used widely by the biological community. In this paper we consider developing an abstraction theory for multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. This is important as it aids the analysis and comparison of multi-valued networks and in particular, helps address the well-known problem of state space explosion associated with such analysis. We also consider developing techniques for efficiently identifying abstractions and so provide a basis for the automation of this task. We illustrate the theory and techniques developed by investigating the identification of abstractions for two published MVN models of the lysis-lysogeny switch in the bacteriophage lambda.

  11. Abstract

    African Journals Online (AJOL)

    PROF. OLIVER OSUAGWA

    Abstract. Many mathematical models of stochastic dynamical systems were based on the assumption that the drift and volatility coefficients were linear function of the solution. In this work, we arrive at the drift and the volatility by observing the dynamics of change in the selected stocks in a sufficiently small interval t∆ .

  12. Engineering Abstractions in Model Checking and Testing

    DEFF Research Database (Denmark)

    Achenbach, Michael; Ostermann, Klaus

    2009-01-01

    Abstractions are used in model checking to tackle problems like state space explosion or modeling of IO. The application of these abstractions in real software development processes, however, lacks engineering support. This is one reason why model checking is not widely used in practice yet...... and testing is still state of the art in falsification. We show how user-defined abstractions can be integrated into a Java PathFinder setting with tools like AspectJ or Javassist and discuss implications of remaining weaknesses of these tools. We believe that a principled engineering approach to designing...

  13. SATURATED ZONE FLOW AND TRANSPORT MODEL ABSTRACTION

    International Nuclear Information System (INIS)

    B.W. ARNOLD

    2004-01-01

    The purpose of the saturated zone (SZ) flow and transport model abstraction task is to provide radionuclide-transport simulation results for use in the total system performance assessment (TSPA) for license application (LA) calculations. This task includes assessment of uncertainty in parameters that pertain to both groundwater flow and radionuclide transport in the models used for this purpose. This model report documents the following: (1) The SZ transport abstraction model, which consists of a set of radionuclide breakthrough curves at the accessible environment for use in the TSPA-LA simulations of radionuclide releases into the biosphere. These radionuclide breakthrough curves contain information on radionuclide-transport times through the SZ. (2) The SZ one-dimensional (I-D) transport model, which is incorporated in the TSPA-LA model to simulate the transport, decay, and ingrowth of radionuclide decay chains in the SZ. (3) The analysis of uncertainty in groundwater-flow and radionuclide-transport input parameters for the SZ transport abstraction model and the SZ 1-D transport model. (4) The analysis of the background concentration of alpha-emitting species in the groundwater of the SZ

  14. Visual automated macromolecular model building.

    Science.gov (United States)

    Langer, Gerrit G; Hazledine, Saul; Wiegels, Tim; Carolan, Ciaran; Lamzin, Victor S

    2013-04-01

    Automated model-building software aims at the objective interpretation of crystallographic diffraction data by means of the construction or completion of macromolecular models. Automated methods have rapidly gained in popularity as they are easy to use and generate reproducible and consistent results. However, the process of model building has become increasingly hidden and the user is often left to decide on how to proceed further with little feedback on what has preceded the output of the built model. Here, ArpNavigator, a molecular viewer tightly integrated into the ARP/wARP automated model-building package, is presented that directly controls model building and displays the evolving output in real time in order to make the procedure transparent to the user.

  15. An Abstract Model of Historical Processes

    Directory of Open Access Journals (Sweden)

    Michael Poulshock

    2017-06-01

    Full Text Available A theoretical model is presented which provides a way to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents interact over time, using the power they have to try to get more of it, while being constrained in their strategic choices by social inertia. The outcomes of the model are probabilistic. More research is needed to determine whether the model has any empirical validity.

  16. ABSTRACT

    African Journals Online (AJOL)

    University Health Services. Ahmadu Bella University, Zaria, Nigeria. ABSTRACT. Phywieo-chemical methods were used to analyse the commonly used lcualt samples bought from Zaria and Kano local markets. Blood-leadaoncentrations in ltuali ...

  17. Tailored model abstraction in performance assessments

    International Nuclear Information System (INIS)

    Kessler, J.H.

    1995-01-01

    Total System Performance Assessments (TSPAs) are likely to be one of the most significant parts of making safety cases for the continued development and licensing of geologic repositories for the disposal of spent fuel and HLW. Thus, it is critical that the TSPA model capture the 'essence' of the physical processes relevant to demonstrating the appropriate regulation is met. But how much detail about the physical processes must be modeled and understood before there is enough confidence that the appropriate essence has been captured? In this summary the level of model abstraction that is required is discussed. Approaches for subsystem and total system performance analyses are outlined, and the role of best estimate models is examined. It is concluded that a conservative approach for repository performance, based on limited amount of field and laboratory data, can provide sufficient confidence for a regulatory decision

  18. Abstract

    African Journals Online (AJOL)

    Francis

    Abstract. Aqueous, methanol and chloroform extracts from the leaves of Ficus religiosa, Thespesia populnea and Hibiscus tiliaceus were completely screened for antibacterial and antifungal activity. The chloroform extract of F. religiosa possessed a broad spectrum of antibacterial activity with a zone of inhibition of 10 to 21 ...

  19. Abstract,

    African Journals Online (AJOL)

    Abstract·. A study was carried out to investigate the effect of overso~ing legumes on ~a~gela~d pe'rtormance in. Shinyanga'region, Tanzania. Four leguminous species namely Centrosema pubescence, Clito-':iii ternatea,. cMacroptilium atropurpureum and Stylosanthes hamata were Qversown in. a"natural ran,geland in a.

  20. Abstract

    Indian Academy of Sciences (India)

    65

    Abstract. For well over three hundred years, the monsoon has been considered to be a gigantic land-sea breeze driven by the land-ocean contrast in surface temperature. In this paper, this hypothesis ..... primary driver of the monsoon in many papers and most textbooks (e.g. Lau and Li, 1984,. Webster 1987a, Meehl 1994, ...

  1. ABSTRACT

    African Journals Online (AJOL)

    Email: jameskigera@yahoo.co.uk. ABSTRACT. Background: Implant orthopaedic surgery is associated with a risk of post operative Surgical Site. Infection (SSI). This can have devastating consequences in the case of arthroplasty. Due to the less than ideal circumstances under which surgery is conducted in Africa, there are ...

  2. Abstract

    African Journals Online (AJOL)

    WORKERS ON THEIR JOB PERFORMANCE IN IMO STATE, NIGERIA. NGOZI OKEREKE AND no. ONU. ABSTRACT. The study focused on the. efl'ect of socioeconomic characteristics of field extension workers on their job performance in.1mo state agricultural development programme, Nigeria. Data was collected with the ...

  3. An Office Automation Needs Assessment Model

    Science.gov (United States)

    1985-08-01

    office automation needs of a Army Hospital. Based on a literature review and interviews with industry experts, a model was developed to assess office automation needs. The model was applied against the needs of the Clinical Support Division. The author identified a need for a strategic plan for Office Automation prior to analysis of a specific service for automaton. He recommended establishment of a Hospital Automation Advisory Council to centralize establish policy recommendations for Office automation

  4. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  5. Physical and Chemical Environmental Abstraction Model

    International Nuclear Information System (INIS)

    Nowak, E.

    2000-01-01

    As directed by a written development plan (CRWMS M and O 1999a), Task 1, an overall conceptualization of the physical and chemical environment (P/CE) in the emplacement drift is documented in this Analysis/Model Report (AMR). Included are the physical components of the engineered barrier system (EBS). The intended use of this descriptive conceptualization is to assist the Performance Assessment Department (PAD) in modeling the physical and chemical environment within a repository drift. It is also intended to assist PAD in providing a more integrated and complete in-drift geochemical model abstraction and to answer the key technical issues raised in the U.S. Nuclear Regulatory Commission (NRC) Issue Resolution Status Report (IRSR) for the Evolution of the Near-Field Environment (NFE) Revision 2 (NRC 1999). EBS-related features, events, and processes (FEPs) have been assembled and discussed in ''EBS FEPs/Degradation Modes Abstraction'' (CRWMS M and O 2000a). Reference AMRs listed in Section 6 address FEPs that have not been screened out. This conceptualization does not directly address those FEPs. Additional tasks described in the written development plan are recommended for future work in Section 7.3. To achieve the stated purpose, the scope of this document includes: (1) the role of in-drift physical and chemical environments in the Total System Performance Assessment (TSPA) (Section 6.1); (2) the configuration of engineered components (features) and critical locations in drifts (Sections 6.2.1 and 6.3, portions taken from EBS Radionuclide Transport Abstraction (CRWMS M and O 2000b)); (3) overview and critical locations of processes that can affect P/CE (Section 6.3); (4) couplings and relationships among features and processes in the drifts (Section 6.4); and (5) identities and uses of parameters transmitted to TSPA by some of the reference AMRs (Section 6.5). This AMR originally considered a design with backfill, and is now being updated (REV 00 ICN1) to address

  6. Scalable Automated Model Search

    Science.gov (United States)

    2014-05-20

    tributed learning environment. Specifically, how to best choose be- tween model families for supervised learning problems and config- ure the...io n Er ro r Maximum Calls 16 81 256 625 Comparison of Search Methods Across Learning Problems Figure 3: Search methods were compared across several...while several of the methods in this paper may apply to this setting, optimizing over this many hyperparameters for learning problems is not a well

  7. Abstract

    African Journals Online (AJOL)

    Getachew

    realistic distribution of no-show data in modeling the cost function was considered using data collected from the .... the paper models the cost function based on a realistic probability distributions based on the historical data is a .... Plot of Revenue generated vs. overbooking for two class case (at $500. Compensation Cost ...

  8. ABSTRACT

    Directory of Open Access Journals (Sweden)

    Michelle de Stefano Sabino

    2011-12-01

    Full Text Available This paper aims to describe and to analyze the integration observed in the Sintonia project with respect to the comparison of project management processes to the model of the Stage-Gate ®. The literature addresses these issues conceptually, but lack an alignment between them that is evident in practice. As a method was used single case study. The report is as if the Sintonia project, developed by PRODESP - Data Processing Company of São Paulo. The results show the integration of project management processes with the Stage-Gate model developed during the project life cycle. The formalization of the project was defined in stages in which allowed the exploitation of economies of repetition and recombination to the development of new projects. This study contributes to the technical vision in dealing with the integration of project management processes. It was concluded that this system represents an attractive way, in terms of creating economic value and technological innovation for the organization.

  9. Spike Neural Models Part II: Abstract Neural Models

    OpenAIRE

    Johnson, Melissa G.; Chartier, Sylvain

    2018-01-01

    Neurons are complex cells that require a lot of time and resources to model completely. In spiking neural networks (SNN) though, not all that complexity is required. Therefore simple, abstract models are often used. These models save time, use less computer resources, and are easier to understand. This tutorial presents two such models: Izhikevich's model, which is biologically realistic in the resulting spike trains but not in the parameters, and the Leaky Integrate and Fire (LIF) model whic...

  10. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    Science.gov (United States)

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results.

  11. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  12. Meta-Domains for Automated Model Building

    National Research Council Canada - National Science Library

    Easley, Matthew; Bradley, Elizabeth

    1999-01-01

    .... In particular, we introduce a new structure for automated model building known as a meta-domain which, when instantiated with components, tailors the space of candidate models to the system at hand...

  13. Old and New Models for Office Automation.

    Science.gov (United States)

    Cole, Eliot

    1983-01-01

    Discusses organization design as context for office automation; mature computer-based systems as one application of organization design variables; and emerging office automation systems (organizational information management, personal information management) as another application of these variables. Management information systems models and…

  14. An Abstraction-Based Data Model for Information Retrieval

    Science.gov (United States)

    McAllister, Richard A.; Angryk, Rafal A.

    Language ontologies provide an avenue for automated lexical analysis that may be used to supplement existing information retrieval methods. This paper presents a method of information retrieval that takes advantage of WordNet, a lexical database, to generate paths of abstraction, and uses them as the basis for an inverted index structure to be used in the retrieval of documents from an indexed corpus. We present this method as a entree to a line of research on using ontologies to perform word-sense disambiguation and improve the precision of existing information retrieval techniques.

  15. Demo abstract: Flexhouse-2-an open source building automation platform with a focus on flexible control

    DEFF Research Database (Denmark)

    Gehrke, Oliver; Kosek, Anna Magdalena; Svendsen, Mathias

    2014-01-01

    , an open-source implementation of a building automation system which has been designed with a strong focus on enabling the integration of the building into a smart power system and dedicated support for the requirements of an R&D environment. We will demonstrate the need for such a platform, discuss...

  16. Home automation for a sustainable living:modelling a detached house in Northern Finland

    OpenAIRE

    Louis, J.-N. (Jean-Nicolas); Caló, A. (Antonio); Leiviskä, K. (Kauko); Pongrácz, E. (Eva)

    2014-01-01

    Abstract This paper presents a model of a detached house in which home automation has been progressively introduced into the building. The model integrates different factors related to end-user behaviour and decision-making regarding the management of electrical energy consumption, and integrates a gradual end-user response to home automation measures. The presented model aims to show the potential economic benefits obtained by the modelled changes of end-users’ behaviours within a smart e...

  17. An abstract machine model of dynamic module replacement

    OpenAIRE

    Walton, Chris; Kırlı, Dilsun; Gilmore, Stephen

    2000-01-01

    In this paper we define an abstract machine model for the mλ typed intermediate language. This abstract machine is used to give a formal description of the operation of run-time module replacement for the programming language Dynamic ML. The essential technical device which we employ for module replacement is a modification of two-space copying garbage collection. We show how the operation of module replacement could be applied to other garbage-collected languages such as Java.

  18. Automating Derivations of Abstract Machines from Reduction Semantics: A Generic Formalization of Refocusing in Coq

    DEFF Research Database (Denmark)

    Sieczkowski, Filip; Biernacka, Małgorzata; Biernacki, Dariusz

    2011-01-01

    We present a generic formalization of the refocusing trans- formation for functional languages in the Coq proof assistant. The refo- cusing technique, due to Danvy and Nielsen, allows for mechanical trans- formation of an evaluator implementing a reduction semantics into an equivalent abstract ma...... into an abstract machine equivalent to it. The article is accompanied by a Coq development that contains the formalization of the refocusing method and a number of case studies that serve both as an illustration of the method and as a sanity check on the axiomatization....

  19. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers, and Automation Technology, Number 30.

    Science.gov (United States)

    1978-01-23

    Systems, College of Transportation, Zilina [Abstract] The article discusses programming of the INTEL 8080 microprocessor developed at the College of...Transportation at Zilina . A typical application program of a microcomputer contains 50 to 4,000 instructions. The means of programming depends on the

  1. Efficient family-based model checking via variability abstractions

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus

    2016-01-01

    Many software systems are variational: they can be configured to meet diverse sets of requirements. They can produce a (potentially huge) number of related systems, known as products or variants, by systematically reusing common parts. For variational models (variational systems or families...... with the abstract model checking of the concrete high-level variational model. This allows the use of Spin with all its accumulated optimizations for efficient verification of variational models without any knowledge about variability. We have implemented the transformations in a prototype tool, and we illustrate...

  2. An automated framework for QSAR model building.

    Science.gov (United States)

    Kausar, Samina; Falcao, Andre O

    2018-01-16

    In-silico quantitative structure-activity relationship (QSAR) models based tools are widely used to screen huge databases of compounds in order to determine the biological properties of chemical molecules based on their chemical structure. With the passage of time, the exponentially growing amount of synthesized and known chemicals data demands computationally efficient automated QSAR modeling tools, available to researchers that may lack extensive knowledge of machine learning modeling. Thus, a fully automated and advanced modeling platform can be an important addition to the QSAR community. In the presented workflow the process from data preparation to model building and validation has been completely automated. The most critical modeling tasks (data curation, data set characteristics evaluation, variable selection and validation) that largely influence the performance of QSAR models were focused. It is also included the ability to quickly evaluate the feasibility of a given data set to be modeled. The developed framework is tested on data sets of thirty different problems. The best-optimized feature selection methodology in the developed workflow is able to remove 62-99% of all redundant data. On average, about 19% of the prediction error was reduced by using feature selection producing an increase of 49% in the percentage of variance explained (PVE) compared to models without feature selection. Selecting only the models with a modelability score above 0.6, average PVE scores were 0.71. A strong correlation was verified between the modelability scores and the PVE of the models produced with variable selection. We developed an extendable and highly customizable fully automated QSAR modeling framework. This designed workflow does not require any advanced parameterization nor depends on users decisions or expertise in machine learning/programming. With just a given target or problem, the workflow follows an unbiased standard protocol to develop reliable QSAR models

  3. Particle Tracking Model and Abstraction of Transport Processes

    International Nuclear Information System (INIS)

    Robinson, B.

    2004-01-01

    The purpose of this report is to document the abstraction model being used in total system performance assessment (TSPA) model calculations for radionuclide transport in the unsaturated zone (UZ). The UZ transport abstraction model uses the particle-tracking method that is incorporated into the finite element heat and mass model (FEHM) computer code (Zyvoloski et al. 1997 [DIRS 100615]) to simulate radionuclide transport in the UZ. This report outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the UZ at Yucca Mountain. In addition, methods for determining and inputting transport parameters are outlined for use in the TSPA for license application (LA) analyses. Process-level transport model calculations are documented in another report for the UZ (BSC 2004 [DIRS 164500]). Three-dimensional, dual-permeability flow fields generated to characterize UZ flow (documented by BSC 2004 [DIRS 169861]; DTN: LB03023DSSCP9I.001 [DIRS 163044]) are converted to make them compatible with the FEHM code for use in this abstraction model. This report establishes the numerical method and demonstrates the use of the model that is intended to represent UZ transport in the TSPA-LA. Capability of the UZ barrier for retarding the transport is demonstrated in this report, and by the underlying process model (BSC 2004 [DIRS 164500]). The technical scope, content, and management of this report are described in the planning document ''Technical Work Plan for: Unsaturated Zone Transport Model Report Integration'' (BSC 2004 [DIRS 171282]). Deviations from the technical work plan (TWP) are noted within the text of this report, as appropriate. The latest version of this document is being prepared principally to correct parameter values found to be in error due to transcription errors, changes in source data that were not captured in the report, calculation errors, and errors in interpretation of source data

  4. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers and Automation Technology, Number 34

    Science.gov (United States)

    1978-07-26

    pp Ul-ljlt manuscript received 9 Mar 77 D0LG0P0L0V, ALEKSANDR SERGYEVICH, senior engineer, Pskov Radio Components Plant (Pskov) and...Jul 76 B0R0DKIN, A. M., B0R0DKIN, L. I., GURIN, N. N., KOGAN , YA. A., LYAPICHEVA, N. G. and MUCHNIK, I. B., Moscow [Abstract] The optimal speed...Kiev); GULYAYEV, ALEKSANDR IVANOVICH, candidate in technical sciences, RIUTs, Ministry of Public Health UkrSSR (Kiev); and DEMENTKOVA, ANNA

  5. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  6. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  7. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  8. Abstracts of the symposium on unsaturated flow and transport modeling

    International Nuclear Information System (INIS)

    1982-03-01

    Abstract titles are: Recent developments in modeling variably saturated flow and transport; Unsaturated flow modeling as applied to field problems; Coupled heat and moisture transport in unsaturated soils; Influence of climatic parameters on movement of radionuclides in a multilayered saturated-unsaturated media; Modeling water and solute transport in soil containing roots; Simulation of consolidation in partially saturated soil materials; modeling of water and solute transport in unsaturated heterogeneous fields; Fluid dynamics and mass transfer in variably-saturated porous media; Solute transport through soils; One-dimensional analytical transport modeling; Convective transport of ideal tracers in unsaturated soils; Chemical transport in macropore-mesopore media under partially saturated conditions; Influence of the tension-saturated zone on contaminant migration in shallow water regimes; Influence of the spatial distribution of velocities in porous media on the form of solute transport; Stochastic vs deterministic models for solute movement in the field; and Stochastic analysis of flow and solute transport

  9. Automation model of sewerage rehabilitation planning.

    Science.gov (United States)

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  10. Modeling of Prepregs during Automated Draping Sequences

    DEFF Research Database (Denmark)

    Krogh, Christian; Glud, Jens Ammitzbøll; Jakobsen, Johnny

    2017-01-01

    significant quality variations in the final part. Thus, an automated layup solution is under development where a robot can manipulate and drape the prepregs into the mold. The success of this implementation relies on both accurate and computationally efficient models describing the mechanical behavior...... aspect of the draping must be taken into account. The accurate modeling is accomplished with an explicit Finite Element (FE) scheme with shell elements. Material characterization in the form of uniaxial tensile tests, bias-extension tests (45 ° tensile test) and bending tests provide input for the model...

  11. Integration of Automated Decision Support Systems with Data Mining Abstract: A Client Perspective

    OpenAIRE

    Abdullah Saad AL-Malaise

    2013-01-01

    Customer’s behavior and satisfaction are always play important role to increase organization’s growth and market value. Customers are on top priority for the growing organization to build up their businesses. In this paper presents the architecture of Decision Support Systems (DSS) in connection to deal with the customer’s enquiries and requests. Main purpose behind the proposed model is to enhance the customer’s satisfaction and behavior using DSS. We proposed model by extension in tradition...

  12. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  13. Modeling of prepregs during automated draping sequences

    Science.gov (United States)

    Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny

    2017-10-01

    The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.

  14. The Abstract Machine Model for Transaction-based System Control

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.

    2003-01-31

    Recent work applying statistical mechanics to economic modeling has demonstrated the effectiveness of using thermodynamic theory to address the complexities of large scale economic systems. Transaction-based control systems depend on the conjecture that when control of thermodynamic systems is based on price-mediated strategies (e.g., auctions, markets), the optimal allocation of resources in a market-based control system results in an emergent optimal control of the thermodynamic system. This paper proposes an abstract machine model as the necessary precursor for demonstrating this conjecture and establishes the dynamic laws as the basis for a special theory of emergence applied to the global behavior and control of complex adaptive systems. The abstract machine in a large system amounts to the analog of a particle in thermodynamic theory. The permit the establishment of a theory dynamic control of complex system behavior based on statistical mechanics. Thus we may be better able to engineer a few simple control laws for a very small number of devices types, which when deployed in very large numbers and operated as a system of many interacting markets yields the stable and optimal control of the thermodynamic system.

  15. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  16. Specification of Interlevel Relations for Agent Models in Multiple Abstraction Dimensions

    NARCIS (Netherlands)

    Treur, J.; Mehrotra et al., K.G.

    2011-01-01

    Multiagent systems for a certain application area can be modelled at multiple levels of abstraction. Interlevel relations are a means to relate models from different abstraction levels. Three dimensions of abstraction often occurring are the process abstraction, temporal abstraction, and agent

  17. An automation model of Effluent Treatment Plant

    Directory of Open Access Journals (Sweden)

    Luiz Alberto Oliveira Lima Roque

    2012-07-01

    on the conservation of water resources, this paper aims to propose an automation model of an Effluent Treatment Plant, using Ladder programming language and supervisory systems.

  18. Automated adaptive inference of phenomenological dynamical models

    Science.gov (United States)

    Daniels, Bryan

    Understanding the dynamics of biochemical systems can seem impossibly complicated at the microscopic level: detailed properties of every molecular species, including those that have not yet been discovered, could be important for producing macroscopic behavior. The profusion of data in this area has raised the hope that microscopic dynamics might be recovered in an automated search over possible models, yet the combinatorial growth of this space has limited these techniques to systems that contain only a few interacting species. We take a different approach inspired by coarse-grained, phenomenological models in physics. Akin to a Taylor series producing Hooke's Law, forgoing microscopic accuracy allows us to constrain the search over dynamical models to a single dimension. This makes it feasible to infer dynamics with very limited data, including cases in which important dynamical variables are unobserved. We name our method Sir Isaac after its ability to infer the dynamical structure of the law of gravitation given simulated planetary motion data. Applying the method to output from a microscopically complicated but macroscopically simple biological signaling model, it is able to adapt the level of detail to the amount of available data. Finally, using nematode behavioral time series data, the method discovers an effective switch between behavioral attractors after the application of a painful stimulus.

  19. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  20. Implementing The Automated Phases Of The Partially-Automated Digital Triage Process Model

    Directory of Open Access Journals (Sweden)

    Gary D Cantrell

    2012-12-01

    Full Text Available Digital triage is a pre-digital-forensic phase that sometimes takes place as a way of gathering quick intelligence. Although effort has been undertaken to model the digital forensics process, little has been done to date to model digital triage. This work discuses the further development of a model that does attempt to address digital triage the Partially-automated Crime Specific Digital Triage Process model. The model itself will be presented along with a description of how its automated functionality was implemented to facilitate model testing.

  1. Local martingale and pathwise solutions for an abstract fluids model

    Science.gov (United States)

    Debussche, Arnaud; Glatt-Holtz, Nathan; Temam, Roger

    2011-07-01

    We establish the existence and uniqueness of both local martingale and local pathwise solutions of an abstract nonlinear stochastic evolution system. The primary application of this abstract framework is to infer the local existence of strong, pathwise solutions to the 3D primitive equations of the oceans and atmosphere forced by a nonlinear multiplicative white noise. Instead of developing our results, specifically for the 3D primitive equations we choose to develop them in a slightly abstract framework which covers many related forms of these equations (atmosphere, oceans, coupled atmosphere-ocean, on the sphere, on the β-plane approximation etc. and the incompressible Navier-Stokes equations). In applications, all the details are given for the β-plane approximation of the equations of the ocean.

  2. Selected translated abstracts of Russian-language climate-change publications. 4: General circulation models

    Energy Technology Data Exchange (ETDEWEB)

    Burtis, M.D. [comp.] [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Razuvaev, V.N.; Sivachok, S.G. [All-Russian Research Inst. of Hydrometeorological Information--World Data Center, Obninsk (Russian Federation)

    1996-10-01

    This report presents English-translated abstracts of important Russian-language literature concerning general circulation models as they relate to climate change. Into addition to the bibliographic citations and abstracts translated into English, this report presents the original citations and abstracts in Russian. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  3. Resource Allocation Model for Modelling Abstract RTOS on Multiprocessor System-on-Chip

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2003-01-01

    Resource Allocation is an important problem in RTOS's, and has been an active area of research. Numerous approaches have been developed and many different techniques have been combined for a wide range of applications. In this paper, we address the problem of resource allocation in the context...... of modelling an abstract RTOS on multiprocessor SoC platforms. We discuss the implementation details of a simplified basic priority inheritance protocol for our abstract system model in SystemC....

  4. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  5. Mathematical models in marketing a collection of abstracts

    CERN Document Server

    Funke, Ursula H

    1976-01-01

    Mathematical models can be classified in a number of ways, e.g., static and dynamic; deterministic and stochastic; linear and nonlinear; individual and aggregate; descriptive, predictive, and normative; according to the mathematical technique applied or according to the problem area in which they are used. In marketing, the level of sophistication of the mathe­ matical models varies considerably, so that a nurnber of models will be meaningful to a marketing specialist without an extensive mathematical background. To make it easier for the nontechnical user we have chosen to classify the models included in this collection according to the major marketing problem areas in which they are applied. Since the emphasis lies on mathematical models, we shall not as a rule present statistical models, flow chart models, computer models, or the empirical testing aspects of these theories. We have also excluded competitive bidding, inventory and transportation models since these areas do not form the core of ·the market...

  6. Ecosystem models (a bibliography with abstracts). Report for 1964-Nov 1975

    International Nuclear Information System (INIS)

    Harrison, E.A.

    1975-11-01

    The bibliography contains abstracts which cover marine biology, natural resources, wildlife, plants, water pollution, microorganisms, food chains, radioactive substances, limnology, and diseases as related to ecosystem models. Contains 214 abstracts

  7. Automated workflows for modelling chemical fate, kinetics and toxicity.

    Science.gov (United States)

    Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P

    2017-12-01

    Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Automation Marketplace 2010: New Models, Core Systems

    Science.gov (United States)

    Breeding, Marshall

    2010-01-01

    In a year when a difficult economy presented fewer opportunities for immediate gains, the major industry players have defined their business strategies with fundamentally different concepts of library automation. This is no longer an industry where companies compete on the basis of the best or the most features in similar products but one where…

  9. Modelling of evapotranspiration at field and landscape scales. Abstract

    DEFF Research Database (Denmark)

    Overgaard, Jesper; Butts, M.B.; Rosbjerg, Dan

    2002-01-01

    observations from a nearby weather station. Detailed land-use and soil maps were used to set up the model. Leaf area index was derived from NDVI (Normalized Difference Vegetation Index) images. To validate the model at field scale the simulated evapotranspiration rates were compared to eddy...

  10. Modelling and simulation of superalloys. Book of abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Rogal, Jutta; Hammerschmidt, Thomas; Drautz, Ralf (eds.)

    2014-07-01

    Superalloys are multi-component materials with complex microstructures that offer unique properties for high-temperature applications. The complexity of the superalloy materials makes it particularly challenging to obtain fundamental insight into their behaviour from the atomic structure to turbine blades. Recent advances in modelling and simulation of superalloys contribute to a better understanding and prediction of materials properties and therefore offer guidance for the development of new alloys. This workshop will give an overview of recent progress in modelling and simulation of materials for superalloys, with a focus on single crystal Ni-base and Co-base alloys. Topics will include electronic structure methods, atomistic simulations, microstructure modelling and modelling of microstructural evolution, solidification and process simulation as well as the modelling of phase stability and thermodynamics.

  11. Abstraction and Model Checking in the PEPA Plug-in for Eclipse

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    lead to very large Markov chains. One way of analysing such models is to use abstraction - constructing a smaller model that bounds the properties of the original. We present an extension to the PEPA plug-in for Eclipse that enables abstracting and model checking of PEPA models. This implements two new...

  12. Cooperatif Learning Models Simulation : From Abstract to Concrete

    Directory of Open Access Journals (Sweden)

    Agustini Ketut

    2018-01-01

    Full Text Available This study aimed to develop a simulation of cooperative learning model that used students as prospective teachers in improving the quality of learning, especially for preparedness in the classroom of the microteaching learning. A wider range of outcomes can be used more widely by teachers and lecturers in order to improve the professionalism as educators. The method used is research and development (R&D, using Dick & Carey development model. To produce as expected, there are several steps that must be done through global research, among others, do steps (a conduct in-depth theoretical study related to the simulation software that will be generated based on cooperative learning models to be developed , (b formulate figure simulation software system is based on the results of theoretical study and (c conduct a formative evaluation is done by content expert, design expert, and media expert to the validity of the simulation media, one to one student evaluation, small group evaluation and field trial evaluation. Simulation results showed that the Cooperative Learning Model can simulated three models by well. Student response through the simulation models is very positive by 60 % and 40% positive. The implication of this research result is that student of teacher candidate can apply cooperative learning model well when teaching real in training school hence student need to be given real simulation example how cooperative learning is implemented in class.

  13. Standard State Space Models of Unawareness (Extended Abstract

    Directory of Open Access Journals (Sweden)

    Peter Fritz

    2016-06-01

    Full Text Available The impossibility theorem of Dekel, Lipman and Rustichini has been thought to demonstrate that standard state-space models cannot be used to represent unawareness. We first show that Dekel, Lipman and Rustichini do not establish this claim. We then distinguish three notions of awareness, and argue that although one of them may not be adequately modeled using standard state spaces, there is no reason to think that standard state spaces cannot provide models of the other two notions. In fact, standard space models of these forms of awareness are attractively simple. They allow us to prove completeness and decidability results with ease, to carry over standard techniques from decision theory, and to add propositional quantifiers straightforwardly.

  14. Tent map as an abstract model of open system evolution

    Science.gov (United States)

    Usychenko, V. G.

    2011-06-01

    The irreversible equations of thermodynamic transport phenomena are changed to tent mapping. It is shown that a tent map may serve as a model of an open system capable of self-organization and evolution.

  15. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  16. Particle Tracking Model and Abstraction of Transport Processes

    International Nuclear Information System (INIS)

    Robinson, B.

    2000-01-01

    The purpose of the transport methodology and component analysis is to provide the numerical methods for simulating radionuclide transport and model setup for transport in the unsaturated zone (UZ) site-scale model. The particle-tracking method of simulating radionuclide transport is incorporated into the FEHM computer code and the resulting changes in the FEHM code are to be submitted to the software configuration management system. This Analysis and Model Report (AMR) outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the unsaturated zone at Yucca Mountain. In addition, methods for determining colloid-facilitated transport parameters are outlined for use in the Total System Performance Assessment (TSPA) analyses. Concurrently, process-level flow model calculations are being carrier out in a PMR for the unsaturated zone. The computer code TOUGH2 is being used to generate three-dimensional, dual-permeability flow fields, that are supplied to the Performance Assessment group for subsequent transport simulations. These flow fields are converted to input files compatible with the FEHM code, which for this application simulates radionuclide transport using the particle-tracking algorithm outlined in this AMR. Therefore, this AMR establishes the numerical method and demonstrates the use of the model, but the specific breakthrough curves presented do not necessarily represent the behavior of the Yucca Mountain unsaturated zone

  17. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann

    2012-01-01

    Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However...... (and reportedly one or two critics) can engage one another on several agreed questions about such frameworks. The goal is to aid non-aligned practitioners in choosing between alternative frameworks for their human-automation interaction design challenges....

  18. Modeling Hydraulic Components for Automated FMEA of a Braking System

    Science.gov (United States)

    2014-12-23

    causes the diminishing of the wheel brake pressure if the brake pedal is released. When operated under the Anti - lock - braking system (ABS), the valves...Modeling Hydraulic Components for Automated FMEA of a Braking System Peter Struss, Alessandro Fraracci Tech. Univ. of Munich, 85748 Garching...the hydraulic part of a vehicle braking system . We describe the FMEA task and the application problem and outline the foundations for automating the

  19. Dynamics Model Abstraction Scheme Using Radial Basis Functions

    Directory of Open Access Journals (Sweden)

    Silvia Tolu

    2012-01-01

    Full Text Available This paper presents a control model for object manipulation. Properties of objects and environmental conditions influence the motor control and learning. System dynamics depend on an unobserved external context, for example, work load of a robot manipulator. The dynamics of a robot arm change as it manipulates objects with different physical properties, for example, the mass, shape, or mass distribution. We address active sensing strategies to acquire object dynamical models with a radial basis function neural network (RBF. Experiments are done using a real robot’s arm, and trajectory data are gathered during various trials manipulating different objects. Biped robots do not have high force joint servos and the control system hardly compensates all the inertia variation of the adjacent joints and disturbance torque on dynamic gait control. In order to achieve smoother control and lead to more reliable sensorimotor complexes, we evaluate and compare a sparse velocity-driven versus a dense position-driven control scheme.

  20. Abstract interpretation over non-deterministic finite tree automate for set-based analysis of logic programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Puebla, G.

    2002-01-01

    Set-based program analysis has many potential applications, including compiler optimisations, type-checking, debugging, verification and planning. One method of set-based analysis is to solve a set of {\\it set constraints} derived directly from the program text. Another approach is based...... constraint analysis of a particular program $P$ could be understood as an abstract interpretation over a finite domain of regular tree grammars, constructed from $P$. In this paper we define such an abstract interpretation for logic programs, formulated over a domain of non-deterministic finite tree automata...

  1. Automated Model Fit Method for Diesel Engine Control Development

    NARCIS (Netherlands)

    Seykens, X.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is

  2. The Use of AMET & Automated Scripts for Model Evaluation

    Science.gov (United States)

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  3. MODELING OF AUTOMATION PROCESSES CONCERNING CROP CULTIVATION BY AVIATION

    Directory of Open Access Journals (Sweden)

    V. I. Ryabkov

    2010-01-01

    Full Text Available The paper considers modeling of automation processes concerning crop cultivation by aviation. Processes that take place in three interconnected environments: human, technical and movable air objects are described by a model which is based on a set theory. Stochastic network theory of mass service systems for description of human-machine system of real time is proposed in the paper.

  4. Parmodel: a web server for automated comparative modeling of proteins.

    Science.gov (United States)

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  5. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  6. Automated particulate sampler field test model operations guide

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, S.M.; Miley, H.S.

    1996-10-01

    The Automated Particulate Sampler Field Test Model Operations Guide is a collection of documents which provides a complete picture of the Automated Particulate Sampler (APS) and the Field Test in which it was evaluated. The Pacific Northwest National Laboratory (PNNL) Automated Particulate Sampler was developed for the purpose of radionuclide particulate monitoring for use under the Comprehensive Test Ban Treaty (CTBT). Its design was directed by anticipated requirements of small size, low power consumption, low noise level, fully automatic operation, and most predominantly the sensitivity requirements of the Conference on Disarmament Working Paper 224 (CDWP224). This guide is intended to serve as both a reference document for the APS and to provide detailed instructions on how to operate the sampler. This document provides a complete description of the APS Field Test Model and all the activity related to its evaluation and progression.

  7. Automated data acquisition technology development:Automated modeling and control development

    Science.gov (United States)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  8. Automation of program model developing for complex structure control objects

    International Nuclear Information System (INIS)

    Ivanov, A.P.; Sizova, T.B.; Mikhejkina, N.D.; Sankovskij, G.A.; Tyufyagin, A.N.

    1991-01-01

    A brief description of software for automated developing the models of integrating modular programming system, program module generator and program module library providing thermal-hydraulic calcualtion of process dynamics in power unit equipment components and on-line control system operation simulation is given. Technical recommendations for model development are based on experience in creation of concrete models of NPP power units. 8 refs., 1 tab., 4 figs

  9. Automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

    Science.gov (United States)

    Bordoli, Lorenza; Schwede, Torsten

    2012-01-01

    Comparative protein structure modeling is a computational approach to build three-dimensional structural models for proteins using experimental structures of related protein family members as templates. Regular blind assessments of modeling accuracy have demonstrated that comparative protein structure modeling is currently the most reliable technique to model protein structures. Homology models are often sufficiently accurate to substitute for experimental structures in a wide variety of applications. Since the usefulness of a model for specific application is determined by its accuracy, model quality estimation is an essential component of protein structure prediction. Comparative protein modeling has become a routine approach in many areas of life science research since fully automated modeling systems allow also nonexperts to build reliable models. In this chapter, we describe practical approaches for automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

  10. Towards the Automated Annotation of Process Models

    NARCIS (Netherlands)

    Leopold, H.; Meilicke, C.; Fellmann, M.; Pittke, F.; Stuckenschmidt, H.; Mendling, J.

    2016-01-01

    Many techniques for the advanced analysis of process models build on the annotation of process models with elements from predefined vocabularies such as taxonomies. However, the manual annotation of process models is cumbersome and sometimes even hardly manageable taking the size of taxonomies into

  11. Total human exposure and indoor air quality: An automated bibliography (BLIS) with summary abstracts. Volume 2. Final report, January 1987-December 1989

    International Nuclear Information System (INIS)

    Dellarco, M.; Ott, W.

    1990-10-01

    The Bibliographical Literature Information System (BLIS) is a computer database that provides a comprehensive review of available literature on total human exposure to environmental pollution. Brief abstracts (often condensed versions of the original abstract) are included; if the original document had no abstract, one was prepared. Unpublished draft reports are listed, as well as final reports of the U.S. Government and other countries, reports by governmental research contractors, journal articles, and other publications on exposure models field data, and newly emerging research methodologies. Emphasis is placed on those field studies measuring all the concentrations to which people may be exposed, including indoors, outdoors, and in-transit

  12. Agent-based Modeling Automated: Data-driven Generation of Innovation Diffusion Models

    NARCIS (Netherlands)

    Jensen, T.; Chappin, E.J.L.

    2016-01-01

    Simulation modeling is useful to gain insights into driving mechanisms of diffusion of innovations. This study aims to introduce automation to make identification of such mechanisms with agent-based simulation modeling less costly in time and labor. We present a novel automation procedure in which

  13. Automation of electroweak NLO corrections in general models

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Jean-Nicolas [Universitaet Wuerzburg (Germany)

    2016-07-01

    I discuss the automation of generation of scattering amplitudes in general quantum field theories at next-to-leading order in perturbation theory. The work is based on Recola, a highly efficient one-loop amplitude generator for the Standard Model, which I have extended so that it can deal with general quantum field theories. Internally, Recola computes off-shell currents and for new models new rules for off-shell currents emerge which are derived from the Feynman rules. My work relies on the UFO format which can be obtained by a suited model builder, e.g. FeynRules. I have developed tools to derive the necessary counterterm structures and to perform the renormalization within Recola in an automated way. I describe the procedure using the example of the two-Higgs-doublet model.

  14. An automated in vitro model for the evaluation of ultrasound modalities measuring myocardial deformation

    Directory of Open Access Journals (Sweden)

    Stigö Albin

    2010-09-01

    Full Text Available Abstract Background Echocardiography is the method of choice when one wishes to examine myocardial function. Qualitative assessment of the 2D grey scale images obtained is subjective, and objective methods are required. Speckle Tracking Ultrasound is an emerging technology, offering an objective mean of quantifying left ventricular wall motion. However, before a new ultrasound technology can be adopted in the clinic, accuracy and reproducibility needs to be investigated. Aim It was hypothesized that the collection of ultrasound sample data from an in vitro model could be automated. The aim was to optimize an in vitro model to allow for efficient collection of sample data. Material & Methods A tissue-mimicking phantom was made from water, gelatin powder, psyllium fibers and a preservative. Sonomicrometry crystals were molded into the phantom. The solid phantom was mounted in a stable stand and cyclically compressed. Peak strain was then measured by Speckle Tracking Ultrasound and sonomicrometry. Results We succeeded in automating the acquisition and analysis of sample data. Sample data was collected at a rate of 200 measurement pairs in 30 minutes. We found good agreement between Speckle Tracking Ultrasound and sonomicrometry in the in vitro model. Best agreement was 0.83 ± 0.70%. Worst agreement was -1.13 ± 6.46%. Conclusions It has been shown possible to automate a model that can be used for evaluating the in vitro accuracy and precision of ultrasound modalities measuring deformation. Sonomicrometry and Speckle Tracking Ultrasound had acceptable agreement.

  15. Modeling of Prepregs during Automated Draping Sequences

    DEFF Research Database (Denmark)

    Krogh, Christian; Glud, Jens Ammitzbøll; Jakobsen, Johnny

    pockets and other defects. The models must, among other things, account for the nonlinear anisotropic constitutive behavior, viscoelasticity, possible plasticity, and contact which includes friction between the ply-mold and ply-end effector interfaces. The problem is path dependent and thus the transient...

  16. Automated Qualitative Modeling of Dynamic Physical Systems

    Science.gov (United States)

    1993-01-01

    Resnick, Naomi Ribner, Ruth Schonfeld, Re- becca Simmons, Cindy Wible, and especially David Clemens, Nomi Harris, Michele Popper , Karen Sarachik, and...describe a part of a system by using a component name, such as "mo- tor." MM accepts both geometric and component descriptions, and allows t, e two...not a scientific discovery program along the lines of, say, BACON [201, which could also be said to be constructing models of systems. Thus the first

  17. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  18. Abstract behavior types : a foundation model for components and their composition

    NARCIS (Netherlands)

    F. Arbab (Farhad)

    2003-01-01

    textabstractThe notion of Abstract Data Type (ADT) has served as a foundation model for structured and object oriented programming for some thirty years. The current trend in software engineering toward component based systems requires a foundation model as well. The most basic inherent property of

  19. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann

    2012-01-01

    , the frameworks are often adapted from other purposes, usually applied to a limited range of problems, sometimes not fully described in the open literature, and rarely critically reviewed in a manner acceptable to proponents and critics alike. The present paper introduces a panel session wherein these proponents...... (and reportedly one or two critics) can engage one another on several agreed questions about such frameworks. The goal is to aid non-aligned practitioners in choosing between alternative frameworks for their human-automation interaction design challenges.......Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However...

  20. Assessing the impacts of water abstractions on river ecosystem services: an eco-hydraulic modelling approach

    International Nuclear Information System (INIS)

    Carolli, Mauro; Geneletti, Davide; Zolezzi, Guido

    2017-01-01

    The provision of important river ecosystem services (ES) is dependent on the flow regime. This requires methods to assess the impacts on ES caused by interventions on rivers that affect flow regime, such as water abstractions. This study proposes a method to i) quantify the provision of a set of river ES, ii) simulate the effects of water abstraction alternatives that differ in location and abstracted flow, and iii) assess the impact of water abstraction alternatives on the selected ES. The method is based on river modelling science, and integrates spatially distributed hydrological, hydraulic and habitat models at different spatial and temporal scales. The method is applied to the hydropeaked upper Noce River (Northern Italy), which is regulated by hydropower operations. We selected locally relevant river ES: habitat suitability for the adult marble trout, white-water rafting suitability, hydroelectricity production from run-of-river (RoR) plants. Our results quantify the seasonality of river ES response variables and their intrinsic non-linearity, which explains why the same abstracted flow can produce different effects on trout habitat and rafting suitability depending on the morphology of the abstracted reach. An economic valuation of the examined river ES suggests that incomes from RoR hydropower plants are of comparable magnitude to touristic revenue losses related to the decrease in rafting suitability.

  1. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Automated Decomposition of Model-based Learning Problems

    Science.gov (United States)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  3. Utilizing Gaze Behavior for Inferring Task Transitions Using Abstract Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Daniel Fernando Tello Gamarra

    2016-12-01

    Full Text Available We demonstrate an improved method for utilizing observed gaze behavior and show that it is useful in inferring hand movement intent during goal directed tasks. The task dynamics and the relationship between hand and gaze behavior are learned using an Abstract Hidden Markov Model (AHMM. We show that the predicted hand movement transitions occur consistently earlier in AHMM models with gaze than those models that do not include gaze observations.

  4. Technical Work Plan for: Near Field Environment: Engineered Barrier System: Radionuclide Transport Abstraction Model Report

    International Nuclear Information System (INIS)

    J.D. Schreiber

    2006-01-01

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model will be revised to be consistent with

  5. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  6. Abstraction of Models for Pitting and Crevice Corrosion of Drip Shield and Waste Package Outer Barrier

    Energy Technology Data Exchange (ETDEWEB)

    K. Mon

    2001-08-29

    This analyses and models report (AMR) was conducted in response to written work direction (CRWMS M and O 1999a). ICN 01 of this AMR was developed following guidelines provided in TWP-MGR-MD-000004 REV 01, ''Technical Work Plan for: Integrated Management of Technical Product Input Department'' (BSC 2001, Addendum B). The purpose and scope of this AMR is to review and analyze upstream process-level models (CRWMS M and O 2000a and CRWMS M and O 2000b) and information relevant to pitting and crevice corrosion degradation of waste package outer barrier (Alloy 22) and drip shield (Titanium Grade 7) materials, and to develop abstractions of the important processes in a form that is suitable for input to the WAPDEG analysis for long-term degradation of waste package outer barrier and drip shield in the repository. The abstraction is developed in a manner that ensures consistency with the process-level models and information and captures the essential behavior of the processes represented. Also considered in the model abstraction are the probably range of exposure conditions in emplacement drifts and local exposure conditions on drip shield and waste package surfaces. The approach, method, and assumptions that are employed in the model abstraction are documented and justified.

  7. Models and automation technologies for the curriculum development

    Directory of Open Access Journals (Sweden)

    V. N. Volkova

    2016-01-01

    Full Text Available The aim of the research was to determine the sequence of the curriculum development stages on the basis of the system analysis, as well as to create models and information technologies for the implementation of thesestages.The methods and the models of the systems’ theory and the system analysis, including methods and automated procedures for structuring organizational aims, models and automated procedures for organizing complex expertise.On the basis of the analysis of existing studies in the field of curriculum modeling, using formal mathematical language, including optimization models, that help to make distribution of disciplines by years and semesters in accordance with the relevant restrictions, it is shown, that the complexity and dimension of these tasks require the development of special software; the problem of defining the input data and restrictions requires a large time investment, that seems to be difficult to provide in real conditions of plans’ developing, thus it is almost impossible to verify the objectivity of the input data and the restrictions in such models. For a complete analysis of the process of curriculum development it is proposed to use the system definition, based on the system-targeted approach. On the basis of this definition the reasonable sequence of the integrated stages for the development of the curriculum was justified: 1 definition (specification of the requirements for the educational content; 2 determining the number of subjects, included in the curriculum; 3 definition of the sequence of the subjects; 4 distribution of subjects by semesters. The models and technologies for the implementation of these stages of curriculum development were given in the article: 1 models, based on the information approach of A.Denisov and the modified degree of compliance with objectives based on Denisov’s evaluation index (in the article the idea of evaluating the degree of the impact of disciplines for realization

  8. Geochemistry Model Abstraction and Sensitivity Studies for the 21 PWR CSNF Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    P. Bernot; S. LeStrange; E. Thomas; K. Zarrabi; S. Arthur

    2002-10-29

    The CSNF geochemistry model abstraction, as directed by the TWP (BSC 2002b), was developed to provide regression analysis of EQ6 cases to obtain abstracted values of pH (and in some cases HCO{sub 3}{sup -} concentration) for use in the Configuration Generator Model. The pH of the system is the controlling factor over U mineralization, CSNF degradation rate, and HCO{sub 3}{sup -} concentration in solution. The abstraction encompasses a large variety of combinations for the degradation rates of materials. The ''base case'' used EQ6 simulations looking at differing steel/alloy corrosion rates, drip rates, and percent fuel exposure. Other values such as the pH/HCO{sub 3}{sup -} dependent fuel corrosion rate and the corrosion rate of A516 were kept constant. Relationships were developed for pH as a function of these differing rates to be used in the calculation of total C and subsequently, the fuel rate. An additional refinement to the abstraction was the addition of abstracted pH values for cases where there was limited O{sub 2} for waste package corrosion and a flushing fluid other than J-13, which has been used in all EQ6 calculation up to this point. These abstractions also used EQ6 simulations with varying combinations of corrosion rates of materials to abstract the pH (and HCO{sub 3}{sup -} in the case of the limiting O{sub 2} cases) as a function of WP materials corrosion rates. The goodness of fit for most of the abstracted values was above an R{sup 2} of 0.9. Those below this value occurred during the time at the very beginning of WP corrosion when large variations in the system pH are observed. However, the significance of F-statistic for all the abstractions showed that the variable relationships are significant. For the abstraction, an analysis of the minerals that may form the ''sludge'' in the waste package was also presented. This analysis indicates that a number a different iron and aluminum minerals may form in

  9. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  10. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  11. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  12. Automated Modeling of Microwave Structures by Enhanced Neural Networks

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2006-12-01

    Full Text Available The paper describes the methodology of the automated creation of neural models of microwave structures. During the creation process, artificial neural networks are trained using the combination of the particle swarm optimization and the quasi-Newton method to avoid critical training problems of the conventional neural nets. In the paper, neural networks are used to approximate the behavior of a planar microwave filter (moment method, Zeland IE3D. In order to evaluate the efficiency of neural modeling, global optimizations are performed using numerical models and neural ones. Both approaches are compared from the viewpoint of CPU-time demands and the accuracy. Considering conclusions, methodological recommendations for including neural networks to the microwave design are formulated.

  13. Identifying crop vulnerability to groundwater abstraction: modelling and expert knowledge in a GIS.

    Science.gov (United States)

    Procter, Chris; Comber, Lex; Betson, Mark; Buckley, Dennis; Frost, Andy; Lyons, Hester; Riding, Alison; Voyce, Kevin

    2006-11-01

    Water use is expected to increase and climate change scenarios indicate the need for more frequent water abstraction. Abstracting groundwater may have a detrimental effect on soil moisture availability for crop growth and yields. This work presents an elegant and robust method for identifying zones of crop vulnerability to abstraction. Archive groundwater level datasets were used to generate a composite groundwater surface that was subtracted from a digital terrain model. The result was the depth from surface to groundwater and identified areas underlain by shallow groundwater. Knowledge from an expert agronomist was used to define classes of risk in terms of their depth below ground level. Combining information on the permeability of geological drift types further refined the assessment of the risk of crop growth vulnerability. The nature of the mapped output is one that is easy to communicate to the intended farming audience because of the general familiarity of mapped information. Such Geographic Information System (GIS)-based products can play a significant role in the characterisation of catchments under the EU Water Framework Directive especially in the process of public liaison that is fundamental to the setting of priorities for management change. The creation of a baseline allows the impact of future increased water abstraction rates to be modelled and the vulnerability maps are in a format that can be readily understood by the various stakeholders. This methodology can readily be extended to encompass additional data layers and for a range of groundwater vulnerability issues including water resources, ecological impacts, nitrate and phosphorus.

  14. A Computational Approach for Automated Posturing of a Human Finite Element Model

    Science.gov (United States)

    2016-07-01

    Std. Z39.18 July 2016 Memorandum Report A Computational Approach for Automated Posturing of a Human Finite Element Model Justin McKee and Adam...ARL-MR-0934• JULY 2016 US Army Research Laboratory A Computational Approach for Automated Posturing of a Human Finite ElementModel by Justin McKee... Automated Posturing of a Human Finite ElementModel by Justin McKee Bennett Aerospace, Inc., Cary, NC Adam Sokolow Weapons and Materials Research

  15. Technical Work Plan for: Near Field Environment: Engineered System: Radionuclide Transport Abstraction Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2006-12-08

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model

  16. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  17. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  18. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  19. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring

    2016-10-01

    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  20. Automated reconstruction of 3D models from real environments

    Science.gov (United States)

    Sequeira, V.; Ng, K.; Wolfart, E.; Gonçalves, J. G. M.; Hogg, D.

    This paper describes an integrated approach to the construction of textured 3D scene models of building interiors from laser range data and visual images. This approach has been implemented in a collection of algorithms and sensors within a prototype device for 3D reconstruction, known as the EST (Environmental Sensor for Telepresence). The EST can take the form of a push trolley or of an autonomous mobile platform. The Autonomous EST (AEST) has been designed to provide an integrated solution for automating the creation of complete models. Embedded software performs several functions, including triangulation of the range data, registration of video texture, registration and integration of data acquired from different capture points. Potential applications include facilities management for the construction industry and creating reality models to be used in general areas of virtual reality, for example, virtual studios, virtualised reality for content-related applications (e.g., CD-ROMs), social telepresence, architecture and others. The paper presents the main components of the EST/AEST, and presents some example results obtained from the prototypes. The reconstructed model is encoded in VRML format so that it is possible to access and view the model via the World Wide Web.

  1. Automated robust generation of compact 3D statistical shape models

    Science.gov (United States)

    Vrtovec, Tomaz; Likar, Bostjan; Tomazevic, Dejan; Pernus, Franjo

    2004-05-01

    Ascertaining the detailed shape and spatial arrangement of anatomical structures is important not only within diagnostic settings but also in the areas of planning, simulation, intraoperative navigation, and tracking of pathology. Robust, accurate and efficient automated segmentation of anatomical structures is difficult because of their complexity and inter-patient variability. Furthermore, the position of the patient during image acquisition, the imaging device and protocol, image resolution, and other factors induce additional variations in shape and appearance. Statistical shape models (SSMs) have proven quite successful in capturing structural variability. A possible approach to obtain a 3D SSM is to extract reference voxels by precisely segmenting the structure in one, reference image. The corresponding voxels in other images are determined by registering the reference image to each other image. The SSM obtained in this way describes statistically plausible shape variations over the given population as well as variations due to imperfect registration. In this paper, we present a completely automated method that significantly reduces shape variations induced by imperfect registration, thus allowing a more accurate description of variations. At each iteration, the derived SSM is used for coarse registration, which is further improved by describing finer variations of the structure. The method was tested on 64 lumbar spinal column CT scans, from which 23, 38, 45, 46 and 42 volumes of interest containing vertebra L1, L2, L3, L4 and L5, respectively, were extracted. Separate SSMs were generated for each vertebra. The results show that the method is capable of reducing the variations induced by registration errors.

  2. Individual Differences in Response to Automation: The Five Factor Model of Personality

    Science.gov (United States)

    Szalma, James L.; Taylor, Grant S.

    2011-01-01

    This study examined the relationship of operator personality (Five Factor Model) and characteristics of the task and of adaptive automation (reliability and adaptiveness--whether the automation was well-matched to changes in task demand) to operator performance, workload, stress, and coping. This represents the first investigation of how the Five…

  3. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  4. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  5. Abstract Modelling of the Impact of Activities of Economic Entities on the Social System

    Directory of Open Access Journals (Sweden)

    Dana Bernardová

    2017-01-01

    Full Text Available Economic entities as integral parts of the social system have an impact on it. The complexity of structures and uncertainty of behaviour which are also conditioned by incorporating the human factor are the typical characteristics of economic entities and the social system. The lack of precise measurement data as well as precise information is their typical feature. Methods of creating computer models of such systems must therefore be based on uncertain, incomplete or approximate data and hypothetical assumptions. The paper deals with the synthesis of the abstract model of the expert system for determining the level of corporate social responsibility of an enterprise (CSR with the use of methods of artificial intelligence. The linguistic rule model is built on the basis of the expert determination of the level of CSR based on the level of care for employees, level of supplier‑customer relations, level of its ecological behaviour, and compliance with legal obligations. The linguistic modelling method is based on the theoretical approach to fuzzy set mathematics and fuzzy logic. The aim of the paper is the presentation of the system for determining the level of CSR with the use of non‑conventional non‑numerical methods as well as simulative presentation of the efficiency of its functions. The above‑mentioned expert system is a relevant module of the built hierarchical structure aimed at the research of impacts of activities of economic entities on the social system.

  6. Abstract Tree Architectures in 3d Canopy Reflectance Models: Impact on Simulated Satellite Observations

    Science.gov (United States)

    Widlowski, J.; Cote, J.; Beland, M.

    2013-12-01

    Current operational retrieval algorithms of terrestrial essential climate variables, like LAI and FAPAR, rely on simulations from physically based radiative transfer models that possess inherent assumptions regarding the structure of the vegetation. This work investigates the biases that can be expected when validated Monte Carlo ray-tracing models are used to simulated bi-directional reflectance properties of Savanna environments using different levels of architectural realism. More specifically, tree crowns will be gradually abstracted with voxels that increase in size from 0.1m to 0.9m sidelength. This will reduce the spatial variability of the foliage density in the tree crown, alter the outer silhouette of the tree, and change its directional cross-sectional area. To assess the impact of such structural changes on the radiative properties of Savanna environments, very detailed 3-D tree reconstructions are made use of that were originally derived from terrestrial LIDAR scans acquired in Mali. Canopy reflectance simulations of these reference targets are then compared with the same data from the voxelised canopy representations (with and without woody structures) at multiple spatial resolutions, spectral domains, as well as illumination and viewing configurations. The goal of this study is to find the least detailed tree architecture representations that are still suitable for the interpretation of space borne data. Graphical depiction of the Savanna trees (top left) that served as architectural reference for canopy reflectance simulations that were then compared to the same quantities for crown abstractions based on different voxel sizes (middle and right panels) as well as ellipsoids (bottom left).

  7. Bottom-Up Abstract Modelling of Optical Networks-on-Chip: From Physical to Architectural Layer

    Directory of Open Access Journals (Sweden)

    Alberto Parini

    2012-01-01

    Full Text Available This work presents a bottom-up abstraction procedure based on the design-flow FDTD + SystemC suitable for the modelling of optical Networks-on-Chip. In this procedure, a complex network is decomposed into elementary switching elements whose input-output behavior is described by means of scattering parameters models. The parameters of each elementary block are then determined through 2D-FDTD simulation, and the resulting analytical models are exported within functional blocks in SystemC environment. The inherent modularity and scalability of the S-matrix formalism are preserved inside SystemC, thus allowing the incremental composition and successive characterization of complex topologies typically out of reach for full-vectorial electromagnetic simulators. The consistency of the outlined approach is verified, in the first instance, by performing a SystemC analysis of a four-input, four-output ports switch and making a comparison with the results of 2D-FDTD simulations of the same device. Finally, a further complex network encompassing 160 microrings is investigated, the losses over each routing path are calculated, and the minimum amount of power needed to guarantee an assigned BER is determined. This work is a basic step in the direction of an automatic technology-aware network-level simulation framework capable of assembling complex optical switching fabrics, while at the same time assessing the practical feasibility and effectiveness at the physical/technological level.

  8. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  9. A Stepwise Fitting Procedure for automated fitting of Ecopath with Ecosim models

    Science.gov (United States)

    Scott, Erin; Serpetti, Natalia; Steenbeek, Jeroen; Heymans, Johanna Jacomina

    The Stepwise Fitting Procedure automates testing of alternative hypotheses used for fitting Ecopath with Ecosim (EwE) models to observation reference data (Mackinson et al. 2009). The calibration of EwE model predictions to observed data is important to evaluate any model that will be used for ecosystem based management. Thus far, the model fitting procedure in EwE has been carried out manually: a repetitive task involving setting > 1000 specific individual searches to find the statistically 'best fit' model. The novel fitting procedure automates the manual procedure therefore producing accurate results and lets the modeller concentrate on investigating the 'best fit' model for ecological accuracy.

  10. An Overview of the Automated Dispatch Controller Algorithms in the System Advisor Model (SAM)

    Energy Technology Data Exchange (ETDEWEB)

    DiOrio, Nicholas A [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-11-22

    Three automatic dispatch modes have been added to the battery model within the System Adviser Model. These controllers have been developed to perform peak shaving in an automated fashion, providing users with a way to see the benefit of reduced demand charges without manually programming a complicated dispatch control. A flexible input option allows more advanced interaction with the automated controller. This document will describe the algorithms in detail and present brief results on its use and limitations.

  11. Automated MRI segmentation for individualized modeling of current flow in the human head

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-12-01

    Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully

  12. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    Directory of Open Access Journals (Sweden)

    Mohamed Saad

    2015-08-01

    Full Text Available Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid management technique where retail or wholesale customers are requested either electronically or manually to reduce their load. Therefore this paper aims to design and model an automated power system for optimal new load locations using DPL DIgSILENT Programming Language. This study is a diagnostic approach that assists system operator about any voltage violation cases that would happen during adding new load to the grid. The process of identifying the optimal bus bar location involves a complicated calculation of the power consumptions at each load bus As a result the DPL program would consider all the IEEE 30 bus internal networks data then a load flow simulation will be executed. To add the new load to the first bus in the network. Therefore the developed model will simulate the new load at each available bus bar in the network and generate three analytical reports for each case that captures the overunder voltage and the loading elements among the grid.

  13. Industrial Automation Mechanic Model Curriculum Project. Final Report.

    Science.gov (United States)

    Toledo Public Schools, OH.

    This document describes a demonstration program that developed secondary level competency-based instructional materials for industrial automation mechanics. Program activities included task list compilation, instructional materials research, learning activity packet (LAP) development, construction of lab elements, system implementation,…

  14. Modelling and experimental study for automated congestion driving

    NARCIS (Netherlands)

    Urhahne, Joseph; Piastowski, P.; van der Voort, Mascha C.; Bebis, G; Boyle, R.; Parvin, B.; Koracin, D.; Pavlidis, I.; Feris, R.; McGraw, T.; Elendt, M.; Kopper, R.; Ragan, E.; Ye, Z.; Weber, G.

    2015-01-01

    Taking a collaborative approach in automated congestion driving with a Traffic Jam Assist system requires the driver to take over control in certain traffic situations. In order to warn the driver appropriately, warnings are issued (“pay attention” vs. “take action”) due to a control transition

  15. Automation of a dust sampling train | Akinola | Journal of Modeling ...

    African Journals Online (AJOL)

    The results obtained from this work show that the flue gas sampling process can be automated using microprocessor-based control system and a sampling train. Using the sampling train developed by the British Coal Utilization Research Association (BCURA) with both the internal and external flow-meter arrangements, the ...

  16. Côte de Resyste -- Automated Model Based Testing

    NARCIS (Netherlands)

    Tretmans, G.J.; Brinksma, Hendrik; Schweizer, M.

    2002-01-01

    Systematic testing is very important for assessing and improving the quality of embedded software. Yet, testing turns out to be expensive, laborious, time-consuming and error-prone. The project Cˆote de Resyste has been working since 1998 on methods, techniques and tools for automating specification

  17. TorX: Automated Model-Based Testing

    NARCIS (Netherlands)

    Tretmans, G.J.; Brinksma, Hendrik; Hartman, A.; Dussa-Ziegler, K.

    2003-01-01

    Systematic testing is very important for assessing and improving the quality of software systems. Yet, testing turns out to be expensive, laborious, time-consuming and error-prone. The Dutch research and development project Côte de Resyste worked on methods, techniques and tools for automating

  18. METHOD AND ABSTRACT MODEL FOR CONTROL AND ACCESS RIGHTS BY REQUESTS REDIRECTION

    Directory of Open Access Journals (Sweden)

    K. A. Shcheglov

    2015-11-01

    Full Text Available We have researched implementation problems of control and access rights of subjects to objects in modern computer systems. We have suggested access control method based on objects access requests redirection. The method possesses a distinctive feature as compared to discretional access control. In case when a subject needs to deny writing (object modification, it is not denied but redirected (access rights are not changed, but operation is performed with another object. This gives the possibility to implement access policies to system objects without breaking the system and applications operability, and share correctly access objects between subjects. This important property of suggested access control method enables to solve fundamentally new system objects securing problems like system resources virtualization aimed to protect system objects from users’ and applications attacks. We have created an abstract model, and it shows that this method (access control from subjects to objects based on requests redirection can be used as self-sufficient access control method, implementing any access control policy (from subjects to objects, thus being an alternative to discretional access control method.

  19. Automated 4D analysis of dendritic spine morphology: applications to stimulus-induced spine remodeling and pharmacological rescue in a disease model

    Directory of Open Access Journals (Sweden)

    Swanger Sharon A

    2011-10-01

    Full Text Available Abstract Uncovering the mechanisms that regulate dendritic spine morphology has been limited, in part, by the lack of efficient and unbiased methods for analyzing spines. Here, we describe an automated 3D spine morphometry method and its application to spine remodeling in live neurons and spine abnormalities in a disease model. We anticipate that this approach will advance studies of synapse structure and function in brain development, plasticity, and disease.

  20. Context based mixture model for cell phase identification in automated fluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Zhou Xiaobo

    2007-01-01

    Full Text Available Abstract Background Automated identification of cell cycle phases of individual live cells in a large population captured via automated fluorescence microscopy technique is important for cancer drug discovery and cell cycle studies. Time-lapse fluorescence microscopy images provide an important method to study the cell cycle process under different conditions of perturbation. Existing methods are limited in dealing with such time-lapse data sets while manual analysis is not feasible. This paper presents statistical data analysis and statistical pattern recognition to perform this task. Results The data is generated from Hela H2B GFP cells imaged during a 2-day period with images acquired 15 minutes apart using an automated time-lapse fluorescence microscopy. The patterns are described with four kinds of features, including twelve general features, Haralick texture features, Zernike moment features, and wavelet features. To generate a new set of features with more discriminate power, the commonly used feature reduction techniques are used, which include Principle Component Analysis (PCA, Linear Discriminant Analysis (LDA, Maximum Margin Criterion (MMC, Stepwise Discriminate Analysis based Feature Selection (SDAFS, and Genetic Algorithm based Feature Selection (GAFS. Then, we propose a Context Based Mixture Model (CBMM for dealing with the time-series cell sequence information and compare it to other traditional classifiers: Support Vector Machine (SVM, Neural Network (NN, and K-Nearest Neighbor (KNN. Being a standard practice in machine learning, we systematically compare the performance of a number of common feature reduction techniques and classifiers to select an optimal combination of a feature reduction technique and a classifier. A cellular database containing 100 manually labelled subsequence is built for evaluating the performance of the classifiers. The generalization error is estimated using the cross validation technique. The

  1. ABSTRACTION OF INFORMATION FROM 2- AND 3-DIMENSIONAL PORFLOW MODELS INTO A 1-D GOLDSIM MODEL - 11404

    International Nuclear Information System (INIS)

    Taylor, G.; Hiergesell, R.

    2010-01-01

    The Savannah River National Laboratory has developed a 'hybrid' approach to Performance Assessment modeling which has been used for a number of Performance Assessments. This hybrid approach uses a multi-dimensional modeling platform (PorFlow) to develop deterministic flow fields and perform contaminant transport. The GoldSim modeling platform is used to develop the Sensitivity and Uncertainty analyses. Because these codes are performing complementary tasks, it is incumbent upon them that for the deterministic cases they produce very similar results. This paper discusses two very different waste forms, one with no engineered barriers and one with engineered barriers, each of which present different challenges to the abstraction of data. The hybrid approach to Performance Assessment modeling used at the SRNL uses a 2-D unsaturated zone (UZ) and a 3-D saturated zone (SZ) model in the PorFlow modeling platform. The UZ model consists of the waste zone and the unsaturated zoned between the waste zone and the water table. The SZ model consists of source cells beneath the waste form to the points of interest. Both models contain 'buffer' cells so that modeling domain boundaries do not adversely affect the calculation. The information pipeline between the two models is the contaminant flux. The domain contaminant flux, typically in units of moles (or Curies) per year from the UZ model is used as a boundary condition for the source cells in the SZ. The GoldSim modeling component of the hybrid approach is an integrated UZ-SZ model. The model is a 1-D representation of the SZ, typically 1-D in the UZ, but as discussed below, depending on the waste form being analyzed may contain pseudo-2-D elements. A waste form at the Savannah River Site (SRS) which has no engineered barriers is commonly referred to as a slit trench. A slit trench, as its name implies, is an unlined trench, typically 6 m deep, 6 m wide, and 200 m long. Low level waste consisting of soil, debris, rubble, wood

  2. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  3. Advances in automated valuation modeling AVM after the non-agency mortgage crisis

    CERN Document Server

    Kauko, Tom

    2017-01-01

    This book addresses several problems related to automated valuation methodologies (AVM). Following the non-agency mortgage crisis, it offers a variety of approaches to improve the efficiency and quality of an automated valuation methodology (AVM) dealing with emerging problems and different contexts. Spatial issue, evolution of AVM standards, multilevel models, fuzzy and rough set applications and quantitative methods to define comparables are just some of the topics discussed.

  4. Automated finite element modelling of 3D woven textiles

    OpenAIRE

    Zeng, Xuesen; Long, A.C.; Clifford, M.J.; Probst-Schendzielorz, S.; Schmitt, M.W.

    2011-01-01

    The advance of 3D fabric technology allows tailored material structure in different directions for optimised performance. 3D fabrics open up increasing applications in automotive, medical, energy and many other areas. This paper explores highly automated techniques to simulate 3D fabric geometry and mechanical behaviour. The basis of the work starts from TexGen,an open source software package developed at the University of Nottingham. A complex variety of 3D fabrics can be defined as subclass...

  5. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    Irishkin, Maxim

    2014-01-01

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author) [fr

  6. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  7. Model of informational system for freight insurance automation based on digital signature

    Directory of Open Access Journals (Sweden)

    Maxim E. SLOBODYANYUK

    2009-01-01

    Full Text Available In the article considered a model of informational system for freight insurance automation based on digital signature, showed architecture, macro flowchart of information flow in model, components (modules and their functions. Described calculation method of costs on interactive cargo insurance via proposed system, represented main characteristics and options of existing transport management systems, conceptual cost models.

  8. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  9. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  10. HIERARCHICAL DATAFLOW MODEL WITH AUTOMATED FILE MANAGEMENT FOR ENGINEERING AND SCIENTIFIC APPLICATIONS

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Solving modern scientific and engineering problems typically implies using multiple task-specific software appli- cations and often a complex sequence of computations must be performed. Adopted approach to achieve the required level of automation is to use one of the many available scientific and engineering workflow systems, which can be based on dif- ferent workflow models. This paper introduces a workflow model targeted to provide natural automation and distributed execution of complex iterative computation processes, where the calculation chain contains multiple task-specific software applications which exchange files during the process.The proposed workflow model addresses a wide range of applications and targets complex cases when a single it- eration of a top-level process may contain multiple nested execution loops. Typical requirements to process automation are considered as well: execution isolation, data re-use and caching, parallel execution, data provenance tracking.

  11. Planning for Evolution in a Production Environment: Migration from a Legacy Geometry Code to an Abstract Geometry Modeling Language in STAR

    Science.gov (United States)

    Webb, Jason C.; Lauret, Jerome; Perevoztchikov, Victor

    2012-12-01

    Increasingly detailed descriptions of complex detector geometries are required for the simulation and analysis of today's high-energy and nuclear physics experiments. As new tools for the representation of geometry models become available during the course of an experiment, a fundamental challenge arises: how best to migrate from legacy geometry codes developed over many runs to the new technologies, such as the ROOT/TGeo [1] framework, without losing touch with years of development, tuning and validation. One approach, which has been discussed within the community for a number of years, is to represent the geometry model in a higher-level language independent of the concrete implementation of the geometry. The STAR experiment has used this approach to successfully migrate its legacy GEANT 3-era geometry to an Abstract geometry Modelling Language (AgML), which allows us to create both native GEANT 3 and ROOT/TGeo implementations. The language is supported by parsers and a C++ class library which enables the automated conversion of the original source code to AgML, supports export back to the original AgSTAR[5] representation, and creates the concrete ROOT/TGeo geometry implementation used by our track reconstruction software. In this paper we present our approach, design and experience and will demonstrate physical consistency between the original AgSTAR and new AgML geometry representations.

  12. Individual differences in response to automation: the five factor model of personality.

    Science.gov (United States)

    Szalma, James L; Taylor, Grant S

    2011-06-01

    This study examined the relationship of operator personality (Five Factor Model) and characteristics of the task and of adaptive automation (reliability and adaptiveness-whether the automation was well-matched to changes in task demand) to operator performance, workload, stress, and coping. This represents the first investigation of how the Five Factors relate to human response to automation. One-hundred-sixty-one college students experienced either 75% or 95% reliable automation provided with task loads of either two or four displays to be monitored. The task required threat detection in a simulated uninhabited ground vehicle (UGV) task. Task demand exerted the strongest influence on outcome variables. Automation characteristics did not directly impact workload or stress, but effects did emerge in the context of trait-task interactions that varied as a function of the dimension of workload and stress. The pattern of relationships of traits to dependent variables was generally moderated by at least one task factor. Neuroticism was related to poorer performance in some conditions, and all five traits were associated with at least one measure of workload and stress. Neuroticism generally predicted increased workload and stress and the other traits predicted decreased levels of these states. However, in the case of the relation of Extraversion and Agreeableness to Worry, Frustration, and avoidant coping, the direction of effects varied across task conditions. The results support incorporation of individual differences into automation design by identifying the relevant person characteristics and using the information to determine what functions to automate and the form and level of automation.

  13. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    Science.gov (United States)

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  14. A predictive model of flight crew performance in automated air traffic control and flight management operations

    Science.gov (United States)

    1995-01-01

    Prepared ca. 1995. This paper describes Air-MIDAS, a model of pilot performance in interaction with varied levels of automation in flight management operations. The model was used to predict the performance of a two person flight crew responding to c...

  15. Teaching Subtraction and Multiplication with Regrouping Using the Concrete-Representational-Abstract Sequence and Strategic Instruction Model

    Science.gov (United States)

    Flores, Margaret M.; Hinton, Vanessa; Strozier, Shaunita D.

    2014-01-01

    Based on Common Core Standards (2010), mathematics interventions should emphasize conceptual understanding of numbers and operations as well as fluency. For students at risk for failure, the concrete-representational-abstract (CRA) sequence and the Strategic Instruction Model (SIM) have been shown effective in teaching computation with an emphasis…

  16. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  17. Abstract algebra

    CERN Document Server

    Garrett, Paul B

    2007-01-01

    Designed for an advanced undergraduate- or graduate-level course, Abstract Algebra provides an example-oriented, less heavily symbolic approach to abstract algebra. The text emphasizes specifics such as basic number theory, polynomials, finite fields, as well as linear and multilinear algebra. This classroom-tested, how-to manual takes a more narrative approach than the stiff formalism of many other textbooks, presenting coherent storylines to convey crucial ideas in a student-friendly, accessible manner. An unusual feature of the text is the systematic characterization of objects by universal

  18. BALWOIS: Abstracts

    International Nuclear Information System (INIS)

    Morell, Morell; Todorovik, Olivija; Dimitrov, Dobri

    2004-01-01

    anthropogenic pressures and international shared water. Here are the 320 abstracts proposed by authors and accepted by the Scientific Committee. More than 200 papers are presented during the Conference on 8 topics related to Hydrology, Climatology and Hydro biology: - Climate and Environment; - Hydrological regimes and water balances; - Droughts and Floods; -Integrated Water Resources Management; -Water bodies Protection and Eco hydrology; -Lakes; -Information Systems for decision support; -Hydrological modelling. Papers relevant to INIS are indexed separately

  19. An OMNeT++ model of the control system of large-scale concentrator photovoltaic power plants: Poster abstract

    OpenAIRE

    Benoit, P.; Fey, S.; Rohbogner, G.; Kreifels, N.; Kohrs, R.

    2013-01-01

    The communication system of a large-scale concentrator photovoltaic power plant is very challenging. Manufacturers are building power plants having thousands of sun tracking systems equipped with communication and distributed over a wide area. Research is necessary to build a scalable communication system enabling modern control strategies. This poster abstract describes the ongoing work on the development of a simulation model of such power plants in OMNeT++. The model uses the INET Framewor...

  20. Automated side-chain model building and sequence assignment by template matching

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2002-01-01

    A method for automated macromolecular side-chain model building and for aligning the sequence to the map is described. An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer

  1. Article Abstract

    African Journals Online (AJOL)

    Abstract. Simple learning tools to improve clinical laboratory practical skills training. B Taye, BSc, MPH. Addis Ababa University, College of Health Sciences, Addis Ababa, ... concerns about the competence of medical laboratory science graduates. ... standardised practical learning guides and assessment checklists would.

  2. Abstract Introduction

    African Journals Online (AJOL)

    Abstract. Cyclic ovarian activity and plasma progesterone (P4) concentrations were assessed for 179 days in 5. (free grazing) and 6 (free grazing + high energy and protein-supplemented) normocyclic donkeys. In addition, plasma p4 and cortisol were measured in blood samples collected at J5·min intervals in the.

  3. Abstract Introduction

    African Journals Online (AJOL)

    cce

    Abstract. Hemoglobin is a tetrameric protein which is able to dissociate into dimers. The dimers can in turn dissociate into tetramers. It has been found that dimers are more reactive than tetramers. The difference in the reactivity of these two species has been used to determine the tetramer- dimer dissociation constant of ...

  4. Modeling strategic behavior in human-automation interaction - Why an 'aid' can (and should) go unused

    Science.gov (United States)

    Kirlik, Alex

    1993-01-01

    Task-offload aids (e.g., an autopilot, an 'intelligent' assistant) can be selectively engaged by the human operator to dynamically delegate tasks to automation. Introducing such aids eliminates some task demands but creates new ones associated with programming, engaging, and disengaging the aiding device via an interface. The burdens associated with managing automation can sometimes outweigh the potential benefits of automation to improved system performance. Aid design parameters and features of the overall multitask context combine to determine whether or not a task-offload aid will effectively support the operator. A modeling and sensitivity analysis approach is presented that identifies effective strategies for human-automation interaction as a function of three task-context parameters and three aid design parameters. The analysis and modeling approaches provide resources for predicting how a well-adapted operator will use a given task-offload aid, and for specifying aid design features that ensure that automation will provide effective operator support in a multitask environment.

  5. Formally verifying human-automation interaction as part of a system model: limitations and tradeoffs.

    Science.gov (United States)

    Bolton, Matthew L; Bass, Ellen J

    2010-03-25

    Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human-automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human-automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE.

  6. A Stepwise Fitting Procedure for automated fitting of Ecopath with Ecosim models

    Directory of Open Access Journals (Sweden)

    Erin Scott

    2016-01-01

    Full Text Available The Stepwise Fitting Procedure automates testing of alternative hypotheses used for fitting Ecopath with Ecosim (EwE models to observation reference data (Mackinson et al. 2009. The calibration of EwE model predictions to observed data is important to evaluate any model that will be used for ecosystem based management. Thus far, the model fitting procedure in EwE has been carried out manually: a repetitive task involving setting >1000 specific individual searches to find the statistically ‘best fit’ model. The novel fitting procedure automates the manual procedure therefore producing accurate results and lets the modeller concentrate on investigating the ‘best fit’ model for ecological accuracy.

  7. Tensor contraction engine: Abstraction and automated parallel implementation of configuration-interaction, coupled-cluster, and many-body perturbation theories

    International Nuclear Information System (INIS)

    Hirata, So

    2003-01-01

    We develop a symbolic manipulation program and program generator (Tensor Contraction Engine or TCE) that automatically derives the working equations of a well-defined model of second-quantized many-electron theories and synthesizes efficient parallel computer programs on the basis of these equations. Provided an ansatz of a many-electron theory model, TCE performs valid contractions of creation and annihilation operators according to Wick's theorem, consolidates identical terms, and reduces the expressions into the form of multiple tensor contractions acted by permutation operators. Subsequently, it determines the binary contraction order for each multiple tensor contraction with the minimal operation and memory cost, factorizes common binary contractions (defines intermediate tensors), and identifies reusable intermediates. The resulting ordered list of binary tensor contractions, additions, and index permutations is translated into an optimized program that is combined with the NWChem and UTChem computational chemistry software packages. The programs synthesized by TCE take advantage of spin symmetry, Abelian point-group symmetry, and index permutation symmetry at every stage of calculations to minimize the number of arithmetic operations and storage requirement, adjust the peak local memory usage by index range tiling, and support parallel I/O interfaces and dynamic load balancing for parallel executions. We demonstrate the utility of TCE through automatic derivation and implementation of parallel programs for various models of configuration-interaction theory (CISD, CISDT, CISDTQ), many-body perturbation theory[MBPT(2), MBPT(3), MBPT(4)], and coupled-cluster theory (LCCD, CCD, LCCSD, CCSD, QCISD, CCSDT, and CCSDTQ)

  8. The importance of information goods abstraction levels for information commerce process models

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.

    2002-01-01

    A process model, in the context of e-commerce, is an organized set of activities for the creation, (re-)production, trade and delivery of goods. Electronic commerce studies have created important process models for the trade of physical goods via Internet. These models are not easily suitable for

  9. Automated modelling of complex refrigeration cycles through topological structure analysis

    International Nuclear Information System (INIS)

    Belman-Flores, J.M.; Riesco-Avila, J.M.; Gallegos-Munoz, A.; Navarro-Esbri, J.; Aceves, S.M.

    2009-01-01

    We have developed a computational method for analysis of refrigeration cycles. The method is well suited for automated analysis of complex refrigeration systems. The refrigerator is specified through a description of flows representing thermodynamic sates at system locations; components that modify the thermodynamic state of a flow; and controls that specify flow characteristics at selected points in the diagram. A system of equations is then established for the refrigerator, based on mass, energy and momentum balances for each of the system components. Controls specify the values of certain system variables, thereby reducing the number of unknowns. It is found that the system of equations for the refrigerator may contain a number of redundant or duplicate equations, and therefore further equations are necessary for a full characterization. The number of additional equations is related to the number of loops in the cycle, and this is calculated by a matrix-based topological method. The methodology is demonstrated through an analysis of a two-stage refrigeration cycle.

  10. Refinement-Based Student Modeling and Automated Bug Library Construction.

    Science.gov (United States)

    Baffes, Paul; Mooney, Raymond

    1996-01-01

    Discussion of student modeling and intelligent tutoring systems focuses on the development of the ASSERT algorithm (Acquiring Stereotypical Student Errors by Refining Theories). Topics include overlay modeling; bug libraries (databases of student misconceptions); dynamic modeling; refinement-based modeling; and experimental results from tests at…

  11. Beyond modeling abstractions: Learning nouns over developmental time in atypical populations and individuals

    Directory of Open Access Journals (Sweden)

    Clare eSims

    2013-11-01

    Full Text Available Connectionist models that capture developmental change over time have much to offer in the field of language development research. Several models in the literature have made good contact with developmental data, effectively captured behavioral tasks, and accurately represented linguistic input available to young children. However, fewer models of language development have truly captured the process of developmental change over time. In this review paper, we discuss several prominent connectionist models of early word learning, focusing on semantic development, as well as our recent work modeling the emergence of word learning biases in different populations. We also discuss the potential of these kinds of models to capture children’s language development at the individual level. We argue that a modeling approach that truly captures change over time has the potential to inform theory, guide research, and lead to innovations in early language intervention.

  12. Advances in automated noise data acquisition and noise source modeling for power reactors

    International Nuclear Information System (INIS)

    Clapp, N.E. Jr.; Kryter, R.C.; Sweeney, F.J.; Renier, J.A.

    1981-01-01

    A newly expanded program, directed toward achieving a better appreciation of both the strengths and limitations of on-line, noise-based, long-term surveillance programs for nuclear reactors, is described. Initial results in the complementary experimental (acquisition and automated screening of noise signatures) and theoretical (stochastic modeling of likely noise sources) areas of investigation are given

  13. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  14. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    Science.gov (United States)

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  15. Development and Evaluation of a Model for Modular Automation in Plant Manufacturing

    Directory of Open Access Journals (Sweden)

    Uwe Katzke

    2005-08-01

    Full Text Available The benefit of modular concepts in plant automation is seen ambivalent. On one hand it offers advantages, on the other hand it also sets requirements on the system structure as well as discipline of designer. The main reasons to use modularity in systems design for automation applications in industry are reusability and reduction of complexity, but up to now modular concepts are rare in plant automation. This paper analyses the reasons and proposes measures and solution concepts. An analysis of the work flow and the working results of some companies in several branches show different proposals of modularity. These different proposals in production and process engineering are integrated in one model and represent different perspectives of an integrated system.

  16. Initial Assessment and Modeling Framework Development for Automated Mobility Districts: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Yi [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Young, Stanley E [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Garikapati, Venu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Chen, Yuche [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhu, Lei [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-07

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displaces private automobiles for day-to-day travel in dense activity districts. This paper examines a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMD). This paper reviews several such districts, including airports, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology and others with more traditional transit-based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs

  17. Design Studies Suggested by an Abstract Model for a Medical Information System

    Science.gov (United States)

    Cox, J. R.; Kimura, T. D.; Moore, P.; Gillett, W.; Stucki, M. J.

    1980-01-01

    We have developed a formal model of a database system that is unusual in that it has the ability to represent information about its own structure and to insure semantic consistency. The model distinguishes general laws from instances of events and objects, but many of its mechanisms serve both categories of information. The model forms a substrate upon which an information structure appropriate to neonatology is being developed. Some example queries are shown and a design study for an associative memory suggested by the model is described briefly.

  18. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language......, a well-established visual language for modelling workflows in a business context. The framework’s modelling language is extended to include the tracking of real-valued quantities associated with the process (such as time, cost, temperature). In addition, this language also allows for an intention...... by means of a case study from the food industry. Through this case study we explore the extent to which the risk of production faults can be reduced and the impact of these can be minimised, primarily through restructuring of the production workflows. This approach is fully automated and only the modelling...

  19. Information Exchange in Global Logistics Chains : An application for Model-based Auditing (abstract)

    NARCIS (Netherlands)

    Veenstra, A.W.; Hulstijn, J.; Christiaanse, R.; Tan, Y.

    2013-01-01

    An integrated data pipeline has been proposed to meet requirements for supply chain visibility and control. How can data integration be used for risk assessment, monitoring and control in global supply chains? We argue that concepts from model-based auditing can be used to model the ‘ideal’ flow of

  20. The Use of AMET and Automated Scripts for Model Evaluation

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool (AMET) is a suite of software designed to facilitate the analysis and evaluation of meteorological and air quality models. AMET matches the model output for particular locations to the corresponding observed values from one or more networks ...

  1. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  2. Lateral-Directional Parameter Estimation on the X-48B Aircraft Using an Abstracted, Multi-Objective Effector Model

    Science.gov (United States)

    Ratnayake, Nalin A.; Waggoner, Erin R.; Taylor, Brian R.

    2011-01-01

    The problem of parameter estimation on hybrid-wing-body aircraft is complicated by the fact that many design candidates for such aircraft involve a large number of aerodynamic control effectors that act in coplanar motion. This adds to the complexity already present in the parameter estimation problem for any aircraft with a closed-loop control system. Decorrelation of flight and simulation data must be performed in order to ascertain individual surface derivatives with any sort of mathematical confidence. Non-standard control surface configurations, such as clamshell surfaces and drag-rudder modes, further complicate the modeling task. In this paper, time-decorrelation techniques are applied to a model structure selected through stepwise regression for simulated and flight-generated lateral-directional parameter estimation data. A virtual effector model that uses mathematical abstractions to describe the multi-axis effects of clamshell surfaces is developed and applied. Comparisons are made between time history reconstructions and observed data in order to assess the accuracy of the regression model. The Cram r-Rao lower bounds of the estimated parameters are used to assess the uncertainty of the regression model relative to alternative models. Stepwise regression was found to be a useful technique for lateral-directional model design for hybrid-wing-body aircraft, as suggested by available flight data. Based on the results of this study, linear regression parameter estimation methods using abstracted effectors are expected to perform well for hybrid-wing-body aircraft properly equipped for the task.

  3. Measurements of the thickness of model sea ice by UHF waves (abstract)

    OpenAIRE

    Takashima,Hayao; Yamakoshi,Hisao; Maeda,Toshio; Sakurai,Akio

    1993-01-01

    It is indispensable to know the dielectric constant of model sea ice in order to detect the ice thickness by radar. The authors measured the dielectric constsnt of model sea ice by the space reflection method using UHF waves. A UHF signal is swept from 200MHz to 1000MHz and is transmitted from an antenna toward the model sea ice set on a metal sheet. The transmitting antenna is a conical log spiral antenna for a right circular polarized wave. The receiving antenna is an inverse type antenna s...

  4. Abstract machine based execution model for computer architecture design and efficient implementation of logic programs in parallel

    Energy Technology Data Exchange (ETDEWEB)

    Hermenegildo, M.V.

    1986-01-01

    The term Logic Programming refers to a variety of computer languages and execution models based on the traditional concept of Symbolic Logic. The expressive power of these languages offers promise to be of great assistance in facing the programming challenges of present and future symbolic processing applications in artificial intelligence, knowledge-based systems, and many other areas of computing. This dissertation presents an efficient parallel execution model for logic programs. The model is described from the source language level down to an Abstract Machine level, suitable for direct implementation on existing parallel systems or for the design of special purpose parallel architectures. Few assumptions are made at the source language level and, therefore, the techniques developed and the general Abstract Machine design are applicable to a variety of logic (and also functional) languages. These techniques offer efficient solutions to several areas of parallel Logic Programming implementation previously considered problematic or a source of considerable overhead, such as the detection and handling of variable binding conflicts in AND-parallelism, the specification of control and management of the execution tree, the treatment of distributed backtracking, and goal scheduling and memory management issues, etc. A parallel Abstract Machine design is offered, specifying data areas, operation, and a suitable instruction set.

  5. Generic process model structures: towards a standard notation for abstract representations

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2007-10-01

    Full Text Available The identification of process model structures is usually complex and costly. If these structures can be reused across boundaries, this could not only benefit the internal structure of one application domain, but could also benefit organizations...

  6. Collaboration and abstract representations: towards predictive models based on raw speech and eye-tracking data

    OpenAIRE

    Nüssli, Marc-Antoine; Jermann, Patrick; Sangin, Mirweis; Dillenbourg, Pierre

    2009-01-01

    This study aims to explore the possibility of using machine learning techniques to build predictive models of performance in collaborative induction tasks. More specifically, we explored how signal-level data, like eye-gaze data and raw speech may be used to build such models. The results show that such low level features have effectively some potential to predict performance in such tasks. Implications for future applications design are shortly discussed.

  7. Thermomechanical Modeling of Sintered Silver - A Fracture Mechanics-based Approach: Extended Abstract: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Paret, Paul P [National Renewable Energy Laboratory (NREL), Golden, CO (United States); DeVoto, Douglas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Narumanchi, Sreekant V [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-01

    Sintered silver has proven to be a promising candidate for use as a die-attach and substrate-attach material in automotive power electronics components. It holds promise of greater reliability than lead-based and lead-free solders, especially at higher temperatures (less than 200 degrees Celcius). Accurate predictive lifetime models of sintered silver need to be developed and its failure mechanisms thoroughly characterized before it can be deployed as a die-attach or substrate-attach material in wide-bandgap device-based packages. We present a finite element method (FEM) modeling methodology that can offer greater accuracy in predicting the failure of sintered silver under accelerated thermal cycling. A fracture mechanics-based approach is adopted in the FEM model, and J-integral/thermal cycle values are computed. In this paper, we outline the procedures for obtaining the J-integral/thermal cycle values in a computational model and report on the possible advantage of using these values as modeling parameters in a predictive lifetime model.

  8. Automated generation of compact models for fluidic microsystems

    Science.gov (United States)

    Turowski, Marek; Chen, Zhijian; Przekwas, Andrzej J.

    2000-04-01

    Simulation and design of microfluidic systems requires various level models: high-fidelity models for design and optimization of particular elements and devices as well as system-level models allowing for VLSI-scale simulation of such systems. For the latter purpose, reduced or compact models are necessary to make such system simulations computationally feasible. In this paper, we present a design methodology and practical approach for generation of compact models of microfluidic elements. In this procedure we use high-fidelity 3D simulations of the microfluidic devices to extract their characteristics for compact models, and subsequently, to validate the compact model behavior in various regimes of operation. The compact models are generated automatically in the formats that can be directly used in SPICE or SABER. As an example of a nonlinear fluidic device, the generation of compact model for 'Tesla valve' is described in detail. Tesla valve is one of the no-moving- parts valves used in micropumps in MEMS. Its principle of operation is based on the rectification of the fluid, so it may be considered as a 'fluidic diode'.

  9. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    Model-based software engineering offers several attractive benefits for the implementation of protocols, including automated code generation for different platforms from design-level models. In earlier work, we have proposed a template-based approach using Coloured Petri Net formal models...... with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show...

  10. Automated search-model discovery and preparation for structure solution by molecular replacement.

    Science.gov (United States)

    Keegan, Ronan M; Winn, Martyn D

    2007-04-01

    A novel automation pipeline for macromolecular structure solution by molecular replacement is described. There is a special emphasis on the discovery and preparation of a large number of search models, all of which can be passed to the core molecular-replacement programs. For routine molecular-replacement problems, the pipeline automates what a crystallographer might do and its value is simply one of convenience. For more difficult cases, the pipeline aims to discover the particular template structure and model edits required to produce a viable search model and may succeed in finding an efficacious combination that would be missed otherwise. The pipeline is described in detail and a number of examples are given. The examples are chosen to illustrate successes in real crystallography problems and also particular features of the pipeline. It is concluded that exploring a range of search models automatically can be valuable in many cases.

  11. Modelling Venting and Pressure Build-up in a 18650 LCO Cell during Thermal Runaway (ABSTRACT)

    DEFF Research Database (Denmark)

    Coman, Paul Tiberiu; Veje, Christian; White, Ralph

    Li-ion batteries are a very popular type of electric storage devices that possess high energy density when compared to the other battery chemistries. Due to this property, when operating under abusive conditions such as high ambient temperature, the batteries can experience thermal runaway, which...... may lead to fires and explosions. To prevent this, it is therefore important to model thermal runaway considering different events such as venting and the pressure development inside the battery cell, which makes the main purpose of this paper. A model consisting of the different decomposition...... reactions in the anode, cathode and SEI, but also in electrochemical reactions and boiling of the electrolyte is developed for a cylindrical 18650 LCO cell (Lithium Cobalt Oxide). For determining the pressure and the temperature after venting, the isentropic flow equations are included in the model...

  12. Integration of Diagnostic Microbiology in a Model of Total Laboratory Automation.

    Science.gov (United States)

    Da Rin, Giorgio; Zoppelletto, Maira; Lippi, Giuseppe

    2016-02-01

    Although automation has become widely utilized in certain areas of diagnostic testing, its adoption in diagnostic microbiology has proceeded much more slowly. To describe our real-world experience of integrating an automated instrument for diagnostic microbiology (Walk-Away Specimen Processor, WASPLab) within a model of total laboratory automation (TLA). The implementation process was divided into 2 phases. The former period, lasting approximately 6 weeks, entailed the installation of the WASPLab processor to operate as a stand-alone instrumentation, whereas the latter, lasting approximately 2 weeks, involved physical connection of the WASPLab with the automation. Using the WASPLab instrument in conjunction with the TLA model, we obtained a time savings equivalent to the work of 1.2 full-time laboratory technicians for diagnostic microbiology. The connection of WASPLab to TLA allowed its management by a generalist or clinical chemistry technician, with no need for microbiology skills on the part of either worker. Hence, diagnostic microbiology could be performed by the staff that is already using the TLA, extending their activities to include processing urgent clinical chemistry and hematology specimens. The time to result was also substantially improved. According to our experience, using the WASPLab instrument as part of a TLA in diagnostic microbiology holds great promise for optimizing laboratory workflow and improving the quality of testing. © American Society for Clinical Pathology, 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Interval Abstraction Refinement for Model Checking of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Viesmose, Sine Lyhne; Jacobsen, Thomas Stig; Jensen, Jacob Jon

    2014-01-01

    can be considerably faster but it does not in general guarantee conclusive answers. We implement the algorithms within the open-source model checker TAPAAL and demonstrate on a number of experiments that our approximation techniques often result in a significant speed-up of the verification....

  14. Automating an integrated spatial data-mining model for landfill site selection

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  15. The Integration and Abstraction of EBS Models in Yucca Mountain Performance Assessment

    International Nuclear Information System (INIS)

    S.D. Sevougian; V. Jain; A.V. Luik

    2006-01-01

    The safety strategy for geological disposal of radioactive waste at Yucca Mountain relies on a multi-barrier system to contain the waste and isolate it from the biosphere. The multi-barrier system consists of the natural barrier provided by the geological setting and the engineered barrier system (EBS). In the case of Yucca Mountain (YM) the geologic setting is the unsaturated-zone host rock, consisting of about 600 meters of layered ash-flow volcanic tuffs above the water table, and the saturated zone beneath the water table. Both the unsaturated and saturated rocks are part of a closed hydrologic basin in a desert surface environment. The waste is to be buried about halfway between the desert surface and the water table. The primary engineered barriers at YM consist of metal components that are highly durable in an oxidizing environment. The two primary components of the engineered barrier system are highly corrosion-resistant metal waste packages, made from a nickel-chromium-molybdenum alloy, Alloy 22, and titanium drip shields that protect the waste packages from corrosive dripping water and falling rocks. Design and performance assessment of the EBS requires models that describe how the EBS and near field behave under anticipated repository-relevant conditions. These models must describe coupled hydrologic, thermal, chemical, and mechanical (THCM) processes that drive radionuclide transport in a highly fractured host rock, consisting of a relatively permeable network of conductive fractures in a setting of highly impermeable tuff rock matrix. An integrated performance assessment of the EBS must include a quantification of the uncertainties that arise from (1) incomplete understanding of processes and (2) from lack of data representative of the large spatial scales and long time scales relevant to radioactive waste disposal (e.g., long-term metal corrosion rates and heterogeneities in rock properties over the large 5 km 2 emplacement area of the repository). A

  16. Radionuclide Transport Modelling: Current Status and Future Needs. Synthesis, Work Group Reports and Extended Abstracts

    International Nuclear Information System (INIS)

    2002-06-01

    The workshop identified a set of critical issues for the Swedish Nuclear Power Inspectorate (SKI) and the Swedish Radiation Protection Authority (SSI) to address in preparing for future reviews of license applications, which have subsequently been considered in preparing this synthesis. Structure for organising expert participation: A structure for organising expert participation in future reviews is proposed based on clearinghouses for (1) regulatory application and context, (2) engineered barrier systems, (3) geosphere, (4) biosphere, and (5) performance assessment integration and calculations. As part of their work, these clearinghouses could identify key issues that need to be resolved prior to future reviews. Performance assessment strategy and review context: Future reviews will be conducted in the context of regulations based on risk criteria; this leads to a need to review the methods used in probabilistic risk assessment, as well as the underlying process models. A plan is needed for accomplishing both aims. Despite the probabilistic framework, a need is anticipated for targeted, deterministic calculations to check particular assumptions. Priorities and ambition level for reviews: SKI's and SSI's resources can be more efficiently utilised by an early review of SKB's safety case, so that if necessary the authorities can make an early start on evaluating topics that are of primary significance to the safety case. As a guide to planning for allocation of effort in future reviews, this workshop produced a preliminary ranking of technical issues, on a scale from 'non-controversial' to 'requiring independent modelling,' Analysis of repository system and scenarios: Systems analysis tools including features/events/processes encyclopaedias, process-influence diagrams, and assessment-model flowcharts should be used as review tools, to check the processes and influences considered in SKB's analyses, and to evaluate the comprehensiveness of the scenarios that are

  17. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  18. Automation of Safety Analysis with SysML Models

    Data.gov (United States)

    National Aeronautics and Space Administration — To provide economical, reliable and safe access to space, design weaknesses should be identified earlier in the engineering life cycle, using model-based systems...

  19. Automated mask creation from a 3D model using Faethm.

    Energy Technology Data Exchange (ETDEWEB)

    Schiek, Richard Louis; Schmidt, Rodney Cannon

    2007-11-01

    We have developed and implemented a method which given a three-dimensional object can infer from topology the two-dimensional masks needed to produce that object with surface micro-machining. The masks produced by this design tool can be generic, process independent masks, or if given process constraints, specific for a target process. This design tool calculates the two-dimensional mask set required to produce a given three-dimensional model by investigating the vertical topology of the model.

  20. Poster Abstract: A Practical Model for Human-Smart Appliances Interaction

    DEFF Research Database (Denmark)

    Fürst, Jonathan; Fruergaard, Andreas; Johannesen, Marco Høvinghof

    2016-01-01

    for human-smart appliance interaction. We present a prototype implementation with an off-the-shelf smart lighting and heating system in a shared office space. Our approach minimizes the need for location metadata. It relies on a human-feedback loop (both sensor based and manual) to identify the optimal......Buildings are increasingly equipped with smart appliances that allow a fine grained adaption to personal comfort requirements. Such comfort adaption should be based on a human-feedback loop and not on a centralized comfort model. We argue that this feedback-loop should be achieved through local...... interaction with smart appliances. Two issues stand out: (1) How to impose logical locality when interacting with a smart appliance? (2) How to mediate conflicts between several persons in a room, or between building-wide policies and user preferences? We approach both problems by defining a general model...

  1. Neighbourhood Abstraction in GROOVE

    NARCIS (Netherlands)

    Rensink, Arend; Zambon, Eduardo; De Lara, J.; Varro, D.

    2011-01-01

    Important classes of graph grammars have infinite state spaces and therefore cannot be verified with traditional model checking techniques. One way to address this problem is to perform graph abstraction, which allows us to generate a finite abstract state space that over-approximates the original

  2. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  3. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  4. LOFT test support branch data abstract report: one-sixth scale model BWR jet pump test

    International Nuclear Information System (INIS)

    Crapo, H.S.

    1979-01-01

    Pump performance data are presented for a 1/6 scale model jet pump in tests conducted at the LOFT Test Support Blowdown Facility. Steady-state subcooled pump characterization tests were performed over a wide range of forward and reverse flow conditions, both at room temperature, and at elevated temperature (555 0 K). Blowdown tests were also performed to obtain two-phase performance data in configurations simulating the flow patterns in the intact and broken loops of a BWR during a recirculation line break transient

  5. Learning Methods for Dynamic Topic Modeling in Automated Behavior Analysis.

    Science.gov (United States)

    Isupova, Olga; Kuzin, Danil; Mihaylova, Lyudmila

    2017-09-27

    Semisupervised and unsupervised systems provide operators with invaluable support and can tremendously reduce the operators' load. In the light of the necessity to process large volumes of video data and provide autonomous decisions, this paper proposes new learning algorithms for activity analysis in video. The activities and behaviors are described by a dynamic topic model. Two novel learning algorithms based on the expectation maximization approach and variational Bayes inference are proposed. Theoretical derivations of the posterior estimates of model parameters are given. The designed learning algorithms are compared with the Gibbs sampling inference scheme introduced earlier in the literature. A detailed comparison of the learning algorithms is presented on real video data. We also propose an anomaly localization procedure, elegantly embedded in the topic modeling framework. It is shown that the developed learning algorithms can achieve 95% success rate. The proposed framework can be applied to a number of areas, including transportation systems, security, and surveillance.

  6. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    Science.gov (United States)

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  7. Automated economic analysis model for hazardous waste minimization

    International Nuclear Information System (INIS)

    Dharmavaram, S.; Mount, J.B.; Donahue, B.A.

    1990-01-01

    The US Army has established a policy of achieving a 50 percent reduction in hazardous waste generation by the end of 1992. To assist the Army in reaching this goal, the Environmental Division of the US Army Construction Engineering Research Laboratory (USACERL) designed the Economic Analysis Model for Hazardous Waste Minimization (EAHWM). The EAHWM was designed to allow the user to evaluate the life cycle costs for various techniques used in hazardous waste minimization and to compare them to the life cycle costs of current operating practices. The program was developed in C language on an IBM compatible PC and is consistent with other pertinent models for performing economic analyses. The potential hierarchical minimization categories used in EAHWM include source reduction, recovery and/or reuse, and treatment. Although treatment is no longer an acceptable minimization option, its use is widespread and has therefore been addressed in the model. The model allows for economic analysis for minimization of the Army's six most important hazardous waste streams. These include, solvents, paint stripping wastes, metal plating wastes, industrial waste-sludges, used oils, and batteries and battery electrolytes. The EAHWM also includes a general application which can be used to calculate and compare the life cycle costs for minimization alternatives of any waste stream, hazardous or non-hazardous. The EAHWM has been fully tested and implemented in more than 60 Army installations in the United States

  8. Microscopic Models and Network Transformations for Automated Railway Traffic Planning

    NARCIS (Netherlands)

    Besinovic, Nikola; Goverde, R.M.P.; Quaglietta, Egidio

    2016-01-01

    This article tackles the real-world planning problem of railway operations. Improving the timetable planning process will result in more reliable product plans and a higher quality of service for passengers and freight operators. We focus on the microscopic models for computing accurate track

  9. Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.

    Science.gov (United States)

    Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R

    2018-01-01

    Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom

  10. A cellular automation model accounting for bicycle's group behavior

    Science.gov (United States)

    Tang, Tie-Qiao; Rui, Ying-Xu; Zhang, Jian; Shang, Hua-Yan

    2018-02-01

    Recently, bicycle has become an important traffic tool in China, again. Due to the merits of bicycle, the group behavior widely exists in urban traffic system. However, little effort has been made to explore the impacts of the group behavior on bicycle flow. In this paper, we propose a CA (cellular automaton) model with group behavior to explore the complex traffic phenomena caused by shoulder group behavior and following group behavior on an open road. The numerical results illustrate that the proposed model can qualitatively describe the impacts of the two kinds of group behaviors on bicycle flow and that the effects are related to the mode and size of group behaviors. The results can help us to better understand the impacts of the bicycle's group behaviors on urban traffic system and effectively control the bicycle's group behavior.

  11. Implementation and automated validation of the minimal Z' model in FeynRules

    International Nuclear Information System (INIS)

    Basso, L.; Christensen, N.D.; Duhr, C.; Fuks, B.; Speckner, C.

    2012-01-01

    We describe the implementation of a well-known class of U(1) gauge models, the 'minimal' Z' models, in FeynRules. We also describe a new automated validation tool for FeynRules models which is controlled by a web interface and allows the user to run a complete set of 2 → 2 processes on different matrix element generators, different gauges, and compare between them all. If existing, the comparison with independent implementations is also possible. This tool has been used to validate our implementation of the 'minimal' Z' models. (authors)

  12. Gaia: automated quality assessment of protein structure models.

    Science.gov (United States)

    Kota, Pradeep; Ding, Feng; Ramachandran, Srinivas; Dokholyan, Nikolay V

    2011-08-15

    Increasing use of structural modeling for understanding structure-function relationships in proteins has led to the need to ensure that the protein models being used are of acceptable quality. Quality of a given protein structure can be assessed by comparing various intrinsic structural properties of the protein to those observed in high-resolution protein structures. In this study, we present tools to compare a given structure to high-resolution crystal structures. We assess packing by calculating the total void volume, the percentage of unsatisfied hydrogen bonds, the number of steric clashes and the scaling of the accessible surface area. We assess covalent geometry by determining bond lengths, angles, dihedrals and rotamers. The statistical parameters for the above measures, obtained from high-resolution crystal structures enable us to provide a quality-score that points to specific areas where a given protein structural model needs improvement. We provide these tools that appraise protein structures in the form of a web server Gaia (http://chiron.dokhlab.org). Gaia evaluates the packing and covalent geometry of a given protein structure and provides quantitative comparison of the given structure to high-resolution crystal structures. dokh@unc.edu Supplementary data are available at Bioinformatics online.

  13. Simple full micromagnetic model of exchange bias behavior in ferro/antiferromagnetic layered structures (abstract)

    Science.gov (United States)

    Koon, Norman C.

    1997-04-01

    It is shown using full micromagnetic relaxation calculations that exchange bias behavior is predicted for single-crystal ferro/antiferromagnetic layers with a fully compensated interface. The particular example most fully studied has a bcc/bct lattice structure with a fully compensated (110) interface plane. Only bilinear Heisenberg exchange was assumed, with anisotropy only in the antiferromagnet. In spite of the intuitive notion that exchange coupling between a ferromagnet and an antiferromagnet across a fully compensated plane of the antiferromagnet should be zero, we find strong coupling, comparable to the bilinear exchange, with a 90° angle between the ferromagnetic and antiferromagnetic axes of layers far from the interface in absence of an applied field. Even though the 90° coupling has characteristics resembling "biquadratic" exchange, it originates entirely from frustrated bilinear exchange. The development of exchange bias is found to originate from the formation of a domain wall in the antiferromagnet via the strong 90° exchange coupling and pinning of the wall by the magnetocrystalline anisotropy in the antiferromagnet. Because the large demagnetizing factor of the ferromagnet tends to confine its magnetization to the plane, the exchange bias is found to depend mainly on the strength and the symmetry of the in-plane component of anisotropy. Although little effort was made to analyze specific systems, the model reproduces many of the qualitative features observed in real exchange bias systems and gives reasonable semiquantitative estimates for the bias field when exchange and anisotropy values consistent with real systems are used.

  14. Automated main-chain model building by template matching and iterative fragment extension

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2003-01-01

    A method for automated macromolecular main-chain model building is described. An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and β-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and β-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C α positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 Å. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition

  15. Drivers' communicative interactions: on-road observations and modelling for integration in future automation systems.

    Science.gov (United States)

    Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas

    2014-01-01

    Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.

  16. Geological modeling of a stratified deposit with CAD-Based solid model automation

    Directory of Open Access Journals (Sweden)

    Ayten Eser

    Full Text Available Abstract The planning stages of mining activities require many comprehensive and detailed analyses. Determining the correct orebody model is the first stage and one of the most important. Three-dimensional solid modeling is one of the significant methods that can examine the position and shape of the ore deposit. Although there are many different types of mining software for determining a solid model, many users try to build geological models in the computer without knowing how these software packages work. As researchers on the subject, we wanted to answer the question "How would we do it". For this purpose, a system was developed for generating solid models using data obtained from boreholes. Obtaining this model in an AutoCAD environment will be important for geologists and engineers. Developed programs were first tested with virtual borehole data belonging to a virtual deposit. Then the real borehole data of a cement raw material site were successfully applied. This article allows readers not only to see a clear example of the programming approach to layered deposits but also to produce more complicated software in this context. Our study serves as a window to understanding the geological modeling process.

  17. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  18. Towards Continuous Integration in Model-Based Engineering of Automated Production Systems

    OpenAIRE

    Jakob Mund, Iman Badr, Safa Bougouffa, Birgit Vogel-Heuser

    2017-01-01

    Continuous integration (CI) is widely used in software engineering. The observed benefits include reduced efforts for system integration, which is particularly appealing for engineering automated production systems (aPS) due to the different disciplines involved. Yet, while many individual quality assurance means for aPS have been proposed, their adequacy for and systematic use in CI remains unclear. In this article, we provide two key contributions: First, we propose a quality model for mode...

  19. Piloted Simulation of a Model-Predictive Automated Recovery System

    Science.gov (United States)

    Liu, James (Yuan); Litt, Jonathan; Sowers, T. Shane; Owens, A. Karl; Guo, Ten-Huei

    2014-01-01

    This presentation describes a model-predictive automatic recovery system for aircraft on the verge of a loss-of-control situation. The system determines when it must intervene to prevent an imminent accident, resulting from a poor approach. It estimates the altitude loss that would result from a go-around maneuver at the current flight condition. If the loss is projected to violate a minimum altitude threshold, the maneuver is automatically triggered. The system deactivates to allow landing once several criteria are met. Piloted flight simulator evaluation showed the system to provide effective envelope protection during extremely unsafe landing attempts. The results demonstrate how flight and propulsion control can be integrated to recover control of the vehicle automatically and prevent a potential catastrophe.

  20. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  1. Automated crack detection in conductive smart-concrete structures using a resistor mesh model

    Science.gov (United States)

    Downey, Austin; D'Alessandro, Antonella; Ubertini, Filippo; Laflamme, Simon

    2018-03-01

    Various nondestructive evaluation techniques are currently used to automatically detect and monitor cracks in concrete infrastructure. However, these methods often lack the scalability and cost-effectiveness over large geometries. A solution is the use of self-sensing carbon-doped cementitious materials. These self-sensing materials are capable of providing a measurable change in electrical output that can be related to their damage state. Previous work by the authors showed that a resistor mesh model could be used to track damage in structural components fabricated from electrically conductive concrete, where damage was located through the identification of high resistance value resistors in a resistor mesh model. In this work, an automated damage detection strategy that works through placing high value resistors into the previously developed resistor mesh model using a sequential Monte Carlo method is introduced. Here, high value resistors are used to mimic the internal condition of damaged cementitious specimens. The proposed automated damage detection method is experimentally validated using a 500 × 500 × 50 mm3 reinforced cement paste plate doped with multi-walled carbon nanotubes exposed to 100 identical impact tests. Results demonstrate that the proposed Monte Carlo method is capable of detecting and localizing the most prominent damage in a structure, demonstrating that automated damage detection in smart-concrete structures is a promising strategy for real-time structural health monitoring of civil infrastructure.

  2. Modelling of series of types of automated trenchless works tunneling

    Science.gov (United States)

    Gendarz, P.; Rzasinski, R.

    2016-08-01

    Microtunneling is the newest method for making underground installations. Show method is the result of experience and methods applied in other, previous methods of trenchless underground works. It is considered reasonable to elaborate a series of types of construction of tunneling machines, to develop this particular earthworks method. There are many design solutions of machines, but the current goal is to develop non - excavation robotized machine. Erosion machines with main dimensions of the tunnels which are: 1600, 2000, 2500, 3150 are design with use of the computer aided methods. Series of types of construction of tunneling machines creating process was preceded by analysis of current state. The verification of practical methodology of creating the systematic part series was based on the designed erosion machines series of types. There were developed: method of construction similarity of the erosion machines, algorithmic methods of quantitative construction attributes variant analyzes in the I-DEAS advanced graphical program, relational and program parameterization. There manufacturing process of the parts will be created, which allows to verify the technological process on the CNC machines. The models of designed will be modified and the construction will be consulted with erosion machine users and manufacturers like: Tauber Rohrbau GmbH & Co.KG from Minster, OHL ZS a.s. from Brna,. The companies’ acceptance will result in practical verification by JUMARPOL company.

  3. A semi-automated vascular access system for preclinical models

    Science.gov (United States)

    Berry-Pusey, B. N.; Chang, Y. C.; Prince, S. W.; Chu, K.; David, J.; Taschereau, R.; Silverman, R. W.; Williams, D.; Ladno, W.; Stout, D.; Tsao, T. C.; Chatziioannou, A.

    2013-08-01

    Murine models are used extensively in biological and translational research. For many of these studies it is necessary to access the vasculature for the injection of biologically active agents. Among the possible methods for accessing the mouse vasculature, tail vein injections are a routine but critical step for many experimental protocols. To perform successful tail vein injections, a high skill set and experience is required, leaving most scientists ill-suited to perform this task. This can lead to a high variability between injections, which can impact experimental results. To allow more scientists to perform tail vein injections and to decrease the variability between injections, a vascular access system (VAS) that semi-automatically inserts a needle into the tail vein of a mouse was developed. The VAS uses near infrared light, image processing techniques, computer controlled motors, and a pressure feedback system to insert the needle and to validate its proper placement within the vein. The VAS was tested by injecting a commonly used radiolabeled probe (FDG) into the tail veins of five mice. These mice were then imaged using micro-positron emission tomography to measure the percentage of the injected probe remaining in the tail. These studies showed that, on average, the VAS leaves 3.4% of the injected probe in the tail. With these preliminary results, the VAS system demonstrates the potential for improving the accuracy of tail vein injections in mice.

  4. Development strategy and process models for phased automation of design and digital manufacturing electronics

    Science.gov (United States)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  5. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  6. Automated Protein Structure Modeling with SWISS-MODEL Workspace and the Protein Model Portal

    OpenAIRE

    Bordoli, Lorenza; Schwede, Torsten

    2012-01-01

    Comparative protein structure modeling is a computational approach to build three-dimensional structural models for proteins using experimental structures of related protein family members as templates. Regular blind assessments of modeling accuracy have demonstrated that comparative protein structure modeling is currently the most reliable technique to model protein structures. Homology models are often sufficiently accurate to substitute for experimental structures in a wide variety of appl...

  7. An Automated Method for Landmark Identification and Finite-Element Modeling of the Lumbar Spine.

    Science.gov (United States)

    Campbell, Julius Quinn; Petrella, Anthony J

    2015-11-01

    The purpose of this study was to develop a method for the automated creation of finite-element models of the lumbar spine. Custom scripts were written to extract bone landmarks of lumbar vertebrae and assemble L1-L5 finite-element models. End-plate borders, ligament attachment points, and facet surfaces were identified. Landmarks were identified to maintain mesh correspondence between meshes for later use in statistical shape modeling. 90 lumbar vertebrae were processed creating 18 subject-specific finite-element models. Finite-element model surfaces and ligament attachment points were reproduced within 1e-5 mm of the bone surface, including the critical contact surfaces of the facets. Element quality exceeded specifications in 97% of elements for the 18 models created. The current method is capable of producing subject-specific finite-element models of the lumbar spine with good accuracy, quality, and robustness. The automated methods developed represent advancement in the state of the art of subject-specific lumbar spine modeling to a scale not possible with prior manual and semiautomated methods.

  8. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    Science.gov (United States)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  9. An Automated Planning Model for RoF Heterogeneous Wireless Networks

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Bergheim, Hans; Ragnarsson, Ólafur

    2010-01-01

    The number of users in wireless WANs is increasing like never before, at the same time as the bandwidth demands by users increase.The structure of the third generation Wireless WANs makes it expensive for Wireless ISPs to meet these demands.The FUTON architecture is a RoF heterogeneous wireless...... network architecture under development,that will be cheaper to deploy and operate.This paper shows a method to plan an implementation of this architecture.The planning is done as automatic as possible,covering radio planning, fiber planning and network dimensioning. The out come of the paper is a planning...... process that uses GIS-data to automate planning for the entire architecture.The automated model uses a collection of scripts that can easily be modified for planning a FUTON architecture anywhere. The scripts are made using functions for the different tasks, inorder to make them easy to extend and modify....

  10. Monitoring arid-land groundwater abstraction through optimization of a land surface model with remote sensing-based evaporation

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2018-02-01

    The increase in irrigated agriculture in Saudi Arabia is having a large impact on its limited groundwater resources. While large-scale water storage changes can be estimated using satellite data, monitoring groundwater abstraction rates is largely non-existent at either farm or regional level, so water management decisions remain ill-informed. Although determining water use from space at high spatiotemporal resolutions remains challenging, a number of approaches have shown promise, particularly in the retrieval of crop water use via evaporation. Apart from satellite-based estimates, land surface models offer a continuous spatial-temporal evolution of full land-atmosphere water and energy exchanges. In this study, we first examine recent trends in terrestrial water storage depletion within the Arabian Peninsula and explore its relation to increased agricultural activity in the region using satellite data. Next, we evaluate a number of large-scale remote sensing-based evaporation models, giving insight into the challenges of evaporation retrieval in arid environments. Finally, we present a novel method aimed to retrieve groundwater abstraction rates used in irrigated fields by constraining a land surface model with remote sensing-based evaporation observations. The approach is used to reproduce reported irrigation rates over 41 center-pivot irrigation fields presenting a range of crop dynamics over the course of one year. The results of this application are promising, with mean absolute errors below 3 mm:day-1, bias of -1.6 mm:day-1, and a first rough estimate of total annual abstractions of 65.8 Mm3 (close to the estimated value using reported farm data, 69.42 Mm3). However, further efforts to address the overestimation of bare soil evaporation in the model are required. The uneven coverage of satellite data within the study site allowed us to evaluate its impact on the optimization, with a better match between observed and obtained irrigation rates on fields with

  11. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  12. Shingle 2.0: generalising self-consistent and automated domain discretisation for multi-scale geophysical models

    Directory of Open Access Journals (Sweden)

    A. S. Candy

    2018-01-01

    Full Text Available The approaches taken to describe and develop spatial discretisations of the domains required for geophysical simulation models are commonly ad hoc, model- or application-specific, and under-documented. This is particularly acute for simulation models that are flexible in their use of multi-scale, anisotropic, fully unstructured meshes where a relatively large number of heterogeneous parameters are required to constrain their full description. As a consequence, it can be difficult to reproduce simulations, to ensure a provenance in model data handling and initialisation, and a challenge to conduct model intercomparisons rigorously. This paper takes a novel approach to spatial discretisation, considering it much like a numerical simulation model problem of its own. It introduces a generalised, extensible, self-documenting approach to carefully describe, and necessarily fully, the constraints over the heterogeneous parameter space that determine how a domain is spatially discretised. This additionally provides a method to accurately record these constraints, using high-level natural language based abstractions that enable full accounts of provenance, sharing, and distribution. Together with this description, a generalised consistent approach to unstructured mesh generation for geophysical models is developed that is automated, robust and repeatable, quick-to-draft, rigorously verified, and consistent with the source data throughout. This interprets the description above to execute a self-consistent spatial discretisation process, which is automatically validated to expected discrete characteristics and metrics. Library code, verification tests, and examples available in the repository at https://github.com/shingleproject/Shingle. Further details of the project presented at http://shingleproject.org.

  13. Shingle 2.0: generalising self-consistent and automated domain discretisation for multi-scale geophysical models

    Science.gov (United States)

    Candy, Adam S.; Pietrzak, Julie D.

    2018-01-01

    The approaches taken to describe and develop spatial discretisations of the domains required for geophysical simulation models are commonly ad hoc, model- or application-specific, and under-documented. This is particularly acute for simulation models that are flexible in their use of multi-scale, anisotropic, fully unstructured meshes where a relatively large number of heterogeneous parameters are required to constrain their full description. As a consequence, it can be difficult to reproduce simulations, to ensure a provenance in model data handling and initialisation, and a challenge to conduct model intercomparisons rigorously. This paper takes a novel approach to spatial discretisation, considering it much like a numerical simulation model problem of its own. It introduces a generalised, extensible, self-documenting approach to carefully describe, and necessarily fully, the constraints over the heterogeneous parameter space that determine how a domain is spatially discretised. This additionally provides a method to accurately record these constraints, using high-level natural language based abstractions that enable full accounts of provenance, sharing, and distribution. Together with this description, a generalised consistent approach to unstructured mesh generation for geophysical models is developed that is automated, robust and repeatable, quick-to-draft, rigorously verified, and consistent with the source data throughout. This interprets the description above to execute a self-consistent spatial discretisation process, which is automatically validated to expected discrete characteristics and metrics. Library code, verification tests, and examples available in the repository at https://github.com/shingleproject/Shingle. Further details of the project presented at http://shingleproject.org.

  14. Abstract Storage Devices

    OpenAIRE

    Koenig, Robert; Maurer, Ueli; Tessaro, Stefano

    2007-01-01

    A quantum storage device differs radically from a conventional physical storage device. Its state can be set to any value in a certain (infinite) state space, but in general every possible read operation yields only partial information about the stored state. The purpose of this paper is to initiate the study of a combinatorial abstraction, called abstract storage device (ASD), which models deterministic storage devices with the property that only partial information about the state can be re...

  15. Automation reliability in unmanned aerial vehicle control: a reliance-compliance model of automation dependence in high workload.

    Science.gov (United States)

    Dixon, Stephen R; Wickens, Christopher D

    2006-01-01

    Two experiments were conducted in which participants navigated a simulated unmanned aerial vehicle (UAV) through a series of mission legs while searching for targets and monitoring system parameters. The goal of the study was to highlight the qualitatively different effects of automation false alarms and misses as they relate to operator compliance and reliance, respectively. Background data suggest that automation false alarms cause reduced compliance, whereas misses cause reduced reliance. In two studies, 32 and 24 participants, including some licensed pilots, performed in-lab UAV simulations that presented the visual world and collected dependent measures. Results indicated that with the low-reliability aids, false alarms correlated with poorer performance in the system failure task, whereas misses correlated with poorer performance in the concurrent tasks. Compliance and reliance do appear to be affected by false alarms and misses, respectively, and are relatively independent of each other. Practical implications are that automated aids must be fairly reliable to provide global benefits and that false alarms and misses have qualitatively different effects on performance.

  16. Component-based modeling of systems for automated fault tree generation

    International Nuclear Information System (INIS)

    Majdara, Aref; Wakabayashi, Toshio

    2009-01-01

    One of the challenges in the field of automated fault tree construction is to find an efficient modeling approach that can support modeling of different types of systems without ignoring any necessary details. In this paper, we are going to represent a new system of modeling approach for computer-aided fault tree generation. In this method, every system model is composed of some components and different types of flows propagating through them. Each component has a function table that describes its input-output relations. For the components having different operational states, there is also a state transition table. Each component can communicate with other components in the system only through its inputs and outputs. A trace-back algorithm is proposed that can be applied to the system model to generate the required fault trees. The system modeling approach and the fault tree construction algorithm are applied to a fire sprinkler system and the results are presented

  17. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  18. Automated modelling of spatially-distributed glacier ice thickness and volume

    Science.gov (United States)

    James, William H. M.; Carrivick, Jonathan L.

    2016-07-01

    Ice thickness distribution and volume are both key parameters for glaciological and hydrological applications. This study presents VOLTA (Volume and Topography Automation), which is a Python script tool for ArcGISTM that requires just a digital elevation model (DEM) and glacier outline(s) to model distributed ice thickness, volume and bed topography. Ice thickness is initially estimated at points along an automatically generated centreline network based on the perfect-plasticity rheology assumption, taking into account a valley side drag component of the force balance equation. Distributed ice thickness is subsequently interpolated using a glaciologically correct algorithm. For five glaciers with independent field-measured bed topography, VOLTA modelled volumes were between 26.5% (underestimate) and 16.6% (overestimate) of that derived from field observations. Greatest differences were where an asymmetric valley cross section shape was present or where significant valley infill had occurred. Compared with other methods of modelling ice thickness and volume, key advantages of VOLTA are: a fully automated approach and a user friendly graphical user interface (GUI), GIS consistent geometry, fully automated centreline generation, inclusion of a side drag component in the force balance equation, estimation of glacier basal shear stress for each individual glacier, fully distributed ice thickness output and the ability to process multiple glaciers rapidly. VOLTA is capable of regional scale ice volume assessment, which is a key parameter for exploring glacier response to climate change. VOLTA also permits subtraction of modelled ice thickness from the input surface elevation to produce an ice-free DEM, which is a key input for reconstruction of former glaciers. VOLTA could assist with prediction of future glacier geometry changes and hence in projection of future meltwater fluxes.

  19. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    Science.gov (United States)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  20. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  1. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    International Nuclear Information System (INIS)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke

    2015-01-01

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  2. An architecture and model for cognitive engineering simulation analysis - Application to advanced aviation automation

    Science.gov (United States)

    Corker, Kevin M.; Smith, Barry R.

    1993-01-01

    The process of designing crew stations for large-scale, complex automated systems is made difficult because of the flexibility of roles that the crew can assume, and by the rapid rate at which system designs become fixed. Modern cockpit automation frequently involves multiple layers of control and display technology in which human operators must exercise equipment in augmented, supervisory, and fully automated control modes. In this context, we maintain that effective human-centered design is dependent on adequate models of human/system performance in which representations of the equipment, the human operator(s), and the mission tasks are available to designers for manipulation and modification. The joint Army-NASA Aircrew/Aircraft Integration (A3I) Program, with its attendant Man-machine Integration Design and Analysis System (MIDAS), was initiated to meet this challenge. MIDAS provides designers with a test bed for analyzing human-system integration in an environment in which both cognitive human function and 'intelligent' machine function are described in similar terms. This distributed object-oriented simulation system, its architecture and assumptions, and our experiences from its application in advanced aviation crew stations are described.

  3. Journal Abstracts

    Directory of Open Access Journals (Sweden)

    Mete Korkut Gülmen

    1996-07-01

    çesinden oluşturuldu ve aynı kişilerin kan örneklerinin sekanslarıyla karşılaştırıldı. Kan ve feçes örneklerinde sekanslar özdeşti fakat sekans değişkenleri yaklaşık her 400bp'de 4.88 olan bir ortalama ile, 1-10 sınırlarında bireyler arasında nükleotid farklılıkları gözlendi. Bu çalışmada uygulanan çeşitli ekstraksiyon protokolleri arasında DNA'yı bağlama ve saflaştırma işleminde en yüksek oranda başarıya manyetik partikül seperasyonuna dayanan organik ekstraksiyon yöntemiyle ulaşıldı. Feçesten DNA ekstraksiyonunda STR analizi rutin olarak olası değildi. İLAÇLARIN MİDEDEKİ ARTIKLARDAN POSTMORTEM DİFÜZYONU Postmortem diffusion of drugs from gastric residue, an experimental study Pounder DJ, Fuke C, Cox DE, Smith D, Kuroda N. Am J Forensic Med-Pathol. 1996; 17(1: 1-7. İlaçların midedeki artıklardan postmortem difüzyonu bir insan kadavrası modeli üzerinde çalışılmıştır. 350 mİ % 10’luk metanol ve 0.1 N HC1 içinde elli miligram amit- riptilin (Ami ve 5 gram parasetamol (Par süspansiyonu ve 50 gram ürografin 5 gram lityum karbonat (alkali model ile ve lityum karbonatsız (asidik model olarak boyun diseksiyonu ile bir özofagus tübü yoluyla mide içine konulmuştur. Oda sıcaklığında (Ortalama saatlik oda sıcaklığı sınırları: 15.6-20.70C, n = 9 geçen 48 saatten sonra bir çok örnek alınmıştır. Mide içeriğinin pH’ının anlamlı ( alkalin model sınırlar = 8.3-8.9, n = 5, asidik model sınırları = 3.4-3-8, n = 5 bir etkisi olmamıştır. İlaç difüzyonu Ami için 0.1-13-9 (pg/g, Par için 65- 524, ve lityum için 13-161 konsantrasyonlarında en çok sol akciğer tabanında belirgindi. Benzer şekilde karaciğer sol lobunda (Ami, 0.1-54.9; Par, 7-218; lityum 7-39, dalakta (Ami, 0.6-24.3; Par, 104-663; lityum, 27-106, ve perikard sıvısında (Ami, 0-4.5; Par, 48-641; lityum, 12- 56 belirgin konsantrasyonlar bulundu. Safra kesesine, kalp kanına, aorta kanına ve inferior vena kava

  4. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  5. An automated method to build groundwater model hydrostratigraphy from airborne electromagnetic data and lithological borehole logs

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; He, X.

    2015-01-01

    of electrical resistivity and clay fraction are classified into hydrostratigraphic zones using k-means clustering. Hydraulic conductivity values of the zones are estimated by hydrological calibration using hydraulic head and stream discharge observations. The method is applied to a Danish case study....... Benchmarking hydrological performance by comparison of simulated hydrological state variables, the cluster model performed competitively. Calibrations of 11 hydrostratigraphic cluster models with 1–11 hydraulic conductivity zones showed improved hydrological performance with increasing number of clusters....... Beyond the 5-cluster model hydrological performance did not improve. Due to reproducibility and possibility of method standardization and automation, we believe that hydrostratigraphic model generation with the proposed method has important prospects for groundwater models used in water resources...

  6. Intelligent sensor-model automated control of PMR-15 autoclave processing

    Science.gov (United States)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.

    1992-01-01

    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  7. Modeling take-over performance in level 3 conditionally automated vehicles.

    Science.gov (United States)

    Gold, Christian; Happee, Riender; Bengler, Klaus

    2017-11-28

    Taking over vehicle control from a Level 3 conditionally automated vehicle can be a demanding task for a driver. The take-over determines the controllability of automated vehicle functions and thereby also traffic safety. This paper presents models predicting the main take-over performance variables take-over time, minimum time-to-collision, brake application and crash probability. These variables are considered in relation to the situational and driver-related factors time-budget, traffic density, non-driving-related task, repetition, the current lane and driver's age. Regression models were developed using 753 take-over situations recorded in a series of driving simulator experiments. The models were validated with data from five other driving simulator experiments of mostly unrelated authors with another 729 take-over situations. The models accurately captured take-over time, time-to-collision and crash probability, and moderately predicted the brake application. Especially the time-budget, traffic density and the repetition strongly influenced the take-over performance, while the non-driving-related tasks, the lane and drivers' age explained a minor portion of the variance in the take-over performances. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. GoSam 2.0. Automated one loop calculations within and beyond the standard model

    International Nuclear Information System (INIS)

    Greiner, Nicolas; Deutsches Elektronen-Synchrotron

    2014-10-01

    We present GoSam 2.0, a fully automated framework for the generation and evaluation of one loop amplitudes in multi leg processes. The new version offers numerous improvements both on generational aspects as well as on the reduction side. This leads to a faster and more stable code for calculations within and beyond the Standard Model. Furthermore it contains the extended version of the standardized interface to Monte Carlo programs which allows for an easy combination with other existing tools. We briefly describe the conceptual innovations and present some phenomenological results.

  9. Electronic design automation of analog ICs combining gradient models with multi-objective evolutionary algorithms

    CERN Document Server

    Rocha, Frederico AE; Lourenço, Nuno CC; Horta, Nuno CG

    2013-01-01

    This book applies to the scientific area of electronic design automation (EDA) and addresses the automatic sizing of analog integrated circuits (ICs). Particularly, this book presents an approach to enhance a state-of-the-art layout-aware circuit-level optimizer (GENOM-POF), by embedding statistical knowledge from an automatically generated gradient model into the multi-objective multi-constraint optimization kernel based on the NSGA-II algorithm. The results showed allow the designer to explore the different trade-offs of the solution space, both through the achieved device sizes, or the resp

  10. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    Directory of Open Access Journals (Sweden)

    Jean-Claude Gilhodes

    Full Text Available Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg. A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001 was found between automated analysis and the above standard evaluation methods. This correlation

  11. Automated Translation and Thermal Zoning of Digital Building Models for Energy Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Nathaniel L. [Cornell University; McCrone, Colin J. [Cornell University; Walter, Bruce J. [Cornell University; Pratt, Kevin B. [Cornell University; Greenberg, Donald P. [Cornell University

    2013-08-26

    Building energy simulation is valuable during the early stages of design, when decisions can have the greatest impact on energy performance. However, preparing digital design models for building energy simulation typically requires tedious manual alteration. This paper describes a series of five automated steps to translate geometric data from an unzoned CAD model into a multi-zone building energy model. First, CAD input is interpreted as geometric surfaces with materials. Second, surface pairs defining walls of various thicknesses are identified. Third, normal directions of unpaired surfaces are determined. Fourth, space boundaries are defined. Fifth, optionally, settings from previous simulations are applied, and spaces are aggregated into a smaller number of thermal zones. Building energy models created quickly using this method can offer guidance throughout the design process.

  12. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...... and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework......'s capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow are explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed...

  13. Modeling and simulation of networked automation and control systems in Modelica; Modellierung und Simulation vernetzter Automatisierungs- und Regelungssysteme in Modelica

    Energy Technology Data Exchange (ETDEWEB)

    Frey, Georg; Liu, Liu [Universitaet des Saarlandes, Saarbruecken (Germany). Lehrstuhl fuer Automatisierungstechnik

    2009-07-01

    The use of network technologies in automation systems is increasing. The analysis of the resulting systems by simulation requires libraries of models that describe the temporal behavior of automation components and communication networks. In this paper, such a library is presented. It was developed using the modeling language Modelica. The resulting models can be simulated, for example, in the tool Dymola. The application of the presented models in open-loop response time analysis as well as in closed-loop analysis of networked control systems is illustrated by examples. Additionally, an approach to reduce the computational cost in the resulting hybrid simulation is presented. (orig.)

  14. Spud and FLML: generalising and automating the user interfaces of scientific computer models

    Science.gov (United States)

    Ham, D. A.; Farrell, P. E.; Maddison, J. R.; Gorman, G. J.; Wilson, C. R.; Kramer, S. C.; Shipton, J.; Collins, G. S.; Cotter, C. J.; Piggott, M. D.

    2009-04-01

    The interfaces by which users specify the scenarios to be simulated by scientific computer models are frequently primitive, under-documented and ad-hoc text files which make using the model in question difficult and error-prone and significantly increase the development cost of the model. We present a model-independent system, Spud[1], which formalises the specification of model input formats in terms of formal grammars. This is combined with an automatically generated graphical user interface which guides users to create valid model inputs based on the grammar provided, and a generic options reading module which minimises the development cost of adding model options. We further present FLML, the Fluidity Markup Language. FLML applies Spud to the Imperial College Ocean Model (ICOM) resulting in a graphically driven system which radically improves the usability of ICOM. As well as a step forward for ICOM, FLML illustrates how the Spud system can be applied to an existing complex ocean model highlighting the potential of Spud as a user interface for other codes in the ocean modelling community. [1] Ham, D. A. et.al, Spud 1.0: generalising and automating the user interfaces of scientific computer models, Geosci. Model Dev. Discuss., 1, 125-146, 2008.

  15. Generating Phenotypical Erroneous Human Behavior to Evaluate Human-automation Interaction Using Model Checking.

    Science.gov (United States)

    Bolton, Matthew L; Bass, Ellen J; Siminiceanu, Radu I

    2012-11-01

    Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel's zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development.

  16. Generating Phenotypical Erroneous Human Behavior to Evaluate Human-automation Interaction Using Model Checking

    Science.gov (United States)

    Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.

    2012-01-01

    Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914

  17. A Co-Opetitive Automated Negotiation Model for Vertical Allied Enterprises Teams and Stakeholders

    Directory of Open Access Journals (Sweden)

    Taiguang Gao

    2018-04-01

    Full Text Available Upstream and downstream of supply chain enterprises often form a tactic vertical alliance to enhance their operational efficiency and maintain their competitive edges in the market. Hence, it is critical for an alliance to collaborate over their internal resources and resolve the profit conflicts among members, so that the functionality required by stakeholders can be fulfilled. As an effective solution, automated negotiation for the vertical allied enterprises team and stakeholder will sufficiently make use of emerging team advantages and significantly reduce the profit conflicts in teams with grouping decisions rather than unilateral decisions by some leader. In this paper, an automated negotiation model is designed to describe both the collaborative game process among the team members and the competitive negotiation process between the allied team and the stakeholder. Considering the co-competitiveness of the vertical allied team, the designed model helps the team members making decision for their own sake, and the team counter-offers for the ongoing negotiation are generated with non-cooperative game process, where the profit derived from negotiation result is distributed with Shapley value method according to contribution or importance contributed by each team member. Finally, a case study is given to testify the effectiveness of the designed model.

  18. Collaborative Model-based Systems Engineering for Cyber-Physical Systems, with a Building Automation Case Study

    DEFF Research Database (Denmark)

    Fitzgerald, John; Gamble, Carl; Payne, Richard

    2016-01-01

    We describe an approach to the model-based engineering of cyber-physical systems that permits the coupling of diverse discrete-event and continuous-time models and their simulators. A case study in the building automation domain demonstrates how such co-models and co-simulation can promote early...

  19. A conceptual model of the automated credibility assessment of the volunteered geographic information

    International Nuclear Information System (INIS)

    Idris, N H; Jackson, M J; Ishak, M H I

    2014-01-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd – sourced based applications. There are two main components proposed to be assessed in the conceptual model – metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers

  20. Abstraction and Learning for Infinite-State Compositional Verification

    Directory of Open Access Journals (Sweden)

    Dimitra Giannakopoulou

    2013-09-01

    Full Text Available Despite many advances that enable the application of model checking techniques to the verification of large systems, the state-explosion problem remains the main challenge for scalability. Compositional verification addresses this challenge by decomposing the verification of a large system into the verification of its components. Recent techniques use learning-based approaches to automate compositional verification based on the assume-guarantee style reasoning. However, these techniques are only applicable to finite-state systems. In this work, we propose a new framework that interleaves abstraction and learning to perform automated compositional verification of infinite-state systems. We also discuss the role of learning and abstraction in the related context of interface generation for infinite-state components.

  1. Automation, Control and Modeling of Compound Semiconductor Thin-Film Growth

    Energy Technology Data Exchange (ETDEWEB)

    Breiland, W.G.; Coltrin, M.E.; Drummond, T.J.; Horn, K.M.; Hou, H.Q.; Klem, J.F.; Tsao, J.Y.

    1999-02-01

    This report documents the results of a laboratory-directed research and development (LDRD) project on control and agile manufacturing in the critical metalorganic chemical vapor deposition (MOCVD) and molecular beam epitaxy (MBE) materials growth processes essential to high-speed microelectronics and optoelectronic components. This effort is founded on a modular and configurable process automation system that serves as a backbone allowing integration of process-specific models and sensors. We have developed and integrated MOCVD- and MBE-specific models in this system, and demonstrated the effectiveness of sensor-based feedback control in improving the accuracy and reproducibility of semiconductor heterostructures. In addition, within this framework we have constructed ''virtual reactor'' models for growth processes, with the goal of greatly shortening the epitaxial growth process development cycle.

  2. System Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model Programmer's Manual

    Science.gov (United States)

    1982-07-01

    In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...

  3. Automated 3D Damaged Cavity Model Builder for Lower Surface Acreage Tile on Orbiter

    Science.gov (United States)

    Belknap, Shannon; Zhang, Michael

    2013-01-01

    The 3D Automated Thermal Tool for Damaged Acreage Tile Math Model builder was developed to perform quickly and accurately 3D thermal analyses on damaged lower surface acreage tiles and structures beneath the damaged locations on a Space Shuttle Orbiter. The 3D model builder created both TRASYS geometric math models (GMMs) and SINDA thermal math models (TMMs) to simulate an idealized damaged cavity in the damaged tile(s). The GMMs are processed in TRASYS to generate radiation conductors between the surfaces in the cavity. The radiation conductors are inserted into the TMMs, which are processed in SINDA to generate temperature histories for all of the nodes on each layer of the TMM. The invention allows a thermal analyst to create quickly and accurately a 3D model of a damaged lower surface tile on the orbiter. The 3D model builder can generate a GMM and the correspond ing TMM in one or two minutes, with the damaged cavity included in the tile material. A separate program creates a configuration file, which would take a couple of minutes to edit. This configuration file is read by the model builder program to determine the location of the damage, the correct tile type, tile thickness, structure thickness, and SIP thickness of the damage, so that the model builder program can build an accurate model at the specified location. Once the models are built, they are processed by the TRASYS and SINDA.

  4. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    Schreiner, R.

    2001-01-01

    The purpose of this work is to develop the Engineered Barrier System (EBS) radionuclide transport abstraction model, as directed by a written development plan (CRWMS M and O 1999a). This abstraction is the conceptual model that will be used to determine the rate of release of radionuclides from the EBS to the unsaturated zone (UZ) in the total system performance assessment-license application (TSPA-LA). In particular, this model will be used to quantify the time-dependent radionuclide releases from a failed waste package (WP) and their subsequent transport through the EBS to the emplacement drift wall/UZ interface. The development of this conceptual model will allow Performance Assessment Operations (PAO) and its Engineered Barrier Performance Department to provide a more detailed and complete EBS flow and transport abstraction. The results from this conceptual model will allow PA0 to address portions of the key technical issues (KTIs) presented in three NRC Issue Resolution Status Reports (IRSRs): (1) the Evolution of the Near-Field Environment (ENFE), Revision 2 (NRC 1999a), (2) the Container Life and Source Term (CLST), Revision 2 (NRC 1999b), and (3) the Thermal Effects on Flow (TEF), Revision 1 (NRC 1998). The conceptual model for flow and transport in the EBS will be referred to as the ''EBS RT Abstraction'' in this analysis/modeling report (AMR). The scope of this abstraction and report is limited to flow and transport processes. More specifically, this AMR does not discuss elements of the TSPA-SR and TSPA-LA that relate to the EBS but are discussed in other AMRs. These elements include corrosion processes, radionuclide solubility limits, waste form dissolution rates and concentrations of colloidal particles that are generally represented as boundary conditions or input parameters for the EBS RT Abstraction. In effect, this AMR provides the algorithms for transporting radionuclides using the flow geometry and radionuclide concentrations determined by other

  5. The Development Of Mathematical Model For Automated Fingerprint Identification Systems Analysis

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2001-01-01

    Fingerprint has a strong oriented and periodic structure composed of dark lines of raised skin (ridges) and clear lines of lowered skin (furrows)that twist to form a distinct pattern. Although the manner in which the ridges flow is distinctive, other characteristics of the fingerprint called m inutiae a re what are most unique to the individual. These features are particular patterns consisting of terminations or bifurcations of the ridges. To assert if two fingerprints are from the same finger or not, experts detect those minutiae. AFIS (Automated Fingerprint Identification Systems) extract and compare these features for determining a match. The classic methods of fingerprints recognition are not suitable for direct implementation in form of computer algorithms. The creation of a finger's model was however the necessity of development of new, better algorithms of analysis. This paper presents a new numerical methods of fingerprints' simulation based on mathematical model of arrangement of dermatoglyphics and creation of minutiae. This paper describes also the design and implementation of an automated fingerprint identification systems which operates in two stages: minutiae extraction and minutiae matching

  6. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    J. Prouty

    2006-01-01

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  7. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J. Prouty

    2006-07-14

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  8. Reflective Abstraction and Representation.

    Science.gov (United States)

    Lewin, Philip

    Piaget's theory of reflective abstraction can supplement cognitive science models of representation by specifying both the act of construction and the component steps through which knowers pass as they acquire knowledge. But, while approaches suggested by cognitive science supplement Piaget by awakening researchers to the role of auxiliary factors…

  9. Testing abstract behavioral specifications

    NARCIS (Netherlands)

    P.Y.H. Wong; R. Bubel (Richard); F.S. de Boer (Frank); C.P.T. de Gouw (Stijn); M. Gómez-Zamalloa; R Haehnle; K. Meinke; M.A. Sindhu

    2015-01-01

    htmlabstractWe present a range of testing techniques for the Abstract Behavioral Specification (ABS) language and apply them to an industrial case study. ABS is a formal modeling language for highly variable, concurrent, component-based systems. The nature of these systems makes them susceptible to

  10. Abstracts and Abstracting in Knowledge Discovery.

    Science.gov (United States)

    Pinto, Maria; Lancaster, F. W.

    1999-01-01

    Presents various levels of criteria for judging the quality of abstracts and abstracting. Requirements for abstracts to be read by humans are compared with requirements for those to be searched by computer. Concludes that the wide availability of complete text in electronic form does not reduce the value of abstracts for information retrieval.…

  11. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    Science.gov (United States)

    Gilhodes, Jean-Claude; Julé, Yvon; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (pfibrosis in mice, which will be very valuable for future preclinical drug explorations.

  12. Chemical Kinetics of Hydrogen Atom Abstraction from Allylic Sites by 3O2; Implications for Combustion Modeling and Simulation.

    Science.gov (United States)

    Zhou, Chong-Wen; Simmie, John M; Somers, Kieran P; Goldsmith, C Franklin; Curran, Henry J

    2017-03-09

    Hydrogen atom abstraction from allylic C-H bonds by molecular oxygen plays a very important role in determining the reactivity of fuel molecules having allylic hydrogen atoms. Rate constants for hydrogen atom abstraction by molecular oxygen from molecules with allylic sites have been calculated. A series of molecules with primary, secondary, tertiary, and super secondary allylic hydrogen atoms of alkene, furan, and alkylbenzene families are taken into consideration. Those molecules include propene, 2-butene, isobutene, 2-methylfuran, and toluene containing the primary allylic hydrogen atom; 1-butene, 1-pentene, 2-ethylfuran, ethylbenzene, and n-propylbenzene containing the secondary allylic hydrogen atom; 3-methyl-1-butene, 2-isopropylfuran, and isopropylbenzene containing tertiary allylic hydrogen atom; and 1-4-pentadiene containing super allylic secondary hydrogen atoms. The M06-2X/6-311++G(d,p) level of theory was used to optimize the geometries of all of the reactants, transition states, products and also the hinder rotation treatments for lower frequency modes. The G4 level of theory was used to calculate the electronic single point energies for those species to determine the 0 K barriers to reaction. Conventional transition state theory with Eckart tunnelling corrections was used to calculate the rate constants. The comparison between our calculated rate constants with the available experimental results from the literature shows good agreement for the reactions of propene and isobutene with molecular oxygen. The rate constant for toluene with O 2 is about an order magnitude slower than that experimentally derived from a comprehensive model proposed by Oehlschlaeger and coauthors. The results clearly indicate the need for a more detailed investigation of the combustion kinetics of toluene oxidation and its key pyrolysis and oxidation intermediates. Despite this, our computed barriers and rate constants retain an important internal consistency. Rate constants

  13. PredicT-ML: a tool for automating machine learning model building with big clinical data.

    Science.gov (United States)

    Luo, Gang

    2016-01-01

    Predictive modeling is fundamental to transforming large clinical data sets, or "big clinical data," into actionable knowledge for various healthcare applications. Machine learning is a major predictive modeling approach, but two barriers make its use in healthcare challenging. First, a machine learning tool user must choose an algorithm and assign one or more model parameters called hyper-parameters before model training. The algorithm and hyper-parameter values used typically impact model accuracy by over 40 %, but their selection requires many labor-intensive manual iterations that can be difficult even for computer scientists. Second, many clinical attributes are repeatedly recorded over time, requiring temporal aggregation before predictive modeling can be performed. Many labor-intensive manual iterations are required to identify a good pair of aggregation period and operator for each clinical attribute. Both barriers result in time and human resource bottlenecks, and preclude healthcare administrators and researchers from asking a series of what-if questions when probing opportunities to use predictive models to improve outcomes and reduce costs. This paper describes our design of and vision for PredicT-ML (prediction tool using machine learning), a software system that aims to overcome these barriers and automate machine learning model building with big clinical data. The paper presents the detailed design of PredicT-ML. PredicT-ML will open the use of big clinical data to thousands of healthcare administrators and researchers and increase the ability to advance clinical research and improve healthcare.

  14. SPECIAL LIBRARIES OF FRAGMENTS OF ALGORITHMIC NETWORKS TO AUTOMATE THE DEVELOPMENT OF ALGORITHMIC MODELS

    Directory of Open Access Journals (Sweden)

    V. E. Marley

    2015-01-01

    Full Text Available Summary. The concept of algorithmic models appeared from the algorithmic approach in which the simulated object, the phenomenon appears in the form of process, subject to strict rules of the algorithm, which placed the process of operation of the facility. Under the algorithmic model is the formalized description of the scenario subject specialist for the simulated process, the structure of which is comparable with the structure of the causal and temporal relationships between events of the process being modeled, together with all information necessary for its software implementation. To represent the structure of algorithmic models used algorithmic network. Normally, they were defined as loaded finite directed graph, the vertices which are mapped to operators and arcs are variables, bound by operators. The language of algorithmic networks has great features, the algorithms that it can display indifference the class of all random algorithms. In existing systems, automation modeling based on algorithmic nets, mainly used by operators working with real numbers. Although this reduces their ability, but enough for modeling a wide class of problems related to economy, environment, transport, technical processes. The task of modeling the execution of schedules and network diagrams is relevant and useful. There are many counting systems, network graphs, however, the monitoring process based analysis of gaps and terms of graphs, no analysis of prediction execution schedule or schedules. The library is designed to build similar predictive models. Specifying source data to obtain a set of projections from which to choose one and take it for a new plan.

  15. Seismic Consequence Abstraction

    International Nuclear Information System (INIS)

    Gross, M.

    2004-01-01

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274])

  16. Seismic Consequence Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  17. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  18. 3D model assisted fully automated scanning laser Doppler vibrometer measurements

    Science.gov (United States)

    Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve

    2017-12-01

    In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle

  19. A cellular automation model for the change of public attitude regarding nuclear energy

    International Nuclear Information System (INIS)

    Ohnishi, Teruaki

    1991-01-01

    A cellular automation model was constructed to investigate how public opinion on nuclear energy in Japan depends upon the information environment and personal communication between people. From simulation with this model, the following become clear; (i) society is a highly non-linear system with a self-organizing potential: (ii) in a society composed of one type of constituent member with homogeneous characteristics, the trend of public opinion is substantially changed only when the effort to ameliorate public acceptance over a long period of time, by means such as education, persuasion and advertisement, exceeds a certain threshold, and (iii) in the case when the amount of information on nuclear risk released from the newsmedia is reduced continuously from now on, the acceptability of nuclear energy is significantly improved so far as the extent of the reduction exceeds a certain threshold. (author)

  20. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  1. Use of the DynaLearn learning environment by naïve student modelers : Implications for automated support

    NARCIS (Netherlands)

    Noble, R.; Bredeweg, B.; Biswas, G.; Bull, S.; Kay, J.; Mitrovic, A.

    2011-01-01

    This paper shows that naïve students will require coaching to overcome the difficulties they face in identifying the important concepts to be modeled, and understanding the causal meta-vocabulary needed for conceptual models. The results of this study will be incorporated in the automated feedback

  2. Automated optimization and construction of chemometric models based on highly variable raw chromatographic data.

    Science.gov (United States)

    Sinkov, Nikolai A; Johnston, Brandon M; Sandercock, P Mark L; Harynuk, James J

    2011-07-04

    Direct chemometric interpretation of raw chromatographic data (as opposed to integrated peak tables) has been shown to be advantageous in many circumstances. However, this approach presents two significant challenges: data alignment and feature selection. In order to interpret the data, the time axes must be precisely aligned so that the signal from each analyte is recorded at the same coordinates in the data matrix for each and every analyzed sample. Several alignment approaches exist in the literature and they work well when the samples being aligned are reasonably similar. In cases where the background matrix for a series of samples to be modeled is highly variable, the performance of these approaches suffers. Considering the challenge of feature selection, when the raw data are used each signal at each time is viewed as an individual, independent variable; with the data rates of modern chromatographic systems, this generates hundreds of thousands of candidate variables, or tens of millions of candidate variables if multivariate detectors such as mass spectrometers are utilized. Consequently, an automated approach to identify and select appropriate variables for inclusion in a model is desirable. In this research we present an alignment approach that relies on a series of deuterated alkanes which act as retention anchors for an alignment signal, and couple this with an automated feature selection routine based on our novel cluster resolution metric for the construction of a chemometric model. The model system that we use to demonstrate these approaches is a series of simulated arson debris samples analyzed by passive headspace extraction, GC-MS, and interpreted using partial least squares discriminant analysis (PLS-DA). Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Automated Reconstruction of Walls from Airborne LIDAR Data for Complete 3d Building Modelling

    Science.gov (United States)

    He, Y.; Zhang, C.; Awrangjeb, M.; Fraser, C. S.

    2012-07-01

    Automated 3D building model generation continues to attract research interests in photogrammetry and computer vision. Airborne Light Detection and Ranging (LIDAR) data with increasing point density and accuracy has been recognized as a valuable source for automated 3D building reconstruction. While considerable achievements have been made in roof extraction, limited research has been carried out in modelling and reconstruction of walls, which constitute important components of a full building model. Low point density and irregular point distribution of LIDAR observations on vertical walls render this task complex. This paper develops a novel approach for wall reconstruction from airborne LIDAR data. The developed method commences with point cloud segmentation using a region growing approach. Seed points for planar segments are selected through principle component analysis, and points in the neighbourhood are collected and examined to form planar segments. Afterwards, segment-based classification is performed to identify roofs, walls and planar ground surfaces. For walls with sparse LIDAR observations, a search is conducted in the neighbourhood of each individual roof segment to collect wall points, and the walls are then reconstructed using geometrical and topological constraints. Finally, walls which were not illuminated by the LIDAR sensor are determined via both reconstructed roof data and neighbouring walls. This leads to the generation of topologically consistent and geometrically accurate and complete 3D building models. Experiments have been conducted in two test sites in the Netherlands and Australia to evaluate the performance of the proposed method. Results show that planar segments can be reliably extracted in the two reported test sites, which have different point density, and the building walls can be correctly reconstructed if the walls are illuminated by the LIDAR sensor.

  4. AUTOMATED FEATURE BASED TLS DATA REGISTRATION FOR 3D BUILDING MODELING

    Directory of Open Access Journals (Sweden)

    K. Kitamura

    2012-07-01

    Full Text Available In this paper we present a novel method for the registration of point cloud data obtained using terrestrial laser scanner (TLS. The final goal of our investigation is the automated reconstruction of CAD drawings and the 3D modeling of objects surveyed by TLS. Because objects are scanned from multiple positions, individual point cloud need to be registered to the same coordinate system. We propose in this paper an automated feature based registration procedure. Our proposed method does not require the definition of initial values or the placement of targets and is robust against noise and background elements. A feature extraction procedure is performed for each point cloud as pre-processing. The registration of the point clouds from different viewpoints is then performed by utilizing the extracted features. The feature extraction method which we had developed previously (Kitamura, 2010 is used: planes and edges are extracted from the point cloud. By utilizing these features, the amount of information to process is reduced and the efficiency of the whole registration procedure is increased. In this paper, we describe the proposed algorithm and, in order to demonstrate its effectiveness, we show the results obtained by using real data.

  5. AUTOMATED FORCE FIELD PARAMETERIZATION FOR NON-POLARIZABLE AND POLARIZABLE ATOMIC MODELS BASED ONAB INITIOTARGET DATA.

    Science.gov (United States)

    Huang, Lei; Roux, Benoît

    2013-08-13

    Classical molecular dynamics (MD) simulations based on atomistic models are increasingly used to study a wide range of biological systems. A prerequisite for meaningful results from such simulations is an accurate molecular mechanical force field. Most biomolecular simulations are currently based on the widely used AMBER and CHARMM force fields, which were parameterized and optimized to cover a small set of basic compounds corresponding to the natural amino acids and nucleic acid bases. Atomic models of additional compounds are commonly generated by analogy to the parameter set of a given force field. While this procedure yields models that are internally consistent, the accuracy of the resulting models can be limited. In this work, we propose a method, General Automated Atomic Model Parameterization (GAAMP), for generating automatically the parameters of atomic models of small molecules using the results from ab initio quantum mechanical (QM) calculations as target data. Force fields that were previously developed for a wide range of model compounds serve as initial guess, although any of the final parameter can be optimized. The electrostatic parameters (partial charges, polarizabilities and shielding) are optimized on the basis of QM electrostatic potential (ESP) and, if applicable, the interaction energies between the compound and water molecules. The soft dihedrals are automatically identified and parameterized by targeting QM dihedral scans as well as the energies of stable conformers. To validate the approach, the solvation free energy is calculated for more than 200 small molecules and MD simulations of 3 different proteins are carried out.

  6. Retrieval-travel-time model for free-fall-flow-rack automated storage and retrieval system

    Science.gov (United States)

    Metahri, Dhiyaeddine; Hachemi, Khalid

    2018-03-01

    Automated storage and retrieval systems (AS/RSs) are material handling systems that are frequently used in manufacturing and distribution centers. The modelling of the retrieval-travel time of an AS/RS (expected product delivery time) is practically important, because it allows us to evaluate and improve the system throughput. The free-fall-flow-rack AS/RS has emerged as a new technology for drug distribution. This system is a new variation of flow-rack AS/RS that uses an operator or a single machine for storage operations, and uses a combination between the free-fall movement and a transport conveyor for retrieval operations. The main contribution of this paper is to develop an analytical model of the expected retrieval-travel time for the free-fall flow-rack under a dedicated storage assignment policy. The proposed model, which is based on a continuous approach, is compared for accuracy, via simulation, with discrete model. The obtained results show that the maximum deviation between the continuous model and the simulation is less than 5%, which shows the accuracy of our model to estimate the retrieval time. The analytical model is useful to optimise the dimensions of the rack, assess the system throughput, and evaluate different storage policies.

  7. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    Science.gov (United States)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  8. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  9. Automated home cage assessment shows behavioral changes in a transgenic mouse model of spinocerebellar ataxia type 17.

    Science.gov (United States)

    Portal, Esteban; Riess, Olaf; Nguyen, Huu Phuc

    2013-08-01

    Spinocerebellar Ataxia type 17 (SCA17) is an autosomal dominantly inherited, neurodegenerative disease characterized by ataxia, involuntary movements, and dementia. A novel SCA17 mouse model having a 71 polyglutamine repeat expansion in the TATA-binding protein (TBP) has shown age related motor deficit using a classic motor test, yet concomitant weight increase might be a confounding factor for this measurement. In this study we used an automated home cage system to test several motor readouts for this same model to confirm pathological behavior results and evaluate benefits of automated home cage in behavior phenotyping. Our results confirm motor deficits in the Tbp/Q71 mice and present previously unrecognized behavioral characteristics obtained from the automated home cage, indicating its use for high-throughput screening and testing, e.g. of therapeutic compounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Automating Construction of Machine Learning Models With Clinical Big Data: Proposal Rationale and Methods.

    Science.gov (United States)

    Luo, Gang; Stone, Bryan L; Johnson, Michael D; Tarczy-Hornoch, Peter; Wilcox, Adam B; Mooney, Sean D; Sheng, Xiaoming; Haug, Peter J; Nkoy, Flory L

    2017-08-29

    To improve health outcomes and cut health care costs, we often need to conduct prediction/classification using large clinical datasets (aka, clinical big data), for example, to identify high-risk patients for preventive interventions. Machine learning has been proposed as a key technology for doing this. Machine learning has won most data science competitions and could support many clinical activities, yet only 15% of hospitals use it for even limited purposes. Despite familiarity with data, health care researchers often lack machine learning expertise to directly use clinical big data, creating a hurdle in realizing value from their data. Health care researchers can work with data scientists with deep machine learning knowledge, but it takes time and effort for both parties to communicate effectively. Facing a shortage in the United States of data scientists and hiring competition from companies with deep pockets, health care systems have difficulty recruiting data scientists. Building and generalizing a machine learning model often requires hundreds to thousands of manual iterations by data scientists to select the following: (1) hyper-parameter values and complex algorithms that greatly affect model accuracy and (2) operators and periods for temporally aggregating clinical attributes (eg, whether a patient's weight kept rising in the past year). This process becomes infeasible with limited budgets. This study's goal is to enable health care researchers to directly use clinical big data, make machine learning feasible with limited budgets and data scientist resources, and realize value from data. This study will allow us to achieve the following: (1) finish developing the new software, Automated Machine Learning (Auto-ML), to automate model selection for machine learning with clinical big data and validate Auto-ML on seven benchmark modeling problems of clinical importance; (2) apply Auto-ML and novel methodology to two new modeling problems crucial for care

  11. Automating Construction of Machine Learning Models With Clinical Big Data: Proposal Rationale and Methods

    Science.gov (United States)

    Stone, Bryan L; Johnson, Michael D; Tarczy-Hornoch, Peter; Wilcox, Adam B; Mooney, Sean D; Sheng, Xiaoming; Haug, Peter J; Nkoy, Flory L

    2017-01-01

    Background To improve health outcomes and cut health care costs, we often need to conduct prediction/classification using large clinical datasets (aka, clinical big data), for example, to identify high-risk patients for preventive interventions. Machine learning has been proposed as a key technology for doing this. Machine learning has won most data science competitions and could support many clinical activities, yet only 15% of hospitals use it for even limited purposes. Despite familiarity with data, health care researchers often lack machine learning expertise to directly use clinical big data, creating a hurdle in realizing value from their data. Health care researchers can work with data scientists with deep machine learning knowledge, but it takes time and effort for both parties to communicate effectively. Facing a shortage in the United States of data scientists and hiring competition from companies with deep pockets, health care systems have difficulty recruiting data scientists. Building and generalizing a machine learning model often requires hundreds to thousands of manual iterations by data scientists to select the following: (1) hyper-parameter values and complex algorithms that greatly affect model accuracy and (2) operators and periods for temporally aggregating clinical attributes (eg, whether a patient’s weight kept rising in the past year). This process becomes infeasible with limited budgets. Objective This study’s goal is to enable health care researchers to directly use clinical big data, make machine learning feasible with limited budgets and data scientist resources, and realize value from data. Methods This study will allow us to achieve the following: (1) finish developing the new software, Automated Machine Learning (Auto-ML), to automate model selection for machine learning with clinical big data and validate Auto-ML on seven benchmark modeling problems of clinical importance; (2) apply Auto-ML and novel methodology to two new

  12. Modelling and simulating the forming of new dry automated lay-up reinforcements for primary structures

    Science.gov (United States)

    Bouquerel, Laure; Moulin, Nicolas; Drapier, Sylvain; Boisse, Philippe; Beraud, Jean-Marc

    2017-10-01

    While weight has been so far the main driver for the development of prepreg based-composites solutions for aeronautics, a new weight-cost trade-off tends to drive choices for next-generation aircrafts. As a response, Hexcel has designed a new dry reinforcement type for aircraft primary structures, which combines the benefits of automation, out-of-autoclave process cost-effectiveness, and mechanical performances competitive to prepreg solutions: HiTape® is a unidirectional (UD) dry carbon reinforcement with thermoplastic veil on each side designed for aircraft primary structures [1-3]. One privileged process route for HiTape® in high volume automated processes consists in forming initially flat dry reinforcement stacks, before resin infusion [4] or injection. Simulation of the forming step aims at predicting the geometry and mechanical properties of the formed stack (so-called preform) for process optimisation. Extensive work has been carried out on prepreg and dry woven fabrics forming behaviour and simulation, but the interest for dry non-woven reinforcements has emerged more recently. Some work has been achieved on non crimp fabrics but studies on the forming behaviour of UDs are seldom and deal with UD prepregs only. Tension and bending in the fibre direction, along with inter-ply friction have been identified as the main mechanisms controlling the HiTape® response during forming. Bending has been characterised using a modified Peirce's flexometer [5] and inter-ply friction study is under development. Anisotropic hyperelastic constitutive models have been selected to represent the assumed decoupled deformation mechanisms. Model parameters are then identified from associated experimental results. For forming simulation, a continuous approach at the macroscopic scale has been selected first, and simulation is carried out in the Zset framework [6] using proper shell finite elements.

  13. Development and Evaluation of a Model for Modular Automation in Plant Manufacturing

    OpenAIRE

    Uwe Katzke; Katja Fischer; Birgit Vogel-Heuser

    2005-01-01

    The benefit of modular concepts in plant automation is seen ambivalent. On one hand it offers advantages, on the other hand it also sets requirements on the system structure as well as discipline of designer. The main reasons to use modularity in systems design for automation applications in industry are reusability and reduction of complexity, but up to now modular concepts are rare in plant automation. This paper analyses the reasons and proposes measures and solution concepts. An analysis ...

  14. Conceptual Model of an Application for Automated Generation of Webpage Mobile Versions

    Directory of Open Access Journals (Sweden)

    Todor Rachovski

    2017-11-01

    Full Text Available Accessing webpages through various types of mobile devices with different screen sizes and using different browsers has put new demands on web developers. The main challenge is the development of websites with responsive design that is adaptable depending on the mobile device used. The article presents a conceptual model of an app for automated generation of mobile pages. It has five-layer architecture: database, database management layer, business logic layer, web services layer and a presentation layer. The database stores all the data needed to run the application. The database management layer uses an ORM model to convert relational data into an object-oriented format and control the access to them. The business logic layer contains components that perform the actual work on building a mobile version of the page, including parsing, building a hierarchical model of the page and a number of transformations. The web services layer provides external applications with access to lower-level functionalities, and the presentation layer is responsible for choosing and using the appropriate CSS. A web application that uses the proposed model was developed and experiments were conducted.

  15. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  16. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  17. Degree of anisotropy as an automated indicator of rip channels in high resolution bathymetric models

    Science.gov (United States)

    Trimble, S. M.; Houser, C.; Bishop, M. P.

    2017-12-01

    A rip current is a concentrated seaward flow of water that forms in the surf zone of a beach as a result of alongshore variations in wave breaking. Rips can carry swimmers swiftly into deep water, and they are responsible for hundreds of fatal drownings and thousands of rescues worldwide each year. These currents form regularly alongside hard structures like piers and jetties, and can also form along sandy coasts when there is a three dimensional bar morphology. This latter rip type tends to be variable in strength and location, making them arguably the most dangerous to swimmers and most difficult to identify. These currents form in characteristic rip channels in surf zone bathymetry, in which the primary axis of self-similarity is oriented shore-normal. This paper demonstrates a new method for automating identification of such rip channels in bathymetric digital surface models (DSMs) using bathymetric data collected by various remote sensing methods. Degree of anisotropy is used to detect rip channels and distinguishes between sandbars, rip channels, and other beach features. This has implications for coastal geomorphology theory and safety practices. As technological advances increase access and accuracy of topobathy mapping methods in the surf zone, frequent nearshore bathymetric DSMs could be more easily captured and processed, then analyzed with this method to result in localized, automated, and frequent detection of rip channels. This could ultimately reduce rip-related fatalities worldwide (i) in present mitigation, by identifying the present location of rip channels, (ii) in forecasting, by tracking the channel's evolution through multiple DSMs, and (iii) in rip education by improving local lifeguard knowledge of the rip hazard. Although this paper on applies analysis of degree of anisotropy to the identification of rip channels, this parameter can be applied to multiple facets of barrier island morphological analysis.

  18. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2005-08-25

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport

  19. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    J.D. Schreiber

    2005-01-01

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers

  20. Modelling and interpreting biologically crusted dryland soil sub-surface structure using automated micropenetrometry

    Science.gov (United States)

    Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter

    2015-04-01

    Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the

  1. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  2. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    and Ben Polly, Joseph Robertson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Polly, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Collis, Jon [Colorado School of Mines, Golden, CO (United States)

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  3. Towards Automated Bargaining in Electronic Markets: A Partially Two-Sided Competition Model

    Science.gov (United States)

    Gatti, Nicola; Lazaric, Alessandro; Restelli, Marcello

    This paper focuses on the prominent issue of automating bargaining agents within electronic markets. Models of bargaining in literature deal with settings wherein there are only two agents and no model satisfactorily captures settings in which there is competition among buyers, being they more than one, and analogously among sellers. In this paper, we extend the principal bargaining protocol, i.e. the alternating-offers protocol, to capture bargaining in markets. The model we propose is such that, in presence of a unique buyer and a unique seller, agents' equilibrium strategies are those in the original protocol. Moreover, we game theoretically study the considered game providing the following results: in presence of one-sided competition (more buyers and one seller or vice versa) we provide agents' equilibrium strategies for all the values of the parameters, in presence of two-sided competition (more buyers and more sellers) we provide an algorithm that produce agents' equilibrium strategies for a large set of the parameters and we experimentally evaluate its effectiveness.

  4. FULLY AUTOMATED GENERATION OF ACCURATE DIGITAL SURFACE MODELS WITH SUB-METER RESOLUTION FROM SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    J. Wohlfeil

    2012-07-01

    Full Text Available Modern pixel-wise image matching algorithms like Semi-Global Matching (SGM are able to compute high resolution digital surface models from airborne and spaceborne stereo imagery. Although image matching itself can be performed automatically, there are prerequisites, like high geometric accuracy, which are essential for ensuring the high quality of resulting surface models. Especially for line cameras, these prerequisites currently require laborious manual interaction using standard tools, which is a growing problem due to continually increasing demand for such surface models. The tedious work includes partly or fully manual selection of tie- and/or ground control points for ensuring the required accuracy of the relative orientation of images for stereo matching. It also includes masking of large water areas that seriously reduce the quality of the results. Furthermore, a good estimate of the depth range is required, since accurate estimates can seriously reduce the processing time for stereo matching. In this paper an approach is presented that allows performing all these steps fully automated. It includes very robust and precise tie point selection, enabling the accurate calculation of the images’ relative orientation via bundle adjustment. It is also shown how water masking and elevation range estimation can be performed automatically on the base of freely available SRTM data. Extensive tests with a large number of different satellite images from QuickBird and WorldView are presented as proof of the robustness and reliability of the proposed method.

  5. Automated Decisional Model for Optimum Economic Order Quantity Determination Using Price Regressive Rates

    Science.gov (United States)

    Roşu, M. M.; Tarbă, C. I.; Neagu, C.

    2016-11-01

    The current models for inventory management are complementary, but together they offer a large pallet of elements for solving complex problems of companies when wanting to establish the optimum economic order quantity for unfinished products, row of materials, goods etc. The main objective of this paper is to elaborate an automated decisional model for the calculus of the economic order quantity taking into account the price regressive rates for the total order quantity. This model has two main objectives: first, to determine the periodicity when to be done the order n or the quantity order q; second, to determine the levels of stock: lighting control, security stock etc. In this way we can provide the answer to two fundamental questions: How much must be ordered? When to Order? In the current practice, the business relationships with its suppliers are based on regressive rates for price. This means that suppliers may grant discounts, from a certain level of quantities ordered. Thus, the unit price of the products is a variable which depends on the order size. So, the most important element for choosing the optimum for the economic order quantity is the total cost for ordering and this cost depends on the following elements: the medium price per units, the stock cost, the ordering cost etc.

  6. ERP Human Enhancement Progress Report : Use case and computational model for adaptive maritime automation

    NARCIS (Netherlands)

    Kleij, R. van der; Broek, J. van den; Brake, G.M. te; Rypkema, J.A.; Schilder, C.M.C.

    2015-01-01

    Automation is often applied in order to increase the cost-effectiveness, reliability and safety of maritime ship and offshore operations. Automation of operator tasks, has not, however, eliminated human error so much as created opportunities for new kinds of error. The ambition of the Adaptive

  7. The modeling of transfer of steering between automated vehicle and human driver using hybrid control framework

    NARCIS (Netherlands)

    Kaustubh, M.; Willemsen, DMC; Mazo Espinosa, M.; Sjöberg, J.; Morris, B.

    2016-01-01

    Proponents of autonomous driving pursue driverless technologies, whereas others foresee a gradual transition where there will be automated driving systems that share the control of the vehicle with the driver. With such advances it becomes pertinent that the developed automated systems need to be

  8. The modeling of transfer of steering between automated vehicle and human driver using hybrid control framework

    NARCIS (Netherlands)

    Kaustubh, M.; Willemsen, D.M.C.; Mazo, M.

    2016-01-01

    Proponents of autonomous driving pursue driverless technologies, whereas others foresee a gradual transition where there will be automated driving systems that share the control of the vehicle with the driver. With such advances it becomes pertinent that the developed automated systems need to be

  9. From Abstract Art to Abstracted Artists

    Directory of Open Access Journals (Sweden)

    Romi Mikulinsky

    2016-11-01

    Full Text Available What lineage connects early abstract films and machine-generated YouTube videos? Hans Richter’s famous piece Rhythmus 21 is considered to be the first abstract film in the experimental tradition. The Webdriver Torso YouTube channel is composed of hundreds of thousands of machine-generated test patterns designed to check frequency signals on YouTube. This article discusses geometric abstraction vis-à-vis new vision, conceptual art and algorithmic art. It argues that the Webdriver Torso is an artistic marvel indicative of a form we call mathematical abstraction, which is art performed by computers and, quite possibly, for computers.

  10. ABSTRACTION OF DRIFT SEEPAGE

    International Nuclear Information System (INIS)

    Wilson, Michael L.

    2001-01-01

    Drift seepage refers to flow of liquid water into repository emplacement drifts, where it can potentially contribute to degradation of the engineered systems and release and transport of radionuclides within the drifts. Because of these important effects, seepage into emplacement drifts is listed as a ''principal factor for the postclosure safety case'' in the screening criteria for grading of data in Attachment 1 of AP-3.15Q, Rev. 2, ''Managing Technical Product Inputs''. Abstraction refers to distillation of the essential components of a process model into a form suitable for use in total-system performance assessment (TSPA). Thus, the purpose of this analysis/model is to put the information generated by the seepage process modeling in a form appropriate for use in the TSPA for the Site Recommendation. This report also supports the Unsaturated-Zone Flow and Transport Process Model Report. The scope of the work is discussed below. This analysis/model is governed by the ''Technical Work Plan for Unsaturated Zone Flow and Transport Process Model Report'' (CRWMS MandO 2000a). Details of this activity are in Addendum A of the technical work plan. The original Work Direction and Planning Document is included as Attachment 7 of Addendum A. Note that the Work Direction and Planning Document contains tasks identified for both Performance Assessment Operations (PAO) and Natural Environment Program Operations (NEPO). Only the PAO tasks are documented here. The planning for the NEPO activities is now in Addendum D of the same technical work plan and the work is documented in a separate report (CRWMS MandO 2000b). The Project has been reorganized since the document was written. The responsible organizations in the new structure are the Performance Assessment Department and the Unsaturated Zone Department, respectively. The work plan for the seepage abstraction calls for determining an appropriate abstraction methodology, determining uncertainties in seepage, and providing

  11. Automated de novo phasing and model building of coiled-coil proteins.

    Science.gov (United States)

    Rämisch, Sebastian; Lizatović, Robert; André, Ingemar

    2015-03-01

    Models generated by de novo structure prediction can be very useful starting points for molecular replacement for systems where suitable structural homologues cannot be readily identified. Protein-protein complexes and de novo-designed proteins are examples of systems that can be challenging to phase. In this study, the potential of de novo models of protein complexes for use as starting points for molecular replacement is investigated. The approach is demonstrated using homomeric coiled-coil proteins, which are excellent model systems for oligomeric systems. Despite the stereotypical fold of coiled coils, initial phase estimation can be difficult and many structures have to be solved with experimental phasing. A method was developed for automatic structure determination of homomeric coiled coils from X-ray diffraction data. In a benchmark set of 24 coiled coils, ranging from dimers to pentamers with resolutions down to 2.5 Å, 22 systems were automatically solved, 11 of which had previously been solved by experimental phasing. The generated models contained 71-103% of the residues present in the deposited structures, had the correct sequence and had free R values that deviated on average by 0.01 from those of the respective reference structures. The electron-density maps were of sufficient quality that only minor manual editing was necessary to produce final structures. The method, named CCsolve, combines methods for de novo structure prediction, initial phase estimation and automated model building into one pipeline. CCsolve is robust against errors in the initial models and can readily be modified to make use of alternative crystallographic software. The results demonstrate the feasibility of de novo phasing of protein-protein complexes, an approach that could also be employed for other small systems beyond coiled coils.

  12. Automated quantitative analysis to assess motor function in different rat models of impaired coordination and ataxia.

    Science.gov (United States)

    Kyriakou, Elisavet I; van der Kieft, Jan G; de Heer, Raymond C; Spink, Andrew; Nguyen, Huu Phuc; Homberg, Judith R; van der Harst, Johanneke E

    2016-08-01

    An objective and automated method for assessing alterations in gait and motor coordination in different animal models is important for proper gait analysis. The CatWalk system has been used in pain research, ischemia, arthritis, spinal cord injury and some animal models for neurodegenerative diseases. Our goals were to obtain a comprehensive gait analysis of three different rat models and to identify which motor coordination parameters are affected and are the most suitable and sensitive to describe and detect ataxia with a secondary focus on possible training effects. Both static and dynamic parameters showed significant differences in all three models: enriched housed rats show higher walking and swing speed and longer stride length, ethanol-induced ataxia affects mainly the hind part of the body, and the SCA17 rats show coordination disturbances. Coordination changes were revealed only in the case of the ethanol-induced ataxia and the SCA17 rat model. Although training affected some gait parameters, it did not obscure group differences when those were present. To our knowledge, a comparative gait assessment in rats with enriched housing conditions, ethanol-induced ataxia and SCA17 has not been presented before. There is no gold standard for the use of CatWalk. Dependent on the specific effects expected, the protocol can be adjusted. By including all sessions in the analysis, any training effect should be detectable and the development of the performance over the sessions can provide insight in effects attributed to intervention, treatment or injury. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Farzad Jalaei

    2014-01-01

    Full Text Available Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA strategies and systems are attained. This paper proposes an automated model that links BIM, LCA, energy analysis, and lighting simulation tools with green building certification systems. The implementation is within developing plug-ins on BIM tool capable of measuring the environmental impacts (EI and embodied energy of building components. Using this method, designers will be provided with a new way to visualize and to identify the potential gain or loss of energy for the building as a whole and for each of its associated components. Furthermore, designers will be able to detect and evaluate the sustainability of the proposed buildings based on Leadership in Energy and Environmental Design (LEED rating system. An actual building project will be used to illustrate the workability of the proposed methodology.

  14. Programme and abstracts

    International Nuclear Information System (INIS)

    1975-01-01

    Abstracts of 25 papers presented at the congress are given. The abstracts cover various topics including radiotherapy, radiopharmaceuticals, radioimmunoassay, health physics, radiation protection and nuclear medicine

  15. Assessing user acceptance towards automated and conventional sink use for hand decontamination using the technology acceptance model.

    Science.gov (United States)

    Dawson, Carolyn H; Mackrill, Jamie B; Cain, Rebecca

    2017-12-01

    Hand hygiene (HH) prevents harmful contaminants spreading in settings including domestic, health care and food handling. Strategies to improve HH range from behavioural techniques through to automated sinks that ensure hand surface cleaning. This study aimed to assess user experience and acceptance towards a new automated sink, compared to a normal sink. An adapted version of the technology acceptance model (TAM) assessed each mode of handwashing. A within-subjects design enabled N = 46 participants to evaluate both sinks. Perceived Ease of Use and Satisfaction of Use were significantly lower for the automated sink, compared to the conventional sink (p design features including jet strength, water temperature and device affordance may improve HH technology. We provide recommendations for future HH technology development to contribute a positive user experience, relevant to technology developers, ergonomists and those involved in HH across all sectors. Practitioner Summary: The need to facilitate timely, effective hand hygiene to prevent illness has led to a rise in automated handwashing systems across different contexts. User acceptance is a key factor in system uptake. This paper applies the technology acceptance model as a means to explore and optimise the design of such systems.

  16. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  17. Modal abstractions of concurrent behavior

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nanz, Sebastian; Nielson, Hanne Riis

    2011-01-01

    We present an effective algorithm for the automatic construction of finite modal transition systems as abstractions of potentially infinite concurrent processes. Modal transition systems are recognized as valuable abstractions for model checking because they allow for the validation as well as re...

  18. Automation of the Jarrell--Ash model 70-314 emission spectrometer

    International Nuclear Information System (INIS)

    Morris, W.F.; Fisher, E.R.; Taber, L.

    1978-01-01

    Automation of the Jarrell-Ash 3.4-Meter Ebert direct-reading emission spectrometer with digital scaler readout is described. The readout is interfaced to a Data General NOVA 840 minicomputer. The automation code consists of BASIC language programs for interactive routines, data processing, and report generation. Call statements within the BASIC programs invoke assembly language routines for real-time data acquisition and control. In addition, the automation objectives as well as the spectrometer-computer system functions, coding, and operating instructions are presented

  19. A Demonstration of Automated DNA Sequencing.

    Science.gov (United States)

    Latourelle, Sandra; Seidel-Rogol, Bonnie

    1998-01-01

    Details a simulation that employs a paper-and-pencil model to demonstrate the principles behind automated DNA sequencing. Discusses the advantages of automated sequencing as well as the chemistry of automated DNA sequencing. (DDR)

  20. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Bouzon Gustavo

    2008-01-01

    Full Text Available Abstract This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  1. Data Structure Analysis to Represent Basic Models of Finite State Automation

    Directory of Open Access Journals (Sweden)

    V. V. Gurenko

    2015-01-01

    Full Text Available Complex system engineering based on the automaton models requires a reasoned data structure selection to implement them. The problem of automaton representation and data structure selection to be used in it has been understudied. Arbitrary data structure selection for automaton model software implementation leads to unnecessary computational burden and reduces the developed system efficiency. This article proposes an approach to the reasoned selection of data structures to represent finite algoristic automaton basic models and gives practical considerations based on it.Static and dynamic data structures are proposed for three main ways to assign Mealy and Moore automatons: a transition table, a matrix of coupling and a transition graph. A thirddimensional array, a rectangular matrix and a matrix of lists are the static structures. Dynamic structures are list-oriented structures: two-level and three-level Ayliff vectors and a multi-linked list. These structures allow us to store all required information about finite state automaton model components - characteristic set cardinalities and data of transition and output functions.A criterion system is proposed for data structure comparative evaluation in virtue of algorithmic features of automata theory problems. The criteria focused on capacitive and time computational complexity of operations performed in tasks such as equivalent automaton conversions, proving of automaton equivalence and isomorphism, and automaton minimization.A data structure comparative analysis based on the criterion system has done for both static and dynamic type. The analysis showed advantages of the third-dimensional array, matrix and two-level Ayliff vector. These are structures that assign automaton by transition table. For these structures an experiment was done to measure the execution time of automation operations included in criterion system.The analysis of experiment results showed that a dynamic structure - two

  2. Ecosystem models. Volume 3. November, 1977--October, 1978 (a bibliography with abstracts). Report for Nov 1977--Oct 1978

    International Nuclear Information System (INIS)

    Harrison, E.A.

    1978-10-01

    The preparation and use of ecosystem models are covered in this bibliography of Federally-funded research. Models for marine biology, wildlife, plants, water pollution, microorganisms, food chains, radioactive substances, limnology, and diseases as related to ecosystems are included

  3. Benefits Estimation Model for Automated Vehicle Operations: Phase 2 Final Report

    Science.gov (United States)

    2018-01-01

    Automated vehicles have the potential to bring about transformative safety, mobility, energy, and environmental benefits to the surface transportation system. They are also being introduced into a complex transportation system, where second-order imp...

  4. Semi-automated operation of Mars Climate Simulation chamber - MCSC modelled for biological experiments

    Science.gov (United States)

    Tarasashvili, M. V.; Sabashvili, Sh. A.; Tsereteli, S. L.; Aleksidze, N. D.; Dalakishvili, O.

    2017-10-01

    The Mars Climate Simulation Chamber (MCSC) (GEO PAT 12 522/01) is designed for the investigation of the possible past and present habitability of Mars, as well as for the solution of practical tasks necessary for the colonization and Terraformation of the Planet. There are specific tasks such as the experimental investigation of the biological parameters that allow many terrestrial organisms to adapt to the imitated Martian conditions: chemistry of the ground, atmosphere, temperature, radiation, etc. MCSC is set for the simulation of the conduction of various biological experiments, as well as the selection of extremophile microorganisms for the possible Settlement, Ecopoesis and/or Terraformation purposes and investigation of their physiological functions. For long-term purposes, it is possible to cultivate genetically modified organisms (e.g., plants) adapted to the Martian conditions for future Martian agriculture to sustain human Mars missions and permanent settlements. The size of the chamber allows preliminary testing of the functionality of space-station mini-models and personal protection devices such as space-suits, covering and building materials and other structures. The reliability of the experimental biotechnological materials can also be tested over a period of years. Complex and thorough research has been performed to acquire the most appropriate technical tools for the accurate engineering of the MCSC and precious programmed simulation of Martian environmental conditions. This paper describes the construction and technical details of the equipment of the MCSC, which allows its semi-automated, long-term operation.

  5. Automated segmentation and geometrical modeling of the tricuspid aortic valve in 3D echocardiographic images.

    Science.gov (United States)

    Pouch, Alison M; Wang, Hongzhi; Takabe, Manabu; Jackson, Benjamin M; Sehgal, Chandra M; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A

    2013-01-01

    The aortic valve has been described with variable anatomical definitions, and the consistency of 2D manual measurement of valve dimensions in medical image data has been questionable. Given the importance of image-based morphological assessment in the diagnosis and surgical treatment of aortic valve disease, there is considerable need to develop a standardized framework for 3D valve segmentation and shape representation. Towards this goal, this work integrates template-based medial modeling and multi-atlas label fusion techniques to automatically delineate and quantitatively describe aortic leaflet geometry in 3D echocardiographic (3DE) images, a challenging task that has been explored only to a limited extent. The method makes use of expert knowledge of aortic leaflet image appearance, generates segmentations with consistent topology, and establishes a shape-based coordinate system on the aortic leaflets that enables standardized automated measurements. In this study, the algorithm is evaluated on 11 3DE images of normal human aortic leaflets acquired at mid systole. The clinical relevance of the method is its ability to capture leaflet geometry in 3DE image data with minimal user interaction while producing consistent measurements of 3D aortic leaflet geometry.

  6. Thermal Edge-Effects Model for Automated Tape Placement of Thermoplastic Composites

    Science.gov (United States)

    Costen, Robert C.

    2000-01-01

    Two-dimensional thermal models for automated tape placement (ATP) of thermoplastic composites neglect the diffusive heat transport that occurs between the newly placed tape and the cool substrate beside it. Such lateral transport can cool the tape edges prematurely and weaken the bond. The three-dimensional, steady state, thermal transport equation is solved by the Green's function method for a tape of finite width being placed on an infinitely wide substrate. The isotherm for the glass transition temperature on the weld interface is used to determine the distance inward from the tape edge that is prematurely cooled, called the cooling incursion Delta a. For the Langley ATP robot, Delta a = 0.4 mm for a unidirectional lay-up of PEEK/carbon fiber composite, and Delta a = 1.2 mm for an isotropic lay-up. A formula for Delta a is developed and applied to a wide range of operating conditions. A surprise finding is that Delta a need not decrease as the Peclet number Pe becomes very large, where Pe is the dimensionless ratio of inertial to diffusive heat transport. Conformable rollers that increase the consolidation length would also increase Delta a, unless other changes are made, such as proportionally increasing the material speed. To compensate for premature edge cooling, the thermal input could be extended past the tape edges by the amount Delta a. This method should help achieve uniform weld strength and crystallinity across the width of the tape.

  7. Methods for Automating Analysis of Glacier Morphology for Regional Modelling: Centerlines, Extensions, and Elevation Bands

    Science.gov (United States)

    Viger, R. J.; Van Beusekom, A. E.

    2016-12-01

    The treatment of glaciers in modeling requires information about their shape and extent. This presentation discusses new methods and their application in a new glacier-capable variant of the USGS PRMS model, a physically-based, spatially distributed daily time-step model designed to simulate the runoff and evolution of glaciers through time. In addition to developing parameters describing PRMS land surfaces (hydrologic response units, HRUs), several of the analyses and products are likely of interest to cryospheric science community in general. The first method is a (fully automated) variation of logic previously presented in the literature for definition of the glacier centerline. Given that the surface of a glacier might be convex, using traditional topographic analyses based on a DEM to trace a path down the glacier is not reliable. Instead a path is derived based on a cost function. Although only a single path is presented in our results, the method can be easily modified to delineate a branched network of centerlines for each glacier. The second method extends the glacier terminus downslope by an arbitrary distance, according to local surface topography. This product is can be used to explore possible, if unlikely, scenarios under which glacier area grows. More usefully, this method can be used to approximate glacier extents from previous years without needing historical imagery. The final method presents an approach for segmenting the glacier into altitude-based HRUs. Successful integration of this information with traditional approaches for discretizing the non-glacierized portions of a basin requires several additional steps. These include synthesizing the glacier centerline network with one developed with a traditional DEM analysis, ensuring that flow can be routed under and beyond glaciers to a basin outlet. Results are presented based on analysis of the Copper River Basin, Alaska.

  8. Forecasting Macroeconomic Variables using Neural Network Models and Three Automated Model Selection Techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this paper we consider the forecasting performance of a well-defined class of flexible models, the so-called single hidden-layer feedforward neural network models. A major aim of our study is to find out whether they, due to their flexibility, are as useful tools in economic forecasting as some...... previous studies have indicated. When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. In fact, their parameters are not even globally...... on the linearisation idea: the Marginal Bridge Estimator and Autometrics. Second, one must decide whether forecasting should be carried out recursively or directly. Comparisons of these two methodss exist for linear models and here these comparisons are extended to neural networks. Finally, a nonlinear model...

  9. Method of Modeling Questions for Automated Grading of Students’ Responses in E-Learning Systems

    Directory of Open Access Journals (Sweden)

    A. A. Gurchenkov

    2015-01-01

    Full Text Available Introduction. Problem relevance. The capability to check a solution of practical problems automatically is an important functionality of any learning management system (LMS. Complex types of questions, implying creative approach to problem solving are of particular interest. There are a lot of studies presenting automated scoring algorithms of students' answers, such as mathematical expressions, graphs, molecules, etc. However, the most common types of problems in the open LMS that are being actively implemented in Russian and foreign universities (Moodle, Sakai, Ilias etc. remain simple types of questions such as, for example, multiple choice.Study subject and goal. The purpose of study is to create a method that allows integrating arbitrary algorithms of answer scoring into any existing LMS, as well as its practical implementation in the form of an independent software module, which will handle questions in LMS.Method. The model for objects of type "algorithmic question" is considered. A unified format for storing objects of this type, allowing keeping their state, is developed. The algorithm is a set of variables, which defines the responses versus input data (or vice versa. Basis variables (input are selected pseudo-randomly from a predetermined range, and based on these values resulting variables (responses are calculated. This approach allows us to synthesize variations of the same question. State of the question is saved by means of "seed" of pseudo-random number generator. A set of algorithmic problems was used to build the lifecycle management functions, namely: initialization create (, rendering render (, and evaluation answer (. These functions lay the foundation for the Application Program Interface (API and allow us to control software module responsible for the questions in LMS.Practical results. This study is completed with the implementation of software module responsible for mapping the interaction with the student and automated

  10. SAHM - Simplification of one-dimensional hydraulic networks by automated processes evaluated on 1D/2D deterministic flood models

    DEFF Research Database (Denmark)

    Löwe, Roland; Davidsen, Steffen; Thrysøe, Cecilie

    the 1D network model. The simplifications lead to an underestimation of flooded area because interaction points between network and surface are removed and because water is transported downstream faster. These effects can be mitigated by maintaining nodes in flood-prone areas in the simplification......We present an algorithm for automated simplification of 1D pipe network models. The impact of the simplifications on the flooding simulated by coupled 1D-2D models is evaluated in an Australian case study. Significant reductions of the simulation time of the coupled model are achieved by reducing...

  11. Continuous Automated Model EvaluatiOn (CAMEO) complementing the critical assessment of structure prediction in CASP12.

    Science.gov (United States)

    Haas, Jürgen; Barbato, Alessandro; Behringer, Dario; Studer, Gabriel; Roth, Steven; Bertoni, Martino; Mostaguir, Khaled; Gumienny, Rafal; Schwede, Torsten

    2018-03-01

    Every second year, the community experiment "Critical Assessment of Techniques for Structure Prediction" (CASP) is conducting an independent blind assessment of structure prediction methods, providing a framework for comparing the performance of different approaches and discussing the latest developments in the field. Yet, developers of automated computational modeling methods clearly benefit from more frequent evaluations based on larger sets of data. The "Continuous Automated Model EvaluatiOn (CAMEO)" platform complements the CASP experiment by conducting fully automated blind prediction assessments based on the weekly pre-release of sequences of those structures, which are going to be published in the next release of the PDB Protein Data Bank. CAMEO publishes weekly benchmarking results based on models collected during a 4-day prediction window, on average assessing ca. 100 targets during a time frame of 5 weeks. CAMEO benchmarking data is generated consistently for all participating methods at the same point in time, enabling developers to benchmark and cross-validate their method's performance, and directly refer to the benchmarking results in publications. In order to facilitate server development and promote shorter release cycles, CAMEO sends weekly email with submission statistics and low performance warnings. Many participants of CASP have successfully employed CAMEO when preparing their methods for upcoming community experiments. CAMEO offers a variety of scores to allow benchmarking diverse aspects of structure prediction methods. By introducing new scoring schemes, CAMEO facilitates new development in areas of active research, for example, modeling quaternary structure, complexes, or ligand binding sites. © 2017 Wiley Periodicals, Inc.

  12. Coupling surface water (Delft3D) to groundwater (MODFLOW) in the Bay-Delta community model: the effect of major abstractions in the Delta

    Science.gov (United States)

    Hendriks, D.; Ball, S. M.; Van der Wegen, M.; Verkaik, J.; van Dam, A.

    2016-12-01

    We present a coupled groundwater-surface water model for the San Francisco Bay and Sacramento Valley that consists of a combination of a spatially-distributed groundwater model (Modflow) based on the USGS Central Valley model(1) and the Flexible Mesh (FM) surface water model of the Bay Area(2). With this coupled groundwater-surface water model, we assessed effects of climate, surface water abstractions and groundwater pumping on surface water and groundwater levels, groundwater-surface water interaction and infiltration/seepage fluxes. Results show that the effect of climate (high flow and low flow) on surface water and groundwater is significant and most prominent in upstream areas. The surface water abstractions cause significant local surface water levels decrease (over 2 m), which may cause inflow of bay water during low flow periods, resulting in salinization of surface water in more upstream areas. Groundwater level drawdown due to surface water withdrawal is moderate and limited to the area of the withdrawals. The groundwater pumping causes large groundwater level drawdowns (up to 0.8 m) and significant changes in seepage/infiltration fluxes in the model. However, the effect on groundwater-surface water exchange is relatively small. The presented model instrument gives a sound first impression of the effects of climate and water abstraction on both surface water and groundwater. The combination of Modflow and Flexible Mesh has potential for modelling of groundwater-surface water exchange in deltaic areas, also in other parts of the world. However, various improvements need to be made in order to make the simulation results useful in practice. In addition, a water quality aspect could be added to assess salinization processes as well as groundwater-surface water aspects of water and soil pollution. (1) http://ca.water.usgs.gov/projects/central-valley/central-valley-hydrologic-model.html (2) www.d3d-baydelta.org

  13. Design automation for integrated optics

    Science.gov (United States)

    Condrat, Christopher

    Recent breakthroughs in silicon photonics technology are enabling the integration of optical devices into silicon-based semiconductor processes. Photonics technology enables high-speed, high-bandwidth, and high-fidelity communications on the chip-scale---an important development in an increasingly communications-oriented semiconductor world. Significant developments in silicon photonic manufacturing and integration are also enabling investigations into applications beyond that of traditional telecom: sensing, filtering, signal processing, quantum technology---and even optical computing. In effect, we are now seeing a convergence of communications and computation, where the traditional roles of optics and microelectronics are becoming blurred. As the applications for opto-electronic integrated circuits (OEICs) are developed, and manufacturing capabilities expand, design support is necessary to fully exploit the potential of this optics technology. Such design support for moving beyond custom-design to automated synthesis and optimization is not well developed. Scalability requires abstractions, which in turn enables and requires the use of optimization algorithms and design methodology flows. Design automation represents an opportunity to take OEIC design to a larger scale, facilitating design-space exploration, and laying the foundation for current and future optical applications---thus fully realizing the potential of this technology. This dissertation proposes design automation for integrated optic system design. Using a building-block model for optical devices, we provide an EDA-inspired design flow and methodologies for optical design automation. Underlying these flows and methodologies are new supporting techniques in behavioral and physical synthesis, as well as device-resynthesis techniques for thermal-aware system integration. We also provide modeling for optical devices and determine optimization and constraint parameters that guide the automation

  14. A distributed substation automation model based on the multi-agents technology; Um modelo distribuido de automacao de subestacoes baseado em tecnologia multiagentes

    Energy Technology Data Exchange (ETDEWEB)

    Geus, Klaus de; Milsztajn, Flavio; Kolb, Carlos Jose Johann; Dometerco, Jose Henrique; Souza, Alexandre Mendonca de; Braga, Ciro de Carvalho; Parolin, Emerson Luis; Frisch, Arlenio Carneiro; Fortunato Junior, Luiz Kiss; Erzinger Junior, Augusto; Jonack, Marco Antonio; Guiera, Anderson Juliano Azambuja [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)]. E-mail: klaus@copel.com; flaviomil@copel.com; kolb@copel.com; dometerc@copel.com; alexandre.mendonca@copel.com; ciro@copel.com; parolin@copel.com; arlenio@copel.com; luiz.kiss@copel.com; aerzinger@copel.com; jonack@copel.com; guiera@copel.com

    2006-10-15

    The main purpose of this paper is to analyse distributed computing technology which can be used in substation automation systems. Based on performance comparative results obtained in laboratory, a specific model for distributed substation automation is proposed considering the current model employed at COPEL - Companhia Paranaense de Energia. The proposed model is based on the multi-agents technology, which has lately received special attention in the development of distributed systems with local intelligence. (author)

  15. Model of the management program for a means complex of the design works automation as a finite-state automaton

    Directory of Open Access Journals (Sweden)

    Zakharchenko V. P.

    2017-12-01

    Full Text Available For software development it is necessary to have its mathematical model. It is established that for a means complex of the design works automation a model of a finite automaton is the best choice. The automatic machine has been chosen with a single feedback state, which asynchronously initiates the execution of design procedures, on which there are Terms of References. For this an additional requested automaton is used. This automaton implements the selection of design work according to a status of the initiated design procedure. Commands of designers also are processed by a separate automaton. Situations arising in the automated design process and are associated with designers’ commands, are divided into five groups.

  16. Evidence evaluation in fingerprint comparison and automated fingerprint identification systems--Modeling between finger variability.

    Science.gov (United States)

    Egli Anthonioz, N M; Champod, C

    2014-02-01

    In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Design, analysis and modeling of a novel hybrid powertrain system based on hybridized automated manual transmission

    Science.gov (United States)

    Wu, Guang; Dong, Zuomin

    2017-09-01

    Hybrid electric vehicles are widely accepted as a promising short to mid-term technical solution due to noticeably improved efficiency and lower emissions at competitive costs. In recent years, various hybrid powertrain systems were proposed and implemented based on different types of conventional transmission. Power-split system, including Toyota Hybrid System and Ford Hybrid System, are well-known examples. However, their relatively low torque capacity, and the drive of alternative and more advanced designs encouraged other innovative hybrid system designs. In this work, a new type of hybrid powertrain system based hybridized automated manual transmission (HAMT) is proposed. By using the concept of torque gap filler (TGF), this new hybrid powertrain type has the potential to overcome issue of torque gap during gearshift. The HAMT design (patent pending) is described in details, from gear layout and design of gear ratios (EV mode and HEV mode) to torque paths at different gears. As an analytical tool, mutli-body model of vehicle equipped with this HAMT was built to analyze powertrain dynamics at various steady and transient modes. A gearshift was decomposed and analyzed based basic modes. Furthermore, a Simulink-SimDriveline hybrid vehicle model was built for the new transmission, driveline and vehicle modular. Control strategy has also been built to harmonically coordinate different powertrain components to realize TGF function. A vehicle launch simulation test has been completed under 30% of accelerator pedal position to reveal details during gearshift. Simulation results showed that this HAMT can eliminate most torque gap that has been persistent issue of traditional AMT, improving both drivability and performance. This work demonstrated a new type of transmission that features high torque capacity, high efficiency and improved drivability.

  18. Abstracts and program proceedings of the 1994 meeting of the International Society for Ecological Modelling North American Chapter

    Energy Technology Data Exchange (ETDEWEB)

    Kercher, J.R.

    1994-06-01

    This document contains information about the 1994 meeting of the International Society for Ecological Modelling North American Chapter. The topics discussed include: extinction risk assessment modelling, ecological risk analysis of uranium mining, impacts of pesticides, demography, habitats, atmospheric deposition, and climate change.

  19. Book of Abstracts. International workshop on the terrestrial water cycle: Modeling and data assimilation across catchment scales, workshop

    NARCIS (Netherlands)

    Teuling, A.J.; Leijnse, H.; Troch, P.A.A.; Sheffield, J.; Wood, E.F.

    2004-01-01

    Scope of the International Workshop was bringing together experts in hydrological modeling to discuss new modeling strategies, and the potential of using advanced data assimilation methods to improve parameterization and predictability of distributed and semi-distributed catchment-scale hydrological

  20. Beyond captions: linking figures with abstract sentences in biomedical articles.

    Directory of Open Access Journals (Sweden)

    Joseph P Bockhorst

    Full Text Available Although figures in scientific articles have high information content and concisely communicate many key research findings, they are currently under utilized by literature search and retrieval systems. Many systems ignore figures, and those that do not typically only consider caption text. This study describes and evaluates a fully automated approach for associating figures in the body of a biomedical article with sentences in its abstract. We use supervised methods to learn probabilistic language models, hidden Markov models, and conditional random fields for predicting associations between abstract sentences and figures. Three kinds of evidence are used: text in abstract sentences and figures, relative positions of sentences and figures, and the patterns of sentence/figure associations across an article. Each information source is shown to have predictive value, and models that use all kinds of evidence are more accurate than models that do not. Our most accurate method has an F1-score of 69% on a cross-validation experiment, is competitive with the accuracy of human experts, has significantly better predictive accuracy than state-of-the-art methods and enables users to access figures associated with an abstract sentence with an average of 1.82 fewer mouse clicks. A user evaluation shows that human users find our system beneficial. The system is available at http://FigureItOut.askHERMES.org.

  1. Beyond captions: linking figures with abstract sentences in biomedical articles.

    Science.gov (United States)

    Bockhorst, Joseph P; Conroy, John M; Agarwal, Shashank; O'Leary, Dianne P; Yu, Hong

    2012-01-01

    Although figures in scientific articles have high information content and concisely communicate many key research findings, they are currently under utilized by literature search and retrieval systems. Many systems ignore figures, and those that do not typically only consider caption text. This study describes and evaluates a fully automated approach for associating figures in the body of a biomedical article with sentences in its abstract. We use supervised methods to learn probabilistic language models, hidden Markov models, and conditional random fields for predicting associations between abstract sentences and figures. Three kinds of evidence are used: text in abstract sentences and figures, relative positions of sentences and figures, and the patterns of sentence/figure associations across an article. Each information source is shown to have predictive value, and models that use all kinds of evidence are more accurate than models that do not. Our most accurate method has an F1-score of 69% on a cross-validation experiment, is competitive with the accuracy of human experts, has significantly better predictive accuracy than state-of-the-art methods and enables users to access figures associated with an abstract sentence with an average of 1.82 fewer mouse clicks. A user evaluation shows that human users find our system beneficial. The system is available at http://FigureItOut.askHERMES.org.

  2. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    Science.gov (United States)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  3. Gene clustering by latent semantic indexing of MEDLINE abstracts.

    Science.gov (United States)

    Homayouni, Ramin; Heinrich, Kevin; Wei, Lai; Berry, Michael W

    2005-01-01

    A major challenge in the interpretation of high-throughput genomic data is understanding the functional associations between genes. Previously, several approaches have been described to extract gene relationships from various biological databases using term-matching methods. However, more flexible automated methods are needed to identify functional relationships (both explicit and implicit) between genes from the biomedical literature. In this study, we explored the utility of Latent Semantic Indexing (LSI), a vector space model for information retrieval, to automatically identify conceptual gene relationships from titles and abstracts in MEDLINE citations. We found that LSI identified gene-to-gene and keyword-to-gene relationships with high average precision. In addition, LSI identified implicit gene relationships based on word usage patterns in the gene abstract documents. Finally, we demonstrate here that pairwise distances derived from the vector angles of gene abstract documents can be effectively used to functionally group genes by hierarchical clustering. Our results provide proof-of-principle that LSI is a robust automated method to elucidate both known (explicit) and unknown (implicit) gene relationships from the biomedical literature. These features make LSI particularly useful for the analysis of novel associations discovered in genomic experiments. The 50-gene document collection used in this study can be interactively queried at http://shad.cs.utk.edu/sgo/sgo.html.

  4. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  5. Identity Management Processes Automation

    Directory of Open Access Journals (Sweden)

    A. Y. Lavrukhin

    2010-03-01

    Full Text Available Implementation of identity management systems consists of two main parts, consulting and automation. The consulting part includes development of a role model and identity management processes description. The automation part is based on the results of consulting part. This article describes the most important aspects of IdM implementation.

  6. Object-Based Integration of Photogrammetric and LiDAR Data for Automated Generation of Complex Polyhedral Building Models

    Science.gov (United States)

    Kim, Changjae; Habib, Ayman

    2009-01-01

    This research is concerned with a methodology for automated generation of polyhedral building models for complex structures, whose rooftops are bounded by straight lines. The process starts by utilizing LiDAR data for building hypothesis generation and derivation of individual planar patches constituting building rooftops. Initial boundaries of these patches are then refined through the integration of LiDAR and photogrammetric data and hierarchical processing of the planar patches. Building models for complex structures are finally produced using the refined boundaries. The performance of the developed methodology is evaluated through qualitative and quantitative analysis of the generated building models from real data. PMID:22346722

  7. Towards reduction of Paradigm coordination models

    NARCIS (Netherlands)

    S. Andova; L.P.J. Groenewegen; E.P. de Vink (Erik Peter); L. Aceto (Luca); M.R. Mousavi

    2011-01-01

    htmlabstractThe coordination modelling language Paradigm addresses collaboration between components in terms of dynamic constraints. Within a Paradigm model, component dynamics are consistently specified at a detailed and a global level of abstraction. To enable automated verification of Paradigm

  8. Program and abstracts

    International Nuclear Information System (INIS)

    1975-01-01

    Abstracts of the papers given at the conference are presented. The abstracts are arranged under sessions entitled:Theoretical Physics; Nuclear Physics; Solid State Physics; Spectroscopy; Physics Education; SANCGASS; Astronomy; Plasma Physics; Physics in Industry; Applied and General Physics

  9. A model composition for Mars derived from the oxygen isotopic ratios of martian/SNC meteorites. [Abstract only

    Science.gov (United States)

    Delaney, J. S.

    1994-01-01

    Oxygen is the most abundant element in most meteorites, yet the ratios of its isotopes are seldom used to constrain the compositional history of achondrites. The two major achondrite groups have O isotope signatures that differ from any plausible chondritic precursors and lie between the ordinary and carbonaceous chondrite domains. If the assumption is made that the present global sampling of chondritic meteorites reflects the variability of O reservoirs at the time of planetessimal/planet aggregation in the early nebula, then the O in these groups must reflect mixing between known chondritic reservoirs. This approach, in combination with constraints based on Fe-Mn-Mg systematics, has been used previously to model the composition of the basaltic achondrite parent body (BAP) and provides a model precursor composition that is generally consistent with previous eucrite parent body (EPB) estimates. The same approach is applied to Mars exploiting the assumption that the SNC and related meteorites sample the martian lithosphere. Model planet and planetesimal compositions can be derived by mixing of known chondritic components using O isotope ratios as the fundamental compositional constraint. The major- and minor-element composition for Mars derived here and that derived previously for the basaltic achondrite parent body are, in many respects, compatible with model compositions generated using completely independent constraints. The role of volatile elements and alkalis in particular remains a major difficulty in applying such models.

  10. A novel automated rodent tracker (ART), demonstrated in a mouse model of amyotrophic lateral sclerosis.

    Science.gov (United States)

    Hewitt, Brett M; Yap, Moi Hoon; Hodson-Tole, Emma F; Kennerley, Aneurin J; Sharp, Paul S; Grant, Robyn A

    2018-04-15

    Generating quantitative metrics of rodent locomotion and general behaviours from video footage is important in behavioural neuroscience studies. However, there is not yet a free software system that can process large amounts of video data with minimal user interventions. Here we propose a new, automated rodent tracker (ART) that uses a simple rule-based system to quickly and robustly track rodent nose and body points, with minimal user input. Tracked points can then be used to identify behaviours, approximate body size and provide locomotion metrics, such as speed and distance. ART was demonstrated here on video recordings of a SOD1 mouse model, of amyotrophic lateral sclerosis, aged 30, 60, 90 and 120days. Results showed a robust decline in locomotion speeds, as well as a reduction in object exploration and forward movement, with an increase in the time spent still. Body size approximations (centroid width), showed a significant decrease from P30. ART performed to a very similar accuracy as manual tracking and Ethovision (a commercially available alternative), with average differences in coordinate points of 0.6 and 0.8mm, respectively. However, it required much less user intervention than Ethovision (6 as opposed to 30 mouse clicks) and worked robustly over more videos. ART provides an open-source option for behavioural analysis of rodents, performing to the same standards as commercially available software. It can be considered a validated, and accessible, alternative for researchers for whom non-invasive quantification of natural rodent behaviour is desirable. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A novel automated behavioral test battery assessing cognitive rigidity in two genetic mouse models of autism.

    Directory of Open Access Journals (Sweden)

    Alicja ePuścian

    2014-04-01

    Full Text Available Repetitive behaviors are a key feature of many pervasive developmental disorders, such as autism. As a heterogeneous group of symptoms, repetitive behaviors are conceptualized into two main subgroups: sensory/motor (lower-order and cognitive rigidity (higher-order. Although lower-order repetitive behaviors are measured in mouse models in several paradigms, so far there have been no high-throughput tests directly measuring cognitive rigidity. We describe a novel approach for monitoring repetitive behaviors during reversal learning in mice in the automated IntelliCage system. During the reward-motivated place preference reversal learning, designed to assess cognitive abilities of mice, visits to the previously rewarded places were recorded to measure cognitive flexibility. Thereafter, emotional flexibility was assessed by measuring conditioned fear extinction. Additionally, to look for neuronal correlates of cognitive impairments, we measured CA3-CA1 hippocampal long term potentiation (LTP. To standardize the designed tests we used C57BL/6 and BALB/c mice, representing two genetic backgrounds, for induction of autism by prenatal exposure to the sodium valproate. We found impairments of place learning related to perseveration and no LTP impairments in C57BL/6 valproate-treated mice. In contrast, BALB/c valproate-treated mice displayed severe deficits of place learning not associated with perseverative behaviors and accompanied by hippocampal LTP impairments. Alterations of cognitive flexibility observed in C57BL/6 valproate-treated mice were related to neither restricted exploration pattern nor to emotional flexibility. Altogether, we showed that the designed tests of cognitive performance and perseverative behaviors are efficient and highly replicable. Moreover, the results suggest that genetic background is crucial for the behavioral effects of prenatal valproate treatment.

  12. Automated Detection and Predictive Modeling of Flux Transfer Events using CLUSTER Data

    Science.gov (United States)

    Sipes, T. B.; Karimabadi, H.; Driscoll, J.; Wang, Y.; Lavraud, B.; Slavin, J. A.

    2006-12-01

    Almost all statistical studies of flux ropes (FTEs) and traveling compression regions (TCRs) have been based on (i) visual inspection of data to compile a list of events and (ii) use of histograms and simple linear correlation analysis to study their properties and potential causes and dependencies. This approach has several major drawbacks including being highly subjective and inefficient. The traditional use of histograms and simple linear correlation analysis is also only useful for analysis of systems that show dominant dependencies on one or two variables at the most. However, if the system has complex dependencies, more sophisticated statistical techniques are required. For example, Wang et al. [2006] showed evidence that FTE occurrence rate are affected by IMF Bygsm, Bzgsm, and magnitude, and the IMF clock, tilt, spiral, and cone angles. If the initial findings were correct that FTEs occur only during periods of southward IMF, one could use the direction of IMF as a predictor of occurrence of FTEs. But in light of Wang et al. result, one cannot draw quantitative conclusions about conditions under which FTEs occur. It may be that a certain combination of these parameters is the true controlling parameter. To uncover this, one needs to deploy more sophisticated techniques. We have developed a new, sophisticated data mining tool called MineTool. MineTool is highly accurate, flexible and capable of handling difficult and even noisy datasets extremely well. It has the ability to outperform standard data mining tools such as artificial neural networks, decision/regression trees and support vector machines. Here we present preliminary results of the application of this tool to the CLUSTER data to perform two tasks: (i) automated detection of FTEs, and (ii) predictive modeling of occurrences of FTEs based on IMF and magnetospheric conditions.

  13. Automated detection of arterial input function in DSC perfusion MRI in a stroke rat model

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, M-Y; Liu, H-L [Graduate Institute of Medical Physics and Imaging Science, Chang Gung University, Taoyuan, Taiwan (China); Lee, T-H; Yang, S-T; Kuo, H-H [Stroke Section, Department of Neurology, Chang Gung Memorial Hospital and Chang Gung University, Taoyuan, Taiwan (China); Chyi, T-K [Molecular Imaging Center Chang Gung Memorial Hospital, Taoyuan, Taiwan (China)], E-mail: hlaliu@mail.cgu.edu.tw

    2009-05-15

    Quantitative cerebral blood flow (CBF) estimation requires deconvolution of the tissue concentration time curves with an arterial input function (AIF). However, image-based determination of AIF in rodent is challenged due to limited spatial resolution. We evaluated the feasibility of quantitative analysis using automated AIF detection and compared the results with commonly applied semi-quantitative analysis. Permanent occlusion of bilateral or unilateral common carotid artery was used to induce cerebral ischemia in rats. The image using dynamic susceptibility contrast method was performed on a 3-T magnetic resonance scanner with a spin-echo echo-planar-image sequence (TR/TE = 700/80 ms, FOV = 41 mm, matrix = 64, 3 slices, SW = 2 mm), starting from 7 s prior to contrast injection (1.2 ml/kg) at four different time points. For quantitative analysis, CBF was calculated by the AIF which was obtained from 10 voxels with greatest contrast enhancement after deconvolution. For semi-quantitative analysis, relative CBF was estimated by the integral divided by the first moment of the relaxivity time curves. We observed if the AIFs obtained in the three different ROIs (whole brain, hemisphere without lesion and hemisphere with lesion) were similar, the CBF ratios (lesion/normal) between quantitative and semi-quantitative analyses might have a similar trend at different operative time points. If the AIFs were different, the CBF ratios might be different. We concluded that using local maximum one can define proper AIF without knowing the anatomical location of arteries in a stroke rat model.

  14. An initial abstraction and constant loss model, and methods for estimating unit hydrographs, peak streamflows, and flood volumes for urban basins in Missouri

    Science.gov (United States)

    Huizinga, Richard J.

    2014-01-01

    Streamflow data, basin characteristics, and rainfall data from 39 streamflow-gaging stations for urban areas in and adjacent to Missouri were used by the U.S. Geological Survey in cooperation with the Metropolitan Sewer District of St. Louis to develop an initial abstraction and constant loss model (a time-distributed basin-loss model) and a gamma unit hydrograph (GUH) for urban areas in Missouri. Study-specific methods to determine peak streamflow and flood volume for a given rainfall event also were developed.

  15. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  16. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  17. Automated Segmentation of Cardiac Magnetic Resonance Images

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Nilsson, Jens Chr.; Grønning, Bjørn A.

    2001-01-01

    Magnetic resonance imaging (MRI) has been shown to be an accurate and precise technique to assess cardiac volumes and function in a non-invasive manner and is generally considered to be the current gold-standard for cardiac imaging [1]. Measurement of ventricular volumes, muscle mass and function...... is based on determination of the left-ventricular endocardial and epicardial borders. Since manual border detection is laborious, automated segmentation is highly desirable as a fast, objective and reproducible alternative. Automated segmentation will thus enhance comparability between and within cardiac...... studies and increase accuracy by allowing acquisition of thinner MRI-slices. This abstract demonstrates that statistical models of shape and appearance, namely the deformable models: Active Appearance Models, can successfully segment cardiac MRIs....

  18. Abstract ID: 240 A probabilistic-based nuclear reaction model for Monte Carlo ion transport in particle therapy.

    Science.gov (United States)

    Maria Jose, Gonzalez Torres; Jürgen, Henniger

    2018-01-01

    In order to expand the Monte Carlo transport program AMOS to particle therapy applications, the ion module is being developed in the radiation physics group (ASP) at the TU Dresden. This module simulates the three main interactions of ions in matter for the therapy energy range: elastic scattering, inelastic collisions and nuclear reactions. The simulation of the elastic scattering is based on the Binary Collision Approximation and the inelastic collisions on the Bethe-Bloch theory. The nuclear reactions, which are the focus of the module, are implemented according to a probabilistic-based model developed in the group. The developed model uses probability density functions to sample the occurrence of a nuclear reaction given the initial energy of the projectile particle as well as the energy at which this reaction will take place. The particle is transported until the reaction energy is reached and then the nuclear reaction is simulated. This approach allows a fast evaluation of the nuclear reactions. The theory and application of the proposed model will be addressed in this presentation. The results of the simulation of a proton beam colliding with tissue will also be presented. Copyright © 2017.

  19. A new automated method for analysis of gated-SPECT images based on a three-dimensional heart shaped model

    DEFF Research Database (Denmark)

    Lomsky, Milan; Richter, Jens; Johansson, Lena

    2005-01-01

    A new automated method for quantification of left ventricular function from gated-single photon emission computed tomography (SPECT) images has been developed. The method for quantification of cardiac function (CAFU) is based on a heart shaped model and the active shape algorithm. The model....... In the patient group the EDV calculated using QGS and CAFU showed good agreement for large hearts and higher CAFU values compared with QGS for the smaller hearts. In the larger hearts, ESV was much larger for QGS than for CAFU both in the phantom and patient studies. In the smallest hearts there was good...

  20. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Dejneko, A.O.

    2011-01-01

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed [ru

  1. Feasibility of rapid and automated importation of 3D echocardiographic left ventricular (LV geometry into a finite element (FEM analysis model

    Directory of Open Access Journals (Sweden)

    Nathan Nadia S

    2004-10-01

    Full Text Available Abstract Background Finite element method (FEM analysis for intraoperative modeling of the left ventricle (LV is presently not possible. Since 3D structural data of the LV is now obtainable using standard transesophageal echocardiography (TEE devices intraoperatively, the present study describes a method to transfer this data into a commercially available FEM analysis system: ABAQUS©. Methods In this prospective study TomTec LV Analysis TEE© Software was used for semi-automatic endocardial border detection, reconstruction, and volume-rendering of the clinical 3D echocardiographic data. A newly developed software program MVCP FemCoGen©, written in Delphi, reformats the TomTec file structures in five patients for use in ABAQUS and allows visualization of regional deformation of the LV. Results This study demonstrates that a fully automated importation of 3D TEE data into FEM modeling is feasible and can be efficiently accomplished in the operating room. Conclusion For complete intraoperative 3D LV finite element analysis, three input elements are necessary: 1. time-gaited, reality-based structural information, 2. continuous LV pressure and 3. instantaneous tissue elastance. The first of these elements is now available using the methods presented herein.

  2. Automated Battle Planning for Combat Models with Maneuver and Fire Support

    Science.gov (United States)

    2017-03-01

    World Terrain includes features to import roads , buildings, rivers, trees, and grass from map data, but we have not experimented with these. D...level commensurate with an average commander. In other words, we would prefer automated planning to be done intelligently, but not ingeniously

  3. Models of Automation surprise : results of a field survey in aviation

    NARCIS (Netherlands)

    De Boer, Robert; Dekker, Sidney

    2017-01-01

    Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration

  4. The Value of Information in Automated Negotiation: A Decision Model for Eliciting User Preferences

    NARCIS (Netherlands)

    T. Baarslag (Tim); M. Kaisers (Michael)

    2017-01-01

    textabstractConsider an agent that can autonomously negotiate and coordinate with others in our stead, to reach outcomes and agreements in our interest. Such automated negotiation agents are already common practice in areas such as high frequency trading, and are now finding applications in domains

  5. Transcranial Magnetic Stimulation: An Automated Procedure to Obtain Coil-specific Models for Field Calculations

    DEFF Research Database (Denmark)

    Madsen, Kristoffer Hougaard; Ewald, Lars; Siebner, Hartwig R.

    2015-01-01

    potential of the TMS coils. Objective: To develop an approach to reconstruct the magnetic vector potential based on automated measurements. Methods: We implemented a setup that simultaneously measures the three components of the magnetic field with high spatial resolution. This is complemented by a novel...

  6. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    Science.gov (United States)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  7. Agent-Based Modelling of Agricultural Water Abstraction in Response to Climate, Policy, and Demand Changes: Results from East Anglia, UK

    Science.gov (United States)

    Swinscoe, T. H. A.; Knoeri, C.; Fleskens, L.; Barrett, J.

    2014-12-01

    Freshwater is a vital natural resource for multiple needs, such as drinking water for the public, industrial processes, hydropower for energy companies, and irrigation for agriculture. In the UK, crop production is the largest in East Anglia, while at the same time the region is also the driest, with average annual rainfall between 560 and 720 mm (1971 to 2000). Many water catchments of East Anglia are reported as over licensed or over abstracted. Therefore, freshwater available for agricultural irrigation abstraction in this region is becoming both increasingly scarce due to competing demands, and increasingly variable and uncertain due to climate and policy changes. It is vital for water users and policy makers to understand how these factors will affect individual abstractors and water resource management at the system level. We present first results of an Agent-based Model that captures the complexity of this system as individual abstractors interact, learn and adapt to these internal and external changes. The purpose of this model is to simulate what patterns of water resource management emerge on the system level based on local interactions, adaptations and behaviours, and what policies lead to a sustainable water resource management system. The model is based on an irrigation abstractor typology derived from a survey in the study area, to capture individual behavioural intentions under a range of water availability scenarios, in addition to farm attributes, and demographics. Regional climate change scenarios, current and new abstraction licence reforms by the UK regulator, such as water trading and water shares, and estimated demand increases from other sectors were used as additional input data. Findings from the integrated model provide new understanding of the patterns of water resource management likely to emerge at the system level.

  8. Automated segmentation of mouse OCT volumes (ASiMOV): Validation & clinical study of a light damage model.

    Science.gov (United States)

    Antony, Bhavna Josephine; Kim, Byung-Jin; Lang, Andrew; Carass, Aaron; Prince, Jerry L; Zack, Donald J

    2017-01-01

    The use of spectral-domain optical coherence tomography (SD-OCT) is becoming commonplace for the in vivo longitudinal study of murine models of ophthalmic disease. Longitudinal studies, however, generate large quantities of data, the manual analysis of which is very challenging due to the time-consuming nature of generating delineations. Thus, it is of importance that automated algorithms be developed to facilitate accurate and timely analysis of these large datasets. Furthermore, as the models target a variety of diseases, the associated structural changes can also be extremely disparate. For instance, in the light damage (LD) model, which is frequently used to study photoreceptor degeneration, the outer retina appears dramatically different from the normal retina. To address these concerns, we have developed a flexible graph-based algorithm for the automated segmentation of mouse OCT volumes (ASiMOV). This approach incorporates a machine-learning component that can be easily trained for different disease models. To validate ASiMOV, the automated results were compared to manual delineations obtained from three raters on healthy and BALB/cJ mice post LD. It was also used to study a longitudinal LD model, where five control and five LD mice were imaged at four timepoints post LD. The total retinal thickness and the outer retina (comprising the outer nuclear layer, and inner and outer segments of the photoreceptors) were unchanged the day after the LD, but subsequently thinned significantly (p < 0.01). The retinal nerve fiber-ganglion cell complex and the inner plexiform layers, however, remained unchanged for the duration of the study.

  9. Work Practice Simulation of Complex Human-Automation Systems in Safety Critical Situations: The Brahms Generalized berlingen Model

    Science.gov (United States)

    Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael

    2013-01-01

    The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on­-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an

  10. Automation of measurement of heights waves around a model ship; Mokeisen mawari no hako keisoku no jidoka

    Energy Technology Data Exchange (ETDEWEB)

    Ikehata, M.; Kato, M.; Yanagida, F. [Yokohama National University, Yokohama (Japan). Faculty of Engineering

    1997-10-01

    Trial fabrication and tests were performed on an instrument to automate measurement of heights of waves around a model ship. The currently used electric wave height measuring instrument takes long time for measurement, hence poor in efficiency. The method for processing optical images also has a problem in accuracy. Therefore, a computer controlled system was structured by using AC servo motors in driving the X and Y axes of a traverse equipment. Equipment was fabricated to automate the wave height measurement, in which four servo type wave height meters are installed on a moving rack in the lateral (Y-axial) direction so that wave heights to be measured by four meters can be measured automatically all at once. Wave heights can be measured continuously by moving the moving rack at a constant speed, verifying that wave shapes in longitudinal cross sections can be acquired by only one towing. Time required in the measurements using the instrument was 40 hours as a net time for fixed point measurement and 12 hours for continuous measurement, or 52 hours in total. On the other hand, the time may reach 240 hours for fixed point measurement when the conventional all-point manual traverse equipment is used. Enormous effects were obtained from automating the instrument. Collection of wave height data will continue also on tankers and other types of ships. 2 refs., 8 figs., 1 tab.

  11. Compilation of Theses Abstracts

    National Research Council Canada - National Science Library

    2005-01-01

    This publication contains unclassified/unrestricted abstracts of classified or restricted theses submitted for the degrees of Doctor of Philosophy, Master of Business Administration, Master of Science...

  12. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    and class instantiations. Our teaching experience shows that many novice programmers find it difficult to write programs with abstractions that materialise to concrete objects later in the development process. The contribution of this paper is the idea of initiating a programming process by creating......In this paper we discuss computational abstraction steps as a way to create class abstractions from concrete objects, and from examples. Computational abstraction steps are regarded as symmetric counterparts to computational concretisation steps, which are well-known in terms of function calls...

  13. Nuclear medicine. Abstracts; Nuklearmedizin 2000. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2000-07-01

    This issue of the journal contains the abstracts of the 183 conference papers as well as 266 posters presented at the conference. Subject fields covered are: Neurology, psychology, oncology, pediatrics, radiopharmacy, endocrinology, EDP, measuring equipment and methods, radiological protection, cardiology, and therapy. (orig./CB) [German] Die vorliegende Zeitschrift enthaelt die Kurzfassungen der 183 auf der Tagung gehaltenen Vortraege sowie der 226 praesentierten Poster, die sich mit den folgenden Themen befassten: Neurologie, Psychiatrie, Onkologie, Paediatrie, Radiopharmazie, Endokrinologie, EDV, Messtechnik, Strahlenschutz, Kardiologie sowie Therapie. (MG)

  14. Development of a semi-automated method for mitral valve modeling with medial axis representation using 3D ultrasound.

    Science.gov (United States)

    Pouch, Alison M; Yushkevich, Paul A; Jackson, Benjamin M; Jassar, Arminder S; Vergnat, Mathieu; Gorman, Joseph H; Gorman, Robert C; Sehgal, Chandra M

    2012-02-01

    Precise 3D modeling of the mitral valve has the potential to improve our understanding of valve morphology, particularly in the setting of mitral regurgitation (MR). Toward this goal, the authors have developed a user-initialized algorithm for reconstructing valve geometry from transesophageal 3D ultrasound (3D US) image data. Semi-automated image analysis was performed on transesophageal 3D US images obtained from 14 subjects with MR ranging from trace to severe. Image analysis of the mitral valve at midsystole had two stages: user-initialized segmentation and 3D deformable modeling with continuous medial representation (cm-rep). Semi-automated segmentation began with user-identification of valve location in 2D projection images generated from 3D US data. The mitral leaflets were then automatically segmented in 3D using the level set method. Second, a bileaflet deformable medial model was fitted to the binary valve segmentation by Bayesian optimization. The resulting cm-rep provided a visual reconstruction of the mitral valve, from which localized measurements of valve morphology were automatically derived. The features extracted from the fitted cm-rep included annular area, annular circumference, annular height, intercommissural width, septolateral length, total tenting volume, and percent anterior tenting volume. These measurements were compared to those obtained by expert manual tracing. Regurgitant orifice area (ROA) measurements were compared to qualitative assessments of MR severity. The accuracy of valve shape representation with cm-rep was evaluated in terms of the Dice overlap between the fitted cm-rep and its target segmentation. The morphological features and anatomic ROA derived from semi-automated image analysis were consistent with manual tracing of 3D US image data and with qualitative assessments of MR severity made on clinical radiology. The fitted cm-reps accurately captured valve shape and demonstrated patient-specific differences in valve

  15. Abstraction of Drift Seepage

    Energy Technology Data Exchange (ETDEWEB)

    J.T. Birkholzer

    2004-11-01

    This model report documents the abstraction of drift seepage, conducted to provide seepage-relevant parameters and their probability distributions for use in Total System Performance Assessment for License Application (TSPA-LA). Drift seepage refers to the flow of liquid water into waste emplacement drifts. Water that seeps into drifts may contact waste packages and potentially mobilize radionuclides, and may result in advective transport of radionuclides through breached waste packages [''Risk Information to Support Prioritization of Performance Assessment Models'' (BSC 2003 [DIRS 168796], Section 3.3.2)]. The unsaturated rock layers overlying and hosting the repository form a natural barrier that reduces the amount of water entering emplacement drifts by natural subsurface processes. For example, drift seepage is limited by the capillary barrier forming at the drift crown, which decreases or even eliminates water flow from the unsaturated fractured rock into the drift. During the first few hundred years after waste emplacement, when above-boiling rock temperatures will develop as a result of heat generated by the decay of the radioactive waste, vaporization of percolation water is an additional factor limiting seepage. Estimating the effectiveness of these natural barrier capabilities and predicting the amount of seepage into drifts is an important aspect of assessing the performance of the repository. The TSPA-LA therefore includes a seepage component that calculates the amount of seepage into drifts [''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504], Section 6.3.3.1)]. The TSPA-LA calculation is performed with a probabilistic approach that accounts for the spatial and temporal variability and inherent uncertainty of seepage-relevant properties and processes. Results are used for subsequent TSPA-LA components that may handle, for example, waste package

  16. Abstraction of Drift Seepage

    International Nuclear Information System (INIS)

    J.T. Birkholzer

    2004-01-01

    This model report documents the abstraction of drift seepage, conducted to provide seepage-relevant parameters and their probability distributions for use in Total System Performance Assessment for License Application (TSPA-LA). Drift seepage refers to the flow of liquid water into waste emplacement drifts. Water that seeps into drifts may contact waste packages and potentially mobilize radionuclides, and may result in advective transport of radionuclides through breached waste packages [''Risk Information to Support Prioritization of Performance Assessment Models'' (BSC 2003 [DIRS 168796], Section 3.3.2)]. The unsaturated rock layers overlying and hosting the repository form a natural barrier that reduces the amount of water entering emplacement drifts by natural subsurface processes. For example, drift seepage is limited by the capillary barrier forming at the drift crown, which decreases or even eliminates water flow from the unsaturated fractured rock into the drift. During the first few hundred years after waste emplacement, when above-boiling rock temperatures will develop as a result of heat generated by the decay of the radioactive waste, vaporization of percolation water is an additional factor limiting seepage. Estimating the effectiveness of these natural barrier capabilities and predicting the amount of seepage into drifts is an important aspect of assessing the performance of the repository. The TSPA-LA therefore includes a seepage component that calculates the amount of seepage into drifts [''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504], Section 6.3.3.1)]. The TSPA-LA calculation is performed with a probabilistic approach that accounts for the spatial and temporal variability and inherent uncertainty of seepage-relevant properties and processes. Results are used for subsequent TSPA-LA components that may handle, for example, waste package corrosion or radionuclide transport

  17. Mathematical model of elements of automated system of loose materials dosing

    OpenAIRE

    Kozak, Andriy

    2016-01-01

    Automatic dosing system for loose materials is widely used in construction, food and pharmaceutical industries to prepare various mixtures. The main criterion for optimization of such systems is the accuracy of dosing of each component of the mixture, which depends on the speed component dosing and other process factors. Technological requirements for product quality in production and the high cost of individual components of the mixture strictly regulate the developers of the automated syste...

  18. Truthful Monadic Abstractions

    DEFF Research Database (Denmark)

    Brock-Nannestad, Taus; Schürmann, Carsten

    2012-01-01

    indefinitely, finding neither a proof nor a disproof of a given subgoal. In this paper we characterize a family of truth-preserving abstractions from intuitionistic first-order logic to the monadic fragment of classical first-order logic. Because they are truthful, these abstractions can be used to disprove...

  19. Check Sample Abstracts.

    Science.gov (United States)

    Alter, David; Grenache, David G; Bosler, David S; Karcher, Raymond E; Nichols, James; Rajadhyaksha, Aparna; Camelo-Piragua, Sandra; Rauch, Carol; Huddleston, Brent J; Frank, Elizabeth L; Sluss, Patrick M; Lewandrowski, Kent; Eichhorn, John H; Hall, Janet E; Rahman, Saud S; McPherson, Richard A; Kiechle, Frederick L; Hammett-Stabler, Catherine; Pierce, Kristin A; Kloehn, Erica A; Thomas, Patricia A; Walts, Ann E; Madan, Rashna; Schlesinger, Kathie; Nawgiri, Ranjana; Bhutani, Manoop; Kanber, Yonca; Abati, Andrea; Atkins, Kristen A; Farrar, Robert; Gopez, Evelyn Valencerina; Jhala, Darshana; Griffin, Sonya; Jhala, Khushboo; Jhala, Nirag; Bentz, Joel S; Emerson, Lyska; Chadwick, Barbara E; Barroeta, Julieta E; Baloch, Zubair W; Collins, Brian T; Middleton, Owen L; Davis, Gregory G; Haden-Pinneri, Kathryn; Chu, Albert Y; Keylock, Joren B; Ramoso, Robert; Thoene, Cynthia A; Stewart, Donna; Pierce, Arand; Barry, Michelle; Aljinovic, Nika; Gardner, David L; Barry, Michelle; Shields, Lisa B E; Arnold, Jack; Stewart, Donna; Martin, Erica L; Rakow, Rex J; Paddock, Christopher; Zaki, Sherif R; Prahlow, Joseph A; Stewart, Donna; Shields, Lisa B E; Rolf, Cristin M; Falzon, Andrew L; Hudacki, Rachel; Mazzella, Fermina M; Bethel, Melissa; Zarrin-Khameh, Neda; Gresik, M Vicky; Gill, Ryan; Karlon, William; Etzell, Joan; Deftos, Michael; Karlon, William J; Etzell, Joan E; Wang, Endi; Lu, Chuanyi M; Manion, Elizabeth; Rosenthal, Nancy; Wang, Endi; Lu, Chuanyi M; Tang, Patrick; Petric, Martin; Schade, Andrew E; Hall, Geraldine S; Oethinger, Margret; Hall, Geraldine; Picton, Avis R; Hoang, Linda; Imperial, Miguel Ranoa; Kibsey, Pamela; Waites, Ken; Duffy, Lynn; Hall, Geraldine S; Salangsang, Jo-Anne M; Bravo, Lulette Tricia C; Oethinger, Margaret D; Veras, Emanuela; Silva, Elvia; Vicens, Jimena; Silva, Elvio; Keylock, Joren; Hempel, James; Rushing, Elizabeth; Posligua, Lorena E; Deavers, Michael T; Nash, Jason W; Basturk, Olca; Perle, Mary Ann; Greco, Alba; Lee, Peng; Maru, Dipen; Weydert, Jamie Allen; Stevens, Todd M; Brownlee, Noel A; Kemper, April E; Williams, H James; Oliverio, Brock J; Al-Agha, Osama M; Eskue, Kyle L; Newlands, Shawn D; Eltorky, Mahmoud A; Puri, Puja K; Royer, Michael C; Rush, Walter L; Tavora, Fabio; Galvin, Jeffrey R; Franks, Teri J; Carter, James Elliot; Kahn, Andrea Graciela; Lozada Muñoz, Luis R; Houghton, Dan; Land, Kevin J; Nester, Theresa; Gildea, Jacob; Lefkowitz, Jerry; Lacount, Rachel A; Thompson, Hannis W; Refaai, Majed A; Quillen, Karen; Lopez, Ana Ortega; Goldfinger, Dennis; Muram, Talia; Thompson, Hannis

    2009-02-01

    The following abstracts are compiled from Check Sample exercises published in 2008. These peer-reviewed case studies assist laboratory professionals with continuing medical education and are developed in the areas of clinical chemistry, cytopathology, forensic pathology, hematology, microbiology, surgical pathology, and transfusion medicine. Abstracts for all exercises published in the program will appear annually in AJCP.

  20. Program and abstracts

    International Nuclear Information System (INIS)

    1976-01-01

    Abstracts of the papers given at the conference are presented. The abstracts are arranged under sessions entitled: Theoretical Physics; Nuclear Physics; Solid State Physics; Spectroscopy; Plasma Physics; Solar-Terrestrial Physics; Astrophysics and Astronomy; Radioastronomy; General Physics; Applied Physics; Industrial Physics

  1. In-Package Chemistry Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    P.S. Domski

    2003-07-21

    The work associated with the development of this model report was performed in accordance with the requirements established in ''Technical Work Plan for Waste Form Degradation Modeling, Testing, and Analyses in Support of SR and LA'' (BSC 2002a). The in-package chemistry model and in-package chemistry model abstraction are developed to predict the bulk chemistry inside of a failed waste package and to provide simplified expressions of that chemistry. The purpose of this work is to provide the abstraction model to the Performance Assessment Project and the Waste Form Department for development of geochemical models of the waste package interior. The scope of this model report is to describe the development and validation of the in-package chemistry model and in-package chemistry model abstraction. The in-package chemistry model will consider chemical interactions of water with the waste package materials and the waste form for commercial spent nuclear fuel (CSNF) and codisposed high-level waste glass (HLWG) and N Reactor spent fuel (CDNR). The in-package chemistry model includes two sub-models, the first a water vapor condensation (WVC) model, where water enters a waste package as vapor and forms a film on the waste package components with subsequent film reactions with the waste package materials and waste form--this is a no-flow model, the reacted fluids do not exit the waste package via advection. The second sub-model of the in-package chemistry model is the seepage dripping model (SDM), where water, water that may have seeped into the repository from the surrounding rock, enters a failed waste package and reacts with the waste package components and waste form, and then exits the waste package with no accumulation of reacted water in the waste package. Both of the submodels of the in-package chemistry model are film models in contrast to past in-package chemistry models where all of the waste package pore space was filled with water. The

  2. Water planning in a mixed land use Mediterranean area: point-source abstraction and pollution scenarios by a numerical model of varying stream-aquifer regime.

    Science.gov (United States)

    Du, Mingxuan; Fouché, Olivier; Zavattero, Elodie; Ma, Qiang; Delestre, Olivier; Gourbesville, Philippe

    2018-02-22

    Integrated hydrodynamic modelling is an efficient approach for making semi-quantitative scenarios reliable enough for groundwater management, provided that the numerical simulations are from a validated model. The model set-up, however, involves many inputs due to the complexity of both the hydrological system and the land use. The case study of a Mediterranean alluvial unconfined aquifer in the lower Var valley (Southern France) is useful to test a method to estimate lacking data on water abstraction by small farms in urban context. With this estimation of the undocumented pumping volumes, and after calibration of the exchange parameters of the stream-aquifer system with the help of a river model, the groundwater flow model shows a high goodness of fit with the measured potentiometric levels. The consistency between simulated results and real behaviour of the system, with regard to the observed effects of lowering weirs and previously published hydrochemistry data, confirms reliability of the groundwater flow model. On the other hand, accuracy of the transport model output may be influenced by many parameters, many of which are not derived from field measurements. In this case study, for which river-aquifer feeding is the main control, the partition coefficient between direct recharge and runoff does not show a significant effect on the transport model output, and therefore, uncertainty of the hydrological terms such as evapotranspiration and runoff is not a first-rank issue to the pollution propagation. The simulation of pollution scenarios with the model returns expected pessimistic outputs, with regard to hazard management. The model is now ready to be used in a decision support system by the local water supply managers.

  3. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    Science.gov (United States)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  4. Completeness of Lyapunov Abstraction

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Sloth, Christoffer

    2013-01-01

    the vector field, which allows the generation of a complete abstraction. To compute the functions that define the subdivision of the state space in an algorithm, we formulate a sum of squares optimization problem. This optimization problem finds the best subdivisioning functions, with respect to the ability......This paper addresses the generation of complete abstractions of polynomial dynamical systems by timed automata. For the proposed abstraction, the state space is divided into cells by sublevel sets of functions. We identify a relation between these functions and their directional derivatives along...

  5. Toward the virtual cell: Automated approaches to building models of subcellular organization “learned” from microscopy images

    Science.gov (United States)

    Buck, Taráz E.; Li, Jieyue; Rohde, Gustavo K.; Murphy, Robert F.

    2012-01-01

    We review state-of-the-art computational methods for constructing, from image data, generative statistical models of cellular and nuclear shapes and the arrangement of subcellular structures and proteins within them. These automated approaches allow consistent analysis of images of cells for the purposes of learning the range of possible phenotypes, discriminating between them, and informing further investigation. Such models can also provide realistic geometry and initial protein locations to simulations in order to better understand cellular and subcellular processes. To determine the structures of cellular components and how proteins and other molecules are distributed among them, the generative modeling approach described here can be coupled with high throughput imaging technology to infer and represent subcellular organization from data with few a priori assumptions. We also discuss potential improvements to these methods and future directions for research. PMID:22777818

  6. Adaptive Automation Based on an Object-Oriented Task Model: Implementation and Evaluation in a Realistic C2 Environment

    NARCIS (Netherlands)

    Greef, T.E.; Arciszewski, H.F.R.; Neerincx, M.A.

    2010-01-01

    Staffing reduction initiatives and more complicated military operations lead to a higher cognitive workload in command and control (C2) environments. Extending automation with adaptive capabilities can aid the human in overcoming cognitive workload challenges. At present, most adaptive automation

  7. Science meeting. Abstracts

    International Nuclear Information System (INIS)

    2000-01-01

    the document is a collection of the science meeting abstracts in the fields of nuclear physics, medical sciences, chemistry, agriculture, environment, engineering, material sciences different aspects of energy and presents research done in 2000 in these fields

  8. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    To most people the concept of abstract machines is connected to the name of Alan Turing and the development of the modern computer. The Turing machine is universal, axiomatic and symbolic (E.g. operating on symbols). Inspired by Foucault, Deleuze and Guattari extended the concept of abstract...... machines to singular, non-axiomatic and diagrammatic machines. That is: Machines which constitute becomings. This presentation gives a survey of the development of the concept of abstract machines in the philosophy of Deleuze and Guatari and the function of these abstract machines in the creation of works...... of art. From Difference and Repetition to Anti-Oedipus, the machines are conceived as binary machines based on the exclusive or inclusive use respectively of the three syntheses: conexa, disjuncta and conjuncta. The machines have a twofold embedment: In the desiring-production and in the social...

  9. Mathematical games, abstract games

    CERN Document Server

    Neto, Joao Pedro

    2013-01-01

    User-friendly, visually appealing collection offers both new and classic strategic board games. Includes abstract games for two and three players and mathematical games such as Nim and games on graphs.

  10. Introduction to abstract algebra

    CERN Document Server

    Smith, Jonathan D H

    2008-01-01

    Taking a slightly different approach from similar texts, Introduction to Abstract Algebra presents abstract algebra as the main tool underlying discrete mathematics and the digital world. It helps students fully understand groups, rings, semigroups, and monoids by rigorously building concepts from first principles. A Quick Introduction to Algebra The first three chapters of the book show how functional composition, cycle notation for permutations, and matrix notation for linear functions provide techniques for practical computation. The author also uses equivalence relations to introduc

  11. Abstracts of contributed papers

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    This volume contains 571 abstracts of contributed papers to be presented during the Twelfth US National Congress of Applied Mechanics. Abstracts are arranged in the order in which they fall in the program -- the main sessions are listed chronologically in the Table of Contents. The Author Index is in alphabetical order and lists each paper number (matching the schedule in the Final Program) with its corresponding page number in the book.

  12. Extending and applying active appearance models for automated, high precision segmentation in different image modalities

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Fisker, Rune; Ersbøll, Bjarne Kjær

    2001-01-01

    , an initialization scheme is designed thus making the usage of AAMs fully automated. Using these extensions it is demonstrated that AAMs can segment bone structures in radiographs, pork chops in perspective images and the left ventricle in cardiovascular magnetic resonance images in a robust, fast and accurate...... object class description, which can be employed to rapidly search images for new object instances. The proposed extensions concern enhanced shape representation, handling of homogeneous and heterogeneous textures, refinement optimization using Simulated Annealing and robust statistics. Finally...

  13. Automated Instrumentation System Verification.

    Science.gov (United States)

    1983-04-01

    fUig JDma Entered) i. _-_J I ___________ UNCLASSI FI ED SECURITY CLASSIFICATION OF TIHIS PAGE(II7,m Daca Entod) 20. ABSTRACT (Continued). ) contain...automatic measurement should arise. 15 I "_......_______.....____,_.........____ _ ’ " AFWL-TR-82-137 11. TRADITIONAL PROCEDURES The necessity to measure data...measurement (Ref. 8). Finally, when the necessity for automation was recognized and funds were provided, the effort described in this report was started

  14. Automated ISMS control auditability

    OpenAIRE

    Suomu, Mikko

    2015-01-01

    This thesis focuses on researching a possible reference model for automated ISMS’s (Information Security Management System) technical control auditability. The main objective was to develop a generic framework for automated compliance status monitoring of the ISO27001:2013 standard which could be re‐used in any ISMS system. The framework was tested with Proof of Concept (PoC) empirical research in a test infrastructure which simulates the framework target deployment environment. To fulfi...

  15. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  16. A knowledge- and model-based system for automated weaning from mechanical ventilation: technical description and first clinical application.

    Science.gov (United States)

    Schädler, Dirk; Mersmann, Stefan; Frerichs, Inéz; Elke, Gunnar; Semmel-Griebeler, Thomas; Noll, Oliver; Pulletz, Sven; Zick, Günther; David, Matthias; Heinrichs, Wolfgang; Scholz, Jens; Weiler, Norbert

    2014-10-01

    To describe the principles and the first clinical application of a novel prototype automated weaning system called Evita Weaning System (EWS). EWS allows an automated control of all ventilator settings in pressure controlled and pressure support mode with the aim of decreasing the respiratory load of mechanical ventilation. Respiratory load takes inspired fraction of oxygen, positive end-expiratory pressure, pressure amplitude and spontaneous breathing activity into account. Spontaneous breathing activity is assessed by the number of controlled breaths needed to maintain a predefined respiratory rate. EWS was implemented as a knowledge- and model-based system that autonomously and remotely controlled a mechanical ventilator (Evita 4, Dräger Medical, Lübeck, Germany). In a selected case study (n = 19 patients), ventilator settings chosen by the responsible physician were compared with the settings 10 min after the start of EWS and at the end of the study session. Neither unsafe ventilator settings nor failure of the system occurred. All patients were successfully transferred from controlled ventilation to assisted spontaneous breathing in a mean time of 37 ± 17 min (± SD). Early settings applied by the EWS did not significantly differ from the initial settings, except for the fraction of oxygen in inspired gas. During the later course, EWS significantly modified most of the ventilator settings and reduced the imposed respiratory load. A novel prototype automated weaning system was successfully developed. The first clinical application of EWS revealed that its operation was stable, safe ventilator settings were defined and the respiratory load of mechanical ventilation was decreased.

  17. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  18. H-Abstraction reactions by OH, HO2, O, O2 and benzyl radical addition to O2 and their implications for kinetic modelling of toluene oxidation.

    Science.gov (United States)

    Pelucchi, M; Cavallotti, C; Faravelli, T; Klippenstein, S J

    2018-02-01

    Alkylated aromatics constitute a significant fraction of the components commonly found in commercial fuels. Toluene is typically considered as a reference fuel. Together with n-heptane and iso-octane, it allows for realistic emulations of the behavior of real fuels by the means of surrogate mixture formulations. Moreover, it is a key precursor for the formation of poly-aromatic hydrocarbons, which are of relevance to understanding soot growth and oxidation mechanisms. In this study the POLIMI kinetic model is first updated based on the literature and on recent kinetic modelling studies of toluene pyrolysis and oxidation. Then, important reaction pathways are investigated by means of high-level theoretical methods, thereby advancing the present knowledge on toluene oxidation. H-Abstraction reactions by OH, HO 2 , O and O 2 , and the reactivity on the multi well benzyl-oxygen (C 6 H 5 CH 2 + O 2 ) potential energy surface (PES) were investigated using electronic structure calculations, transition state theory in its conventional, variational, and variable reaction coordinate forms (VRC-TST), and master equation calculations. Exploration of the effect on POLIMI model performance of literature rate constants and of the present calculations provides valuable guidelines for implementation of the new rate parameters in existing toluene kinetic models.

  19. Automation of block assignment planning using a diagram-based scenario modeling method

    Science.gov (United States)

    Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye

    2014-03-01

    Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  20. MeGARA: Menu-based Game Abstraction and Abstraction Refinement of Markov Automata

    Directory of Open Access Journals (Sweden)

    Bettina Braitling

    2014-06-01

    Full Text Available Markov automata combine continuous time, probabilistic transitions, and nondeterminism in a single model. They represent an important and powerful way to model a wide range of complex real-life systems. However, such models tend to be large and difficult to handle, making abstraction and abstraction refinement necessary. In this paper we present an abstraction and abstraction refinement technique for Markov automata, based on the game-based and menu-based abstraction of probabilistic automata. First experiments show that a significant reduction in size is possible using abstraction.

  1. Metacognition and abstract reasoning.

    Science.gov (United States)

    Markovits, Henry; Thompson, Valerie A; Brisson, Janie

    2015-05-01

    The nature of people's meta-representations of deductive reasoning is critical to understanding how people control their own reasoning processes. We conducted two studies to examine whether people have a metacognitive representation of abstract validity and whether familiarity alone acts as a separate metacognitive cue. In Study 1, participants were asked to make a series of (1) abstract conditional inferences, (2) concrete conditional inferences with premises having many potential alternative antecedents and thus specifically conducive to the production of responses consistent with conditional logic, or (3) concrete problems with premises having relatively few potential alternative antecedents. Participants gave confidence ratings after each inference. Results show that confidence ratings were positively correlated with logical performance on abstract problems and concrete problems with many potential alternatives, but not with concrete problems with content less conducive to normative responses. Confidence ratings were higher with few alternatives than for abstract content. Study 2 used a generation of contrary-to-fact alternatives task to improve levels of abstract logical performance. The resulting increase in logical performance was mirrored by increases in mean confidence ratings. Results provide evidence for a metacognitive representation based on logical validity, and show that familiarity acts as a separate metacognitive cue.

  2. The construct of state-level suspicion: a model and research agenda for automated and information technology (IT) contexts.

    Science.gov (United States)

    Bobko, Philip; Barelka, Alex J; Hirshfield, Leanne M

    2014-05-01

    The objective was to review and integrate available research about the construct of state-level suspicion as it appears in social science literatures and apply the resulting findings to information technology (IT) contexts. Although the human factors literature is replete with articles about trust (and distrust) in automation, there is little on the related, but distinct, construct of "suspicion" (in either automated or IT contexts). The construct of suspicion--its precise definition, theoretical correlates, and role in such applications--deserves further study. Literatures that consider suspicion are reviewed and integrated. Literatures include communication, psychology, human factors, management, marketing, information technology, and brain/neurology. We first develop a generic model of state-level suspicion. Research propositions are then derived within IT contexts. Fundamental components of suspicion include (a) uncertainty, (b) increased cognitive processing (e.g., generation of alternative explanations for perceived discrepancies), and (c) perceptions of (mal)intent. State suspicion is defined as the simultaneous occurrence of these three components. Our analysis also suggests that trust inhibits suspicion, whereas distrust can be a catalyst of state-level suspicion. Based on a three-stage model of state-level suspicion, associated research propositions and questions are developed. These propositions and questions are intended to help guide future work on the measurement of suspicion (self-report and neurological), as well as the role of the construct of suspicion in models of decision making and detection of deception. The study of suspicion, including its correlates, antecedents, and consequences, is important. We hope that the social sciences will benefit from our integrated definition and model of state suspicion. The research propositions regarding suspicion in IT contexts should motivate substantial research in human factors and related fields.

  3. Automated quantification and sizing of unbranched filamentous cyanobacteria by model-based object-oriented image analysis.

    Science.gov (United States)

    Zeder, Michael; Van den Wyngaert, Silke; Köster, Oliver; Felder, Kathrin M; Pernthaler, Jakob

    2010-03-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-oriented image analysis to simultaneously determine (i) filament number, (ii) individual filament lengths, and (iii) the cumulative filament length of unbranched cyanobacterial morphotypes in fluorescent microscope images in a fully automated high-throughput manner. Special emphasis was placed on correct detection of overlapping objects by image analysis and on appropriate coverage of filament length distribution by using large composite images. The method was validated with a data set for Planktothrix rubescens from field samples and was compared with manual filament tracing, the line intercept method, and the Utermöhl counting approach. The computer program described allows batch processing of large images from any appropriate source and annotation of detected filaments. It requires no user interaction, is available free, and thus might be a useful tool for basic research and drinking water quality control.

  4. Abstract Objects of Verbs

    DEFF Research Database (Denmark)

    Robering, Klaus

    2014-01-01

    Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which these obj......Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which...

  5. An Automated Feedback System Based on Adaptive Testing: Extending the Model

    Directory of Open Access Journals (Sweden)

    Trevor Barker

    2010-06-01

    Full Text Available Abstract—The results of the recent national students survey (NSS revealed that a major problem in HE today is that of student feedback. Research carried out by members of the project team in the past has led to the development of an automated student feedback system for use with objective formative testing. This software relies on an ‘intelligent’ engine to determine the most appropriate individual feedback, based on test performance, relating not only to answers, but also to Bloom’s cognitive levels. The system also recommends additional materials and challenges, for each individual learner. Detailed evaluation with more than 500 students and 100 university staff have shown that the system is highly valued by learners and seen by staff as an important addition to the methods available. The software has been used on two modules so far over a two year period

  6. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  7. Benefit of modelling regarding the quality and efficiency of PLC-programming in process automation; Nutzen von Modellierung fuer die Qualitaet und Effizienz der Steuerungsprogrammierung in der Automatisierungstechnik

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, D.; Vogel-Heuser, B. [Bergische Univ. Wuppertal (Germany). Lehrstuhl fuer Automatisierungstechnik/Prozessinformatik

    2006-03-15

    Software development in process automation has many deficiencies in procedures, notations and tool support. As a result, modern software engineering concepts and notations, like object oriented approaches or UML, are not wide spread in this field. Hence, drawbacks regarding start-up times, additional costs and low software quality are immense. This paper will evaluate the benefit of modelling as a design step prior to coding, regarding cognitive paradigms. Two modelling notations (UML and ICL) will be compared analyzing their impact on the quality of automation software written in IEC 61131. (orig.)

  8. Formal Abstractions for Automated Verification and Synthesis of Stochastic Systems

    NARCIS (Netherlands)

    Esmaeil Zadeh Soudjani, S.

    2014-01-01

    Stochastic hybrid systems involve the coupling of discrete, continuous, and probabilistic phenomena, in which the composition of continuous and discrete variables captures the behavior of physical systems interacting with digital, computational devices. Because of their versatility and generality,

  9. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    Science.gov (United States)

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty

  10. Monadic abstract interpreters

    DEFF Research Database (Denmark)

    Sergey, Ilya; Devriese, Dominique; Might, Matthew

    2013-01-01

    -insensitive analysis. To achieve this unification, we develop a systematic method for transforming a concrete semantics into a monadically-parameterized abstract machine. Changing the monad changes the behavior of the machine. By changing the monad, we recover a spectrum of machines—from the original concrete...

  11. WWNPQFT-2013 - Abstracts

    International Nuclear Information System (INIS)

    Cessac, B.; Bianchi, E.; Bellon, M.; Fried, H.; Krajewski, T.; Schubert, C.; Barre, J.; Hofmann, R.; Muller, B.; Raffaelli, B.

    2014-01-01

    The object of this Workshop is to consolidate and publicize new efforts in non perturbative-like Field Theories, relying in Functional Methods, Renormalization Group, and Dyson-Schwinger Equations. A presentation deals with effective vertices and photon-photon scattering in SU(2) Yang-Mills thermodynamics. This document gathers the abstracts of the presentations

  12. 2002 NASPSA Conference Abstracts.

    Science.gov (United States)

    Journal of Sport & Exercise Psychology, 2002

    2002-01-01

    Contains abstracts from the 2002 conference of the North American Society for the Psychology of Sport and Physical Activity. The publication is divided into three sections: the preconference workshop, "Effective Teaching Methods in the Classroom;" symposia (motor development, motor learning and control, and sport psychology); and free…

  13. The Abstraction Engine

    DEFF Research Database (Denmark)

    Fortescue, Michael David

    The main thesis of this book is that abstraction, far from being confined to higher formsof cognition, language and logical reasoning, has actually been a major driving forcethroughout the evolution of creatures with brains. It is manifest in emotive as well as rationalthought. Wending its way th...

  14. Composing Interfering Abstract Protocols

    Science.gov (United States)

    2016-04-01

    Tecnologia , Universidade Nova de Lisboa, Caparica, Portugal. This document is a companion technical report of the paper, “Composing Interfering Abstract...a Ciência e Tecnologia (Portuguese Foundation for Science and Technology) through the Carnegie Mellon Portugal Program under grant SFRH / BD / 33765

  15. Abstract Film and Beyond.

    Science.gov (United States)

    Le Grice, Malcolm

    A theoretical and historical account of the main preoccupations of makers of abstract films is presented in this book. The book's scope includes discussion of nonrepresentational forms as well as examination of experiments in the manipulation of time in films. The ten chapters discuss the following topics: art and cinematography, the first…

  16. Abstract Objects of Verbs

    DEFF Research Database (Denmark)

    2014-01-01

    Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which these obj...

  17. Abstracts of submitted papers

    International Nuclear Information System (INIS)

    1987-01-01

    The conference proceedings contain 152 abstracts of presented papers relating to various aspects of personnel dosimetry, the dosimetry of the working and living environment, various types of dosemeters and spectrometers, the use of radionuclides in various industrial fields, the migration of radionuclides on Czechoslovak territory after the Chernobyl accident, theoretical studies of some parameters of ionizing radiation detectors, and their calibration. (M.D.)

  18. Metaphors in Abstract Thought

    NARCIS (Netherlands)

    I. Boot (Inge)

    2010-01-01

    textabstractThe aim of the dissertation was to investigate the Conceptual Metaphor Theory (CMT, Lakoff & Johnson, 1980, 1999).The CMT proposes that abstract concepts are partly structured by concrete concepts through the mechanism of metaphorical mapping. In Chapter 2 we wanted to investigate the

  19. SPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-04-01

    The volume contains the abstracts of the SPR (society for pediatric radiology) 2015 meeting covering the following issues: fetal imaging, muscoskeletal imaging, cardiac imaging, chest imaging, oncologic imaging, tools for process improvement, child abuse, contrast enhanced ultrasound, image gently - update of radiation dose recording/reporting/monitoring - meaningful or useless meaning?, pediatric thoracic imaging, ALARA.

  20. Building Safe Concurrency Abstractions

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    2014-01-01

    Concurrent object-oriented programming in Beta is based on semaphores and coroutines and the ability to define high-level concurrency abstractions like monitors, and rendezvous-based communication, and their associated schedulers. The coroutine mechanism of SIMULA has been generalized into the no...

  1. Poster Session- Extended Abstracts

    Science.gov (United States)

    Jack D. Alexander III; Jean Findley; Brenda K. Kury; Jan L. Beyers; Douglas S. Cram; Terrell T. Baker; Jon C. Boren; Carl Edminster; Sue A. Ferguson; Steven McKay; David Nagel; Trent Piepho; Miriam Rorig; Casey Anderson; Jeanne Hoadley; Paulette L. Ford; Mark C. Andersen; Ed L. Fredrickson; Joe Truett; Gary W. Roemer; Brenda K. Kury; Jennifer Vollmer; Christine L. May; Danny C. Lee; James P. Menakis; Robert E. Keane; Zhi-Liang Zhu; Carol Miller; Brett Davis; Katharine Gray; Ken Mix; William P. Kuvlesky Jr.; D. Lynn Drawe; Marcia G. Narog; Roger D. Ottmar; Robert E. Vihnanek; Clinton S. Wright; Timothy E. Paysen; Burton K. Pendleton; Rosemary L. Pendleton; Carleton S. White; John Rogan; Doug Stow; Janet Franklin; Jennifer Miller; Lisa Levien; Chris Fischer; Emma Underwood; Robert Klinger; Peggy Moore; Clinton S. Wright

    2008-01-01

    Titles found within Poster Session-Extended Abstracts include:Assessment of emergency fire rehabilitation of four fires from the 2000 fire season on the Vale, Oregon, BLM district: review of the density sampling materials and methods: p. 329 Growth of regreen, seeded for erosion control, in the...

  2. Abstract Introduction Materials & Methods

    African Journals Online (AJOL)

    plzfg

    Abstract. Oral administration to male rats of 200mg kg-1 body weight of an extract of Calendula officinalis flowers every day for 60 days did not cause loss of body weight, but decreased significantly the weight of the testis, epididymis, seminal vesicle and ventral prostate. Sperm motility as well as sperm density were reduced ...

  3. Impredicative concurrent abstract predicates

    DEFF Research Database (Denmark)

    Svendsen, Kasper; Birkedal, Lars

    2014-01-01

    We present impredicative concurrent abstract predicates { iCAP { a program logic for modular reasoning about concurrent, higher- order, reentrant, imperative code. Building on earlier work, iCAP uses protocols to reason about shared mutable state. A key novel feature of iCAP is the ability to dene...

  4. Leadership Abstracts, 2002.

    Science.gov (United States)

    Wilson, Cynthia, Ed.; Milliron, Mark David, Ed.

    2002-01-01

    This 2002 volume of Leadership Abstracts contains issue numbers 1-12. Articles include: (1) "Skills Certification and Workforce Development: Partnering with Industry and Ourselves," by Jeffrey A. Cantor; (2) "Starting Again: The Brookhaven Success College," by Alice W. Villadsen; (3) "From Digital Divide to Digital Democracy," by Gerardo E. de los…

  5. Circularity and Lambda Abstraction

    DEFF Research Database (Denmark)

    Danvy, Olivier; Thiemann, Peter; Zerny, Ian

    2013-01-01

    unknowns from what is done to them, which we lambda-abstract with functions. The circular unknowns then become dead variables, which we eliminate. The result is a strict circu- lar program a la Pettorossi. This transformation is reversible: given a strict circular program a la Pettorossi, we introduce...

  6. Abstract decomposition theorem and applications

    CERN Document Server

    Grossberg, R; Grossberg, Rami; Lessmann, Olivier

    2005-01-01

    Let K be an Abstract Elementary Class. Under the asusmptions that K has a nicely behaved forking-like notion, regular types and existence of some prime models we establish a decomposition theorem for such classes. The decomposition implies a main gap result for the class K. The setting is general enough to cover \\aleph_0-stable first-order theories (proved by Shelah in 1982), Excellent Classes of atomic models of a first order tehory (proved Grossberg and Hart 1987) and the class of submodels of a large sequentially homogenuus \\aleph_0-stable model (which is new).

  7. Impact of Office Automation: An Empirical Assessment

    Science.gov (United States)

    1988-12-01

    imp F rq(I NAVAL POSTGRADUATE SCHOOL Monterey, California N I < DTIC S ELECTEI THESIS -’° "I I MPACT OF OFFICE AUTOMATION : AN EMPIRICAL ASSESSMENT by...FLNDiNG NUMBERS PROGRAM PROCT TASK IWORK UNIT ELEMNT O NONO ACCESSION NO 11 TITLE (Include Security Classification) IMPACT OF OFFICE AUTOMATION : AN...identity by block number) FIELD GROUP I SB-GROLP Productivity Assessment; SACONS; Office Automation I I 19 ABSTRACT (Continue on reverse if necessary

  8. GoSam-2.0. A tool for automated one-loop calculations within the Standard Model and beyond

    International Nuclear Information System (INIS)

    Cullen, Gavin; Deurzen, Hans van; Greiner, Nicolas

    2014-05-01

    We present the version 2.0 of the program package GoSam for the automated calculation of one-loop amplitudes. GoSam is devised to compute one-loop QCD and/or electroweak corrections to multi-particle processes within and beyond the Standard Model. The new code contains improvements in the generation and in the reduction of the amplitudes, performs better in computing time and numerical accuracy, and has an extended range of applicability. The extended version of the ''Binoth-Les-Houches-Accord'' interface to Monte Carlo programs is also implemented. We give a detailed description of installation and usage of the code, and illustrate the new features in dedicated examples.

  9. Laser performance operations model (LPOM): a computational system that automates the setup and performance analysis of the national ignition facility

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, M; House, R; Williams, W; Haynam, C; White, R; Orth, C; Sacks, R [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA, 94550 (United States)], E-mail: shaw7@llnl.gov

    2008-05-15

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF will be the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed and deployed that automates the laser setup process, and accurately predict laser energetics. LPOM determines the settings of the injection laser system required to achieve the desired main laser output, provides equipment protection, determines the diagnostic setup, and supplies post shot data analysis and reporting.

  10. Automated identification of stream-channel geomorphic features from high‑resolution digital elevation models in West Tennessee watersheds

    Science.gov (United States)

    Cartwright, Jennifer M.; Diehl, Timothy H.

    2017-01-17

    High-resolution digital elevation models (DEMs) derived from light detection and ranging (lidar) enable investigations of stream-channel geomorphology with much greater precision than previously possible. The U.S. Geological Survey has developed the DEM Geomorphology Toolbox, containing seven tools to automate the identification of sites of geomorphic instability that may represent sediment sources and sinks in stream-channel networks. These tools can be used to modify input DEMs on the basis of known locations of stormwater infrastructure, derive flow networks at user-specified resolutions, and identify possible sites of geomorphic instability including steep banks, abrupt changes in channel slope, or areas of rough terrain. Field verification of tool outputs identified several tool limitations but also demonstrated their overall usefulness in highlighting likely sediment sources and sinks within channel networks. In particular, spatial clusters of outputs from multiple tools can be used to prioritize field efforts to assess and restore eroding stream reaches.

  11. TOWARD AUTOMATED FAÇADE TEXTURE GENERATION FOR 3D PHOTOREALISTIC CITY MODELLING WITH SMARTPHONES OR TABLET PCS

    Directory of Open Access Journals (Sweden)

    S. Wang

    2012-07-01

    Full Text Available An automated model-image fitting algorithm is proposed in this paper for generating façade texture image from pictures taken by smartphones or tablet PCs. The façade texture generation requires tremendous labour work and thus, has been the bottleneck of 3D photo-realistic city modelling. With advanced developments of the micro electro mechanical system (MEMS, camera, global positioning system (GPS, and gyroscope (G-sensors can all be integrated into a smartphone or a table PC. These sensors bring the possibility of direct-georeferencing for the pictures taken by smartphones or tablet PCs. Since the accuracy of these sensors cannot compared to the surveying instruments, the image position and orientation derived from these sensors are not capable of photogrammetric measurements. This paper adopted the least-squares model-image fitting (LSMIF algorithm to iteratively improve the image's exterior orientation. The image position from GPS and the image orientation from gyroscope are treated as the initial values. By fitting the projection of the wireframe model to the extracted edge pixels on image, the image exterior orientation elements are solved when the optimal fitting achieved. With the exact exterior orientation elements, the wireframe model of the building can be correctly projected on the image and, therefore, the façade texture image can be extracted from the picture.

  12. Norddesign 2012 - Book of Abstract

    DEFF Research Database (Denmark)

    has been organized in line with the original ideas. The topics mentioned in the call for abstracts were: Product Development: Integrated, Multidisciplinary, Product life oriented and Distributed. Multi-product Development. Innovation and Business Models. Engineering Design and Industrial Design....... Conceptualisation and Innovative thinking. Research approaches and topics: Human Behaviour and Cognition. Cooperation and Multidisciplinary Design. Staging and Management of Design. Communication in Design. Design education and teaching: Programmes and Syllabuses. New Courses. Integrated and Multi-disciplinary. We...

  13. Modeling groundwater/surface-water interactions in an Alpine valley (the Aosta Plain, NW Italy): the effect of groundwater abstraction on surface-water resources

    Science.gov (United States)

    Stefania, Gennaro A.; Rotiroti, Marco; Fumagalli, Letizia; Simonetto, Fulvio; Capodaglio, Pietro; Zanotti, Chiara; Bonomi, Tullia

    2018-02-01

    A groundwater flow model of the Alpine valley aquifer in the Aosta Plain (NW Italy) showed that well pumping can induce river streamflow depletions as a function of well location. Analysis of the water budget showed that ˜80% of the water pumped during 2 years by a selected well in the downstream area comes from the baseflow of the main river discharge. Alluvial aquifers hosted in Alpine valleys fall within a particular hydrogeological context where groundwater/surface-water relationships change from upstream to downstream as well as seasonally. A transient groundwater model using MODFLOW2005 and the Streamflow-Routing (SFR2) Package is here presented, aimed at investigating water exchanges between the main regional river (Dora Baltea River, a left-hand tributary of the Po River), its tributaries and the underlying shallow aquifer, which is affected by seasonal oscillations. The three-dimensional distribution of the hydraulic conductivity of the aquifer was obtained by means of a specific coding system within the database TANGRAM. Both head and flux targets were used to perform the model calibration using PEST. Results showed that the fluctuations of the water table play an important role in groundwater/surface-water interconnections. In upstream areas, groundwater is recharged by water leaking through the riverbed and the well abstraction component of the water budget changes as a function of the hydraulic conditions of the aquifer. In downstream areas, groundwater is drained by the river and most of the water pumped by wells comes from the base flow component of the river discharge.

  14. DEGRO 2017. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2017-06-15

    The volume includes abstracts of the Annual DEGRO Meeting 2017 covering lectures and poster sessions with the following issues: lymphoma, biology, physics, radioimmunotherapy, sarcomas and rare tumors, prostate carcinoma, lung tumors, benign lesions and new media, mamma carcinoma, gastrointestinal tumors, quality of life, care science and quality assurance, high-technology methods and palliative situation, head-and-neck tumors, brain tumors, central nervous system metastases, guidelines, radiation sensitivity, radiotherapy, radioimmunotherapy.

  15. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    in emphasis from the three syntheses to mappings and rhizomatic diagrams that cut across semiotics or “blow apart regimes of signs”. The aim here is the absolute deterritorialization. Deleuze has shown how abstract machines operate in the philosophy of Foucault, the literature of Proust and Kafka......, and the painting of Bacon. We will finish our presentation by showing how these machines apply to architecture....

  16. SPR 2014. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-05-15

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  17. SPR 2014. Abstracts

    International Nuclear Information System (INIS)

    2014-01-01

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  18. WWNPQFT-2011 - Abstracts

    International Nuclear Information System (INIS)

    Bianchi, E.; Bender, C.; Culetu, H.; Fried, H.; Grossmann, A.; Hofmann, R.; Le Bellac, M.; Martinetti, P.; Muller, B.; Patras, F.; Raffaeli, B.; Vitting Andersen, J.

    2013-01-01

    The object of this workshop is to consolidate and publicize new efforts in non-perturbative field theories. This year the presentations deal with quantum gravity, non-commutative geometry, fat-tailed wave-functions, strongly coupled field theories, space-times two time-like dimensions, and multiplicative renormalization. A presentation is dedicated to the construction of a nucleon-nucleon potential from an analytical, non-perturbative gauge invariant QCD. This document gathers the abstracts of the presentations

  19. The Complexity of Abstract Machines

    Directory of Open Access Journals (Sweden)

    Beniamino Accattoli

    2017-01-01

    Full Text Available The lambda-calculus is a peculiar computational model whose definition does not come with a notion of machine. Unsurprisingly, implementations of the lambda-calculus have been studied for decades. Abstract machines are implementations schema for fixed evaluation strategies that are a compromise between theory and practice: they are concrete enough to provide a notion of machine and abstract enough to avoid the many intricacies of actual implementations. There is an extensive literature about abstract machines for the lambda-calculus, and yet—quite mysteriously—the efficiency of these machines with respect to the strategy that they implement has almost never been studied. This paper provides an unusual introduction to abstract machines, based on the complexity of their overhead with respect to the length of the implemented strategies. It is conceived to be a tutorial, focusing on the case study of implementing the weak head (call-by-name strategy, and yet it is an original re-elaboration of known results. Moreover, some of the observation contained here never appeared in print before.

  20. Bounded Rationality of Generalized Abstract Fuzzy Economies

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2014-01-01

    Full Text Available By using a nonlinear scalarization technique, the bounded rationality model M for generalized abstract fuzzy economies in finite continuous spaces is established. Furthermore, by using the model M, some new theorems for structural stability and robustness to (λ,ϵ-equilibria of generalized abstract fuzzy economies are proved.

  1. Modelling and automation of the process of phosphate ion removal from waste waters

    Directory of Open Access Journals (Sweden)

    L. Lupa

    2008-03-01

    Full Text Available Phosphate removal from waste waters has become an environmental necessity, since these phosphates stimulate the growth of aquatic plants and planktons and contribute to the eutrophication process in general. The physicochemical methods of phosphate ion removal are the most effective and reliable. This paper presents studies on the process of phosphate ion removal from waste waters resulting from the fertiliser industry’s use of the method of co-precipitation with iron salts and with calcium hydroxide as the neutralizing agent. The optimal process conditions were established as those that allow achievement of a maximum degree of separation of the phosphate ions. The precipitate resulting from the co-precipitation process was analysed for chemical composition and establishment of thermal and structural stability, and the aim was also to establish in which form the phosphate ions in the formed precipitate can be found. Based on these considerations, the experimental data obtained in the process of phosphate ion removal from waste waters were analysed mathematically and the equations for the dependence of the degree of phosphate separation and residual concentration versus the main parameters of the process were formulated. In this paper an automated scheme for the phosphate ion removal from waste waters by co-precipitation is presented.

  2. MODEL PERPINDAHAN PANAS DAN MASSA SELAMA PENGGORENGAN BUAH PADA KEADAAN VAKUM Model of Heat and Mass Transfer during Vacuum Fruit Frying Abstract

    Directory of Open Access Journals (Sweden)

    Jamalludin Jamalludin

    2012-05-01

    Full Text Available Recently, vacuum frying flaky products have been popularly consumed by people as they have spesific characteristics, good taste, crispy, and crunchy. During vacuum frying process, heat and mass transfer simultanously occur. Heat transfer takes place from hot frying oil to the fruit, and water in the fruit comes out. At the same time, the fruit absobes oil. The objective of is this research to develop mathematical model of simultant heat and mass transfer during vacuum fruit frying. Sample of the research is jack fruit vacuumly fried in the temperature of 70-100 oC, duration of 15-60 minutes, and pressure of 13-23 kPa. The model includes changes of water, oil, extract, sukrose, reduction glucosa, and β-karoten content in product. The developed model is based on the one order simultan ordinary differential equation solved by Runge-Kutta numerical method. Simulation result showed that increasing temperature, decreasing water content, and oil absorbtion during the vacuum frying process describe that the developed mathematical model is good enough to explain simultanously heat and mass transfer phenomena during the process of vacuum fruit frying.   ABSTRAK Saat ini produk keripik buah hasil proses penggorengan vakum sudah populer dikonsumsi oleh masyarakat, karena produk keripik buah mempunyai sifat yang khas, enak, gurih dan renyah jika dimakan. Selama proses penggorengan buah pada keadaan vakum, perpindahan panas dan massa terjadi secara simultan. Pepindahan panas dari minyak panas ke permukaan kemudian merambat ke dalam buah dan kandungan air di dalam buah keluar ke permukaan, pada saat yang bersamaan buah menyerap minyak. Penelitian ini bertujuan untuk mengembangkan model matematik perpindahan panas dan massa secara simultan pada penggorengan buah pada keadaan vakum. Sampel penelitian adalah buah nangka digoreng pada suhu 70-100 oC, lama penggorengan 15-60 menit dan tekanan vakum 13-23 kPa. Model meliputi perubahan kadar air, kadar minyak

  3. Energy Modelling and Automated Calibrations of Ancient Building Simulations: A Case Study of a School in the Northwest of Spain

    Directory of Open Access Journals (Sweden)

    Ana Ogando

    2017-06-01

    Full Text Available In the present paper, the energy performance of buildings forming a school centre in the northwest of Spain was analyzed using a transient simulation of the energy model of the school, which was developed with TRNSYS, a software of proven reliability in the field of thermal simulations. A deterministic calibration approach was applied to the initial building model to adjust the predictions to the actual performance of the school, data acquired during the temperature measurement campaign. The buildings under study were in deteriorated conditions due to poor maintenance over the years, presenting a big challenge for modelling and simulating it in a reliable way. The results showed that the proposed methodology is successful for obtaining calibrated thermal models of these types of damaged buildings, as the metrics employed to verify the final error showed a reduced normalized mean bias error (NMBE of 2.73%. It was verified that a decrease of approximately 60% in NMBE and 17% in the coefficient of variation of the root mean square error (CV(RMSE was achieved due to the calibration process. Subsequent steps were performed with the aid of new software, which was developed under a European project that enabled the automated calibration of the simulations.

  4. Semi-automated curation of metabolic models via flux balance analysis: a case study with Mycoplasma gallisepticum.

    Directory of Open Access Journals (Sweden)

    Eddy J Bautista

    Full Text Available Primarily used for metabolic engineering and synthetic biology, genome-scale metabolic modeling shows tremendous potential as a tool for fundamental research and curation of metabolism. Through a novel integration of flux balance analysis and genetic algorithms, a strategy to curate metabolic networks and facilitate identification of metabolic pathways that may not be directly inferable solely from genome annotation was developed. Specifically, metabolites involved in unknown reactions can be determined, and potentially erroneous pathways can be identified. The procedure developed allows for new fundamental insight into metabolism, as well as acting as a semi-automated curation methodology for genome-scale metabolic modeling. To validate the methodology, a genome-scale metabolic model for the bacterium Mycoplasma gallisepticum was created. Several reactions not predicted by the genome annotation were postulated and validated via the literature. The model predicted an average growth rate of 0.358±0.12[Formula: see text], closely matching the experimentally determined growth rate of M. gallisepticum of 0.244±0.03[Formula: see text]. This work presents a powerful algorithm for facilitating the identification and curation of previously known and new metabolic pathways, as well as presenting the first genome-scale reconstruction of M. gallisepticum.

  5. Automated Text Data Mining Analysis of Five Decades of Educational Leadership Research Literature: Probabilistic Topic Modeling of "EAQ" Articles From 1965 to 2014

    Science.gov (United States)

    Wang, Yinying; Bowers, Alex J.; Fikis, David J.

    2017-01-01

    Purpose: The purpose of this study is to describe the underlying topics and the topic evolution in the 50-year history of educational leadership research literature. Method: We used automated text data mining with probabilistic latent topic models to examine the full text of the entire publication history of all 1,539 articles published in…

  6. Dockomatic - automated ligand creation and docking

    Directory of Open Access Journals (Sweden)

    Hampikian Greg

    2010-11-01

    Full Text Available Abstract Background The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. Results DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. Conclusions DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  7. Automated detection of healthcare associated infections: external validation and updating of a model for surveillance of drain-related meningitis.

    Directory of Open Access Journals (Sweden)

    Maaike S M van Mourik

    Full Text Available OBJECTIVE: Automated surveillance of healthcare-associated infections can improve efficiency and reliability of surveillance. The aim was to validate and update a previously developed multivariable prediction model for the detection of drain-related meningitis (DRM. DESIGN: Retrospective cohort study using traditional surveillance by infection control professionals as reference standard. PATIENTS: Patients receiving an external cerebrospinal fluid drain, either ventricular (EVD or lumbar (ELD in a tertiary medical care center. Children, patients with simultaneous drains, <1 day of follow-up or pre-existing meningitis were excluded leaving 105 patients in validation set (2010-2011 and 653 in updating set (2004-2011. METHODS: For validation, the original model was applied. Discrimination, classification and calibration were assessed. For updating, data from all available years was used to optimally re-estimate coefficients and determine whether extension with new predictors is necessary. The updated model was validated and adjusted for optimism (overfitting using bootstrapping techniques. RESULTS: In model validation, the rate of DRM was 17.4/1000 days at risk. All cases were detected by the model. The area under the ROC curve was 0.951. The positive predictive value was 58.8% (95% CI 40.7-75.4 and calibration was good. The revised model also includes Gram stain results. Area under the ROC curve after correction for optimism was 0.963 (95% CI 0.953- 0.974. Group-level prediction was adequate. CONCLUSIONS: The previously developed multivariable prediction model maintains discriminatory power and calibration in an independent patient population. The updated model incorporates all available data and performs well, also after elaborate adjustment for optimism.

  8. Towards an Ontology for the Global Geodynamics Project: Automated Extraction of Resource Descriptions from an XML-Based Data Model

    Science.gov (United States)

    Lumb, L. I.; Aldridge, K. D.

    2005-12-01

    Using the Earth Science Markup Language (ESML), an XML-based data model for the Global Geodynamics Project (GGP) was recently introduced [Lumb & Aldridge, Proc. HPCS 2005, Kotsireas & Stacey, eds., IEEE, 2005, 216-222]. This data model possesses several key attributes -i.e., it: makes use of XML schema; supports semi-structured ASCII format files; includes Earth Science affinities; and is on track for compliance with emerging Grid computing standards (e.g., the Global Grid Forum's Data Format Description Language, DFDL). Favorable attributes notwithstanding, metadata (i.e., data about data) was identified [Lumb & Aldridge, 2005] as a key challenge for progress in enabling the GGP for Grid computing. Even in projects of small-to-medium scale like the GGP, the manual introduction of metadata has the potential to be the rate-determining metric for progress. Fortunately, an automated approach for metadata introduction has recently emerged. Based on Gleaning Resource Descriptions from Dialects of Languages (GRDDL, http://www.w3.org/2004/01/rdxh/spec), this bottom-up approach allows for the extraction of Resource Description Format (RDF) representations from the XML-based data model (i.e., the ESML representation of GGP data) subject to rules of transformation articulated via eXtensible Stylesheet Language Transformations (XSLT). In addition to introducing relationships into the GGP data model, and thereby addressing the metadata requirement, the syntax and semantics of RDF comprise a requisite for a GGP ontology - i.e., ``the common words and concepts (the meaning) used to describe and represent an area of knowledge'' [Daconta et al., The Semantic Web, Wiley, 2003]. After briefly reviewing the XML-based model for the GGP, attention focuses on the automated extraction of an RDF representation via GRDDL with XSLT-delineated templates. This bottom-up approach, in tandem with a top-down approach based on the Protege integrated development environment for ontologies (http

  9. Automata Learning through Counterexample Guided Abstraction Refinement

    DEFF Research Database (Denmark)

    Aarts, Fides; Heidarian, Faranak; Kuppens, Harco

    2012-01-01

    Abstraction is the key when learning behavioral models of realistic systems. Hence, in most practical applications where automata learning is used to construct models of software components, researchers manually define abstractions which, depending on the history, map a large set of concrete events...... to a small set of abstract events that can be handled by automata learning tools. In this article, we show how such abstractions can be constructed fully automatically for a restricted class of extended finite state machines in which one can test for equality of data parameters, but no operations on data...

  10. Comparison of different inspiratory triggering settings in automated ventilators during cardiopulmonary resuscitation in a porcine model.

    Science.gov (United States)

    Tan, Dingyu; Xu, Jun; Shao, Shihuan; Fu, Yangyang; Sun, Feng; Zhang, Yazhi; Hu, Yingying; Walline, Joseph; Zhu, Huadong; Yu, Xuezhong

    2017-01-01

    Mechanical ventilation via automated in-hospital ventilators is quite common during cardiopulmonary resuscitation. It is not known whether different inspiratory triggering sensitivity settings of ordinary ventilators have different effects on actual ventilation, gas exchange and hemodynamics during resuscitation. 18 pigs enrolled in this study were anaesthetized and intubated. Continuous chest compressions and mechanical ventilation (volume-controlled mode, 100% O2, respiratory rate 10/min, and tidal volumes 10ml/kg) were performed after 3 minutes of ventricular fibrillation. Group trig-4, trig-10 and trig-20 (six pigs each) were characterized by triggering sensitivities of 4, 10 and 20 (cmH2O for pressure-triggering and L/min for flow-triggering), respectively. Additionally, each pig in each group was mechanically ventilated using three types of inspiratory triggering (pressure-triggering, flow-triggering and turned-off triggering) of 5 minutes duration each, and each animal matched with one of six random assortments of the three different triggering settings. Blood gas samples, respiratory and hemodynamic parameters for each period were all collected and analyzed. In each group, significantly lower actual respiratory rate, minute ventilation volume, mean airway pressure, arterial pH, PaO2, and higher end-tidal carbon dioxide, aortic blood pressure, coronary perfusion pressure, PaCO2 and venous oxygen saturation were observed in the ventilation periods with a turned-off triggering setting compared to those with pressure- or flow- triggering (all PVentilation with pressure- or flow-triggering tends to induce hyperventilation and deteriorating gas exchange and hemodynamics during CPR. A turned-off patient triggering or a pressure-triggering of 20 cmH2O is preferred for ventilation when an ordinary inpatient hospital ventilator is used during resuscitation.

  11. APPLICATION OF RANKING BASED ATTRIBUTE SELECTION FILTERS TO PERFORM AUTOMATED EVALUATION OF DESCRIPTIVE ANSWERS THROUGH SEQUENTIAL MINIMAL OPTIMIZATION MODELS

    Directory of Open Access Journals (Sweden)

    C. Sunil Kumar

    2014-10-01

    Full Text Available In this paper, we study the performance of various models for automated evaluation of descriptive answers by using rank based feature selection filters for dimensionality reduction. We quantitatively analyze the best feature selection technique from amongst the five rank based feature selection techniques, namely Chi squared filter, Information gain filter, Gain ratio filter, Relief filter and Symmetrical uncertainty filter. We use Sequential Minimal Optimization with Polynomial kernel to build models and we evaluate the models across various parameters such as Accuracy, Time to build models, Kappa, Mean Absolute Error and Root Mean Squared Error. Except with Relief filter, for all other filters applied models, the accuracies obtained are at least 4% better than accuracies obtained with models with no filters applied. The accuracies recorded are same across Chi squared filter, Information gain filter, Gain ratio filter and Symmetrical Uncertainty filter. Therefore accuracy alone is not the determinant in selecting the best filter. The time taken to build models, Kappa, Mean absolute error and Root Mean Squared Error played a major role in determining the effectiveness of the filters. The overall rank aggregation metric of Symmetrical uncertainty filter is 45 and this is better by 1 rank than the rank aggregation metric of information gain attribute evaluation filter, the nearest contender to Symmetric attribute evaluation filter. Symmetric uncertainty rank aggregation metric is better by 3, 6, 112 ranks respectively when compared to rank aggregation metrics of Chi squared filter, Gain ratio filter and Relief filters. Through these quantitative measurements, we conclude that Symmetrical uncertainty attribute evaluation is the overall best performing rank based feature selection algorithm applicable for auto evaluation of descriptive answers.

  12. Program and abstracts

    International Nuclear Information System (INIS)

    1978-01-01

    This volume contains the program and abstracts of the conference. The following topics are included: metal vapor molecular lasers, magnetohydrodynamics, rare gas halide and nuclear pumped lasers, transfer mechanisms in arcs, kinetic processes in rare gas halide lasers, arcs and flows, XeF kinetics and lasers, fundamental processes in excimer lasers, electrode effects and vacuum arcs, electron and ion transport, ion interactions and mobilities, glow discharges, diagnostics and afterglows, dissociative recombination, electron ionization and excitation, rare gas excimers and group VI lasers, breakdown, novel laser pumping techniques, electrode-related discharge phenomena, photon interactions, attachment, plasma chemistry and infrared lasers, electron scattering, and reactions of excited species

  13. ESPR 2015. Abstracts

    International Nuclear Information System (INIS)

    2015-01-01

    The volume includes the abstracts of the ESPR 2015 covering the following topics: PCG (post graduate courses): Radiography; fluoroscopy and general issue; nuclear medicine, interventional radiology and hybrid imaging, pediatric CT, pediatric ultrasound; MRI in childhood. Scientific sessions and task force sessions: International aspects; neuroradiology, neonatal imaging, engineering techniques to simulate injury in child abuse, CT - dose and quality, challenges in the chest, cardiovascular and chest, muscoskeletal, oncology, pediatric uroradiology and abdominal imaging, fetal and postmortem imaging, education and global challenges, neuroradiology - head and neck, gastrointestinal and genitourinary.

  14. ESPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-05-10

    The volume includes the abstracts of the ESPR 2015 covering the following topics: PCG (post graduate courses): Radiography; fluoroscopy and general issue; nuclear medicine, interventional radiology and hybrid imaging, pediatric CT, pediatric ultrasound; MRI in childhood. Scientific sessions and task force sessions: International aspects; neuroradiology, neonatal imaging, engineering techniques to simulate injury in child abuse, CT - dose and quality, challenges in the chest, cardiovascular and chest, muscoskeletal, oncology, pediatric uroradiology and abdominal imaging, fetal and postmortem imaging, education and global challenges, neuroradiology - head and neck, gastrointestinal and genitourinary.

  15. IPR 2016. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-05-15

    The volume on the meeting of pediatric radiology includes abstract on the following issues: chest, cardiovascular system, neuroradiology, CT radiation DRs (diagnostic reference levels) and dose reporting guidelines, genitourinary imaging, gastrointestinal radiology, oncology an nuclear medicine, whole body imaging, fetal/neonates imaging, child abuse, oncology and hybrid imaging, value added imaging, muscoskeletal imaging, dose and radiation safety, imaging children - immobilization and distraction techniques, information - education - QI and healthcare policy, ALARA, the knowledge skills and competences for a technologist/radiographer in pediatric radiology, full exploitation of new technological features in pediatric CT, image quality issues in pediatrics, abdominal imaging, interventional radiology, MR contrast agents, tumor - mass imaging, cardiothoracic imaging, ultrasonography.

  16. SPR 2017. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2017-05-15

    The conference proceedings SPR 2017 include abstracts on the following issues: gastrointestinal radiography - inflammatory bowel diseases, cardiovascular CTA, general muscoskeletal radiology, muscoskeletal congenital development diseases, general pediatric radiology - chest, muscoskeletal imaging - marrow and infectious disorders, state-of-the-art body MR imaging, practical pediatric sonography, quality and professionalism, CT imaging in congenital heart diseases, radiographic courses, body MT techniques, contrast enhanced ultrasound, machine learning, forensic imaging, the radiation dos conundrum - reconciling imaging, imagining and managing, the practice of radiology, interventional radiology, neuroradiology, PET/MR.

  17. Beyond the abstractions?

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2006-01-01

      The anniversary of the International Journal of Lifelong Education takes place in the middle of a conceptual landslide from lifelong education to lifelong learning. Contemporary discourses of lifelong learning etc are however abstractions behind which new functions and agendas for adult education...... are set. The ideological discourse of recent policies seems to neglect the fact that history and resources for lifelong learning are different across Europe, and also neglects the multiplicity of adult learners. Instead of refusing the new agendas, however, adult education research should try to dissolve...... learning. Adult education research must fulfil it's potential conversion from normative philosophy to critical and empirical social science....

  18. Parameterized Dataflow (Extended Abstract

    Directory of Open Access Journals (Sweden)

    Dominic Duggan

    2016-10-01

    Full Text Available Dataflow networks have application in various forms of stream processing, for example for parallel processing of multimedia data. The description of dataflow graphs, including their firing behavior, is typically non-compositional and not amenable to separate compilation. This article considers a dataflow language with a type and effect system that captures the firing behavior of actors. This system allows definitions to abstract over actor firing rates, supporting the definition and safe composition of actor definitions where firing rates are not instantiated until a dataflow graph is launched.

  19. An Outcrop-based Detailed Geological Model to Test Automated Interpretation of Seismic Inversion Results

    NARCIS (Netherlands)

    Feng, R.; Sharma, S.; Luthi, S.M.; Gisolf, A.

    2015-01-01

    Previously, Tetyukhina et al. (2014) developed a geological and petrophysical model based on the Book Cliffs outcrops that contained eight lithotypes. For reservoir modelling purposes, this model is judged to be too coarse because in the same lithotype it contains reservoir and non-reservoir

  20. BATMAN--an R package for the automated quantification of metabolites from nuclear magnetic resonance spectra using a Bayesian model.

    Science.gov (United States)

    Hao, Jie; Astle, William; De Iorio, Maria; Ebbels, Timothy M D

    2012-08-01

    Nuclear Magnetic Resonance (NMR) spectra are widely used in metabolomics to obtain metabolite profiles in complex biological mixtures. Common methods used to assign and estimate concentrations of metabolites involve either an expert manual peak fitting or extra pre-processing steps, such as peak alignment and binning. Peak fitting is very time consuming and is subject to human error. Conversely, alignment and binning can introduce artefacts and limit immediate biological interpretation of models. We present the Bayesian automated metabolite analyser for NMR spectra (BATMAN), an R package that deconvolutes peaks from one-dimensional NMR spectra, automatically assigns them to specific metabolites from a target list and obtains concentration estimates. The Bayesian model incorporates information on characteristic peak patterns of metabolites and is able to account for shifts in the position of peaks commonly seen in NMR spectra of biological samples. It applies a Markov chain Monte Carlo algorithm to sample from a joint posterior distribution of the model parameters and obtains concentration estimates with reduced error compared with conventional numerical integration and comparable to manual deconvolution by experienced spectroscopists. http://www1.imperial.ac.uk/medicine/people/t.ebbels/ t.ebbels@imperial.ac.uk.

  1. Abstract Cauchy problems three approaches

    CERN Document Server

    Melnikova, Irina V

    2001-01-01

    Although the theory of well-posed Cauchy problems is reasonably understood, ill-posed problems-involved in a numerous mathematical models in physics, engineering, and finance- can be approached in a variety of ways. Historically, there have been three major strategies for dealing with such problems: semigroup, abstract distribution, and regularization methods. Semigroup and distribution methods restore well-posedness, in a modern weak sense. Regularization methods provide approximate solutions to ill-posed problems. Although these approaches were extensively developed over the last decades by many researchers, nowhere could one find a comprehensive treatment of all three approaches.Abstract Cauchy Problems: Three Approaches provides an innovative, self-contained account of these methods and, furthermore, demonstrates and studies some of the profound connections between them. The authors discuss the application of different methods not only to the Cauchy problem that is not well-posed in the classical sense, b...

  2. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  3. On the Application of Macros to the Automation of different Dating Models Using ''210 Pb

    International Nuclear Information System (INIS)

    Gasco, C.; Anton, M. P.; Ampudia, J.

    2002-01-01

    Different Dating models based on 210 Pb measurements, used for verifying recent events are shown in this report as well as, models that describe different processes affecting the vertical distribution of radionuclides in lacustrine and marine sediments. Macro-Commands are programmes included in calculation work sheets that allow automatised operations to run. In this report macros are used to: a) obtain 210 Pb results from a data base created from different sampling campaigns b) apply different dating models automatically c) optimise the diffusion coefficient employed by models through standards deviation calculations among experimental values and those obtained by the model. (Author) 21 refs

  4. Constraint-Based Abstract Semantics for Temporal Logic

    DEFF Research Database (Denmark)

    Banda, Gourinath; Gallagher, John Patrick

    2010-01-01

    Abstract interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal mu-calculus, which is the basis for abstract model checking. The abstract semantic...

  5. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers.

    Science.gov (United States)

    Shu, Jie; Dolman, G E; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-04-27

    Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To speed up the training and detection processes, we removed luminance channel, Y channel of YCbCr colour space and chose 128 histogram bins which is the optimal number. A maximum likelihood classifier is used to classify pixels in digital slides into positively or negatively stained pixels automatically. The model-based tool was developed within ImageJ to quantify targets identified using IHC and histochemistry. The purpose of evaluation was to compare the computer model with human evaluation. Several large datasets were prepared and obtained from human oesophageal cancer, colon cancer and liver cirrhosis with different colour stains. Experimental results have demonstrated the model-based tool achieves more accurate results than colour deconvolution and CMYK model in the detection of brown colour, and is comparable to colour deconvolution in the detection of pink colour. We have also demostrated the proposed model has little inter-dataset variations. A robust and effective statistical model is introduced in this paper. The model-based interactive tool in ImageJ, which can create a visual representation of the statistical model and detect a specified colour automatically, is easy to use and available freely at http://rsb.info.nih.gov/ij/plugins/ihc-toolbox/index.html . Testing to the tool by different users showed only minor inter-observer variations in results.

  6. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  7. Evaluation of automated statistical shape model based knee kinematics from biplane fluoroscopy

    DEFF Research Database (Denmark)

    Baka, Nora; Kaptein, Bart L.; Giphart, J. Erik

    2014-01-01

    State-of-the-art fluoroscopic knee kinematic analysis methods require the patient-specific bone shapes segmented from CT or MRI. Substituting the patient-specific bone shapes with personalizable models, such as statistical shape models (SSM), could eliminate the CT/MRI acquisitions, and thereby d...

  8. Systems Operation Studies for Automated Guideway Transit Systems: Feeder Systems Model Functional Specification

    Science.gov (United States)

    1981-01-01

    This document specifies the functional requirements for the AGT-SOS Feeder Systems Model (FSM), the type of hardware required, and the modeling techniques employed by the FSM. The objective of the FSM is to map the zone-to-zone transit patronage dema...

  9. A semi-automated approach for generating natural language requirements documents based on business process models

    NARCIS (Netherlands)

    Aysolmaz, Banu; Leopold, Henrik; Reijers, Hajo A.; Demirörs, Onur

    2018-01-01

    Context: The analysis of requirements for business-related software systems is often supported by using business process models. However, the final requirements are typically still specified in natural language. This means that the knowledge captured in process models must be consistently

  10. IEEE conference record -- Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1994-01-01

    This conference covers the following areas: computational plasma physics; vacuum electronic; basic phenomena in fully ionized plasmas; plasma, electron, and ion sources; environmental/energy issues in plasma science; space plasmas; plasma processing; ball lightning/spherical plasma configurations; plasma processing; fast wave devices; magnetic fusion; basic phenomena in partially ionized plasma; dense plasma focus; plasma diagnostics; basic phenomena in weakly ionized gases; fast opening switches; MHD; fast z-pinches and x-ray lasers; intense ion and electron beams; laser-produced plasmas; microwave plasma interactions; EM and ETH launchers; solid state plasmas and switches; intense beam microwaves; and plasmas for lighting. Separate abstracts were prepared for 416 papers in this conference.

  11. IEEE conference record -- Abstracts

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    This conference covers the following areas: computational plasma physics; vacuum electronic; basic phenomena in fully ionized plasmas; plasma, electron, and ion sources; environmental/energy issues in plasma science; space plasmas; plasma processing; ball lightning/spherical plasma configurations; plasma processing; fast wave devices; magnetic fusion; basic phenomena in partially ionized plasma; dense plasma focus; plasma diagnostics; basic phenomena in weakly ionized gases; fast opening switches; MHD; fast z-pinches and x-ray lasers; intense ion and electron beams; laser-produced plasmas; microwave plasma interactions; EM and ETH launchers; solid state plasmas and switches; intense beam microwaves; and plasmas for lighting. Separate abstracts were prepared for 416 papers in this conference

  12. Problems in abstract algebra

    CERN Document Server

    Wadsworth, A R

    2017-01-01

    This is a book of problems in abstract algebra for strong undergraduates or beginning graduate students. It can be used as a supplement to a course or for self-study. The book provides more variety and more challenging problems than are found in most algebra textbooks. It is intended for students wanting to enrich their learning of mathematics by tackling problems that take some thought and effort to solve. The book contains problems on groups (including the Sylow Theorems, solvable groups, presentation of groups by generators and relations, and structure and duality for finite abelian groups); rings (including basic ideal theory and factorization in integral domains and Gauss's Theorem); linear algebra (emphasizing linear transformations, including canonical forms); and fields (including Galois theory). Hints to many problems are also included.

  13. ICENES 2007 Abstracts

    International Nuclear Information System (INIS)

    Sahin, S.

    2007-01-01

    In this book Conference Program and Abstracts were included 13th International Conference on Emerging Nuclear Energy Systems which held between 03-08 June 2007 in Istanbul, Turkey. The main objective of International Conference series on Emerging Nuclear Energy Systems (ICENES) is to provide an international scientific and technical forum for scientists, engineers, industry leaders, policy makers, decision makers and young professionals who will shape future energy supply and technology , for a broad review and discussion of various advanced, innovative and non-conventional nuclear energy production systems. The main topics of 159 accepted papers from 35 countries are fusion science and technology, fission reactors, accelerator driven systems, transmutation, laser in nuclear technology, radiation shielding, nuclear reactions, hydrogen energy, solar energy, low energy physics and societal issues

  14. Forecasting performance of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this work we consider forecasting macroeconomic variables during an economic crisis. The focus is on a speci…c class of models, the so-called single hidden-layer feedforward autoregressive neural network models. What makes these models interesting in the present context is that they form a class...... during the economic crisis 2007–2009. Forecast accuracy is measured by the root mean square forecast error. Hypothesis testing is also used to compare the performance of the different techniques with each other....... of universal approximators and may be expected to work well during exceptional periods such as major economic crises. These models are often difficult to estimate, and we follow the idea of White (2006) to transform the speci…cation and nonlinear estimation problem into a linear model selection and estimation...

  15. Mathematical model as means of optimization of the automation system of the process of incidents of information security management

    Directory of Open Access Journals (Sweden)

    Yulia G. Krasnozhon

    2018-03-01

    Full Text Available Modern information technologies have an increasing importance for development dynamics and management structure of an enterprise. The management efficiency of implementation of modern information technologies directly related to the quality of information security incident management. However, issues of assessment of the impact of information security incidents management on quality and efficiency of the enterprise management system are not sufficiently highlighted neither in Russian nor in foreign literature. The main direction to approach these problems is the optimization of the process automation system of the information security incident management. Today a special attention is paid to IT-technologies while dealing with information security incidents at mission-critical facilities in Russian Federation such as the Federal Tax Service of Russia (FTS. It is proposed to use the mathematical apparatus of queueing theory in order to build a mathematical model of the system optimization. The developed model allows to estimate quality of the management taking into account the rules and restrictions imposed on the system by the effects of information security incidents. Here an example is given in order to demonstrate the system in work. The obtained statistical data are shown. An implementation of the system discussed here will improve the quality of the Russian FTS services and make responses to information security incidents faster.

  16. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Directory of Open Access Journals (Sweden)

    Gregory R Johnson

    2015-12-01

    Full Text Available Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins and clinical research (e.g. identification of cancer biomarkers. Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  17. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Science.gov (United States)

    Johnson, Gregory R; Li, Jieyue; Shariff, Aabid; Rohde, Gustavo K; Murphy, Robert F

    2015-12-01

    Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins) and clinical research (e.g. identification of cancer biomarkers). Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  18. Automated External Defibrillator

    Science.gov (United States)

    ... To Health Topics / Automated External Defibrillator Automated External Defibrillator Also known as What Is An automated external ... in survival. Training To Use an Automated External Defibrillator Learning how to use an AED and taking ...

  19. Automated Modeling and Simulation Using the Bond Graph Method for the Aerospace Industry

    Science.gov (United States)

    Granda, Jose J.; Montgomery, Raymond C.

    2003-01-01

    Bond graph modeling was originally developed in the late 1950s by the late Prof. Henry M. Paynter of M.I.T. Prof. Paynter acted well before his time as the main advantage of his creation, other than the modeling insight that it provides and the ability of effectively dealing with Mechatronics, came into fruition only with the recent advent of modern computer technology and the tools derived as a result of it, including symbolic manipulation, MATLAB, and SIMULINK and the Computer Aided Modeling Program (CAMPG). Thus, only recently have these tools been available allowing one to fully utilize the advantages that the bond graph method has to offer. The purpose of this paper is to help fill the knowledge void concerning its use of bond graphs in the aerospace industry. The paper first presents simple examples to serve as a tutorial on bond graphs for those not familiar with the technique. The reader is given the basic understanding needed to appreciate the applications that follow. After that, several aerospace applications are developed such as modeling of an arresting system for aircraft carrier landings, suspension models used for landing gears and multibody dynamics. The paper presents also an update on NASA's progress in modeling the International Space Station (ISS) using bond graph techniques, and an advanced actuation system utilizing shape memory alloys. The later covers the Mechatronics advantages of the bond graph method, applications that simultaneously involves mechanical, hydraulic, thermal, and electrical subsystem modeling.

  20. Automating the segmentation of medical images for the production of voxel tomographic computational models

    International Nuclear Information System (INIS)

    Caon, M.

    2001-01-01

    Radiation dosimetry for the diagnostic medical imaging procedures performed on humans requires anatomically accurate, computational models. These may be constructed from medical images as voxel-based tomographic models. However, they are time consuming to produce and as a consequence, there are few available. This paper discusses the emergence of semi-automatic segmentation techniques and describes an application (iRAD) written in Microsoft Visual Basic that allows the bitmap of a medical image to be segmented interactively and semi-automatically while displayed in Microsoft Excel. iRAD will decrease the time required to construct voxel models. Copyright (2001) Australasian College of Physical Scientists and Engineers in Medicine

  1. An automated and simple method for brain MR image extraction

    Directory of Open Access Journals (Sweden)

    Zhu Zixin

    2011-09-01

    Full Text Available Abstract Background The extraction of brain tissue from magnetic resonance head images, is an important image processing step for the analyses of neuroimage data. The authors have developed an automated and simple brain extraction method using an improved geometric active contour model. Methods The method uses an improved geometric active contour model which can not only solve the boundary leakage problem but also is less sensitive to intensity inhomogeneity. The method defines the initial function as a binary level set function to improve computational efficiency. The method is applied to both our data and Internet brain MR data provided by the Internet Brain Segmentation Repository. Results The results obtained from our method are compared with manual segmentation results using multiple indices. In addition, the method is compared to two popular methods, Brain extraction tool and Model-based Level Set. Conclusions The proposed method can provide automated and accurate brain extraction result with high efficiency.

  2. Bringing Automated Model Checking to PLC Program Development - A CERN Case Study

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. Model checking appears to be an appropriate approach for this purpose. However, this technique is not widely used in industry yet, due to some obstacles. The main obstacles encountered when trying to apply formal verification techniques at industrial installations are the difficulty of creating models out of PLC programs and defining formally the specification requirements. In addition, models produced out of real-life programs have a huge state space, thus preventing the verification due to performance issues. Our work at CERN (European Organization for Nuclear Research) focuses on developing efficient automatic verification methods for industrial critical installations based on PLC (Programmable Logic Controller) control systems. In this paper, we present a tool generating automatically formal models out of PLC code. The tool implements a general methodology which can support several input languages, ...

  3. Guidelines for Applying the Capability Maturity Model Analysis to Connected and Automated Vehicle Deployment

    Science.gov (United States)

    2017-11-23

    The Federal Highway Administration (FHWA) has adapted the Transportation Systems Management and Operations (TSMO) Capability Maturity Model (CMM) to describe the operational maturity of Infrastructure Owner-Operator (IOO) agencies across a range of i...

  4. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  5. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  6. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT: A GIS-BASED HYDROLOGIC MODELING TOOL

    Science.gov (United States)

    Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extens...

  7. Automated Geospatial Watershed Assessment (AGWA) Tool for hydrologic modeling and watershed assessment

    Science.gov (United States)

    Using basic, easily attainable GIS data, AGWA provides a simple, direct, and repeatable methodology for hydrologic model setup, execution, and visualization. AGWA experiences activity from over 170 countries. It l has been downloaded over 11,000 times.

  8. Forecasting performances of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2014-01-01

    In this work we consider the forecasting of macroeconomic variables during an economic crisis. The focus is on a specific class of models, the so-called single hidden-layer feed-forward autoregressive neural network models. What makes these models interesting in the present context is the fact...... Scandinavian ones, and focus on forecasting during the economic crisis 2007–2009. The forecast accuracy is measured using the root mean square forecast error. Hypothesis testing is also used to compare the performances of the different techniques....... that they form a class of universal approximators and may be expected to work well during exceptional periods such as major economic crises. Neural network models are often difficult to estimate, and we follow the idea of White (2006) of transforming the specification and nonlinear estimation problem...

  9. An abstract machine for module replacement

    OpenAIRE

    Walton, Chris; Krl, Dilsun; Gilmore, Stephen

    1998-01-01

    In this paper we define an abstract machine model for the mλ typed intermediate language. This abstract machine is used to give a formal description of the operation of run-time module replacement from the programming language Dynamic ML. The essential technical device which we employ for module replacement is a modification of two-space copying garbage collection.

  10. Interactional Metadiscourse in Research Article Abstracts

    Science.gov (United States)

    Gillaerts, Paul; Van de Velde, Freek

    2010-01-01

    This paper deals with interpersonality in research article abstracts analysed in terms of interactional metadiscourse. The evolution in the distribution of three prominent interactional markers comprised in Hyland's (2005a) model, viz. hedges, boosters and attitude markers, is investigated in three decades of abstract writing in the field of…

  11. Automated prostate cancer detection via comprehensive multi-parametric magnetic resonance imaging texture feature models

    International Nuclear Information System (INIS)

    Khalvati, Farzad; Wong, Alexander; Haider, Masoom A.

    2015-01-01

    Prostate cancer is the most common form of cancer and the second leading cause of cancer death in North America. Auto-detection of prostate cancer can play a major role in early detection of prostate cancer, which has a significant impact on patient survival rates. While multi-parametric magnetic resonance imaging (MP-MRI) has shown promise in diagnosis of prostate cancer, the existing auto-detection algorithms do not take advantage of abundance of data available in MP-MRI to improve detection accuracy. The goal of this research was to design a radiomics-based auto-detection method for prostate cancer via utilizing MP-MRI data. In this work, we present new MP-MRI texture feature models for radiomics-driven detection of prostate cancer. In addition to commonly used non-invasive imaging sequences in conventional MP-MRI, namely T2-weighted MRI (T2w) and diffusion-weighted imaging (DWI), our proposed MP-MRI texture feature models incorporate computed high-b DWI (CHB-DWI) and a new diffusion imaging modality called correlated diffusion imaging (CDI). Moreover, the proposed texture feature models incorporate features from individual b-value images. A comprehensive set of texture features was calculated for both the conventional MP-MRI and new MP-MRI texture feature models. We performed feature selection analysis for each individual modality and then combined best features from each modality to construct the optimized texture feature models. The performance of the proposed MP-MRI texture feature models was evaluated via leave-one-patient-out cross-validation using a support vector machine (SVM) classifier trained on 40,975 cancerous and healthy tissue samples obtained from real clinical MP-MRI datasets. The proposed MP-MRI texture feature models outperformed the conventional model (i.e., T2w+DWI) with regard to cancer detection accuracy. Comprehensive texture feature models were developed for improved radiomics-driven detection of prostate cancer using MP-MRI. Using a

  12. AUTOMATING THREE DIMENSIONAL (3D) MODEL CREATION OF CIRCUIT CARD ASSEMBLIES

    Science.gov (United States)

    2017-07-01

    If done correctly, both decals will now appear with the proper orientation, verifiable by the proper alignment of the through holes to decal...Figure 3 shows a properly aligned graphical decal of the solid board model. Figure 3 Printed wiring board with decal Now that a model of the... interference between CCA components and features of other CCAs or parts, insufficient clearances with CCA enclosure, misaligned mounting holes

  13. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  14. Rapid automated superposition of shapes and macromolecular models using spherical harmonics.

    Science.gov (United States)

    Konarev, Petr V; Petoukhov, Maxim V; Svergun, Dmitri I

    2016-06-01

    A rapid algorithm to superimpose macromolecular models in Fourier space is proposed and implemented ( SUPALM ). The method uses a normalized integrated cross-term of the scattering amplitudes as a proximity measure between two three-dimensional objects. The reciprocal-space algorithm allows for direct matching of heterogeneous objects including high- and low-resolution models represented by atomic coordinates, beads or dummy residue chains as well as electron microscopy density maps and inhomogeneous multi-phase models ( e.g. of protein-nucleic acid complexes). Using spherical harmonics for the computation of the amplitudes, the method is up to an order of magnitude faster than the real-space algorithm implemented in SUPCOMB by Kozin & Svergun [ J. Appl. Cryst. (2001 ▸), 34 , 33-41]. The utility of the new method is demonstrated in a number of test cases and compared with the results of SUPCOMB . The spherical harmonics algorithm is best suited for low-resolution shape models, e.g . those provided by solution scattering experiments, but also facilitates a rapid cross-validation against structural models obtained by other methods.

  15. Interpolation Algorithm and Mathematical Model in Automated Welding of Saddle-Shaped Weld

    Directory of Open Access Journals (Sweden)

    Lianghao Xue

    2018-01-01

    Full Text Available This paper presents welding torch pose model and interpolation algorithm of trajectory control of saddle-shaped weld formed by intersection of two pipes; the working principle, interpolation algorithm, welding experiment, and simulation result of the automatic welding system of the saddle-shaped weld are described. A variable angle interpolation method is used to control the trajectory and pose of the welding torch, which guarantees the constant linear terminal velocity. The mathematical model of the trajectory and pose of welding torch are established. Simulation and experiment have been carried out to verify the effectiveness of the proposed algorithm and mathematical model. The results demonstrate that the interpolation algorithm is well within the interpolation requirements of the saddle-shaped weld and ideal feed rate stability.

  16. Towards product design automation based on parameterized standard model with diversiform knowledge

    Science.gov (United States)

    Liu, Wei; Zhang, Xiaobing

    2017-04-01

    Product standardization based on CAD software is an effective way to improve design efficiency. In the past, research and development on standardization mainly focused on the level of component, and the standardization of the entire product as a whole is rarely taken into consideration. In this paper, the size and structure of 3D product models are both driven by the Excel datasheets, based on which a parameterized model library is therefore established. Diversiform knowledge including associated parameters and default properties are embedded into the templates in advance to simplify their reuse. Through the simple operation, we can obtain the correct product with the finished 3D models including single parts or complex assemblies. Two examples are illustrated later to invalid the idea, which will greatly improve the design efficiency.

  17. A novel level set model with automated initialization and controlling parameters for medical image segmentation.

    Science.gov (United States)

    Liu, Qingyi; Jiang, Mingyan; Bai, Peirui; Yang, Guang

    2016-03-01

    In this paper, a level set model without the need of generating initial contour and setting controlling parameters manually is proposed for medical image segmentation. The contribution of this paper is mainly manifested in three points. First, we propose a novel adaptive mean shift clustering method based on global image information to guide the evolution of level set. By simple threshold processing, the results of mean shift clustering can automatically and speedily generate an initial contour of level set evolution. Second, we devise several new functions to estimate the controlling parameters of the level set evolution based on the clustering results and image characteristics. Third, the reaction diffusion method is adopted to supersede the distance regularization term of RSF-level set model, which can improve the accuracy and speed of segmentation effectively with less manual intervention. Experimental results demonstrate the performance and efficiency of the proposed model for medical image segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. An automated toolchain for the data-driven and dynamical modeling of combined sewer systems.

    Science.gov (United States)

    Troutman, Sara C; Schambach, Nathaniel; Love, Nancy G; Kerkez, Branko

    2017-12-01

    The recent availability and affordability of sensors and wireless communications is poised to transform our understanding and management of water systems. This will enable a new generation of adaptive water models that can ingest large quantities of sensor feeds and provide the best possible estimates of current and future conditions. To that end, this paper presents a novel data-driven identification/learning toolchain for combined sewer and stormwater systems. The toolchain uses Gaussian Processes to model dry-weather flows (domestic wastewater) and dynamical System Identification to represent wet-weather flows (rainfall runoff). By using a large and high-resolution sensor dataset across a real-world combined sewer system, we illustrate that relatively simple models can achieve good forecasting performance, subject to a finely-tuned and continuous re-calibration procedure. The data requirements of the proposed toolchain are evaluated, showing sensitivity to spatial heterogeneity and unique time-scales across which models of individual sites remain representative. We identify a near-optimal time record, or data "age," for which historical measurements must be available to ensure good forecasting performance. We also show that more data do not always lead to a better model due to system uncertainty, such as shifts in climate or seasonal wastewater patterns. Furthermore, the individual components of the model (wet- and dry-weather) often require different volumes of historical observations for optimal forecasting performance, thus highlighting the need for a flexible re-calibration toolchain rather than a one-size-fits-all approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Automated acquisition and processing of data from measurements on aerodynamic models

    International Nuclear Information System (INIS)

    Mantlik, F.; Pilat, M.; Schmid, J.

    1981-01-01

    Hardware and software are described for processing data measured in the model research of local hydrodynamic conditions in fluid flow through channels with a complex cross sectional geometry, obtained usign aerodynamic models of parts of fast reactor fuel assemblies of the HEM-1 and HEM-2 type. A system was proposed and is being implemented of automatic control of the experiments and measured data acquisition. Basic information is given on the programs for processing and storing the results using a GIER computer. A CAMAC system is primarily used as part of the hardware. (B.S.)

  20. Exoplanets and Multiverses (Abstract)

    Science.gov (United States)

    Trimble, V.

    2016-12-01

    (Abstract only) To the ancients, the Earth was the Universe, of a size to be crossed by a god in a day, by boat or chariot, and by humans in a lifetime. Thus an exoplanet would have been a multiverse. The ideas gradually separated over centuries, with gradual acceptance of a sun-centered solar system, the stars as suns likely to have their own planets, other galaxies beyond the Milky Way, and so forth. And whenever the community divided between "just one' of anything versus "many," the "manies" have won. Discoveries beginning in 1991 and 1995 have gradually led to a battalion or two of planets orbiting other stars, very few like our own little family, and to moderately serious consideration of even larger numbers of other universes, again very few like our own. I'm betting, however, on habitable (though not necessarily inhabited) exoplanets to be found, and habitable (though again not necessarily inhabited) universes. Only the former will yield pretty pictures.