WorldWideScience

Sample records for description analysis simulation

  1. Simulation framework and XML detector description for the CMS experiment

    CERN Document Server

    Arce, P; Boccali, T; Case, M; de Roeck, A; Lara, V; Liendl, M; Nikitenko, A N; Schröder, M; Strässner, A; Wellisch, H P; Wenzel, H

    2003-01-01

    Currently CMS event simulation is based on GEANT3 while the detector description is built from different sources for simulation and reconstruction. A new simulation framework based on GEANT4 is under development. A full description of the detector is available, and the tuning of the GEANT4 performance and the checking of the ability of the physics processes to describe the detector response is ongoing. Its integration on the CMS mass production system and GRID is also currently under development. The Detector Description Database project aims at providing a common source of information for Simulation, Reconstruction, Analysis, and Visualisation, while allowing for different representations as well as specific information for each application. A functional prototype, based on XML, is already released. Also examples of the integration of DDD in the GEANT4 simulation and in the reconstruction applications are provided.

  2. An activity theory perspective of how scenario-based simulations support learning: a descriptive analysis.

    Science.gov (United States)

    Battista, Alexis

    2017-01-01

    The dominant frameworks for describing how simulations support learning emphasize increasing access to structured practice and the provision of feedback which are commonly associated with skills-based simulations. By contrast, studies examining student participants' experiences during scenario-based simulations suggest that learning may also occur through participation. However, studies directly examining student participation during scenario-based simulations are limited. This study examined the types of activities student participants engaged in during scenario-based simulations and then analyzed their patterns of activity to consider how participation may support learning. Drawing from Engeström's first-, second-, and third-generation activity systems analysis, an in-depth descriptive analysis was conducted. The study drew from multiple qualitative methods, namely narrative, video, and activity systems analysis, to examine student participants' activities and interaction patterns across four video-recorded simulations depicting common motivations for using scenario-based simulations (e.g., communication, critical patient management). The activity systems analysis revealed that student participants' activities encompassed three clinically relevant categories, including (a) use of physical clinical tools and artifacts, (b) social interactions, and (c) performance of structured interventions. Role assignment influenced participants' activities and the complexity of their engagement. Importantly, participants made sense of the clinical situation presented in the scenario by reflexively linking these three activities together. Specifically, student participants performed structured interventions, relying upon the use of physical tools, clinical artifacts, and social interactions together with interactions between students, standardized patients, and other simulated participants to achieve their goals. When multiple student participants were present, such as in a

  3. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  4. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  5. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    Science.gov (United States)

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  6. Production Logistics Simulation Supported by Process Description Languages

    Directory of Open Access Journals (Sweden)

    Bohács Gábor

    2016-03-01

    Full Text Available The process description languages are used in the business may be useful in the optimization of logistics processes too. The process description languages would be the obvious solution for process control, to handle the main sources of faults and to give a correct list of what to do during the logistics process. Related to this, firstly, the paper presents the main features of the frequent process description languages. The following section describes the currently most used process modelling languages, in the areas of production and construction logistics. In addition, the paper gives some examples of logistics simulation, as another very important field of logistics system modelling. The main edification of the paper, the logistics simulation supported by process description languages. The paper gives a comparison of a Petri net formal representation and a Simul8 model, through a construction logistics model, as the major contribution of the research.

  7. High-Alpha Research Vehicle Lateral-Directional Control Law Description, Analyses, and Simulation Results

    Science.gov (United States)

    Davidson, John B.; Murphy, Patrick C.; Lallman, Frederick J.; Hoffler, Keith D.; Bacon, Barton J.

    1998-01-01

    This report contains a description of a lateral-directional control law designed for the NASA High-Alpha Research Vehicle (HARV). The HARV is a F/A-18 aircraft modified to include a research flight computer, spin chute, and thrust-vectoring in the pitch and yaw axes. Two separate design tools, CRAFT and Pseudo Controls, were integrated to synthesize the lateral-directional control law. This report contains a description of the lateral-directional control law, analyses, and nonlinear simulation (batch and piloted) results. Linear analysis results include closed-loop eigenvalues, stability margins, robustness to changes in various plant parameters, and servo-elastic frequency responses. Step time responses from nonlinear batch simulation are presented and compared to design guidelines. Piloted simulation task scenarios, task guidelines, and pilot subjective ratings for the various maneuvers are discussed. Linear analysis shows that the control law meets the stability margin guidelines and is robust to stability and control parameter changes. Nonlinear batch simulation analysis shows the control law exhibits good performance and meets most of the design guidelines over the entire range of angle-of-attack. This control law (designated NASA-1A) was flight tested during the Summer of 1994 at NASA Dryden Flight Research Center.

  8. Efficient generation of connectivity in neuronal networks from simulator-independent descriptions

    Directory of Open Access Journals (Sweden)

    Mikael eDjurfeldt

    2014-04-01

    Full Text Available Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed.We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeller's needs. We have used the connection generator interface to connect C++ and Python implementations of the connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modelling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface.

  9. Descriptive data analysis.

    Science.gov (United States)

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  10. Simulator training analysis

    International Nuclear Information System (INIS)

    Hollnagel, E.; Rasmussen, J.

    1981-08-01

    This paper presents a suggestion for systematic collection of data during the normal use of training simulators, with the double purpose of supporting trainee debriefing and providing data for further theoretical studies of operator performance. The method is based on previously described models of operator performance and decision-making, and is a specific instance of the general method for analysis of operator performance data. The method combines a detailed transient-specific description of the expected performance with transient-independent tools for observation of critical activities. (author)

  11. Uncertainty analysis of a one-dimensional constitutive model for shape memory alloy thermomechanical description

    DEFF Research Database (Denmark)

    Oliveira, Sergio A.; Savi, Marcelo A.; Santos, Ilmar F.

    2014-01-01

    The use of shape memory alloys (SMAs) in engineering applications has increased the interest of the accuracy analysis of their thermomechanical description. This work presents an uncertainty analysis related to experimental tensile tests conducted with shape memory alloy wires. Experimental data...... are compared with numerical simulations obtained from a constitutive model with internal constraints employed to describe the thermomechanical behavior of SMAs. The idea is to evaluate if the numerical simulations are within the uncertainty range of the experimental data. Parametric analysis is also developed...

  12. Conception of a PWR simulator as a tool for safety analysis

    International Nuclear Information System (INIS)

    Lanore, J.M.; Bernard, P.; Romeyer Dherbey, J.; Bonnet, C.; Quilchini, P.

    1982-09-01

    A simulator can be a very useful tool for safety analysis to study accident sequences involving malfunctions of the systems and operator interventions. The main characteristics of the simulator SALAMANDRE (description of the systems, physical models, programming organization, control desk) have then been selected according tot he objectives of safety analysis

  13. A uniform geometry description for simulation, reconstruction and visualization in the BESIII experiment

    Energy Technology Data Exchange (ETDEWEB)

    Liang Yutie [School of Physics and State Key Laboratory of Nuclear Physics and Technology, Peking University, Beijing 100871 (China)], E-mail: liangyt@hep.pku.edu.cn; Zhu Bo; You Zhengyun; Liu Kun; Ye Hongxue; Xu Guangming; Wang Siguang [School of Physics and State Key Laboratory of Nuclear Physics and Technology, Peking University, Beijing 100871 (China); Li Weidong; Liu Huaimin; Mao Zepu [Institute of High Energy Physics, CAS, Beijing 100049 (China); Mao Yajun [School of Physics and State Key Laboratory of Nuclear Physics and Technology, Peking University, Beijing 100871 (China)

    2009-05-21

    In the BESIII experiment, the simulation, reconstruction and visualization were designed to use the same geometry description in order to ensure the consistency of the geometry for different applications. Geometry Description Markup Language (GDML), an application-independent persistent format for describing the geometries of detectors, was chosen and met our requirement. The detector of BESIII was described with GDML and then used in Geant4-based simulation and ROOT-based reconstruction and visualization.

  14. A uniform geometry description for simulation, reconstruction and visualization in the BESIII experiment

    International Nuclear Information System (INIS)

    Liang Yutie; Zhu Bo; You Zhengyun; Liu Kun; Ye Hongxue; Xu Guangming; Wang Siguang; Li Weidong; Liu Huaimin; Mao Zepu; Mao Yajun

    2009-01-01

    In the BESIII experiment, the simulation, reconstruction and visualization were designed to use the same geometry description in order to ensure the consistency of the geometry for different applications. Geometry Description Markup Language (GDML), an application-independent persistent format for describing the geometries of detectors, was chosen and met our requirement. The detector of BESIII was described with GDML and then used in Geant4-based simulation and ROOT-based reconstruction and visualization.

  15. BWR Full Integral Simulation Test (FIST) program: facility description report

    International Nuclear Information System (INIS)

    Stephens, A.G.

    1984-09-01

    A new boiling water reactor safety test facility (FIST, Full Integral Simulation Test) is described. It will be used to investigate small breaks and operational transients and to tie results from such tests to earlier large-break test results determined in the TLTA. The new facility's full height and prototypical components constitute a major scaling improvement over earlier test facilities. A heated feedwater system, permitting steady-state operation, and a large increase in the number of measurements are other significant improvements. The program background is outlined and program objectives defined. The design basis is presented together with a detailed, complete description of the facility and measurements to be made. An extensive component scaling analysis and prediction of performance are presented

  16. The Total In-Flight Simulator (TIFS) aerodynamics and systems: Description and analysis. [maneuver control and gust alleviators

    Science.gov (United States)

    Andrisani, D., II; Daughaday, H.; Dittenhauser, J.; Rynaski, E.

    1978-01-01

    The aerodynamics, control system, instrumentation complement and recording system of the USAF Total In/Flight Simulator (TIFS) airplane are described. A control system that would allow the ailerons to be operated collectively, as well as, differentially to entrance the ability of the vehicle to perform the dual function of maneuver load control and gust alleviation is emphasized. Mathematical prediction of the rigid body and the flexible equations of longitudinal motion using the level 2.01 FLEXSTAB program are included along with a definition of the vehicle geometry, the mass and stiffness distribution, the calculated mode frequencies and mode shapes, and the resulting aerodynamic equations of motion of the flexible vehicle. A complete description of the control and instrumentation system of the aircraft is presented, including analysis, ground test and flight data comparisons of the performance and bandwidth of the aerodynamic surface servos. Proposed modification for improved performance of the servos are also presented.

  17. Multidimensional nonlinear descriptive analysis

    CERN Document Server

    Nishisato, Shizuhiko

    2006-01-01

    Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...

  18. Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0

    Science.gov (United States)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.

  19. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  20. Simulation Experiment Description Markup Language (SED-ML Level 1 Version 2

    Directory of Open Access Journals (Sweden)

    Bergmann Frank T.

    2015-06-01

    Full Text Available The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE guidelines.

  1. Operationalizing Healthcare Simulation Psychological Safety: A Descriptive Analysis of an Intervention.

    Science.gov (United States)

    Henricksen, Jared W; Altenburg, Catherine; Reeder, Ron W

    2017-10-01

    Despite efforts to prepare a psychologically safe environment, simulation participants are occasionally psychologically distressed. Instructing simulation educators about participant psychological risks and having a participant psychological distress action plan available to simulation educators may assist them as they seek to keep all participants psychologically safe. A Simulation Participant Psychological Safety Algorithm was designed to aid simulation educators as they debrief simulation participants perceived to have psychological distress and categorize these events as mild (level 1), moderate (level 2), or severe (level 3). A prebrief dedicated to creating a psychologically safe learning environment was held constant. The algorithm was used for 18 months in an active pediatric simulation program. Data collected included level of participant psychological distress as perceived and categorized by the simulation team using the algorithm, type of simulation that participants went through, who debriefed, and timing of when psychological distress was perceived to occur during the simulation session. The Kruskal-Wallis test was used to evaluate the relationship between events and simulation type, events and simulation educator team who debriefed, and timing of event during the simulation session. A total of 3900 participants went through 399 simulation sessions between August 1, 2014, and January 26, 2016. Thirty-four (simulation participants from 27 sessions (7%) were perceived to have an event. One participant was perceived to have a severe (level 3) psychological distress event. Events occurred more commonly in high-intensity simulations, with novice learners and with specific educator teams. Simulation type and simulation educator team were associated with occurrence of events (P simulation personnel using the Simulation Participant Psychological Safety Algorithm is rare, with mild and moderate events being more common. The algorithm was used to teach

  2. Simulation Experiment Description Markup Language (SED-ML Level 1 Version 3 (L1V3

    Directory of Open Access Journals (Sweden)

    Bergmann Frank T.

    2018-03-01

    Full Text Available The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML is an XML-based format that encodes, for a given simulation experiment, (i which models to use; (ii which modifications to apply to models before simulation; (iii which simulation procedures to run on each model; (iv how to post-process the data; and (v how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1 implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  3. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    Science.gov (United States)

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  4. Fractional-Order Nonlinear Systems Modeling, Analysis and Simulation

    CERN Document Server

    Petráš, Ivo

    2011-01-01

    "Fractional-Order Nonlinear Systems: Modeling, Analysis and Simulation" presents a study of fractional-order chaotic systems accompanied by Matlab programs for simulating their state space trajectories, which are shown in the illustrations in the book. Description of the chaotic systems is clearly presented and their analysis and numerical solution are done in an easy-to-follow manner. Simulink models for the selected fractional-order systems are also presented. The readers will understand the fundamentals of the fractional calculus, how real dynamical systems can be described using fractional derivatives and fractional differential equations, how such equations can be solved, and how to simulate and explore chaotic systems of fractional order. The book addresses to mathematicians, physicists, engineers, and other scientists interested in chaos phenomena or in fractional-order systems. It can be used in courses on dynamical systems, control theory, and applied mathematics at graduate or postgraduate level. ...

  5. Domain Endurants: An Analysis and Description Process Model

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2014-01-01

    We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. We...

  6. Description of textures by a structural analysis.

    Science.gov (United States)

    Tomita, F; Shirai, Y; Tsuji, S

    1982-02-01

    A structural analysis system for describing natural textures is introduced. The analyzer automatically extracts the texture elements in an input image, measures their properties, classifies them into some distinctive classes (one ``ground'' class and some ``figure'' classes), and computes the distributions of the gray level, the shape, and the placement of the texture elements in each class. These descriptions are used for classification of texture images. An analysis-by-synthesis method for evaluating texture analyzers is also presented. We propose a synthesizer which generates a texture image based on the descriptions. By comparing the reconstructed image with the original one, we can see what information is preserved and what is lost in the descriptions.

  7. Multi-Level Simulated Fault Injection for Data Dependent Reliability Analysis of RTL Circuit Descriptions

    Directory of Open Access Journals (Sweden)

    NIMARA, S.

    2016-02-01

    Full Text Available This paper proposes data-dependent reliability evaluation methodology for digital systems described at Register Transfer Level (RTL. It uses a hybrid hierarchical approach, combining the accuracy provided by Gate Level (GL Simulated Fault Injection (SFI and the low simulation overhead required by RTL fault injection. The methodology comprises the following steps: the correct simulation of the RTL system, according to a set of input vectors, hierarchical decomposition of the system into basic RTL blocks, logic synthesis of basic RTL blocks, data-dependent SFI for the GL netlists, and RTL SFI. The proposed methodology has been validated in terms of accuracy on a medium sized circuit – the parallel comparator used in Check Node Unit (CNU of the Low-Density Parity-Check (LDPC decoders. The methodology has been applied for the reliability analysis of a 128-bit Advanced Encryption Standard (AES crypto-core, for which the GL simulation was prohibitive in terms of required computational resources.

  8. An analysis of simulated and observed storm characteristics

    Science.gov (United States)

    Benestad, R. E.

    2010-09-01

    A calculus-based cyclone identification (CCI) method has been applied to the most recent re-analysis (ERAINT) from the European Centre for Medium-range Weather Forecasts and results from regional climate model (RCM) simulations. The storm frequency for events with central pressure below a threshold value of 960-990hPa were examined, and the gradient wind from the simulated storm systems were compared with corresponding estimates from the re-analysis. The analysis also yielded estimates for the spatial extent of the storm systems, which was also included in the regional climate model cyclone evaluation. A comparison is presented between a number of RCMs and the ERAINT re-analysis in terms of their description of the gradient winds, number of cyclones, and spatial extent. Furthermore, a comparison between geostrophic wind estimated though triangules of interpolated or station measurements of SLP is presented. Wind still represents one of the more challenging variables to model realistically.

  9. Simulation model for wind energy storage systems. Volume III. Program descriptions. [SIMWEST CODE

    Energy Technology Data Exchange (ETDEWEB)

    Warren, A.W.; Edsinger, R.W.; Burroughs, J.D.

    1977-08-01

    The effort developed a comprehensive computer program for the modeling of wind energy/storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic). An acronym for the program is SIMWEST (Simulation Model for Wind Energy Storage). The level of detail of SIMWEST is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. Volume III, the SIMWEST program description contains program descriptions, flow charts and program listings for the SIMWEST Model Generation Program, the Simulation program, the File Maintenance program and the Printer Plotter program. Volume III generally would not be required by SIMWEST user.

  10. Descriptive Topology in Selected Topics of Functional Analysis

    CERN Document Server

    Kakol, J; Pellicer, Manuel Lopez

    2011-01-01

    "Descriptive Topology in Selected Topics of Functional Analysis" is a collection of recent developments in the field of descriptive topology, specifically focused on the classes of infinite-dimensional topological vector spaces that appear in functional analysis. Such spaces include Frechet spaces, (LF)-spaces and their duals, and the space of continuous real-valued functions C(X) on a completely regular Hausdorff space X, to name a few. These vector spaces appear in functional analysis in distribution theory, differential equations, complex analysis, and various other analytical set

  11. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  12. The experiences of last-year student midwives with High-Fidelity Perinatal Simulation training: A qualitative descriptive study.

    Science.gov (United States)

    Vermeulen, Joeri; Beeckman, Katrien; Turcksin, Rivka; Van Winkel, Lies; Gucciardo, Léonardo; Laubach, Monika; Peersman, Wim; Swinnen, Eva

    2017-06-01

    Simulation training is a powerful and evidence-based teaching method in healthcare. It allows students to develop essential competences that are often difficult to achieve during internships. High-Fidelity Perinatal Simulation exposes them to real-life scenarios in a safe environment. Although student midwives' experiences need to be considered to make the simulation training work, these have been overlooked so far. To explore the experiences of last-year student midwives with High-Fidelity Perinatal Simulation training. A qualitative descriptive study, using three focus group conversations with last-year student midwives (n=24). Audio tapes were transcribed and a thematic content analysis was performed. The entire data set was coded according to recurrent or common themes. To achieve investigator triangulation and confirm themes, discussions among the researchers was incorporated in the analysis. Students found High-Fidelity Perinatal Simulation training to be a positive learning method that increased both their competence and confidence. Their experiences varied over the different phases of the High-Fidelity Perinatal Simulation training. Although uncertainty, tension, confusion and disappointment were experienced throughout the simulation trajectory, they reported that this did not affect their learning and confidence-building. As High-Fidelity Perinatal Simulation training constitutes a helpful learning experience in midwifery education, it could have a positive influence on maternal and neonatal outcomes. In the long term, it could therefore enhance the midwifery profession in several ways. The present study is an important first step in opening up the debate about the pedagogical use of High-Fidelity Perinatal Simulation training within midwifery education. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  13. Solar Pilot Plant, Phase I. Preliminary design report. Volume II. System description and system analysis. CDRL item 2

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-05-01

    Honeywell conducted a parametric analysis of the 10-MW(e) solar pilot plant requirements and expected performance and established an optimum system design. The main analytical simulation tools were the optical (ray trace) and the dynamic simulation models. These are described in detail in Books 2 and 3 of this volume under separate cover. In making design decisions, available performance and cost data were used to provide a design reflecting the overall requirements and economics of a commercial-scale plant. This volume contains a description of this analysis/design process and resultant system/subsystem design and performance.

  14. Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study.

    Science.gov (United States)

    McKenna, Kim D; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann

    2015-01-01

    The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders' efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to those manikins. Many (78%) respondents felt

  15. Description and Analysis Pattern for Theses and Dissertations

    Directory of Open Access Journals (Sweden)

    Sirous Alidousti

    2009-07-01

    Full Text Available Dissertations and theses that are generated in course of research at PhD and Masters levels are considered to be important scientific documents in every country. Data description and analysis of such documents collected together, could automatically - especially when compared with data from other resources - provide new information that is very valuable. Nevertheless, no comprehensive, integrated pattern exists for such description and analysis. The present paper offers the findings of a research conducted for devising an information analysis pattern for dissertations and theses. It also puts forward information categories derived from such documents that could be described and analyzed.

  16. ASPEN: A fully kinetic, reduced-description particle-in-cell model for simulating parametric instabilities

    International Nuclear Information System (INIS)

    Vu, H.X.; Bezzerides, B.; DuBois, D.F.

    1999-01-01

    A fully kinetic, reduced-description particle-in-cell (RPIC) model is presented in which deviations from quasineutrality, electron and ion kinetic effects, and nonlinear interactions between low-frequency and high-frequency parametric instabilities are modeled correctly. The model is based on a reduced description where the electromagnetic field is represented by three separate temporal envelopes in order to model parametric instabilities with low-frequency and high-frequency daughter waves. Because temporal envelope approximations are invoked, the simulation can be performed on the electron time scale instead of the time scale of the light waves. The electrons and ions are represented by discrete finite-size particles, permitting electron and ion kinetic effects to be modeled properly. The Poisson equation is utilized to ensure that space-charge effects are included. The RPIC model is fully three dimensional and has been implemented in two dimensions on the Accelerated Strategic Computing Initiative (ASCI) parallel computer at Los Alamos National Laboratory, and the resulting simulation code has been named ASPEN. The authors believe this code is the first particle-in-cell code capable of simulating the interaction between low-frequency and high-frequency parametric instabilities in multiple dimensions. Test simulations of stimulated Raman scattering, stimulated Brillouin scattering, and Langmuir decay instability are presented

  17. Descriptions of positron defect analysis capabilities

    International Nuclear Information System (INIS)

    Howell, R.H.

    1994-10-01

    A series of descriptive papers and graphics appropriate for distribution to potential collaborators has been assembled. These describe the capabilities for defect analysis using positron annihilation spectroscopy. The application of positrons to problems in the polymer and semiconductor industries is addressed

  18. Baseline process description for simulating plutonium oxide production for precalc project

    Energy Technology Data Exchange (ETDEWEB)

    Pike, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-26

    Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as well as process and facility design details necessary for multi-scale, multi-physics models are provided.

  19. Analysis of individual brain activation maps using hierarchical description and multiscale detection

    International Nuclear Information System (INIS)

    Poline, J.B.; Mazoyer, B.M.

    1994-01-01

    The authors propose a new method for the analysis of brain activation images that aims at detecting activated volumes rather than pixels. The method is based on Poisson process modeling, hierarchical description, and multiscale detection (MSD). Its performances have been assessed using both Monte Carlo simulated images and experimental PET brain activation data. As compared to other methods, the MSD approach shows enhanced sensitivity with a controlled overall type I error, and has the ability to provide an estimate of the spatial limits of the detected signals. It is applicable to any kind of difference image for which the spatial autocorrelation function can be approximated by a stationary Gaussian function

  20. Analogue circuits simulation

    Energy Technology Data Exchange (ETDEWEB)

    Mendo, C

    1988-09-01

    Most analogue simulators have evolved from SPICE. The history and description of SPICE-like simulators are given. From a mathematical formulation of the electronic circuit the following analysis are possible: DC, AC, transient, noise, distortion, Worst Case and Statistical.

  1. The College of Anaesthetists of Ireland Simulation Training programme: a descriptive report and analysis of course participants' feedback.

    Science.gov (United States)

    Cafferkey, Aine; Coyle, Elizabeth; Greaney, David; Harte, Sinead; Hayes, Niamh; Langdon, Miriam; Straub, Birgitt; Burlacu, Crina

    2018-03-20

    Simulation-based education is a modern training modality that allows healthcare professionals to develop knowledge and practice skills in a safe learning environment. The College of Anaesthetists of Ireland (CAI) was the first Irish postgraduate medical training body to introduce mandatory simulation training into its curriculum. Extensive quality assurance and improvement data has been collected on all simulation courses to date. Describe The College of Anaesthetists of Ireland Simulation Training (CAST) programme and report the analysis of course participants' feedback. A retrospective review of feedback forms from four simulation courses from March 2010 to August 2016 took place. Qualitative and quantitative data from 1069 participants who attended 112 courses was analysed. Feedback was overall very positive. Course content and delivery were deemed to be appropriate. Participants agreed that course participation would influence their future practice. A statistically significant difference (P simulation training in specialist anaesthesia training in Ireland.

  2. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, R V; Kristensen, D; Nielsen, Jacob Holm

    2006-01-01

    and lipolytic changes occurring in the milk during chill storage for 4 d. Sensory analysis and chemical analysis showed high correlation between the typical descriptors for oxidation such as cardboard, metallic taste, and boiled milk and specific chemical markers for oxidation such as hexanal. Notably, primary......Oxidation in 3 types of bovine milk with different fatty acid profiles obtained through manipulation of feed was evaluated by analytical methods quantifying the content of potential antioxidants, the tendency of formation of free radicals, and the accumulation of primary and secondary oxidation...... products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...

  3. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, Rikke Susanne Vingborg; Kristensen, D.; Nielsen, J. H.

    2006-01-01

    products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...... and lipolytic changes occurring in the milk during chill storage for 4 d. Sensory analysis and chemical analysis showed high correlation between the typical descriptors for oxidation such as cardboard, metallic taste, and boiled milk and specific chemical markers for oxidation such as hexanal. Notably, primary...... oxidation products (i.e., lipid hydroperoxides) and even the tendency of formation of radicals as measured by electron spin resonance spectroscopy were also highly correlated to the sensory descriptors for oxidation. Electron spin resonance spectroscopy should accordingly be further explored as a routine...

  4. CarSim: Automatic 3D Scene Generation of a Car Accident Description

    OpenAIRE

    Egges, A.; Nijholt, A.; Nugues, P.

    2001-01-01

    The problem of generating a 3D simulation of a car accident from a written description can be divided into two subtasks: the linguistic analysis and the virtual scene generation. As a means of communication between these two system parts, we designed a template formalism to represent a written accident report. The CarSim system processes formal descriptions of accidents and creates corresponding 3D simulations. A planning component models the trajectories and temporal values of every vehicle ...

  5. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    Science.gov (United States)

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  6. Description of occupant behaviour in building energy simulation: state-of-art and concepts for improvements

    DEFF Research Database (Denmark)

    Fabi, Valentina; Andersen, Rune Vinther; Corgnati, Stefano Paolo

    2011-01-01

    of basic assumptions that affect the results. Therefore, the calculated energy performance may differ significantly from the real energy consumption. One of the key reasons is the current inability to properly model occupant behaviour and to quantify the associated uncertainties in building performance...... predictions. By consequence, a better description of parameters related to occupant behaviour is highly required. In this paper, the state of art in occupant behaviour modelling within energy simulation tools is analysed and some concepts related to possible improvements of simulation tools are proposed...

  7. Combining a building simulation with energy systems analysis to assess the benefits of natural ventilation

    DEFF Research Database (Denmark)

    Oropeza-Perez, Ivan; Østergaard, Poul Alberg; Remmen, Arne

    2013-01-01

    a thermal air flow simulation program - Into the energy systems analysis model. Descriptions of the energy systems in two geographical locations, i.e. Mexico and Denmark, are set up as inputs. Then, the assessment is done by calculating the energy impacts as well as environmental benefits in the energy...

  8. APROS 3-D core models for simulators and plant analyzers

    International Nuclear Information System (INIS)

    Puska, E.K.

    1999-01-01

    The 3-D core models of APROS simulation environment can be used in simulator and plant analyzer applications, as well as in safety analysis. The key feature of APROS models is that the same physical models can be used in all applications. For three-dimensional reactor cores the APROS models cover both quadratic BWR and PWR cores and the hexagonal lattice VVER-type cores. In APROS environment the user can select the number of flow channels in the core and either five- or six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the channel description have a decisive effect on the calculation time of the 3-D core model and thus just these selection make at present the major difference between a safety analysis model and a training simulator model. The paper presents examples of various types of 3-D LWR-type core descriptions for simulator and plant analyzer use and discusses the differences of calculation speed and physical results between a typical safety analysis model description and a real-time simulator model description in transients. (author)

  9. Critical slowing down and error analysis in lattice QCD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Virotta, Francesco

    2012-02-21

    In this work we investigate the critical slowing down of lattice QCD simulations. We perform a preliminary study in the quenched approximation where we find that our estimate of the exponential auto-correlation time scales as {tau}{sub exp}(a){proportional_to}a{sup -5}, where a is the lattice spacing. In unquenched simulations with O(a) improved Wilson fermions we do not obtain a scaling law but find results compatible with the behavior that we find in the pure gauge theory. The discussion is supported by a large set of ensembles both in pure gauge and in the theory with two degenerate sea quarks. We have moreover investigated the effect of slow algorithmic modes in the error analysis of the expectation value of typical lattice QCD observables (hadronic matrix elements and masses). In the context of simulations affected by slow modes we propose and test a method to obtain reliable estimates of statistical errors. The method is supposed to help in the typical algorithmic setup of lattice QCD, namely when the total statistics collected is of O(10){tau}{sub exp}. This is the typical case when simulating close to the continuum limit where the computational costs for producing two independent data points can be extremely large. We finally discuss the scale setting in N{sub f}=2 simulations using the Kaon decay constant f{sub K} as physical input. The method is explained together with a thorough discussion of the error analysis employed. A description of the publicly available code used for the error analysis is included.

  10. Critical slowing down and error analysis in lattice QCD simulations

    International Nuclear Information System (INIS)

    Virotta, Francesco

    2012-01-01

    In this work we investigate the critical slowing down of lattice QCD simulations. We perform a preliminary study in the quenched approximation where we find that our estimate of the exponential auto-correlation time scales as τ exp (a)∝a -5 , where a is the lattice spacing. In unquenched simulations with O(a) improved Wilson fermions we do not obtain a scaling law but find results compatible with the behavior that we find in the pure gauge theory. The discussion is supported by a large set of ensembles both in pure gauge and in the theory with two degenerate sea quarks. We have moreover investigated the effect of slow algorithmic modes in the error analysis of the expectation value of typical lattice QCD observables (hadronic matrix elements and masses). In the context of simulations affected by slow modes we propose and test a method to obtain reliable estimates of statistical errors. The method is supposed to help in the typical algorithmic setup of lattice QCD, namely when the total statistics collected is of O(10)τ exp . This is the typical case when simulating close to the continuum limit where the computational costs for producing two independent data points can be extremely large. We finally discuss the scale setting in N f =2 simulations using the Kaon decay constant f K as physical input. The method is explained together with a thorough discussion of the error analysis employed. A description of the publicly available code used for the error analysis is included.

  11. Supplemental description of ROSA-IV/LSTF with No.1 simulated fuel-rod assembly

    International Nuclear Information System (INIS)

    1989-09-01

    Forty-two integral simulation tests of PWR small break LOCA (loss-of-coolant accident) and transient were conducted at the ROSA-IV Large-Scale Test Facility (LSTF) with the No.1 simulated fuel-rod assembly between March 1985 and August 1988. Described in the report are supplemental information on modifications of the system hardware and measuring systems, results of system characteristics tests including the initial fluid mass inventory and heat loss distribution for the primary system, and thermal properties for the heater rod materials. These are necessary to establish the correct boundary conditions of each LSTF experiment with the No.1 core assembly in addition to the system data given in the system description report (JAERI-M 84-237). (author)

  12. Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation

    Science.gov (United States)

    Campbell, Robert James; Gantt, Laura; Congdon, Tamara

    2009-01-01

    This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533

  13. Description and Simulation of a Fast Packet Switch Architecture for Communication Satellites

    Science.gov (United States)

    Quintana, Jorge A.; Lizanich, Paul J.

    1995-01-01

    The NASA Lewis Research Center has been developing the architecture for a multichannel communications signal processing satellite (MCSPS) as part of a flexible, low-cost meshed-VSAT (very small aperture terminal) network. The MCSPS architecture is based on a multifrequency, time-division-multiple-access (MF-TDMA) uplink and a time-division multiplex (TDM) downlink. There are eight uplink MF-TDMA beams, and eight downlink TDM beams, with eight downlink dwells per beam. The information-switching processor, which decodes, stores, and transmits each packet of user data to the appropriate downlink dwell onboard the satellite, has been fully described by using VHSIC (Very High Speed Integrated-Circuit) Hardware Description Language (VHDL). This VHDL code, which was developed in-house to simulate the information switching processor, showed that the architecture is both feasible and viable. This paper describes a shared-memory-per-beam architecture, its VHDL implementation, and the simulation efforts.

  14. The role of XML in the CMS detector description

    International Nuclear Information System (INIS)

    Liendl, M.; Lingen, F.van; Todorov, T.; Arce, P.; Furtjes, A.; Innocente, V.; Roeck, A. de; Case, M.

    2001-01-01

    Offline Software such as Simulation, Reconstruction, Analysis, and Visualisation are all in need of a detector description. These applications have several common but also many specific requirements for the detector description in order to build up their internal representations. To achieve this in a consistent and coherent manner a common source of information, the detector description database, will be consulted by each of the applications. The role and suitability of XML in the design of the detector description database in the scope of the CMS detector at the LHC is discussed. Different aspects such as data modelling capabilities of XML, tool support, integration to C++ representations of data models are treated and recent results of prototype implementations are presented

  15. Evaluating best educational practices, student satisfaction, and self-confidence in simulation: A descriptive study.

    Science.gov (United States)

    Zapko, Karen A; Ferranto, Mary Lou Gemma; Blasiman, Rachael; Shelestak, Debra

    2018-01-01

    The National League for Nursing (NLN) has endorsed simulation as a necessary teaching approach to prepare students for the demanding role of professional nursing. Questions arise about the suitability of simulation experiences to educate students. Empirical support for the effect of simulation on patient outcomes is sparse. Most studies on simulation report only anecdotal results rather than data obtained using evaluative tools. The aim of this study was to examine student perception of best educational practices in simulation and to evaluate their satisfaction and self-confidence in simulation. This study was a descriptive study designed to explore students' perceptions of the simulation experience over a two-year period. Using the Jeffries framework, a Simulation Day was designed consisting of serial patient simulations using high and medium fidelity simulators and live patient actors. The setting for the study was a regional campus of a large Midwestern Research 2 university. The convenience sample consisted of 199 participants and included sophomore, junior, and senior nursing students enrolled in the baccalaureate nursing program. The Simulation Days consisted of serial patient simulations using high and medium fidelity simulators and live patient actors. Participants rotated through four scenarios that corresponded to their level in the nursing program. Data was collected in two consecutive years. Participants completed both the Educational Practices Questionnaire (Student Version) and the Student Satisfaction and Self-Confidence in Learning Scale. Results provide strong support for using serial simulation as a learning tool. Students were satisfied with the experience, felt confident in their performance, and felt the simulations were based on sound educational practices and were important for learning. Serial simulations and having students experience simulations more than once in consecutive years is a valuable method of clinical instruction. When

  16. Intensive care nurses' perceptions of simulation-based team training for building patient safety in intensive care: a descriptive qualitative study.

    Science.gov (United States)

    Ballangrud, Randi; Hall-Lord, Marie Louise; Persenius, Mona; Hedelin, Birgitta

    2014-08-01

    To describe intensive care nurses' perceptions of simulation-based team training for building patient safety in intensive care. Failures in team processes are found to be contributory factors to incidents in an intensive care environment. Simulation-based training is recommended as a method to make health-care personnel aware of the importance of team working and to improve their competencies. The study uses a qualitative descriptive design. Individual qualitative interviews were conducted with 18 intensive care nurses from May to December 2009, all of which had attended a simulation-based team training programme. The interviews were analysed by qualitative content analysis. One main category emerged to illuminate the intensive care nurse perception: "training increases awareness of clinical practice and acknowledges the importance of structured work in teams". Three generic categories were found: "realistic training contributes to safe care", "reflection and openness motivates learning" and "finding a common understanding of team performance". Simulation-based team training makes intensive care nurses more prepared to care for severely ill patients. Team training creates a common understanding of how to work in teams with regard to patient safety. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. The X-Ray Pebble Recirculation Experiment (X-PREX): Facility Description, Preliminary Discrete Element Method Simulation Validation Studies, and Future Test Program

    International Nuclear Information System (INIS)

    Laufer, Michael R.; Bickel, Jeffrey E.; Buster, Grant C.; Krumwiede, David L.; Peterson, Per F.

    2014-01-01

    This paper presents a facility description, preliminary results, and future test program of the new X-Ray Pebble Recirculation Experiment (X-PREX), which is now operational and being used to collect data on the behavior of slow dense granular flows relevant to pebble bed reactor core designs. The X-PREX facility uses digital x-ray tomography methods to track both the translational and rotational motion of spherical pebbles, which provides unique experimental results that can be used to validate discrete element method (DEM) simulations of pebble motion. The validation effort supported by the X-PREX facility provides a means to build confidence in analysis of pebble bed configuration and residence time distributions that impact the neutronics, thermal hydraulics, and safety analysis of pebble bed reactor cores. Preliminary experimental and DEM simulation results are reported for silo drainage, a classical problem in the granular flow literature, at several hopper angles. These studies include conventional converging and novel diverging geometries that provide additional flexibility in the design of pebble bed reactor cores. Excellent agreement is found between the X-PREX experimental and DEM simulation results. Finally, this paper discusses additional studies in progress relevant to the design and analysis of pebble bed reactor cores including pebble recirculation in cylindrical core geometries and evaluation of forces on shut down blades inserted directly into a packed pebble bed. (author)

  18. Intensive care nursing students' perceptions of simulation for learning confirming communication skills: A descriptive qualitative study.

    Science.gov (United States)

    Karlsen, Marte-Marie Wallander; Gabrielsen, Anita Kristin; Falch, Anne Lise; Stubberud, Dag-Gunnar

    2017-10-01

    The aim of this study was to explore intensive care nursing students experiences with confirming communication skills training in a simulation-based environment. The study has a qualitative, exploratory and descriptive design. The participants were students in a post-graduate program in intensive care nursing, that had attended a one day confirming communication course. Three focus group interviews lasting between 60 and 80min were conducted with 14 participants. The interviews were transcribed verbatim. Thematic analysis was performed, using Braun & Clark's seven steps. The analysis resulted in three main themes: "awareness", "ice-breaker" and "challenging learning environment". The participants felt that it was a challenge to see themselves on the video-recordings afterwards, however receiving feedback resulted in better self-confidence in mastering complex communication. The main finding of the study is that the students reported improved communication skills after the confirming communication course. However; it is uncertain how these skills can be transferred to clinical practice improving patient outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. System description and analysis. Part 1: Feasibility study for helicopter/VTOL wide-angle simulation image generation display system

    Science.gov (United States)

    1977-01-01

    A preliminary design for a helicopter/VSTOL wide angle simulator image generation display system is studied. The visual system is to become part of a simulator capability to support Army aviation systems research and development within the near term. As required for the Army to simulate a wide range of aircraft characteristics, versatility and ease of changing cockpit configurations were primary considerations of the study. Due to the Army's interest in low altitude flight and descents into and landing in constrained areas, particular emphasis is given to wide field of view, resolution, brightness, contrast, and color. The visual display study includes a preliminary design, demonstrated feasibility of advanced concepts, and a plan for subsequent detail design and development. Analysis and tradeoff considerations for various visual system elements are outlined and discussed.

  20. Application of descriptive statistics in analysis of experimental data

    OpenAIRE

    Mirilović Milorad; Pejin Ivana

    2008-01-01

    Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...

  1. Job Analysis, Job Descriptions, and Performance Appraisal Systems.

    Science.gov (United States)

    Sims, Johnnie M.; Foxley, Cecelia H.

    1980-01-01

    Job analysis, job descriptions, and performance appraisal can benefit student services administration in many ways. Involving staff members in the development and implementation of these techniques can increase commitment to and understanding of the overall objectives of the office, as well as communication and cooperation among colleagues.…

  2. A CAD based geometry model for simulation and analysis of particle detector data

    Energy Technology Data Exchange (ETDEWEB)

    Milde, Michael; Losekamm, Martin; Poeschl, Thomas; Greenwald, Daniel; Paul, Stephan [Technische Universitaet Muenchen, 85748 Garching (Germany)

    2016-07-01

    The development of a new particle detector requires a good understanding of its setup. A detailed model of the detector's geometry is not only needed during construction, but also for simulation and data analysis. To arrive at a consistent description of the detector geometry a representation is needed that can be easily implemented in different software tools used during data analysis. We developed a geometry representation based on CAD files that can be easily used within the Geant4 simulation framework and analysis tools based on the ROOT framework. This talk presents the structure of the geometry model and show its implementation using the example of the event reconstruction developed for the Multi-purpose Active-target Particle Telescope (MAPT). The detector consists of scintillating plastic fibers and can be used as a tracking detector and calorimeter with omnidirectional acceptance. To optimize the angular resolution and the energy reconstruction of measured particles, a detailed detector model is needed at all stages of the reconstruction.

  3. Simulation of forced-ventilation fires

    International Nuclear Information System (INIS)

    Krause, F.R.; Gregory, W.S.

    1982-01-01

    Fire hazard descriptions and compartment fire models are assessed as input to airflow network analysis methods that simulate the exposure of ventilation system components to fire products. The assessment considered the availability of hazard descriptions and models for predicting simultaneous heat and mass release at special compartment openings that are characterized by a one-dimensional and controllable volumetric flux

  4. An Integrated Model for Computer Aided Reservoir Description : from Outcrop Study to Fluid Flow Simulations Un logiciel intégré pour une description des gisements assistée par ordinateur : de l'étude d'un affleurement aux simulations de l'écoulement des fluides

    Directory of Open Access Journals (Sweden)

    Guerillot D.

    2006-11-01

    Full Text Available An accurate understanding of the internal architecture of a reservoir is required to improve reservoir management for oil recovery. Geostatistical methods give an image of this architecture. The purpose of this paper is to show how this lithological description could be used for reservoir simulation. For this purpose, scale averaging problems must be solved for non-additive variables. A method giving a full effective permeability matrix is proposed. The integrated software described here starts from core analysis and lithologic logs to provide data for reservoir simulators. Each of the steps of this interactive and graphic system is explained here. Pour faire de bonnes prévisions de production pour un gisement pétrolifère, il est nécessaire de connaître précisément son architecture interne. Les méthodes géostatistiques donnent une représentation de cette architecture. L'objectif de cet article est de montrer une façon d'utiliser cette description lithologique pour la simulation des réservoirs. Il faut alors résoudre des problèmes de changement d'échelle pour les variables qui ne sont pas additives. On propose une méthode d'estimation de la perméabilité effective sous la forme d'une matrice pleine. Le logiciel intégré que l'on décrit part de l'analyse des carottes et des diagraphies en lithologies et fournit des données pour les simulateurs de gisement. On détaille ici chaque étape de ce système interactif graphique.

  5. Virtual Environment Computer Simulations to Support Human Factors Engineering and Operations Analysis for the RLV Program

    Science.gov (United States)

    Lunsford, Myrtis Leigh

    1998-01-01

    The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.

  6. TASAC a computer program for thermal analysis of severe accident conditions. Version 3/01, Dec 1991. Model description and user's guide

    International Nuclear Information System (INIS)

    Stempniewicz, M.; Marks, P.; Salwa, K.

    1992-06-01

    TASAC (Thermal Analysis of Severe Accident Conditions) is computer code developed in the Institute of Atomic Energy written in FORTRAN 77 for the digital computer analysis of PWR rod bundle behaviour during severe accident conditions. The code has the ability to model an early stage of core degradation including heat transfer inside the rods, convective and radiative heat exchange as well as cladding interactions with coolant and fuel, hydrogen generation, melting, relocations and refreezing of fuel rod materials with dissolution of UO 2 and ZrO 2 in liquid phase. The code was applied for the simulation of International Standard Problem number 28, performed on PHEBUS test facility. This report contains the program physical models description, detailed description of input data requirements and results of code verification. The main directions for future TASAC code development are formulated. (author). 20 refs, 39 figs, 4 tabs

  7. High-Alpha Research Vehicle (HARV) longitudinal controller: Design, analyses, and simulation resultss

    Science.gov (United States)

    Ostroff, Aaron J.; Hoffler, Keith D.; Proffitt, Melissa S.; Brown, Philip W.; Phillips, Michael R.; Rivers, Robert A.; Messina, Michael D.; Carzoo, Susan W.; Bacon, Barton J.; Foster, John F.

    1994-01-01

    This paper describes the design, analysis, and nonlinear simulation results (batch and piloted) for a longitudinal controller which is scheduled to be flight-tested on the High-Alpha Research Vehicle (HARV). The HARV is an F-18 airplane modified for and equipped with multi-axis thrust vectoring. The paper includes a description of the facilities, a detailed review of the feedback controller design, linear analysis results of the feedback controller, a description of the feed-forward controller design, nonlinear batch simulation results, and piloted simulation results. Batch simulation results include maximum pitch stick agility responses, angle of attack alpha captures, and alpha regulation for full lateral stick rolls at several alpha's. Piloted simulation results include task descriptions for several types of maneuvers, task guidelines, the corresponding Cooper-Harper ratings from three test pilots, and some pilot comments. The ratings show that desirable criteria are achieved for almost all of the piloted simulation tasks.

  8. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  9. Analysis of laparoscopic port site complications: A descriptive study

    Directory of Open Access Journals (Sweden)

    Somu Karthik

    2013-01-01

    Full Text Available Context: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. Aims: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. Settings and Design: Prospective descriptive study. Materials and Methods: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. Statistical Analysis Used: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. Results: Of the 570 patients undergoing laparoscopic surgery, 17 (3% had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI was the most frequent (n = 10, 1.8%, followed by port site bleeding (n = 4, 0.7%, omentum-related complications (n = 2; 0.35%, and port site metastasis (n = 1, 0.175%. Conclusions: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit.

  10. Analysis of laparoscopic port site complications: A descriptive study

    Science.gov (United States)

    Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya

    2013-01-01

    CONTEXT: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. AIMS: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. SETTINGS AND DESIGN: Prospective descriptive study. MATERIALS AND METHODS: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. STATISTICAL ANALYSIS USED: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. RESULTS: Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). CONCLUSIONS: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit. PMID:23741110

  11. Description of the artificial parameters in EGS4-Monte Carlo simulation and their influence on the absorbed depth dose from electrons in water

    International Nuclear Information System (INIS)

    Sandborg, M.; Alm Carlsson, G.

    1990-01-01

    This report described the background of the EGS4-Monte Carlo code. It gives a short description of the interaction between electrons and materia and a description of the artificial parameters used for EGS4-Monte Carlo simulating. It also gives advice to choose the right artificial parameters. (K.A.E)

  12. Development of BWR [boiling water reactor] and PWR [pressurized water reactor] event descriptions for nuclear facility simulator training

    International Nuclear Information System (INIS)

    Carter, R.J.; Bovell, C.R.

    1987-01-01

    A number of tools that can aid nuclear facility training developers in designing realistic simulator scenarios have been developed. This paper describes each of the tools, i.e., event lists, events-by-competencies matrices, and event descriptions, and illustrates how the tools can be used to construct scenarios

  13. Hardware description languages

    Science.gov (United States)

    Tucker, Jerry H.

    1994-01-01

    Hardware description languages are special purpose programming languages. They are primarily used to specify the behavior of digital systems and are rapidly replacing traditional digital system design techniques. This is because they allow the designer to concentrate on how the system should operate rather than on implementation details. Hardware description languages allow a digital system to be described with a wide range of abstraction, and they support top down design techniques. A key feature of any hardware description language environment is its ability to simulate the modeled system. The two most important hardware description languages are Verilog and VHDL. Verilog has been the dominant language for the design of application specific integrated circuits (ASIC's). However, VHDL is rapidly gaining in popularity.

  14. Robotics/Automated Systems Task Analysis and Description of Required Job Competencies Report. Task Analysis and Description of Required Job Competencies of Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Hull, Daniel M.; Lovett, James E.

    This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…

  15. Correlation of Descriptive Analysis and Instrumental Puncture Testing of Watermelon Cultivars.

    Science.gov (United States)

    Shiu, J W; Slaughter, D C; Boyden, L E; Barrett, D M

    2016-06-01

    The textural properties of 5 seedless watermelon cultivars were assessed by descriptive analysis and the standard puncture test using a hollow probe with increased shearing properties. The use of descriptive analysis methodology was an effective means of quantifying watermelon sensory texture profiles for characterizing specific cultivars' characteristics. Of the 10 cultivars screened, 71% of the variation in the sensory attributes was measured using the 1st 2 principal components. Pairwise correlation of the hollow puncture probe and sensory parameters determined that initial slope, maximum force, and work after maximum force measurements all correlated well to the sensory attributes crisp and firm. These findings confirm that maximum force correlates well with not only firmness in watermelon, but crispness as well. The initial slope parameter also captures the sensory crispness of watermelon, but is not as practical to measure in the field as maximum force. The work after maximum force parameter is thought to reflect cellular arrangement and membrane integrity that in turn impact sensory firmness and crispness. Watermelon cultivar types were correctly predicted by puncture test measurements in heart tissue 87% of the time, although descriptive analysis was correct 54% of the time. © 2016 Institute of Food Technologists®

  16. TASAC a computer program for thermal analysis of severe accident conditions. Version 3/01, Dec 1991. Model description and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Stempniewicz, M; Marks, P; Salwa, K

    1992-06-01

    TASAC (Thermal Analysis of Severe Accident Conditions) is computer code developed in the Institute of Atomic Energy written in FORTRAN 77 for the digital computer analysis of PWR rod bundle behaviour during severe accident conditions. The code has the ability to model an early stage of core degradation including heat transfer inside the rods, convective and radiative heat exchange as well as cladding interactions with coolant and fuel, hydrogen generation, melting, relocations and refreezing of fuel rod materials with dissolution of UO{sub 2} and ZrO{sub 2} in liquid phase. The code was applied for the simulation of International Standard Problem number 28, performed on PHEBUS test facility. This report contains the program physical models description, detailed description of input data requirements and results of code verification. The main directions for future TASAC code development are formulated. (author). 20 refs, 39 figs, 4 tabs.

  17. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  18. Equilibration and analysis of first-principles molecular dynamics simulations of water

    Science.gov (United States)

    Dawson, William; Gygi, François

    2018-03-01

    First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.

  19. Analysis of laparoscopic port site complications: A descriptive study.

    Science.gov (United States)

    Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya

    2013-04-01

    The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. Prospective descriptive study. In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit.

  20. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    Science.gov (United States)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment

  1. Graphic Description: The Mystery of Ibn Khafaja\\'s Success in Description

    Directory of Open Access Journals (Sweden)

    جواد رنجبر

    2009-12-01

    Full Text Available Graphic Description:   The Mystery of Ibn Khafaja's Success in Description    Ali Bagher Taheriniya *  Javad Ranjbar **      Abstract Ibn Khafaja is one of the poets and men of letters in Spain. He is titled to Sanobari of Spain. He is one of the masters of description. Hence, the analysis of successful techniques he has used in the descriptive art could illuminate the way for others. Al-Taswir al-harfi (graphic description is a term which denotes the highest and most detailed poems. On this basis, the best descriptive poem is one which is closer to a painting. He has used some elements called conforming elements of description which contain: imagination, feeling, faculty, and dialogue as well as three other elements: to be inborn in description, enchanting nature and convenient life. This article is going to give an analysis of the reasons for Ibn Khafaja’s success in description and portrait making.   Key words: Ibn Khafaja, poetry, description, portrait   * Associate Professor, Bu Ali Sina University of Hamadan E-mail: bTaheriniya@yahoo.com  ** M.A. in Arabic Language and Literature

  2. Optimization and Simulation in Drug Development - Review and Analysis

    DEFF Research Database (Denmark)

    Schjødt-Eriksen, Jens; Clausen, Jens

    2003-01-01

    We give a review of pharmaceutical R&D and mathematical simulation and optimization methods used to support decision making within the pharmaceutical development process. The complex nature of drug development is pointed out through a description of the various phases of the pharmaceutical develo...... development process. A part of the paper is dedicated to the use of simulation techniques to support clinical trials. The paper ends with a section describing portfolio modelling methods in the context of the pharmaceutical industry....

  3. Description of the grout system dynamic simulation

    International Nuclear Information System (INIS)

    Zimmerman, B.D.

    1993-07-01

    The grout system dynamic computer simulation was created to allow investigation of the ability of the grouting system to meet established milestones, for various assumed system configurations and parameters. The simulation simulates the movement of tank waste through the system versus time, from initial storage tanks, through feed tanks and the grout plant, then finally to a grout vault. The simulation properly accounts for the following (1) time required to perform various actions or processes, (2) delays involved in gaining regulatory approval, (3) random system component failures, (4) limitations on equipment capacities, (5) available parallel components, and (6) different possible strategies for vault filling. The user is allowed to set a variety of system parameters for each simulation run. Currently, the output of a run primarily consists of a plot of projected grouting campaigns completed versus time, for comparison with milestones. Other outputs involving any model component can also be quickly created or deleted as desired. In particular, sensitivity runs where the effect of varying a model parameter (flow rates, delay times, number of feed tanks available, etc.) on the ability of the system to meet milestones can be made easily. The grout system simulation was implemented using the ITHINK* simulation language for Macintosh** computers

  4. Descriptive Research

    DEFF Research Database (Denmark)

    Wigram, Anthony Lewis

    2003-01-01

    Descriptive research is described by Lathom-Radocy and Radocy (1995) to include Survey research, ex post facto research, case studies and developmental studies. Descriptive research also includes a review of the literature in order to provide both quantitative and qualitative evidence of the effect...... starts will allow effect size calculations to be made in order to evaluate effect over time. Given the difficulties in undertaking controlled experimental studies in the creative arts therapies, descriptive research methods offer a way of quantifying effect through descriptive statistical analysis...

  5. CarSim: Automatic 3D Scene Generation of a Car Accident Description

    NARCIS (Netherlands)

    Egges, A.; Nijholt, A.; Nugues, P.

    2001-01-01

    The problem of generating a 3D simulation of a car accident from a written description can be divided into two subtasks: the linguistic analysis and the virtual scene generation. As a means of communication between these two system parts, we designed a template formalism to represent a written

  6. Descriptive analysis of YouTube music therapy videos.

    Science.gov (United States)

    Gooding, Lori F; Gregory, Dianne

    2011-01-01

    The purpose of this study was to conduct a descriptive analysis of music therapy-related videos on YouTube. Preliminary searches using the keywords music therapy, music therapy session, and "music therapy session" resulted in listings of 5000, 767, and 59 videos respectively. The narrowed down listing of 59 videos was divided between two investigators and reviewed in order to determine their relationship to actual music therapy practice. A total of 32 videos were determined to be depictions of music therapy sessions. These videos were analyzed using a 16-item investigator-created rubric that examined both video specific information and therapy specific information. Results of the analysis indicated that audio and visual quality was adequate, while narrative descriptions and identification information were ineffective in the majority of the videos. The top 5 videos (based on the highest number of viewings in the sample) were selected for further analysis in order to investigate demonstration of the Professional Level of Practice Competencies set forth in the American Music Therapy Association (AMTA) Professional Competencies (AMTA, 2008). Four of the five videos met basic competency criteria, with the quality of the fifth video precluding evaluation of content. Of particular interest is the fact that none of the videos included credentialing information. Results of this study suggest the need to consider ways to ensure accurate dissemination of music therapy-related information in the YouTube environment, ethical standards when posting music therapy session videos, and the possibility of creating AMTA standards for posting music therapy related video.

  7. Final safety and hazards analysis for the Battelle LOCA simulation tests in the NRU reactor

    International Nuclear Information System (INIS)

    Axford, D.J.; Martin, I.C.; McAuley, S.J.

    1981-04-01

    This is the final safety and hazards report for the proposed Battelle LOCA simulation tests in NRU. A brief description of equipment test design and operating procedure precedes a safety analysis and hazards review of the project. The hazards review addresses potential equipment failures as well as potential for a metal/water reaction and evaluates the consequences. The operation of the tests as proposed does not present an unacceptable risk to the NRU Reactor, CRNL personnel or members of the public. (author)

  8. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  9. Analysis of Waves in Space Plasma (WISP) near field simulation and experiment

    Science.gov (United States)

    Richie, James E.

    1992-01-01

    The WISP payload scheduler for a 1995 space transportation system (shuttle flight) will include a large power transmitter on board at a wide range of frequencies. The levels of electromagnetic interference/electromagnetic compatibility (EMI/EMC) must be addressed to insure the safety of the shuttle crew. This report is concerned with the simulation and experimental verification of EMI/EMC for the WISP payload in the shuttle cargo bay. The simulations have been carried out using the method of moments for both thin wires and patches to stimulate closed solids. Data obtained from simulation is compared with experimental results. An investigation of the accuracy of the modeling approach is also included. The report begins with a description of the WISP experiment. A description of the model used to simulate the cargo bay follows. The results of the simulation are compared to experimental data on the input impedance of the WISP antenna with the cargo bay present. A discussion of the methods used to verify the accuracy of the model is shown to illustrate appropriate methods for obtaining this information. Finally, suggestions for future work are provided.

  10. Manifest domains:analysis and description

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2017-01-01

    _static_attribute, is_dynamic_attribute, is_inert_attribute, is_reactive_attribute, is_active_attribute, is_autonomous_attribute, is_biddable_attribute and is_programmable_attribute. The twist suggests ways of modeling “access” to the values of these kinds of attributes: the static attributes by simply “copying” them...... processes. C.A.R. Hoare series in computer science. Prentice-Hall International, London, 2004). We show how to model essential aspects of perdurants in terms of their signatures based on the concepts of endurants. And we show how one can “compile” descriptions of endurant parts into descriptions...

  11. Research reactor job analysis - A project description

    International Nuclear Information System (INIS)

    Yoder, John; Bessler, Nancy J.

    1988-01-01

    Addressing the need of the improved training in nuclear industry, nuclear utilities established training program guidelines based on Performance-Based Training (PBT) concepts. The comparison of commercial nuclear power facilities with research and test reactors owned by the U.S. Department of Energy (DOE), made in an independent review of personnel selection, training, and qualification requirements for DOE-owned reactors pointed out that the complexity of the most critical tasks in research reactors is less than that in power reactors. The U.S. Department of Energy (DOE) started a project by commissioning Oak Ridge Associated Universities (ORAU) to conduct a job analysis survey of representative research reactor facilities. The output of the project consists of two publications: Volume 1 - Research Reactor Job Analysis: Overview, which contains an Introduction, Project Description, Project Methodology,, and. An Overview of Performance-Based Training (PBT); and Volume 2 - Research Reactor Job Analysis: Implementation, which contains Guidelines for Application of Preliminary Task Lists and Preliminary Task Lists for Reactor Operators and Supervisory Reactor Operators

  12. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  13. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  14. A Descriptive Analysis of Instructional Coaches' Data Use in Science

    Science.gov (United States)

    Snodgrass Rangel, Virginia; Bell, Elizabeth R.; Monroy, Carlos

    2017-01-01

    A key assumption of accountability policies is that educators will use data to improve their instruction. In practice, however, data use is quite hard, and more districts are looking to instructional coaches to support their teachers. The purpose of this descriptive analysis is to examine how instructional coaches in elementary and middle school…

  15. Neoclassical MHD descriptions of tokamak plasmas

    International Nuclear Information System (INIS)

    Callen, J.D.; Kim, Y.B.; Sundaram, A.K.

    1988-01-01

    Considerable progress has been made in extending neoclassical MHD theory and in exploring the linear instabilities, nonlinear behavior and turbulence models it implies for tokamak plasmas. The areas highlighted in this paper include: extension of the neoclassical MHD equations to include temperature-gradient and heat flow effects; the free energy and entropy evolution implied by this more complete description; a proper ballooning mode formalism analysis of the linear instabilities; a new rippling mode type instability; numerical simulation of the linear instabilities which exhibit a smooth transition from resistive ballooning modes at high collisionality to neoclassical MHD modes at low collisionality; numerical simulation of the nonlinear growth of a single helicity tearing mode; and a Direct-Interaction-Approximation model of neoclassical MHD turbulence and the anomalous transport it induces which substantially improves upon previous mixing length model estimates. 34 refs., 2 figs

  16. Holistic Nursing Simulation: A Concept Analysis.

    Science.gov (United States)

    Cohen, Bonni S; Boni, Rebecca

    2018-03-01

    Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.

  17. Predicate Argument Structure Analysis for Use Case Description Modeling

    Science.gov (United States)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  18. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  19. Seismic analysis of structures by simulation

    International Nuclear Information System (INIS)

    Sundararajan, C.; Gangadharan, A.C.

    1977-01-01

    The paper presents a state-of-the-art survey, and recommendations for future work in the area of stochastic seismic analysis by Monte Carlo simulation. First the Monte Carlo simulation procedure is described, with special emphasis on a 'unified approach' for the digital generation of artificial earthquake motions. Next, the advantages and disadvantages of the method over the power spectral method are discussed; and finally, an efficient 'Hybrid Monte Carlo-Power Spectral Method' is developed. The Monte Carlo simulation procedure consists of the following tasks: (1) Digital generation of artificial earthquake motions, (2) Response analysis of the structure to a number of sample motions, and (3) statistical analysis of the structural responses

  20. Seismic analysis of structures by simulation

    International Nuclear Information System (INIS)

    Sundararajan, C.; Gangadharan, A.C.

    1977-01-01

    The paper presents a state-of-the-art survey, and recommendations for future work in the area of stochastic seismic analysis by Monte Carlo simulation. First the Monte Carlo simulation procedure is described with special emphasis on a 'unified approach' for the digital generation of anificial earthquake motions. Next, the advantages and disadvantages of the method over the power spectral method are discussed; and finally, an efficient 'Hybrid Monte Carlo-Power Spectral Method' is developed. The Monte Carlo simulation procedure consists of the following tasks: (1) Digital generation of artificial earthquake motions, (2) Response analysis of the structure to a number of sample motions, and (3) Statistical analysis of the structural responses. (Auth.)

  1. A regional climate model for northern Europe: model description and results from the downscaling of two GCM control simulations

    Science.gov (United States)

    Rummukainen, M.; Räisänen, J.; Bringfelt, B.; Ullerstig, A.; Omstedt, A.; Willén, U.; Hansson, U.; Jones, C.

    This work presents a regional climate model, the Rossby Centre regional Atmospheric model (RCA1), recently developed from the High Resolution Limited Area Model (HIRLAM). The changes in the HIRLAM parametrizations, necessary for climate-length integrations, are described. A regional Baltic Sea ocean model and a modeling system for the Nordic inland lake systems have been coupled with RCA1. The coupled system has been used to downscale 10-year time slices from two different general circulation model (GCM) simulations to provide high-resolution regional interpretation of large-scale modeling. A selection of the results from the control runs, i.e. the present-day climate simulations, are presented: large-scale free atmospheric fields, the surface temperature and precipitation results and results for the on-line simulated regional ocean and lake surface climates. The regional model modifies the surface climate description compared to the GCM simulations, but it is also substantially affected by the biases in the GCM simulations. The regional model also improves the representation of the regional ocean and the inland lakes, compared to the GCM results.

  2. A regional climate model for northern Europe: model description and results from the downscaling of two GCM control simulations

    Energy Technology Data Exchange (ETDEWEB)

    Rummukainen, M.; Raeisaenen, J.; Bringfelt, B.; Ullerstig, A.; Omstedt, A.; Willen, U.; Hansson, U.; Jones, C. [Rossby Centre, Swedish Meteorological and Hydrological Inst., Norrkoeping (Sweden)

    2001-03-01

    This work presents a regional climate model, the Rossby Centre regional Atmospheric model (RCA1), recently developed from the High Resolution Limited Area Model (HIRLAM). The changes in the HIRLAM parametrizations, necessary for climate-length integrations, are described. A regional Baltic Sea ocean model and a modeling system for the Nordic inland lake systems have been coupled with RCA1. The coupled system has been used to downscale 10-year time slices from two different general circulation model (GCM) simulations to provide high-resolution regional interpretation of large-scale modeling. A selection of the results from the control runs, i.e. the present-day climate simulations, are presented: large-scale free atmospheric fields, the surface temperature and precipitation results and results for the on-line simulated regional ocean and lake surface climates. The regional model modifies the surface climate description compared to the GCM simulations, but it is also substantially affected by the biases in the GCM simulations. The regional model also improves the representation of the regional ocean and the inland lakes, compared to the GCM results. (orig.)

  3. Physics detector simulation facility system software description

    International Nuclear Information System (INIS)

    Allen, J.; Chang, C.; Estep, P.; Huang, J.; Liu, J.; Marquez, M.; Mestad, S.; Pan, J.; Traversat, B.

    1991-12-01

    Large and costly detectors will be constructed during the next few years to study the interactions produced by the SSC. Efficient, cost-effective designs for these detectors will require careful thought and planning. Because it is not possible to test fully a proposed design in a scaled-down version, the adequacy of a proposed design will be determined by a detailed computer model of the detectors. Physics and detector simulations will be performed on the computer model using high-powered computing system at the Physics Detector Simulation Facility (PDSF). The SSCL has particular computing requirements for high-energy physics (HEP) Monte Carlo calculations for the simulation of SSCL physics and detectors. The numerical calculations to be performed in each simulation are lengthy and detailed; they could require many more months per run on a VAX 11/780 computer and may produce several gigabytes of data per run. Consequently, a distributed computing environment of several networked high-speed computing engines is envisioned to meet these needs. These networked computers will form the basis of a centralized facility for SSCL physics and detector simulation work. Our computer planning groups have determined that the most efficient, cost-effective way to provide these high-performance computing resources at this time is with RISC-based UNIX workstations. The modeling and simulation application software that will run on the computing system is usually written by physicists in FORTRAN language and may need thousands of hours of supercomputing time. The system software is the ''glue'' which integrates the distributed workstations and allows them to be managed as a single entity. This report will address the computing strategy for the SSC

  4. Water Quality Analysis Simulation

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...

  5. Two-step rating-based 'double-faced applicability' test for sensory analysis of spread products as an alternative to descriptive analysis with trained panel.

    Science.gov (United States)

    Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong

    2018-03-01

    Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Simulation of the Compact Ignition Tokamak (CIT) conceptual design

    International Nuclear Information System (INIS)

    Carlson, K.E.; Wareing, T.L.

    1988-01-01

    Calculations have been made using the Advanced Thermal Hydraulic Energy Network Analysis (ATHENA) code that simulate the cool down of the cryostat and the performance of the condensing heat exchanger. The purpose of this simulation was to confirm the estimated 30 minute cool down time and to size a condensing heat exchanger for the CIT liquid nitrogen cooling system. This report includes a brief description of the ATHENA code, descriptions of proposed CIT cryostat and condenser designs, and the associated ATHENA models representing these design. This is followed by the ATHENA calculated results and conclusions concerning the results. 6 refs., 17 figs

  7. Description langugage for the modelling and analysis of temporal change of instrumentation and control system structures

    International Nuclear Information System (INIS)

    Goering, Markus Heinrich

    2013-01-01

    The utilisation of computer-based I and C, as a result of the technological advancements in the computer industry, represents an up-to-date challenge for I and C engineers in nuclear power plants throughout the world. In comparison with the time-proven, hard-wired I and C, the engineering must consider the novel characteristics of computer-based technology during the implementation, these are primarily constituted by higher performance and the utilisation of software. On one hand, this allows for implementing more complex I and C functions and integrating several I and C functions on to single components, although on the other hand, the minimisation of the CCF probability is of high priority to the engineering. Furthermore, the engineering must take the implementation of the deterministic safety concept for the I and C design into consideration. This includes engineering the redundancy, diversity, physical separation, and independence design features, and is complemented by the analysis of the I and C design with respect to the superposition of pre-defined event sequences and postulated failure combinations, so as to secure the safe operation of the nuclear power plant. The focus of this thesis is on the basic principles of engineering, i.e. description languages and methods, which the engineering relies on for a highly qualitative and efficient computer-based I and C implementation. The analysis of the deterministic safety concept and computer-based I and C characteristics yields the relevant technical requirements for the engineering, these are combined with the general structuring principles of standard IEC 81346 and the extended description language evaluation criteria, which are based on the guideline VDI/VDE-3681, resulting in target criteria for evaluating description languages. The analysis and comparison of existing description languages reveals that no description language satisfactorily fulfils all target criteria, which is constituted in the

  8. Description langugage for the modelling and analysis of temporal change of instrumentation and control system structures

    Energy Technology Data Exchange (ETDEWEB)

    Goering, Markus Heinrich

    2013-10-25

    The utilisation of computer-based I and C, as a result of the technological advancements in the computer industry, represents an up-to-date challenge for I and C engineers in nuclear power plants throughout the world. In comparison with the time-proven, hard-wired I and C, the engineering must consider the novel characteristics of computer-based technology during the implementation, these are primarily constituted by higher performance and the utilisation of software. On one hand, this allows for implementing more complex I and C functions and integrating several I and C functions on to single components, although on the other hand, the minimisation of the CCF probability is of high priority to the engineering. Furthermore, the engineering must take the implementation of the deterministic safety concept for the I and C design into consideration. This includes engineering the redundancy, diversity, physical separation, and independence design features, and is complemented by the analysis of the I and C design with respect to the superposition of pre-defined event sequences and postulated failure combinations, so as to secure the safe operation of the nuclear power plant. The focus of this thesis is on the basic principles of engineering, i.e. description languages and methods, which the engineering relies on for a highly qualitative and efficient computer-based I and C implementation. The analysis of the deterministic safety concept and computer-based I and C characteristics yields the relevant technical requirements for the engineering, these are combined with the general structuring principles of standard IEC 81346 and the extended description language evaluation criteria, which are based on the guideline VDI/VDE-3681, resulting in target criteria for evaluating description languages. The analysis and comparison of existing description languages reveals that no description language satisfactorily fulfils all target criteria, which is constituted in the

  9. Water Quality Analysis Simulation

    Science.gov (United States)

    The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.

  10. Concept of APDL, the atomic process description language

    International Nuclear Information System (INIS)

    Sasaki, Akira

    2004-01-01

    The concept of APDL, the Atomic Process Description Language, which provides simple and complete description of atomic model is presented. The syntax to describe electron orbital and configuration is defined for the use in the atomic structure, kinetics and spectral synthesis simulation codes. (author)

  11. SIMONE: Tool for Data Analysis and Simulation

    International Nuclear Information System (INIS)

    Chudoba, V.; Hnatio, B.; Sharov, P.; Papka, Paul

    2013-06-01

    SIMONE is a software tool based on the ROOT Data Analysis Framework and developed in collaboration of FLNR JINR and iThemba LABS. It is intended for physicists planning experiments and analysing experimental data. The goal of the SIMONE framework is to provide a flexible system, user friendly, efficient and well documented. It is intended for simulation of a wide range of Nuclear Physics experiments. The most significant conditions and physical processes can be taken into account during simulation of the experiment. The user can create his own experimental setup through the access of predefined detector geometries. Simulated data is made available in the same format as for the real experiment for identical analysis of both experimental and simulated data. Significant time reduction is expected during experiment planning and data analysis. (authors)

  12. Detector Simulation: Data Treatment and Analysis Methods

    CERN Document Server

    Apostolakis, J

    2011-01-01

    Detector Simulation in 'Data Treatment and Analysis Methods', part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B1: Detectors for Particles and Radiation. Part 1: Principles and Methods'. This document is part of Part 1 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Section '4.1 Detector Simulation' of Chapter '4 Data Treatment and Analysis Methods' with the content: 4.1 Detector Simulation 4.1.1 Overview of simulation 4.1.1.1 Uses of detector simulation 4.1.2 Stages and types of simulation 4.1.2.1 Tools for event generation and detector simulation 4.1.2.2 Level of simulation and computation time 4.1.2.3 Radiation effects and background studies 4.1.3 Components of detector simulation 4.1.3.1 Geometry modeling 4.1.3.2 External fields 4.1.3.3 Intro...

  13. Milestone M4900: Simulant Mixing Analytical Results

    Energy Technology Data Exchange (ETDEWEB)

    Kaplan, D.I.

    2001-07-26

    This report addresses Milestone M4900, ''Simulant Mixing Sample Analysis Results,'' and contains the data generated during the ''Mixing of Process Heels, Process Solutions, and Recycle Streams: Small-Scale Simulant'' task. The Task Technical and Quality Assurance Plan for this task is BNF-003-98-0079A. A report with a narrative description and discussion of the data will be issued separately.

  14. An overview of the design and analysis of simulation experiments for sensitivity analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models. This review surveys 'classic' and 'modern' designs for experiments with simulation models. Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc. These designs

  15. Lone ranger decision making versus consensus decision making: Descriptive analysis

    OpenAIRE

    Maite Sara Mashego

    2015-01-01

    Consensus decision making, concerns group members make decisions together with the requirement of reaching a consensus that is all members abiding by the decision outcome. Lone ranging worked for sometime in a autocratic environment. Researchers are now pointing to consensus decision-making in organizations bringing dividend to many organizations. This article used a descriptive analysis to compare the goodness of consensus decision making and making lone ranging decision management. This art...

  16. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    Science.gov (United States)

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  17. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    Science.gov (United States)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  18. An Overview of the Design and Analysis of Simulation Experiments for Sensitivity Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2004-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models.This review surveys classic and modern designs for experiments with simulation models.Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc.These designs assume a

  19. Simulating Silvicultural Treatments Using FIA Data

    Science.gov (United States)

    Christopher W. Woodall; Carl E. Fiedler

    2005-01-01

    Potential uses of the Forest Inventory and Analysis Database (FIADB) extend far beyond descriptions and summaries of current forest resources. Silvicultural treatments, although typically conducted at the stand level, may be simulated using the FIADB for predicting future forest conditions and resources at broader scales. In this study, silvicultural prescription...

  20. Employing Picture Description to Assess the Students' Descriptive Paragraph Writing

    Directory of Open Access Journals (Sweden)

    Ida Ayu Mega Cahyani

    2018-03-01

    Full Text Available Writing is considered as an important skill in learning process which is needed to be mastered by the students. However, in teaching learning process at schools or universities, the assessment of writing skill is not becoming the focus of learning process and the assessment is administered inappropriately. In this present study, the researcher undertook the study which dealt with assessing descriptive paragraph writing ability of the students through picture description by employing an ex post facto as the research design. The present study was intended to answer the research problem dealing with the extent of the students’ achievement of descriptive paragraph writing ability which is assessed through picture description. The samples under the study were 40 students determined by means of random sampling technique with lottery system. The data were collected through administering picture description as the research instrument. The obtained data were analyzed by using norm-reference measure of five standard values. The results of the data analysis showed that there were 67.50% samples of the study were successful in writing descriptive paragraph, while there were 32.50% samples were unsuccessful in writing descriptive paragraph which was assessed by administering picture description test

  1. Experimental Design for Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2001-01-01

    This introductory tutorial gives a survey on the use of statistical designs for what if-or sensitivity analysis in simulation.This analysis uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as

  2. Simulation of FIB-SEM images for analysis of porous microstructures.

    Science.gov (United States)

    Prill, Torben; Schladitz, Katja

    2013-01-01

    Focused ion beam nanotomography-scanning electron microscopy tomography yields high-quality three-dimensional images of materials microstructures at the nanometer scale combining serial sectioning using a focused ion beam with SEM. However, FIB-SEM tomography of highly porous media leads to shine-through artifacts preventing automatic segmentation of the solid component. We simulate the SEM process in order to generate synthetic FIB-SEM image data for developing and validating segmentation methods. Monte-Carlo techniques yield accurate results, but are too slow for the simulation of FIB-SEM tomography requiring hundreds of SEM images for one dataset alone. Nevertheless, a quasi-analytic description of the specimen and various acceleration techniques, including a track compression algorithm and an acceleration for the simulation of secondary electrons, cut down the computing time by orders of magnitude, allowing for the first time to simulate FIB-SEM tomography. © Wiley Periodicals, Inc.

  3. Coupling an analytical description of anti-scatter grids with simulation software of radiographic systems using Monte Carlo code

    International Nuclear Information System (INIS)

    Rinkel, J.; Dinten, J.M.; Tabary, J.

    2004-01-01

    The use of focused anti-scatter grids on digital radiographic systems with two-dimensional detectors produces acquisitions with a decreased scatter to primary ratio and thus improved contrast and resolution. Simulation software is of great interest in optimizing grid configuration according to a specific application. Classical simulators are based on complete detailed geometric descriptions of the grid. They are accurate but very time consuming since they use Monte Carlo code to simulate scatter within the high-frequency grids. We propose a new practical method which couples an analytical simulation of the grid interaction with a radiographic system simulation program. First, a two dimensional matrix of probability depending on the grid is created offline, in which the first dimension represents the angle of impact with respect to the normal to the grid lines and the other the energy of the photon. This matrix of probability is then used by the Monte Carlo simulation software in order to provide the final scattered flux image. To evaluate the gain of CPU time, we define the increasing factor as the increase of CPU time of the simulation with as opposed to without the grid. Increasing factors were calculated with the new model and with classical methods representing the grid with its CAD model as part of the object. With the new method, increasing factors are shorter by one to two orders of magnitude compared with the second one. These results were obtained with a difference in calculated scatter of less than five percent between the new and the classical method. (authors)

  4. Novel 3D/VR interactive environment for MD simulations, visualization and analysis.

    Science.gov (United States)

    Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P

    2014-12-18

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.

  5. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  6. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    International Nuclear Information System (INIS)

    Follin, Sven; Svensson, Urban

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations

  7. Xyce parallel electronic simulator design : mathematical formulation, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Hoekstra, Robert John; Waters, Lon J.; Hutchinson, Scott Alan; Keiter, Eric Richard; Russo, Thomas V.

    2004-06-01

    This document is intended to contain a detailed description of the mathematical formulation of Xyce, a massively parallel SPICE-style circuit simulator developed at Sandia National Laboratories. The target audience of this document are people in the role of 'service provider'. An example of such a person would be a linear solver expert who is spending a small fraction of his time developing solver algorithms for Xyce. Such a person probably is not an expert in circuit simulation, and would benefit from an description of the equations solved by Xyce. In this document, modified nodal analysis (MNA) is described in detail, with a number of examples. Issues that are unique to circuit simulation, such as voltage limiting, are also described in detail.

  8. Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA)

    Science.gov (United States)

    SRD 100 Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA) (PC database for purchase)   This database has been designed to facilitate quantitative interpretation of Auger-electron and X-ray photoelectron spectra and to improve the accuracy of quantitation in routine analysis. The database contains all physical data needed to perform quantitative interpretation of an electron spectrum for a thin-film specimen of given composition. A simulation module provides an estimate of peak intensities as well as the energy and angular distributions of the emitted electron flux.

  9. An application of sedimentation simulation in Tahe oilfield

    Science.gov (United States)

    Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He

    2017-12-01

    The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.

  10. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    Science.gov (United States)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  11. Sensitivity Analysis of Multidisciplinary Rotorcraft Simulations

    Science.gov (United States)

    Wang, Li; Diskin, Boris; Biedron, Robert T.; Nielsen, Eric J.; Bauchau, Olivier A.

    2017-01-01

    A multidisciplinary sensitivity analysis of rotorcraft simulations involving tightly coupled high-fidelity computational fluid dynamics and comprehensive analysis solvers is presented and evaluated. An unstructured sensitivity-enabled Navier-Stokes solver, FUN3D, and a nonlinear flexible multibody dynamics solver, DYMORE, are coupled to predict the aerodynamic loads and structural responses of helicopter rotor blades. A discretely-consistent adjoint-based sensitivity analysis available in FUN3D provides sensitivities arising from unsteady turbulent flows and unstructured dynamic overset meshes, while a complex-variable approach is used to compute DYMORE structural sensitivities with respect to aerodynamic loads. The multidisciplinary sensitivity analysis is conducted through integrating the sensitivity components from each discipline of the coupled system. Numerical results verify accuracy of the FUN3D/DYMORE system by conducting simulations for a benchmark rotorcraft test model and comparing solutions with established analyses and experimental data. Complex-variable implementation of sensitivity analysis of DYMORE and the coupled FUN3D/DYMORE system is verified by comparing with real-valued analysis and sensitivities. Correctness of adjoint formulations for FUN3D/DYMORE interfaces is verified by comparing adjoint-based and complex-variable sensitivities. Finally, sensitivities of the lift and drag functions obtained by complex-variable FUN3D/DYMORE simulations are compared with sensitivities computed by the multidisciplinary sensitivity analysis, which couples adjoint-based flow and grid sensitivities of FUN3D and FUN3D/DYMORE interfaces with complex-variable sensitivities of DYMORE structural responses.

  12. Psychometric analysis of simulated psychopathology during sick leave

    Directory of Open Access Journals (Sweden)

    Ignacio Jáuregui Lobera

    2018-01-01

    Full Text Available Simulation from a categorical or diagnostic perspective, has turned into a more dimensional point of view, so it is possible to establish different “levels” of simulation. In order to analyse, from a psychometric perspective, the possible prediction of simulated behaviour based on common measures of general psychopathology, the objective of the current study was to analyse possible predictors of the Structured Symptomatic Simulation Inventory (SIMS scores considering as dependent variables the total SIMS score, the SIMS subscales scores, and the cut-off points usually suggested to discriminate between “no suspected simulation”/“suspected simulation”, which usually are 14 and 16. In terms of possible predictors, a set of variables were established: a categorical (sex, type of treatment - psychopharmacological, psychotherapeutic, combined-, type of work activity, being self-employed or not, presence-absence of a history of psychopathology (both familial and personal, presence or not of associated physical pathology, diagnosis -according to ICD-10- and the final proposal -return to work, sick leave extended, proposal of permanent work incapacity-; and b continuous (perceived stress -general and current, self-esteem, results of a screening questionnaire for personality disorders and scores on a symptoms questionnaire. In addition, a descriptive study of all variables was carried out and possible differences of genre were analysed.

  13. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  14. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  15. MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system

    Science.gov (United States)

    Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg

    2005-01-01

    We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.

  16. Contribution to the aid to computer-aided design. Simulation of digital and logical sets. The CHAMBOR software

    International Nuclear Information System (INIS)

    Mansuy, Guy

    1973-01-01

    This report presents a simulation software which belongs to a set of software aimed at the design, analysis, test and tracing of electronic and logical assemblies. This software simulates the operation in time, and considers the propagation of signals through the network elements, with taking the delay created by each of them into account. The author presents some generalities (modules, description, library, simulation of a network in function of time), proposes a general and then a detailed description of the software: data interpretation, processing of dynamic data and network simulation, display of results on a graphical workstation

  17. Thermal simulation of storage in TSS-Galleries

    International Nuclear Information System (INIS)

    Lain Huerta, R.; Martinez Santiago, T.; Ramirez Oyangueren, P.

    1993-01-01

    This report describes the experiment ''thermal simulation of storage in TSS-galleries'' what is been developed in salt mine of Asse, Germany. The report has 3 part: 1) Analysis of objectives and general description of boundary layers. 2) Geomechanics parameters of salt mine. 3) Thermal modelization, thermomechanics modelization and data acquisition

  18. Parity simulation for nuclear plant analysis

    International Nuclear Information System (INIS)

    Hansen, K.F.; Depiente, E.

    1986-01-01

    The analysis of the transient performance of nuclear plants is sufficiently complex that simulation tools are needed for design and safety studies. The simulation tools are needed for design and safety studies. The simulation tools are normally digital because of the speed, flexibility, generality, and repeatability of digital computers. However, communication with digital computers is an awkward matter, requiring special skill or training. The designer wishing to gain insight into system behavior must expend considerable effort in learning to use computer codes, or else have an intermediary communicate with the machine. There has been a recent development in analog simulation that simplifies the user interface with the simulator, while at the same time improving the performance of analog computers. This development is termed parity simulation and is now in routine use in analyzing power electronic network transients. The authors describe the concept of parity simulation and present some results of using the approach to simulate neutron kinetics problems

  19. Verifying Real-Time Systems using Explicit-time Description Methods

    Directory of Open Access Journals (Sweden)

    Hao Wang

    2009-12-01

    Full Text Available Timed model checking has been extensively researched in recent years. Many new formalisms with time extensions and tools based on them have been presented. On the other hand, Explicit-Time Description Methods aim to verify real-time systems with general untimed model checkers. Lamport presented an explicit-time description method using a clock-ticking process (Tick to simulate the passage of time together with a group of global variables for time requirements. This paper proposes a new explicit-time description method with no reliance on global variables. Instead, it uses rendezvous synchronization steps between the Tick process and each system process to simulate time. This new method achieves better modularity and facilitates usage of more complex timing constraints. The two explicit-time description methods are implemented in DIVINE, a well-known distributed-memory model checker. Preliminary experiment results show that our new method, with better modularity, is comparable to Lamport's method with respect to time and memory efficiency.

  20. Regional hydrogeological simulations for Forsmark - numerical modelling using DarcyTools. Preliminary site description Forsmark area version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-12-15

    A numerical model is developed on a regional-scale (hundreds of square kilometres) to study the zone of influence for variable-density groundwater flow that affects the Forsmark area. Transport calculations are performed by particle tracking from a local-scale release area (a few square kilometres) to test the sensitivity to different hydrogeological uncertainties and the need for far-field realism. The main objectives of the regional flow modelling were to achieve the following: I. Palaeo-hydrogeological understanding: An improved understanding of the palaeohydrogeological conditions is necessary in order to gain credibility for the site descriptive model in general and the hydrogeological description in particular. This requires modelling of the groundwater flow from the last glaciation up to present-day with comparisons against measured TDS and other hydro-geochemical measures. II. Simulation of flow paths: The simulation and visualisation of flow paths from a tentative repository area is a means for describing the role of the current understanding of the modelled hydrogeological conditions in the target volume, i.e. the conditions of primary interest for Safety Assessment. Of particular interest here is demonstration of the need for detailed far-field realism in the numerical simulations. The motivation for a particular model size (and resolution) and set of boundary conditions for a realistic description of the recharge and discharge connected to the flow at repository depth is an essential part of the groundwater flow path simulations. The numerical modelling was performed by two separate modelling teams, the ConnectFlow Team and the DarcyTools Team. The work presented in this report was based on the computer code DarcyTools developed by Computer-aided Fluid Engineering. DarcyTools is a kind of equivalent porous media (EPM) flow code specifically designed to treat flow and salt transport in sparsely fractured crystalline rock intersected by transmissive

  1. Regional hydrogeological simulations for Forsmark - numerical modelling using DarcyTools. Preliminary site description Forsmark area version 1.2

    International Nuclear Information System (INIS)

    Follin, Sven; Stigsson, Martin; Svensson, Urban

    2005-12-01

    A numerical model is developed on a regional-scale (hundreds of square kilometres) to study the zone of influence for variable-density groundwater flow that affects the Forsmark area. Transport calculations are performed by particle tracking from a local-scale release area (a few square kilometres) to test the sensitivity to different hydrogeological uncertainties and the need for far-field realism. The main objectives of the regional flow modelling were to achieve the following: I. Palaeo-hydrogeological understanding: An improved understanding of the palaeohydrogeological conditions is necessary in order to gain credibility for the site descriptive model in general and the hydrogeological description in particular. This requires modelling of the groundwater flow from the last glaciation up to present-day with comparisons against measured TDS and other hydro-geochemical measures. II. Simulation of flow paths: The simulation and visualisation of flow paths from a tentative repository area is a means for describing the role of the current understanding of the modelled hydrogeological conditions in the target volume, i.e. the conditions of primary interest for Safety Assessment. Of particular interest here is demonstration of the need for detailed far-field realism in the numerical simulations. The motivation for a particular model size (and resolution) and set of boundary conditions for a realistic description of the recharge and discharge connected to the flow at repository depth is an essential part of the groundwater flow path simulations. The numerical modelling was performed by two separate modelling teams, the ConnectFlow Team and the DarcyTools Team. The work presented in this report was based on the computer code DarcyTools developed by Computer-aided Fluid Engineering. DarcyTools is a kind of equivalent porous media (EPM) flow code specifically designed to treat flow and salt transport in sparsely fractured crystalline rock intersected by transmissive

  2. Analog reactor simulator RAS; Reaktorski analogni simulator RAS

    Energy Technology Data Exchange (ETDEWEB)

    Radanovic, Lj; Bingulac, S; Popovic, D [The Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Yugoslavia)

    1961-07-01

    Analog computer RAS was designed as a nuclear reactor simulator, but it can be simultaneously used for solving a number of other problems. This paper contains a brief description of the simulator parts and their principal characteristics.

  3. Biomimicry: Descriptive analysis of biodiversity strategy adoption for business sustainable performance

    Directory of Open Access Journals (Sweden)

    Sivave Mashingaidze

    2014-06-01

    Full Text Available Biomimicry is a novel interdisciplinary field that mimics nature’s best ideas and processes to solve human problems. The objective of this article was to do a descriptive documentary analysis of literature in biodiversity and to recommend for business adoption as a sustainable performance strategy. The research was however based on nine (9 Life’s Principles, which were candidly inspired by nature. The research findings indicated that most business theories and strategies can mimic perfunctorily from nature for them to achieve a sustainable performance. The research was quite a conceptual and therefore did not offer direct practical proposition because its value was a descriptive of the ideas and strategies from nature and to outline its fundamental principles since biodiversity has track record of sustainability without men’s interference which humanity can also mimic

  4. Social interaction, globalization and computer-aided analysis a practical guide to developing social simulation

    CERN Document Server

    Osherenko, Alexander

    2014-01-01

    This thorough, multidisciplinary study discusses the findings of social interaction and social simulation using understandable global examples. Shows the reader how to acquire intercultural data, illustrating each step with descriptive comments and program code.

  5. PRODUCTION SYSTEM MODELING AND SIMULATION USING DEVS FORMALISM

    OpenAIRE

    Amaya Hurtado, Darío; Castillo Estepa, Ricardo Andrés; Avilés Montaño, Óscar Fernando; Ramos Sandoval, Olga Lucía

    2014-01-01

    This article presents the Discrete Event System Specification (DEVS) formalism, in their atomic and coupled configurations; it is used for discrete event systems modeling and simulation. Initially this work describes the analysis of discrete event systems concepts and its applicability. Then a comprehensive description of the DEVS formalism structure is presented, in order to model and simulate an industrial process, taking into account changes in parameters such as process service time, each...

  6. A descriptive analysis of studies on behavioural treatment of drooling (1970-2005).

    NARCIS (Netherlands)

    Burg, J.J.W. van der; Didden, R.; Jongerius, P.H.; Rotteveel, J.J.

    2007-01-01

    A descriptive analysis was conducted on studies on the behavioural treatment of drooling (published between 1970 and 2005). The 17 articles that met the inclusion criteria described 53 participants (mean age 14y 7mo, [SD 4y 9mo]; range 6-28y). Sex of 87% of the participants was reported: 28 male, 18

  7. A descriptive analysis of studies on behavioural treatment of drooling (1970-2005)

    NARCIS (Netherlands)

    Burg, J.J.W. van der; Didden, H.C.M.; Jongerius, P.H.; Rotteveel, J.J.

    2007-01-01

    A descriptive analysis was conducted on studies on the behavioural treatment of drooling (published between 1970 and 2005). The 17 articles that met the inclusion criteria described 53 participants (mean age 14y 7mo, [SD 4y 9mo]; range 6-28y). Sex of 87% of the participants was reported: 28 male, 18

  8. Visualizing human communication in business process simulations

    Science.gov (United States)

    Groehn, Matti; Jalkanen, Janne; Haho, Paeivi; Nieminen, Marko; Smeds, Riitta

    1999-03-01

    In this paper a description of business process simulation is given. Crucial part in the simulation of business processes is the analysis of social contacts between the participants. We will introduce a tool to collect log data and how this log data can be effectively analyzed using two different kind of methods: discussion flow charts and self-organizing maps. Discussion flow charts revealed the communication patterns and self-organizing maps are a very effective way of clustering the participants into development groups.

  9. A simulation program for the VIRGO experiment

    International Nuclear Information System (INIS)

    Caron, B.; Dominjon, A.; Flaminio, R.; Marion, F.; Massonet, L.; Morand, R.; Mours, B.; Verkindt, D.; Yvert, M.

    1994-07-01

    Within the VIRGO experiment a simulation program is developed providing an accurate description of the interferometric antenna behaviour, taking into account all sources of noise. Besides its future use as a tool for data analysis and for the commissioning of the apparatus, the simulation helps finalizing the design of the detector. Emphasis is put at the present time on the study of the stability of optical components implied in the global feedback control system of the interferometer. (author). 5 refs., 4 figs

  10. A job analysis design for the rail industry : description and model analysis of the job of freight conductor.

    Science.gov (United States)

    2013-10-01

    This document provides a step-by-step description of the design and execution of a strategic job analysis, using the position of Freight Conductor as an example. This document was created to be useful for many different needs, and can be used as an e...

  11. Contribution to aroma characteristics of mutton process flavor from the enzymatic hydrolysate of sheep bone protein assessed by descriptive sensory analysis and gas chromatography olfactometry.

    Science.gov (United States)

    Zhan, Ping; Tian, Honglei; Zhang, Xiaoming; Wang, Liping

    2013-03-15

    Changes in the aroma characteristics of mutton process flavors (MPFs) prepared from sheep bone protein hydrolysates (SBPHs) with different degrees of hydrolysis (DH) were evaluated using gas chromatography-mass spectrometry (GC-MS), gas chromatography-olfactometry (GC-O), and descriptive sensory analysis (DSA). Five attributes (muttony, meaty, roasted, mouthful, and simulate) were selected to assess MPFs. The results of DSA showed a distinct difference among the control sample MPF0 and other MPF samples with added SBPHs for different DHs of almost all sensory attributes. MPF5 (DH 25.92%) was the strongest in the muttony, meaty, and roasted attributes, whereas MPF6 (DH 30.89%) was the strongest in the simulate and roasted attributes. Thirty-six compounds were identified as odor-active compounds for the evaluation of the sensory characteristics of MPFs via GC-MS-O analysis. The results of correlation analysis among odor-active compounds, molecular weight, and DSA further confirmed that the SBPH with a DH range of 25.92-30.89% may be a desirable precursor for the sensory characteristics of MPF. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

    CERN Document Server

    The ENVISION Collaboration

    2014-01-01

    Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

  13. Simulators for nuclear power plants

    International Nuclear Information System (INIS)

    Ancarani, A.; Zanobetti, D.

    1983-01-01

    The different types of simulator for nuclear power plants depend on the kind of programme and the degree of representation to be achieved, which in turn determines the functions to duplicate. Different degrees correspond to different simulators and hence to different choices in the functions. Training of nuclear power plant operators takes advantage of the contribution of simulators of various degrees of complexity and fidelity. Reduced scope simulators are best for understanding basic phenomena; replica simulators are best used for formal qualification and requalification of personnel, while modular mini simulators of single parts of a plant are best for replay and assessment of malfunctions. Another category consists of simulators for the development of assistance during operation, with the inclusion of disturbance and alarm analysis. The only existing standard on simulators is, at present, the one adopted in the United States. This is too stringent and is never complied with by present simulators. A description of possible advantages of a European standard is therefore offered: it rests on methods of measurement of basic simulator characteristics such as fidelity in values and time. (author)

  14. Computer image analysis of seed shape and seed color for flax cultivar description

    Czech Academy of Sciences Publication Activity Database

    Wiesnerová, Dana; Wiesner, Ivo

    2008-01-01

    Roč. 61, č. 2 (2008), s. 126-135 ISSN 0168-1699 R&D Projects: GA ČR GA521/03/0019 Institutional research plan: CEZ:AV0Z50510513 Keywords : image analysis * cultivar description * flax Subject RIV: EA - Cell Biology Impact factor: 1.273, year: 2008

  15. Natural Language Description of Emotion

    Science.gov (United States)

    Kazemzadeh, Abe

    2013-01-01

    This dissertation studies how people describe emotions with language and how computers can simulate this descriptive behavior. Although many non-human animals can express their current emotions as social signals, only humans can communicate about emotions symbolically. This symbolic communication of emotion allows us to talk about emotions that we…

  16. Design and Analysis of simulation experiments : Tutorial

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2017-01-01

    This tutorial reviews the design and analysis of simulation experiments. These experiments may have various goals: validation, prediction, sensitivity analysis, optimization (possibly robust), and risk or uncertainty analysis. These goals may be realized through metamodels. Two types of metamodels

  17. Coupling an analytical description of anti-scatter grids with simulation software of radiographic systems using Monte Carlo code; Couplage d'une methode de description analytique de grilles anti diffusantes avec un logiciel de simulation de systemes radiographiques base sur un code Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Rinkel, J.; Dinten, J.M.; Tabary, J

    2004-07-01

    The use of focused anti-scatter grids on digital radiographic systems with two-dimensional detectors produces acquisitions with a decreased scatter to primary ratio and thus improved contrast and resolution. Simulation software is of great interest in optimizing grid configuration according to a specific application. Classical simulators are based on complete detailed geometric descriptions of the grid. They are accurate but very time consuming since they use Monte Carlo code to simulate scatter within the high-frequency grids. We propose a new practical method which couples an analytical simulation of the grid interaction with a radiographic system simulation program. First, a two dimensional matrix of probability depending on the grid is created offline, in which the first dimension represents the angle of impact with respect to the normal to the grid lines and the other the energy of the photon. This matrix of probability is then used by the Monte Carlo simulation software in order to provide the final scattered flux image. To evaluate the gain of CPU time, we define the increasing factor as the increase of CPU time of the simulation with as opposed to without the grid. Increasing factors were calculated with the new model and with classical methods representing the grid with its CAD model as part of the object. With the new method, increasing factors are shorter by one to two orders of magnitude compared with the second one. These results were obtained with a difference in calculated scatter of less than five percent between the new and the classical method. (authors)

  18. The geometry description markup language

    International Nuclear Information System (INIS)

    Chytracek, R.

    2001-01-01

    Currently, a lot of effort is being put on designing complex detectors. A number of simulation and reconstruction frameworks and applications have been developed with the aim to make this job easier. A very important role in this activity is played by the geometry description of the detector apparatus layout and its working environment. However, no real common approach to represent geometry data is available and such data can be found in various forms starting from custom semi-structured text files, source code (C/C++/FORTRAN), to XML and database solutions. The XML (Extensible Markup Language) has proven to provide an interesting approach for describing detector geometries, with several different but incompatible XML-based solutions existing. Therefore, interoperability and geometry data exchange among different frameworks is not possible at present. The author introduces a markup language for geometry descriptions. Its aim is to define a common approach for sharing and exchanging of geometry description data. Its requirements and design have been driven by experience and user feedback from existing projects which have their geometry description in XML

  19. Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility

    Science.gov (United States)

    Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.

    2017-12-01

    The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.

  20. IMAGE-BASED RECONSTRUCTION AND ANALYSIS OF DYNAMIC SCENES IN A LANDSLIDE SIMULATION FACILITY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2017-12-01

    Full Text Available The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.

  1. Simulation of CIFF (Centralized IFF) remote control displays

    Science.gov (United States)

    Tucker, D. L.; Leibowitz, L. M.

    1986-06-01

    This report presents the software simulation of the Remote-Control-Display (RCS) proposed to be used in the Centralized IFF (CIFF) system. A description of the simulation programs along with simulated menu formats are presented. A sample listing of the simulation programs and a brief description of the program operation are also included.

  2. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  3. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  4. 3D-finite element impact simulation on concrete structures

    Energy Technology Data Exchange (ETDEWEB)

    Heider, N.

    1989-12-15

    The analysis of impact processes is an interesting application of full 3D Finite Element calculations. This work presents a simulation of the penetration process of a Kinetic Energy projectile into a concrete target. Such a calculation requires an adequate FE model, especially a proper description of the crack opening process in front of the projectile. The aim is the prediction of the structural survival of the penetrator case with the help of an appropriate failure criterion. Also, the computer simulation allows a detailed analysis of the physical phenomena during impact. (orig.) With 4 refs., 14 figs.

  5. Modeling and simulation of equivalent circuits in description of biological systems - a fractional calculus approach

    Directory of Open Access Journals (Sweden)

    José Francisco Gómez Aguilar

    2012-07-01

    Full Text Available Using the fractional calculus approach, we present the Laplace analysis of an equivalent electrical circuit for a multilayered system, which includes distributed elements of the Cole model type. The Bode graphs are obtained from the numerical simulation of the corresponding transfer functions using arbitrary electrical parameters in order to illustrate the methodology. A numerical Laplace transform is used with respect to the simulation of the fractional differential equations. From the results shown in the analysis, we obtain the formula for the equivalent electrical circuit of a simple spectrum, such as that generated by a real sample of blood tissue, and the corresponding Nyquist diagrams. In addition to maintaining consistency in adjusted electrical parameters, the advantage of using fractional differential equations in the study of the impedance spectra is made clear in the analysis used to determine a compact formula for the equivalent electrical circuit, which includes the Cole model and a simple RC model as special cases.

  6. Shock Mechanism Analysis and Simulation of High-Power Hydraulic Shock Wave Simulator

    Directory of Open Access Journals (Sweden)

    Xiaoqiu Xu

    2017-01-01

    Full Text Available The simulation of regular shock wave (e.g., half-sine can be achieved by the traditional rubber shock simulator, but the practical high-power shock wave characterized by steep prepeak and gentle postpeak is hard to be realized by the same. To tackle this disadvantage, a novel high-power hydraulic shock wave simulator based on the live firing muzzle shock principle was proposed in the current work. The influence of the typical shock characteristic parameters on the shock force wave was investigated via both theoretical deduction and software simulation. According to the obtained data compared with the results, in fact, it can be concluded that the developed hydraulic shock wave simulator can be applied to simulate the real condition of the shocking system. Further, the similarity evaluation of shock wave simulation was achieved based on the curvature distance, and the results stated that the simulation method was reasonable and the structural optimization based on software simulation is also beneficial to the increase of efficiency. Finally, the combination of theoretical analysis and simulation for the development of artillery recoil tester is a comprehensive approach in the design and structure optimization of the recoil system.

  7. Flow analysis of HANARO flow simulated test facility

    International Nuclear Information System (INIS)

    Park, Yong-Chul; Cho, Yeong-Garp; Wu, Jong-Sub; Jun, Byung-Jin

    2002-01-01

    The HANARO, a multi-purpose research reactor of 30 MWth open-tank-in-pool type, has been under normal operation since its initial critical in February, 1995. Many experiments should be safely performed to activate the utilization of the NANARO. A flow simulated test facility is being developed for the endurance test of reactivity control units for extended life times and the verification of structural integrity of those experimental facilities prior to loading in the HANARO. This test facility is composed of three major parts; a half-core structure assembly, flow circulation system and support system. The half-core structure assembly is composed of plenum, grid plate, core channel with flow tubes, chimney and dummy pool. The flow channels are to be filled with flow orifices to simulate core channels. This test facility must simulate similar flow characteristics to the HANARO. This paper, therefore, describes an analytical analysis to study the flow behavior of the test facility. The computational flow analysis has been performed for the verification of flow structure and similarity of this test facility assuming that flow rates and pressure differences of the core channel are constant. The shapes of flow orifices were determined by the trial and error method based on the design requirements of core channel. The computer analysis program with standard k - ε turbulence model was applied to three-dimensional analysis. The results of flow simulation showed a similar flow characteristic with that of the HANARO and satisfied the design requirements of this test facility. The shape of flow orifices used in this numerical simulation can be adapted for manufacturing requirements. The flow rate and the pressure difference through core channel proved by this simulation can be used as the design requirements of the flow system. The analysis results will be verified with the results of the flow test after construction of the flow system. (author)

  8. Simulation of machine-specific topographic indices for use across platforms.

    Science.gov (United States)

    Mahmoud, Ashraf M; Roberts, Cynthia; Lembach, Richard; Herderick, Edward E; McMahon, Timothy T

    2006-09-01

    The objective of this project is to simulate the current published topographic indices used for the detection and evaluation of keratoconus to allow their application to maps acquired from multiple topographic machines. A retrospective analysis was performed on 21 eyes of 14 previously diagnosed keratoconus patients from a single practice using a Tomey TMS-1, an Alcon EyeMap, and a Keratron Topographer. Maps that could not be processed or that contained processing errors were excluded from analysis. Topographic indices native to each of the three devices were recorded from each map. Software was written in ANSI standard C to simulate the indices based on the published formulas and/or descriptions to extend the functionality of The Ohio State University Corneal Topography Tool (OSUCTT), a software package designed to accept the input from many corneal topographic devices and provide consistent display and analysis. Twenty indices were simulated. Linear regression analysis was performed between each simulated index and the corresponding native index. A cross-platform comparison using regression analysis was also performed. All simulated indices were significantly correlated with the corresponding native indices (p simulated. Cross-platform comparisons may be limited for specific indices.

  9. A descriptive analysis of quantitative indices for multi-objective block layout

    Directory of Open Access Journals (Sweden)

    Amalia Medina Palomera

    2013-01-01

    Full Text Available Layout generation methods provide alternative solutions whose feasibility and quality must be evaluated. Indices must be used to distinguish the feasible solutions (involving different criteria obtained for block layout to identify s solution’s suitability, according to set objectives. This paper provides an accurate and descriptive analysis of the geometric indices used in designing facility layout (during block layout phase. The indices studied here have advantages and disadvantages which should be considered by an analyst before attempting to resolve the facility layout problem. New equations are proposed for measuring geometric indices. The analysis revealed redundant indices and that a minimum number of indices covering overall quality criteria may be used when selecting alternative solutions.

  10. Towards reproducible descriptions of neuronal network models.

    Directory of Open Access Journals (Sweden)

    Eilen Nordlie

    2009-08-01

    Full Text Available Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.

  11. Water Quality Analysis Simulation Program (WASP)

    Science.gov (United States)

    The Water Quality Analysis Simulation Program (WASP) model helps users interpret and predict water quality responses to natural phenomena and manmade pollution for various pollution management decisions.

  12. Information system analysis of an e-learning system used for dental restorations simulation.

    Science.gov (United States)

    Bogdan, Crenguţa M; Popovici, Dorin M

    2012-09-01

    The goal of using virtual and augmented reality technologies in therapeutic interventions simulation, in the fixed prosthodontics (VirDenT) project, is to increase the quality of the educational process in dental faculties, by assisting students in learning how to prepare teeth for all-ceramic restorations. Its main component is an e-learning virtual reality-based software system that will be used for the developing skills in grinding teeth, needed in all-ceramic restorations. The complexity of the domain problem that the software system dealt with made the analysis of the information system supported by VirDenT necessary. The analysis contains the following activities: identification and classification of the system stakeholders, description of the business processes, formulation of the business rules, and modelling of business objects. During this stage, we constructed the context diagram, the business use case diagram, the activity diagrams and the class diagram of the domain model. These models are useful for the further development of the software system that implements the VirDenT information system. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. LCDD: A complete detector description package

    Energy Technology Data Exchange (ETDEWEB)

    Graf, Norman, E-mail: Norman.Graf@slac.stanford.edu; McCormick, Jeremy, E-mail: Jeremy.McCormick@slac.stanford.edu

    2015-07-21

    LCDD has been developed to provide a complete detector description package for physics detector simulations using Geant4. All aspects of the experimental setup, such as the physical geometry, magnetic fields, and sensitive detector readouts, as well as control of the physics simulations, such as physics processes, interaction models and kinematic limits, are defined at runtime. Users are therefore able to concentrate on the design of the detector system without having to master the intricacies of C++ programming or being proficient in setting up their own Geant4 application. We describe both the XML-based file format and the processors which communicate this information to the underlying Geant4 simulation toolkit.

  14. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  15. Facility/equipment performance evaluation using microcomputer simulation analysis

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.

    1985-08-01

    A computer simulation analysis model was developed at the Pacific Northwest Laboratory to assist in assuring the adequacy of the Monitored Retrievable Storage facility design to meet the specified spent nuclear fuel throughput requirements. The microcomputer-based model was applied to the analysis of material flow, equipment capability and facility layout. The simulation analysis evaluated uncertainties concerning both facility throughput requirements and process duration times as part of the development of a comprehensive estimate of facility performance. The evaluations provided feedback into the design review task to identify areas where design modifications should be considered

  16. Descriptive analysis of bacon smoked with Brazilian woods from reforestation: methodological aspects, statistical analysis, and study of sensory characteristics.

    Science.gov (United States)

    Saldaña, Erick; Castillo, Luiz Saldarriaga; Sánchez, Jorge Cabrera; Siche, Raúl; de Almeida, Marcio Aurélio; Behrens, Jorge H; Selani, Miriam Mabel; Contreras-Castillo, Carmen J

    2018-06-01

    The aim of this study was to perform a descriptive analysis (DA) of bacons smoked with woods from reforestation and liquid smokes in order to investigate their sensory profile. Six samples of bacon were selected: three smoked bacons with different wood species (Eucalyptus citriodora, Acacia mearnsii, and Bambusa vulgaris), two artificially smoked bacon samples (liquid smoke) and one negative control (unsmoked bacon). Additionally, a commercial bacon sample was also evaluated. DA was developed successfully, presenting a good performance in terms of discrimination, consensus and repeatability. The study revealed that the smoking process modified the sensory profile by intensifying the "saltiness" and differentiating the unsmoked from the smoked samples. The results from the current research represent the first methodological development of descriptive analysis of bacon and may be used by food companies and other stakeholders to understand the changes in sensory characteristics of bacon due to traditional smoking process. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  18. In situ visualization and data analysis for turbidity currents simulation

    Science.gov (United States)

    Camata, Jose J.; Silva, Vítor; Valduriez, Patrick; Mattoso, Marta; Coutinho, Alvaro L. G. A.

    2018-01-01

    Turbidity currents are underflows responsible for sediment deposits that generate geological formations of interest for the oil and gas industry. LibMesh-sedimentation is an application built upon the libMesh library to simulate turbidity currents. In this work, we present the integration of libMesh-sedimentation with in situ visualization and in transit data analysis tools. DfAnalyzer is a solution based on provenance data to extract and relate strategic simulation data in transit from multiple data for online queries. We integrate libMesh-sedimentation and ParaView Catalyst to perform in situ data analysis and visualization. We present a parallel performance analysis for two turbidity currents simulations showing that the overhead for both in situ visualization and in transit data analysis is negligible. We show that our tools enable monitoring the sediments appearance at runtime and steer the simulation based on the solver convergence and visual information on the sediment deposits, thus enhancing the analytical power of turbidity currents simulations.

  19. A program code generator for multiphysics biological simulation using markup languages.

    Science.gov (United States)

    Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi

    2012-01-01

    To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.

  20. Advancement in tritium transport simulations for solid breeding blanket system

    Energy Technology Data Exchange (ETDEWEB)

    Ying, Alice, E-mail: ying@fusion.ucla.edu [Mechanical and Aerospace Engineering Department, UCLA, Los Angeles, CA 90095 (United States); Zhang, Hongjie [Mechanical and Aerospace Engineering Department, UCLA, Los Angeles, CA 90095 (United States); Merrill, Brad J. [Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Ahn, Mu-Young [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-11-01

    In this paper, advancement on tritium transport simulations was demonstrated for a solid breeder blanket HCCR TBS, where multi-physics and detailed engineering descriptions are considered using a commercial simulation code. The physics involved includes compressible purge gas fluid flow, heat transfer, chemical reaction, isotope swamping effect, and tritium isotopes mass transport. The strategy adopted here is to develop numerical procedures and techniques that allow critical details of material, geometric and operational heterogeneity in a most complete engineering description of the TBS being incorporated into the simulation. Our application focuses on the transient assessment in view of ITER being pulsed operations. An immediate advantage is a more realistic predictive and design analysis tool accounting pulsed operations induced temperature variations which impact helium purge gas flow as well as Q{sub 2} composition concentration time and space evolutions in the breeding regions. This affords a more accurate prediction of tritium permeation into the He coolant by accounting correct temperature and partial pressure effects and realistic diffusion paths. The analysis also shows that by introducing by-pass line to accommodate ITER pulsed operations in the TES loop allows tritium extraction design being more cost effective.

  1. System Design Description Salt Well Liquid Pumping Dynamic Simulation

    International Nuclear Information System (INIS)

    HARMSEN, R.W.

    1999-01-01

    The Salt Well Liquid (SWL) Pumping Dynamic Simulation used by the single-shell tank (SST) Interim Stabilization Project is described. A graphical dynamic simulation predicts SWL removal from 29 SSTs using an exponential function and unique time constant for each SST. Increasing quarterly efficiencies are applied to adjust the pumping rates during fiscal year 2000

  2. LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors

    Science.gov (United States)

    Snider, E. L.; Petrillo, G.

    2017-10-01

    LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.

  3. Simulation Development and Analysis of Crew Vehicle Ascent Abort

    Science.gov (United States)

    Wong, Chi S.

    2016-01-01

    I have taken thus far that focus on pure logic, simulation code focuses on mimicking the physical world with some approximation and can have inaccuracies or numerical instabilities. Learning from my mistake, I adopted new methods to analyze these different simulations. One method the student used was to numerically plot various physical parameters using MATLAB to confirm the mechanical behavior of the system in addition to comparing the data to the output from a separate simulation tool called FAST. By having full control over what was being outputted from the simulation, I could choose which parameters to change and to plot as well as how to plot them, allowing for an in depth analysis of the data. Another method of analysis was to convert the output data into a graphical animation. Unlike the numerical plots, where all of the physical components were displayed separately, this graphical display allows for a combined look at the simulation output that makes it much easier for one to see the physical behavior of the model. The process for converting SOMBAT output for EDGE graphical display had to be developed. With some guidance from other EDGE users, I developed a process and created a script that would easily allow one to display simulations graphically. Another limitation with the SOMBAT model was the inability for the capsule to have the main parachutes instantly deployed with a large angle between the air speed vector and the chutes drag vector. To explore this problem, I had to learn about different coordinate frames used in Guidance, Navigation & Control (J2000, ECEF, ENU, etc.) to describe the motion of a vehicle and about Euler angles (e.g. Roll, Pitch, Yaw) to describe the orientation of the vehicle. With a thorough explanation from my mentor about the description of each coordinate frame, as well as how to use a directional cosine matrix to transform one frame to another, I investigated the problem by simulating different capsule orientations. In the end

  4. Simulation Application for the LHCb Experiment

    CERN Document Server

    Belyaev, I; Easo, S; Mato, P; Palacios, J; Pokorski, Witold; Ranjard, F; Van Tilburg, J; Charpentier, Ph.

    2003-01-01

    We describe the LHCb detector simulation application (Gauss) based on the Geant4 toolkit. The application is built using the Gaudi software framework, which is used for all event-processing applications in the LHCb experiment. The existence of an underlying framework allows several common basic services such as persistency, interactivity, as well as detector geometry description or particle data to be shared between simulation, reconstruction and analysis applications. The main benefits of such common services are coherence between different event-processing stages as well as reduced development effort. The interfacing to Geant4 toolkit is realized through a facade (GiGa) which minimizes the coupling to the simulation engine and provides a set of abstract interfaces for configuration and event-by-event communication. The Gauss application is composed of three main blocks, i.e. event generation, detector response simulation and digitization which reflect the different stages performed during the simulation job...

  5. Simulation data analysis by virtual reality system

    International Nuclear Information System (INIS)

    Ohtani, Hiroaki; Mizuguchi, Naoki; Shoji, Mamoru; Ishiguro, Seiji; Ohno, Nobuaki

    2010-01-01

    We introduce new software for analysis of time-varying simulation data and new approach for contribution of simulation to experiment by virtual reality (VR) technology. In the new software, the objects of time-varying field are visualized in VR space and the particle trajectories in the time-varying electromagnetic field are also traced. In the new approach, both simulation results and experimental device data are simultaneously visualized in VR space. These developments enhance the study of the phenomena in plasma physics and fusion plasmas. (author)

  6. Job Analysis and the Preparation of Job Descriptions. Mendip Papers MP 037.

    Science.gov (United States)

    Saunders, Bob

    This document provides guidelines for conducting job analyses and writing job descriptions. It covers the following topics: the rationale for job descriptions, the terminology of job descriptions, who should write job descriptions, getting the information to write job descriptions, preparing for staff interviews, conducting interviews, writing the…

  7. An in-depth description of bipolar resistive switching in Cu/HfOx/Pt devices, a 3D kinetic Monte Carlo simulation approach

    Science.gov (United States)

    Aldana, S.; Roldán, J. B.; García-Fernández, P.; Suñe, J.; Romero-Zaliz, R.; Jiménez-Molinos, F.; Long, S.; Gómez-Campos, F.; Liu, M.

    2018-04-01

    A simulation tool based on a 3D kinetic Monte Carlo algorithm has been employed to analyse bipolar conductive bridge RAMs fabricated with Cu/HfOx/Pt stacks. Resistive switching mechanisms are described accounting for the electric field and temperature distributions within the dielectric. The formation and destruction of conductive filaments (CFs) are analysed taking into consideration redox reactions and the joint action of metal ion thermal diffusion and electric field induced drift. Filamentary conduction is considered when different percolation paths are formed in addition to other conventional transport mechanisms in dielectrics. The simulator was tuned by using the experimental data for Cu/HfOx/Pt bipolar devices that were fabricated. Our simulation tool allows for the study of different experimental results, in particular, the current variations due to the electric field changes between the filament tip and the electrode in the High Resistance State. In addition, the density of metallic atoms within the CF can also be characterized along with the corresponding CF resistance description.

  8. Tolerance analysis through computational imaging simulations

    Science.gov (United States)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  9. Challenges in coupled thermal-hydraulics and neutronics simulations for LWR safety analysis

    International Nuclear Information System (INIS)

    Ivanov, Kostadin; Avramova, Maria

    2007-01-01

    The simulation of nuclear power plant accident conditions requires three-dimensional (3D) modeling of the reactor core to ensure a realistic description of physical phenomena. The operational flexibility of Light Water Reactor (LWR) plants can be improved by utilizing accurate 3D coupled neutronics/thermal-hydraulics calculations for safety margins evaluations. There are certain requirements to the coupling of thermal-hydraulic system codes and neutron-kinetics codes that ought to be considered. The objective of these requirements is to provide accurate solutions in a reasonable amount of CPU time in coupled simulations of detailed operational transient and accident scenarios. These requirements are met by the development and implementation of six basic components of the coupling methodologies: ways of coupling (internal or external coupling); coupling approach (integration algorithm or parallel processing); spatial mesh overlays; coupled time-step algorithms; coupling numerics (explicit, semi-implicit and implicit schemes); and coupled convergence schemes. These principles of the coupled simulations are discussed in details along with the scientific issues associated with the development of appropriate neutron cross-section libraries for coupled code transient modeling. The current trends in LWR nuclear power generation and regulation as well as the design of next generation LWR reactor concepts along with the continuing computer technology progress stimulate further development of these coupled code systems. These efforts have been focused towards extending the analysis capabilities as well as refining the scale and level of detail of the coupling. This article analyses the coupled phenomena and modeling challenges on both global (assembly-wise) and local (pin-wise) levels. The issues related to the consistent qualification of coupled code systems as well as their application to different types of LWR transients are presented. Finally, the advances in numerical

  10. Empirical evidence from an inter-industry descriptive analysis of overall materiality measures

    OpenAIRE

    N. Pecchiari; C. Emby; G. Pogliani

    2013-01-01

    This study presents an empirical cross-industry descriptive analysis of overall quantitative materiality measures. We examine the behaviour of four commonly used quantitative materiality measures within and across industries with respect to their size, relative size and stability, over ten years. The sample consists of large- and medium-sized European companies, representing 24 different industry categories for the years 1998 through 2007 (a total sample of over 36,000 data points). Our resul...

  11. Modern analysis of ion channeling data by Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nowicki, Lech [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland)]. E-mail: lech.nowicki@fuw.edu.pl; Turos, Andrzej [Institute of Electronic Materials Technology, Wolczynska 133, 01-919 Warsaw (Poland); Ratajczak, Renata [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Stonert, Anna [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Garrido, Frederico [Centre de Spectrometrie Nucleaire et Spectrometrie de Masse, CNRS-IN2P3-Universite Paris-Sud, 91405 Orsay (France)

    2005-10-15

    Basic scheme of ion channeling spectra Monte Carlo simulation is reformulated in terms of statistical sampling. The McChasy simulation code is described and two examples of the code applications are presented. These are: calculation of projectile flux in uranium dioxide crystal and defect analysis for ion implanted InGaAsP/InP superlattice. Virtues and pitfalls of defect analysis using Monte Carlo simulations are discussed.

  12. Scientific tourism communication in Brazil: Descriptive analysis of national journals from 1990 to 2012

    Directory of Open Access Journals (Sweden)

    Glauber Eduardo de Oliveira Santos

    2013-04-01

    Full Text Available This paper provides descriptive analysis of 2.126 articles published in 20 Brazilian tourism journals from 1990 to 2012. It offers a comprehensive and objective picture of these journals, contributing to the debate about editorial policies, as well as to a broader understanding of the Brazilian academic research developed in this period. The study analyses the evolution of the number of published papers and descriptive statistics about the length of articles, titles and abstracts. Authors with the largest number of publications and the most recurrent keywords are identified. The integration level among journals is analyzed; point out which publications are closer to the center of the Brazilian tourism scientific publishing network.

  13. Rethinking Sensitivity Analysis of Nuclear Simulations with Topology

    Energy Technology Data Exchange (ETDEWEB)

    Dan Maljovec; Bei Wang; Paul Rosen; Andrea Alfonsi; Giovanni Pastore; Cristian Rabiti; Valerio Pascucci

    2016-01-01

    In nuclear engineering, understanding the safety margins of the nuclear reactor via simulations is arguably of paramount importance in predicting and preventing nuclear accidents. It is therefore crucial to perform sensitivity analysis to understand how changes in the model inputs affect the outputs. Modern nuclear simulation tools rely on numerical representations of the sensitivity information -- inherently lacking in visual encodings -- offering limited effectiveness in communicating and exploring the generated data. In this paper, we design a framework for sensitivity analysis and visualization of multidimensional nuclear simulation data using partition-based, topology-inspired regression models and report on its efficacy. We rely on the established Morse-Smale regression technique, which allows us to partition the domain into monotonic regions where easily interpretable linear models can be used to assess the influence of inputs on the output variability. The underlying computation is augmented with an intuitive and interactive visual design to effectively communicate sensitivity information to the nuclear scientists. Our framework is being deployed into the multi-purpose probabilistic risk assessment and uncertainty quantification framework RAVEN (Reactor Analysis and Virtual Control Environment). We evaluate our framework using an simulation dataset studying nuclear fuel performance.

  14. The coupled chemistry-climate model LMDz-REPROBUS: description and evaluation of a transient simulation of the period 1980–1999

    Directory of Open Access Journals (Sweden)

    L. Jourdain

    2008-06-01

    Full Text Available We present a description and evaluation of the Chemistry-Climate Model (CCM LMDz-REPROBUS, which couples interactively the extended version of the Laboratoire de Météorologie Dynamique General Circulation Model (LMDz GCM and the stratospheric chemistry module of the REactive Processes Ruling the Ozone BUdget in the Stratosphere (REPROBUS model. The transient simulation evaluated here covers the period 1980–1999. The introduction of an interactive stratospheric chemistry module improves the model dynamical climatology, with a substantial reduction of the temperature biases in the lower tropical stratosphere. However, at high latitudes in the Southern Hemisphere, a negative temperature bias, that is already present in the GCM version, albeit with a smaller magnitude, leads to an overestimation of the ozone depletion and its vertical extent in the CCM. This in turn contributes to maintain low polar temperatures in the vortex, delay the break-up of the vortex and the recovery of polar ozone. The latitudinal and vertical variation of the mean age of air compares favourable with estimates derived from long-lived species measurements, though the model mean age of air is 1–3 years too young in the middle stratosphere. The model also reproduces the observed "tape recorder" in tropical total hydrogen (=H2O+2×CH4, but its propagation is about 30% too fast and its signal fades away slightly too quickly. The analysis of the global distributions of CH4 and N2O suggests that the subtropical transport barriers are correctly represented in the simulation. LMDz-REPROBUS also reproduces fairly well most of the spatial and seasonal variations of the stratospheric chemical species, in particular ozone. However, because of the Antarctic cold bias, large discrepancies are found for most species at high latitudes in the Southern Hemisphere during the spring and early summer. In the Northern Hemisphere, polar ozone depletion and its variability are underestimated

  15. Simulation and Analysis of Roller Chain Drive Systems

    DEFF Research Database (Denmark)

    Pedersen, Sine Leergaard

    The subject of this thesis is simulation and analysis of large roller chain drive systems, such as e.g. used in marine diesel engines. The aim of developing a chain drive simulation program is to analyse dynamic phenomena of chain drive systems and investigate different design changes to the syst......The subject of this thesis is simulation and analysis of large roller chain drive systems, such as e.g. used in marine diesel engines. The aim of developing a chain drive simulation program is to analyse dynamic phenomena of chain drive systems and investigate different design changes...... mathematical models, and compare to the prior done research. Even though the model is developed at first for the use of analysing chain drive systems in marine engines, the methods can with small changes be used in general, as for e.g. chain drives in industrial machines, car engines and motorbikes. A novel...

  16. YT: A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Turk, Matthew J.; /San Diego, CASS; Smith, Britton D.; /Michigan State U.; Oishi, Jeffrey S.; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Skory, Stephen; Skillman, Samuel W.; /Colorado U., CASA; Abel, Tom; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Norman, Michael L.; /aff San Diego, CASS

    2011-06-23

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/) an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation, and topologically connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.

  17. yt: A MULTI-CODE ANALYSIS TOOLKIT FOR ASTROPHYSICAL SIMULATION DATA

    International Nuclear Information System (INIS)

    Turk, Matthew J.; Norman, Michael L.; Smith, Britton D.; Oishi, Jeffrey S.; Abel, Tom; Skory, Stephen; Skillman, Samuel W.

    2011-01-01

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/) an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation, and topologically connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.

  18. An integral time series on simulated labeling using fractal structure

    International Nuclear Information System (INIS)

    Djainal, D.D.

    1997-01-01

    This research deals with the detection of time series of vertical two-phase flow, in attempt to developed an objective indicator of time series flow patterns. One of new method is fractal analysis which can complement conventional methods in the description of highly irregular fluctuations. in the present work, fractal analysis applied to analyze simulated boiling coolant signal. this simulated signals built by sum random elements in small subchannels of the coolant channel. Two modes are defined and both modes are characterized by their void fractions. in the case of unimodal-PDF signals, the difference between these modes is relative small. on other hand, bimodal-PDF signals have relative large range. in this research, fractal dimension can indicate the characters of that signals simulation

  19. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  20. DESCRIPTIVE ANALYSIS OF CORPORATE CULTURE FOLLOWING THE CHANGES

    Directory of Open Access Journals (Sweden)

    Elenko Zahariev

    2016-09-01

    Full Text Available Corporate culture more sensibly makes additions to the economic knowledge, accompanies the strategy and tactics in management. It feels in manners and overall activity of the organization - it is empathy and tolerance, respect and responsibility. The new corporate culture transforms each participant, changes his/her mind in the general collaborations and working habits. The new corporate culture requires improving the management style. It is no longer necessary the leader only to rule, to administer and control, but to lead and inspire. The leader sets challenging targets, optimizes the performance of the teams, fuels an optimistic mood and faith, gains agreement between workers, monitors and evaluate the work in a fair way. Current study raises the problem of interpreting cultural profiles in modern organizations and analyzes corporate culture after the changes during the transition period in Bulgaria. The descriptive analysis of corporate culture allows the relatively precise identification of its various types based on the accepted classification signs.

  1. Generator dynamics in aeroelastic analysis and simulations

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, T.J.; Hansen, M.H.; Iov, F.

    2003-05-01

    This report contains a description of a dynamic model for a doubly-fed induction generator implemented in the aeroelastic code HAWC. The model has physical input parameters (resistance, reactance etc.) and input variables (stator and rotor voltage and rotor speed). The model can be used to simulate the generator torque as well as the rotor and stator currents, active and reactive power. A perturbation method has been used to reduce the original generator model equations to a set of equations which can be solved with the same time steps as a typical aeroelastic code. The method is used to separate the fast transients of the model from the slow variations and deduce a reduced order expression for the slow part. Dynamic effects of the first order terms in the model as well as the influence on drive train eigenfrequencies and damping has been investigated. Load response during time simulation of wind turbine response have been compared to simulations with a linear static generator model originally implemented i HAWC. A 2 MW turbine has been modelled in the aeroelastic code HAWC. When using the new dynamic generator model there is an interesting coupling between the generator dynamics and a global turbine vibration mode at 4.5 Hz, which only occurs when a dynamic formulation of the generator equations is applied. This frequency can especially be seen in the electrical power of the generator and the rotational speed of the generator, but also as torque variations in the drive train. (au)

  2. Track Simulation and Reconstruction in the ATLAS experiment

    CERN Document Server

    Salzburger, Andreas; Elsing, Markus

    The reconstruction and simulation of particle trajectories is an inevitable part of the analysis strate- gies for data taken with the ATLAS detector. Many aspects and necessary parts of a high-quality track reconstruction will be presented and discussed in this work. At first, the technical realisation of the data model and the reconstruction geometry will be given; the reconstruction geometry is charac- terised by a newly developed navigation model and an automated procedure for the synchronisation of the detailed simulation geometry description with the simplified reconstruction geometry model, which allows a precise description of the tracker material in track reconstruction. Both components help the coherent and fast integration of material effects in a newly established track extrapolation package, that is discussed in the following. The extrapolation engine enables a highly precise trans- port of the track parameterisation and the associated covariances through the complex magnetic field and the detec...

  3. Full-scope training simulators

    International Nuclear Information System (INIS)

    Ugedo, E.

    1986-01-01

    The following topics to be covered in this report are: Reasons justifying the use of full-scope simulators for operator qualification. Full-scope simulator description: the control room, the physical models, the computer complex, the instructor's console. Main features of full-scope simulators. Merits of simulator training. The role of full-scope simulators in the training programs. The process of ordering and acquiring a full-scope simulator. Maintaining and updating simulator capabilities. (orig./GL)

  4. Simulation Application for the LHCb Experiment

    CERN Document Server

    Pokorski, Witold

    2003-01-01

    We describe the LHCb detector simulation application (Gauss) based on the Geant4 toolkit. The application is built using the Gaudi software framework, which is used for all event-processing applications in the LHCb experiment. The existence of an underlying framework allows several common basic services such as persistency, interactivity, as well as detector geometry description or particle data to be shared between simulation, reconstruction and analysis applications. The main benefits of such common services are coherence between different event-processing stages as well as reduced development effort. The interfacing to Geant4 toolkit is realized through a façade (GiGa) which minimizes the coupling to the simulation engine and provides a set of abstract interfaces for configuration and event-by-event communication. The Gauss application is composed of three main blocks, i.e. event generation, detector response simulation and digitization which reflect the different stages performed during the simulation jo...

  5. Assessment of competence in simulated flexible bronchoscopy using motion analysis

    DEFF Research Database (Denmark)

    Collela, Sara; Svendsen, Morten Bo Søndergaard; Konge, Lars

    2015-01-01

    Background: Flexible bronchoscopy should be performed with a correct posture and a straight scope to optimize bronchoscopy performance and at the same time minimize the risk of work-related injuries and endoscope damage. Objectives: We aimed to test whether an automatic motion analysis system could...... intermediates and 9 experienced bronchoscopy operators performed 3 procedures each on a bronchoscopy simulator. The Microsoft Kinect system was used to automatically measure the total deviation of the scope from a perfectly straight, vertical line. Results: The low-cost motion analysis system could measure...... with the performance on the simulator (virtual-reality simulator score; p analysis system could discriminate between different levels of experience. Automatic feedback on correct movements during self-directed training on simulators might help new bronchoscopists learn how to handle...

  6. Quantitative descriptive analysis of Italian polenta produced with different corn cultivars.

    Science.gov (United States)

    Zeppa, Giuseppe; Bertolino, Marta; Rolle, Luca

    2012-01-30

    Polenta is a porridge-like dish, generally made by mixing cornmeal with salt water and stirring constantly while cooking over a low heat. It can be eaten plain, straight from the pan, or topped with various foods (cheeses, meat, sausages, fish, etc.). It is most popular in northern Italy but can also be found in Switzerland, Austria, Croatia, Argentina and other countries in Eastern Europe and South America. Despite this diffusion, there are no data concerning the sensory characteristics of this product. A research study was therefore carried out to define the lexicon for a sensory profile of polenta and relationships with corn cultivars. A lexicon with 13 sensory parameters was defined and validated before references were determined. After panel training, the sensory profiles of 12 autochthonous maize cultivars were defined. The results of this research highlighted that quantitative descriptive analysis can also be used for the sensory description of polenta, and that the defined lexicon can be used to describe the sensory qualities of polenta for both basic research, such as maize selection, and product development. Copyright © 2011 Society of Chemical Industry.

  7. Kinematics Simulation Analysis of Packaging Robot with Joint Clearance

    Science.gov (United States)

    Zhang, Y. W.; Meng, W. J.; Wang, L. Q.; Cui, G. H.

    2018-03-01

    Considering the influence of joint clearance on the motion error, repeated positioning accuracy and overall position of the machine, this paper presents simulation analysis of a packaging robot — 2 degrees of freedom(DOF) planar parallel robot based on the characteristics of high precision and fast speed of packaging equipment. The motion constraint equation of the mechanism is established, and the analysis and simulation of the motion error are carried out in the case of turning the revolute clearance. The simulation results show that the size of the joint clearance will affect the movement accuracy and packaging efficiency of the packaging robot. The analysis provides a reference point of view for the packaging equipment design and selection criteria and has a great significance on the packaging industry automation.

  8. Virtual reality based surgery simulation for endoscopic gynaecology.

    Science.gov (United States)

    Székely, G; Bajka, M; Brechbühler, C; Dual, J; Enzler, R; Haller, U; Hug, J; Hutter, R; Ironmonger, N; Kauer, M; Meier, V; Niederer, P; Rhomberg, A; Schmid, P; Schweitzer, G; Thaler, M; Vuskovic, V; Tröster, G

    1999-01-01

    Virtual reality (VR) based surgical simulator systems offer very elegant possibilities to both enrich and enhance traditional education in endoscopic surgery. However, while a wide range of VR simulator systems have been proposed and realized in the past few years, most of these systems are far from able to provide a reasonably realistic surgical environment. We explore the basic approaches to the current limits of realism and ultimately seek to extend these based on our description and analysis of the most important components of a VR-based endoscopic simulator. The feasibility of the proposed techniques is demonstrated on a first modular prototype system implementing the basic algorithms for VR-training in gynaecologic laparoscopy.

  9. Summer Computer Simulation Conference, Washington, DC, July 15-17, 1981, Proceedings

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    Aspects of simulation technology are discussed, taking into account microcomputers in simulation, heuristic/adaptive systems, differential equations approaches, available simulation packages, selected operations research applications, and mathematical and statistical tools. Hybrid systems are discussed along with topics of chemical sciences. Subjects related to physical and engineering sciences are explored, giving attention to aeronautics and astronautics, physical processes, nuclear/electrical power technology, advanced computational methods and systems, avionics systems, dynamic systems analysis and control, and industrial systems. Environmental sciences are considered along with biomedical systems, managerial and social sciences, questions of simulation credibility and validation, and energy systems. A description is provided of simulation facilities, and topics related to system engineering and transportation are investigated

  10. Analysis of Medication Errors in Simulated Pediatric Resuscitation by Residents

    Directory of Open Access Journals (Sweden)

    Evelyn Porter

    2014-07-01

    Full Text Available Introduction: The objective of our study was to estimate the incidence of prescribing medication errors specifically made by a trainee and identify factors associated with these errors during the simulated resuscitation of a critically ill child. Methods: The results of the simulated resuscitation are described. We analyzed data from the simulated resuscitation for the occurrence of a prescribing medication error. We compared univariate analysis of each variable to medication error rate and performed a separate multiple logistic regression analysis on the significant univariate variables to assess the association between the selected variables. Results: We reviewed 49 simulated resuscitations . The final medication error rate for the simulation was 26.5% (95% CI 13.7% - 39.3%. On univariate analysis, statistically significant findings for decreased prescribing medication error rates included senior residents in charge, presence of a pharmacist, sleeping greater than 8 hours prior to the simulation, and a visual analog scale score showing more confidence in caring for critically ill children. Multiple logistic regression analysis using the above significant variables showed only the presence of a pharmacist to remain significantly associated with decreased medication error, odds ratio of 0.09 (95% CI 0.01 - 0.64. Conclusion: Our results indicate that the presence of a clinical pharmacist during the resuscitation of a critically ill child reduces the medication errors made by resident physician trainees.

  11. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  12. Simulation-based training for nurses: Systematic review and meta-analysis.

    Science.gov (United States)

    Hegland, Pål A; Aarlie, Hege; Strømme, Hilde; Jamtvedt, Gro

    2017-07-01

    Simulation-based training is a widespread strategy to improve health-care quality. However, its effect on registered nurses has previously not been established in systematic reviews. The aim of this systematic review is to evaluate effect of simulation-based training on nurses' skills and knowledge. We searched CDSR, DARE, HTA, CENTRAL, CINAHL, MEDLINE, Embase, ERIC, and SveMed+ for randomised controlled trials (RCT) evaluating effect of simulation-based training among nurses. Searches were completed in December 2016. Two reviewers independently screened abstracts and full-text, extracted data, and assessed risk of bias. We compared simulation-based training to other learning strategies, high-fidelity simulation to other simulation strategies, and different organisation of simulation training. Data were analysed through meta-analysis and narrative syntheses. GRADE was used to assess the quality of evidence. Fifteen RCTs met the inclusion criteria. For the comparison of simulation-based training to other learning strategies on nurses' skills, six studies in the meta-analysis showed a significant, but small effect in favour of simulation (SMD -1.09, CI -1.72 to -0.47). There was large heterogeneity (I 2 85%). For the other comparisons, there was large between-study variation in results. The quality of evidence for all comparisons was graded as low. The effect of simulation-based training varies substantially between studies. Our meta-analysis showed a significant effect of simulation training compared to other learning strategies, but the quality of evidence was low indicating uncertainty. Other comparisons showed inconsistency in results. Based on our findings simulation training appears to be an effective strategy to improve nurses' skills, but further good-quality RCTs with adequate sample sizes are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Hanford Site Composite Analysis Technical Approach Description: Groundwater

    Energy Technology Data Exchange (ETDEWEB)

    Budge, T. J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-10-02

    The groundwater facet of the revised CA is responsible for generating predicted contaminant concentration values over the entire analysis spatial and temporal domain. These estimates will be used as part of the groundwater pathway dose calculation facet to estimate dose for exposure scenarios. Based on the analysis of existing models and available information, the P2R Model was selected as the numerical simulator to provide these estimates over the 10,000-year temporal domain of the CA. The P2R Model will use inputs from initial plume distributions, updated for a start date of 1/1/2017, and inputs from the vadose zone facet, created by a tool under development as part of the ICF, to produce estimates of hydraulic head, transmissivity, and contaminant concentration over time. A recommendation of acquiring 12 computer processors and 2 TB of hard drive space is made to ensure that the work can be completed within the anticipated schedule of the revised CA.

  14. CUBESIM, Hypercube and Denelcor Hep Parallel Computer Simulation

    International Nuclear Information System (INIS)

    Dunigan, T.H.

    1988-01-01

    1 - Description of program or function: CUBESIM is a set of subroutine libraries and programs for the simulation of message-passing parallel computers and shared-memory parallel computers. Subroutines are supplied to simulate the Intel hypercube and the Denelcor HEP parallel computers. The system permits a user to develop and test parallel programs written in C or FORTRAN on a single processor. The user may alter such hypercube parameters as message startup times, packet size, and the computation-to-communication ratio. The simulation generates a trace file that can be used for debugging, performance analysis, or graphical display. 2 - Method of solution: The CUBESIM simulator is linked with the user's parallel application routines to run as a single UNIX process. The simulator library provides a small operating system to perform process and message management. 3 - Restrictions on the complexity of the problem: Up to 128 processors can be simulated with a virtual memory limit of 6 million bytes. Up to 1000 processes can be simulated

  15. Refined reservoir description to maximize oil recovery

    International Nuclear Information System (INIS)

    Flewitt, W.E.

    1975-01-01

    To assure maximized oil recovery from older pools, reservoir description has been advanced by fully integrating original open-hole logs and the recently introduced interpretive techniques made available through cased-hole wireline saturation logs. A refined reservoir description utilizing normalized original wireline porosity logs has been completed in the Judy Creek Beaverhill Lake ''A'' Pool, a reefal carbonate pool with current potential productivity of 100,000 BOPD and 188 active wells. Continuous porosity was documented within a reef rim and cap while discontinuous porous lenses characterized an interior lagoon. With the use of pulsed neutron logs and production data a separate water front and pressure response was recognized within discrete environmental units. The refined reservoir description aided in reservoir simulation model studies and quantifying pool performance. A pattern water flood has now replaced the original peripheral bottom water drive to maximize oil recovery

  16. Process Simulation Analysis of HF Stripping

    Directory of Open Access Journals (Sweden)

    Thaer A. Abdulla

    2015-02-01

    Full Text Available    HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq. Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee, affecting the results within (0.1-58% variation for the most cases.        The simulated results show that about 4% of paraffin (C10 & C11 presents at the top stream, which may cause a problem in the LAB production plant. The major variations were noticed for the total top vapor flow rate with bottom temperature and with feed composition. The column profiles maintain fairly constants from tray 5 to tray 18. The study gives evidence about a successful simulation with HYSYS because the results correspond with the real plant operation data.

  17. Augmenting health care failure modes and effects analysis with simulation

    DEFF Research Database (Denmark)

    Staub-Nielsen, Ditte Emilie; Dieckmann, Peter; Mohr, Marlene

    2014-01-01

    This study explores whether simulation plays a role in health care failure mode and effects analysis (HFMEA); it does this by evaluating whether additional data are found when a traditional HFMEA is augmented with simulation. Two multidisciplinary teams identified vulnerabilities in a process...... by brainstorming, followed by simulation. Two means of adding simulation were investigated as follows: just simulating the process and interrupting the simulation between substeps of the process. By adding simulation to a traditional HFMEA, both multidisciplinary teams identified additional data that were relevant...

  18. Description of the TREBIL, CRESSEX and STREUSL computer programs, that belongs to RALLY computer code pack for the analysis of reliability systems

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.

    1982-11-01

    The RALLY computer code pack (RALLY pack) is a set of computer codes destinate to the reliability of complex systems, aiming to a risk analysis. Three of the six codes, are commented, presenting their purpose, input description, calculation methods and results obtained with each one of those computer codes. The computer codes are: TREBIL, to obtain the fault tree logical equivalent; CRESSEX, to obtain the minimal cut and the punctual values of the non-reliability and non-availability of the system; and STREUSL, for the dispersion calculation of those values around the media. In spite of the CRESSEX, in its version available at CNEN, uses a little long method to obtain the minimal cut in an HB-CNEN system, the three computer programs show good results, mainly the STREUSL, which permits the simulation of various components. (E.G.) [pt

  19. Description of Measurements on Biogas Stations

    Directory of Open Access Journals (Sweden)

    Ladislav Novosád

    2016-08-01

    Full Text Available This paper focuses mainly on performance analysis for three biogas stations situated within the territory of the Czech Republic. This paper contains basic details of the individual biogas stations as well as description of their types. It also refers to the general description of the measurement gauge involved, with specifications of its potential use. The final part of this paper deals with the analysis of course data obtained, with special regard to voltage, current, active power and reactive power data.

  20. Methods for simulation-based analysis of fluid-structure interaction.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Payne, Jeffrey L.

    2005-10-01

    Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonal decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.

  1. Halo statistics analysis within medium volume cosmological N-body simulation

    Directory of Open Access Journals (Sweden)

    Martinović N.

    2015-01-01

    Full Text Available In this paper we present halo statistics analysis of a ΛCDM N body cosmological simulation (from first halo formation until z = 0. We study mean major merger rate as a function of time, where for time we consider both per redshift and per Gyr dependence. For latter we find that it scales as the well known power law (1 + zn for which we obtain n = 2.4. The halo mass function and halo growth function are derived and compared both with analytical and empirical fits. We analyse halo growth through out entire simulation, making it possible to continuously monitor evolution of halo number density within given mass ranges. The halo formation redshift is studied exploring possibility for a new simple preliminary analysis during the simulation run. Visualization of the simulation is portrayed as well. At redshifts z = 0−7 halos from simulation have good statistics for further analysis especially in mass range of 1011 − 1014 M./h. [176021 ’Visible and invisible matter in nearby galaxies: theory and observations

  2. ANCA: Anharmonic Conformational Analysis of Biomolecular Simulations.

    Science.gov (United States)

    Parvatikar, Akash; Vacaliuc, Gabriel S; Ramanathan, Arvind; Chennubhotla, S Chakra

    2018-05-08

    Anharmonicity in time-dependent conformational fluctuations is noted to be a key feature of functional dynamics of biomolecules. Although anharmonic events are rare, long-timescale (μs-ms and beyond) simulations facilitate probing of such events. We have previously developed quasi-anharmonic analysis to resolve higher-order spatial correlations and characterize anharmonicity in biomolecular simulations. In this article, we have extended this toolbox to resolve higher-order temporal correlations and built a scalable Python package called anharmonic conformational analysis (ANCA). ANCA has modules to: 1) measure anharmonicity in the form of higher-order statistics and its variation as a function of time, 2) output a storyboard representation of the simulations to identify key anharmonic conformational events, and 3) identify putative anharmonic conformational substates and visualization of transitions between these substates. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  3. PDB4DNA: Implementation of DNA geometry from the Protein Data Bank (PDB) description for Geant4-DNA Monte-Carlo simulations

    Science.gov (United States)

    Delage, E.; Pham, Q. T.; Karamitros, M.; Payno, H.; Stepan, V.; Incerti, S.; Maigne, L.; Perrot, Y.

    2015-07-01

    This paper describes PDB4DNA, a new Geant4 user application, based on an independent, cross-platform, free and open source C++ library, so-called PDBlib, which enables use of atomic level description of DNA molecule in Geant4 Monte Carlo particle transport simulations. For the evaluation of direct damage induced on the DNA molecule by ionizing particles, the application makes use of an algorithm able to determine the closest atom in the DNA molecule to energy depositions. Both the PDB4DNA application and the PDBlib library are available as free and open source under the Geant4 license.

  4. Application of three-dimensional simulation at lecturing on descriptive geometry

    Directory of Open Access Journals (Sweden)

    Tel'noy Viktor Ivanovich

    2014-05-01

    Full Text Available Teaching descriptive geometry has its own characteristics. Need not only to inform students of a certain amount of knowledge on the subject, but also to develop their spatial imagination as well as the right to develop the skills of logical thinking. Practice of teaching the discipline showed that students face serious difficulties in the process of its study. This is due to the relatively low level of their schooling in geometry and technical drawing, and lacking in high spatial imagination. They find it difficult to imagine the geometrical image of the object of study and mentally convert it on the plane. Because of this, there is a need to find ways to effectively teach the discipline «Descriptive Geometry» at university. In the context of global informatization and computerization of the educational process, implementation of graphically programs for the development of design documentation and 3D modeling is one of the most promising applications of information technology in the process of solving these problems. With the help of three-dimensional models the best visibility in the classroom is achieved. When conducting lectures on descriptive geometry it is requested to use three-dimensional modeling not only as didactic means (demonstrativeness means, but also as a method of teaching (learning tool to deal with various graphics tasks. Bearing this in mind, the essence of the implementation of 3D modeling is revealed with the aim of better understanding of the algorithms for solving both positional and metric tasks using spatial representation of graphic constructions. It is shown that the possibility to consider the built model from different angles is of particular importance, as well as the use of transparency properties for illustrating the results of solving geometric problems. Using 3D models together with their display on the plane, as well as text information promotes better assimilation and more lasting memorization of the

  5. Simulation of containment atmosphere stratification experiment using local instantaneous description

    International Nuclear Information System (INIS)

    Babic, M.; Kljenak, I.

    2004-01-01

    An experiment on mixing and stratification in the atmosphere of a nuclear power plant containment at accident conditions was simulated with the CFD code CFX4.4. The original experiment was performed in the TOSQAN experimental facility. Simulated nonhomogeneous temperature, species concentration and velocity fields are compared to experimental results. (author)

  6. Generator dynamics in aeroelastic analysis and simulations

    DEFF Research Database (Denmark)

    Larsen, Torben J.; Hansen, Morten Hartvig; Iov, F.

    2003-01-01

    This report contains a description of a dynamic model for a doubly-fed induction generator. The model has physical input parameters (voltage, resistance, reactance etc.) and can be used to calculate rotor and stator currents, hence active and reactivepower. A perturbation method has been used...... to reduce the original generator model equations to a set of equations which can be solved with the same time steps as a typical aeroelastic code. The method is used to separate the fast transients of the modelfrom the slow variations and deduce a reduced order expression for the slow part. Dynamic effects...... of the first order terms in the model as well as the influence on drive train eigenfrequencies and damping has been investigated. Load response during timesimulation of wind turbine response have been compared to simulations with a traditional static generator model based entirely on the slip angle. A 2 MW...

  7. Simulation Loop between CAD systems, Geant4 and GeoModel: Implementation and Results

    CERN Document Server

    Sharmazanashvili, Alexander; The ATLAS collaboration

    2015-01-01

    Data_vs_MonteCarlo discrepancy is one of the most important field of investigation for ATLAS simulation studies. There are several reasons of above mentioned discrepancies but primary interest is falling on geometry studies and investigation of how geometry descriptions of detector in simulation adequately representing “as-built” descriptions. Shapes consistency and detalization is not important while adequateness of volumes and weights of detector components are essential for tracking. There are 2 main reasons of faults of geometry descriptions in simulation: 1/ Inconsistency to “as-built” geometry descriptions; 2/Internal inaccuracies of transactions added by simulation packages itself. Georgian Engineering team developed hub on the base of CATIA platform and several tools enabling to read in CATIA different descriptions used by simulation packages, like XML/Persint->CATIA; IV/VP1->CATIA; GeoModel->CATIA; Geant4->CATIA. As a result it becomes possible to compare different descriptions with each othe...

  8. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  9. Stochastic analysis for finance with simulations

    CERN Document Server

    Choe, Geon Ho

    2016-01-01

    This book is an introduction to stochastic analysis and quantitative finance; it includes both theoretical and computational methods. Topics covered are stochastic calculus, option pricing, optimal portfolio investment, and interest rate models. Also included are simulations of stochastic phenomena, numerical solutions of the Black–Scholes–Merton equation, Monte Carlo methods, and time series. Basic measure theory is used as a tool to describe probabilistic phenomena. The level of familiarity with computer programming is kept to a minimum. To make the book accessible to a wider audience, some background mathematical facts are included in the first part of the book and also in the appendices. This work attempts to bridge the gap between mathematics and finance by using diagrams, graphs and simulations in addition to rigorous theoretical exposition. Simulations are not only used as the computational method in quantitative finance, but they can also facilitate an intuitive and deeper understanding of theoret...

  10. Microstructure-based analysis and simulation of flow and mass transfer in chromatographic stationary phases

    Science.gov (United States)

    Koku, Harun

    -dimensional pore size distributions obtained for the CIM disk using image processing algorithms were found to deviate significantly from the manufacturer-reported experimental mercury intrusion results, the difference being attributed to the local nature of the image-based methods or assumptions and limitations inherent in the experimental mercury intrusion method. A probe placement algorithm was introduced to estimate solute capacity from the explicit geometry of the monolith. To enable a precise description of both flow and the geometry for a rigorous analysis of dispersion, a three-dimensional sample block for the CIM disk was reconstructed using serial imaging and sectioning, and the flow and mass transfer were simulated using a lattice-Boltzmann method and a particle-based random-walk method, respectively. Flow simulations hinted at the partitioning of flow into high- and low-velocity regions and the dispersion simulations obtained on top of the velocity field revealed artifacts in particle trajectories due to the symmetry of the lateral flow with respect to the periodic boundaries. Constraining the simulation length to reduce the effect of this symmetry yielded dispersion behavior suggestive of channeling, hinting that the sample geometry might not be representative of the macroscopic structure. Simulations of the local behavior of finite particles predicted the experimentally observed entrapment behavior, as well as the increase of the entrapment level with flow rate. Analysis of trajectories provided support for a previously hypothesized mechanism for entrapment.

  11. Psychological analysis of primary school pupils self-description in a computer game

    Directory of Open Access Journals (Sweden)

    I. D. Spirina

    2017-06-01

    Full Text Available Objective. The aim of this study was to reveal of the specific impact of computer games on the children`s consciousness in primary school. Materials and Methods. 30 children aged from 6 to 11 years were examined. The qualitative research methods of descriptions the children`s computer games experience according to the main stages of structured phenomenological research have been used. The questionnaire for children`s self- description in a computer game has been developed and qualitative analysis of these descriptions has been conducted. Results. While analyzing the descriptions the difficulty of “true”/“false” separating, the use of personal pronouns of the language, the absence of the proper distinction between "Self" as a game character and "Self" of the child on the whole, attributing the properties of living creatures to virtual "opponents" or "partners" and the confusion of time and spatial terms use while describing the game by the children have been revealed. Only the outer game plan, such as plot, "event", "action", the difficulties occurring in the game have been described by the children, but there have not been any reflected emotions at all. While describing the "events" occurring in the game, the children were not able to focus on themselves either then or during the game. Conclusions. The involvement of a child into the computer game causes, first of all, the disorder of emotional sphere functioning, when the emotions are not understood by the child. The discrepancies while describing by the children themselves, their nature and the trends of their favourite games have been exposed, indicating that there have been the disorders in the child`s self-attitude and self-esteem forming. While playing the computer game a special "operation mode" of the child's mind emerges when the impact of the irreal image on the child`s mind can distort the natural flow of cognitive and emotional reflection of reality forming.

  12. Structure optimization and simulation analysis of the quartz micromachined gyroscope

    Directory of Open Access Journals (Sweden)

    Xuezhong Wu

    2014-02-01

    Full Text Available Structure optimization and simulation analysis of the quartz micromachined gyroscope are reported in this paper. The relationships between the structure parameters and the frequencies of work mode were analysed by finite element analysis. The structure parameters of the quartz micromachined gyroscope were optimized to reduce the difference between the frequencies of the drive mode and the sense mode. The simulation results were proved by testing the prototype gyroscope, which was fabricated by micro-electromechanical systems (MEMS technology. Therefore, the frequencies of the drive mode and the sense mode can match each other by the structure optimization and simulation analysis of the quartz micromachined gyroscope, which is helpful in the design of the high sensitivity quartz micromachined gyroscope.

  13. Regional hydrogeological simulations. Numerical modelling using ConnectFlow. Preliminary site description Simpevarp sub area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Hoch, Andrew; Hunter, Fiona; Jackson, Peter [Serco Assurance, Risley (United Kingdom); Marsic, Niko [Kemakta Konsult, Stockholm (Sweden)

    2005-02-01

    objective of this study is to support the development of a preliminary Site Description of the Simpevarp area on a regional-scale based on the available data of August 2004 (Data Freeze S1.2) and the previous Site Description. A more specific objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater in the Simpevarp area on a regional-scale. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale as well as predictions of future hydrogeological conditions. Other key objectives were to identify the model domain required to simulate regional flow and solute transport at the Simpevarp area and to incorporate a new geological model of the deformation zones produced for Version S1.2.Another difference with Version S1.1 is the increased effort invested in conditioning the hydrogeological property models to the fracture boremap and hydraulic data. A new methodology was developed for interpreting the discrete fracture network (DFN) by integrating the geological description of the DFN (GeoDFN) with the hydraulic test data from Posiva Flow-Log and Pipe-String System double-packer techniques to produce a conditioned Hydro-DFN model. This was done in a systematic way that addressed uncertainties associated with the assumptions made in interpreting the data, such as the relationship between fracture transmissivity and length. Consistent hydraulic data was only available for three boreholes, and therefore only relatively simplistic models were proposed as there isn't sufficient data to justify extrapolating the DFN away from the boreholes based on rock domain, for example. Significantly, a far greater quantity of hydro-geochemical data was available for calibration in the

  14. Description of surface systems. Preliminary site description. Forsmark area Version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Lindborg, Tobias [ed.

    2005-06-01

    the biosphere. Methodologies for developing descriptive- and ecosystem models are only described briefly in this report, but for thorough methodology descriptions see references. The work has been conducted by the project group SurfaceNet together with other discipline-specific collaborators, engaged by members of the project group. The members of the project group represent the disciplines ecology, hydrology, Quaternary geology, soil science, limnology, oceanography, hydrogeology, hydrogeochemistry, environmental science, physical geography and human geography. In addition, some group members have specific qualifications of importance, e.g. experts in GIS modelling and in statistical data analysis.

  15. Description of surface systems. Preliminary site description. Forsmark area Version 1.2

    International Nuclear Information System (INIS)

    Lindborg, Tobias

    2005-06-01

    the biosphere. Methodologies for developing descriptive- and ecosystem models are only described briefly in this report, but for thorough methodology descriptions see references. The work has been conducted by the project group SurfaceNet together with other discipline-specific collaborators, engaged by members of the project group. The members of the project group represent the disciplines ecology, hydrology, Quaternary geology, soil science, limnology, oceanography, hydrogeology, hydrogeochemistry, environmental science, physical geography and human geography. In addition, some group members have specific qualifications of importance, e.g. experts in GIS modelling and in statistical data analysis

  16. Environmental decision support system on base of geoinformational technologies for the analysis of nuclear accident consequences

    International Nuclear Information System (INIS)

    Haas, T.C.; Maigan, M.; Arutyunyan, R.V.; Bolshov, L.A.; Demianov, V.V.

    1996-01-01

    The report deals with description of the concept and prototype of environmental decision support system (EDSS) for the analysis of late off-site consequences of severe nuclear accidents and analysis, processing and presentation of spatially distributed radioecological data. General description of the available software, use of modem achievements of geostatistics and stochastic simulations for the analysis of spatial data are presented and discussed

  17. Tokamak control simulator

    International Nuclear Information System (INIS)

    Edelbaum, T.N.; Serben, S.; Var, R.E.

    1976-01-01

    A computer model of a tokamak experimental power reactor and its control system is being constructed. This simulator will allow the exploration of various open loop and closed loop strategies for reactor control. This paper provides a brief description of the simulator and some of the potential control problems associated with this class of tokamaks

  18. AN ANALYSIS OF STUDENT‘S DESCRIPTIVE TEXT: SYSTEMIC FUNCTIONAL LINGUISTICS PERSPECTIVES

    Directory of Open Access Journals (Sweden)

    Rizka Maulina Wulandari

    2017-12-01

    Full Text Available In Indonesia where different languages co-exist, and where English is used as a foreign language, the learners‘ capabilities in writing toward English plays an important role in formulating effective learning method. This descriptive qualitative research aimed to investigate the student‘s errors in writing descriptive text in SFL perspectives. A secondary, yet important, objective of this research is also to design the appropriate pedagogical plans that can be used for junior high school students in Indonesian education based on the result of the research. The results indicated that the student has good control about the schematic structure of descriptive text although many of his idea still uses Indonesian context which make the reader can be confused in understanding his meaning. It can be concluded that there is intervention from L1, that is Indonesian language, while he wrote his descriptive text.. Hence, the study highlighted that cooperative learning could be an option as an appropriate learning method to solve the students problem on writing descriptive text.

  19. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  20. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  1. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    Science.gov (United States)

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  2. Simulation Analysis of Helicopter Ground Resonance Nonlinear Dynamics

    Science.gov (United States)

    Zhu, Yan; Lu, Yu-hui; Ling, Ai-min

    2017-07-01

    In order to accurately predict the dynamic instability of helicopter ground resonance, a modeling and simulation method of helicopter ground resonance considering nonlinear dynamic characteristics of components (rotor lead-lag damper, landing gear wheel and absorber) is presented. The numerical integral method is used to calculate the transient responses of the body and rotor, simulating some disturbance. To obtain quantitative instabilities, Fast Fourier Transform (FFT) is conducted to estimate the modal frequencies, and the mobile rectangular window method is employed in the predictions of the modal damping in terms of the response time history. Simulation results show that ground resonance simulation test can exactly lead up the blade lead-lag regressing mode frequency, and the modal damping obtained according to attenuation curves are close to the test results. The simulation test results are in accordance with the actual accident situation, and prove the correctness of the simulation method. This analysis method used for ground resonance simulation test can give out the results according with real helicopter engineering tests.

  3. Hydrodynamic analysis and simulation of a flow cell ammonia electrolyzer

    International Nuclear Information System (INIS)

    Diaz, Luis A.; Botte, Gerardine G.

    2015-01-01

    Highlights: • NH_3 electrooxidation mechanism was validated in a bench scale electrolyzer. • All kinetic parameters for NH_3 electro-oxidation were calculated and verified. • Hydrodynamic behavior of the NH_3 electrolyzer was properly described as a CSTR. • CSTR model was successfully applied to simulate a flow ammonia electrolyzer. - Abstract: The hydrodynamic analysis and simulation of a non-ideal single pass flow cell alkaline ammonia electrolyzer was performed after the scale-up of a well-characterized deposited polycrystalline Pt on Ni anode. The hydrodynamic analysis was performed using the residence time distribution (RTD) test. The results of the hydrodynamic investigation provide additional insights for the kinetic analysis of the ammonia electrooxidation reaction on polycrystalline Pt electrocatalysts -which are typically obtained under controlled flow regime, e.g., rotating disk electrode- by including the flow non-uniformity present in the electrolyzer. Based on the RTD function, the ammonia electrolyzer performance was simulated as a non-steady stirred tank reactor (CSTR) and the unknown kinetic parameters were obtained by fitting the simulation results with an experimental current profile, obtaining an adequate prediction of the ammonia conversion. This simplified approach for the simulation of the ammonia electrolyzer could be implemented in process simulation packages and could be used for the design and scale-up of the process for hydrogen production and wastewater remediation.

  4. Time-dependent description of quantum interference nanotransistor

    International Nuclear Information System (INIS)

    Konopka, M.; Bokes, P.

    2012-01-01

    In this contribution we have presented simulations of electron current response to applied gate potentials in a ring-shaped quantum interference device. Such device could function like a current-switching quantum-interference transistor. We demonstrated capability of our approach to describe this kind of system keeping full quantum coherence in the description for extended periods of time. This have been achieved thanks to the unique feature of our method which allows for explicit simulations of small quantum subsystems with open boundary conditions. Further generalisation of the method is needed to reduce the number of basis set functions required to describe the system. (authors)

  5. Needs analysis for developing a virtual-reality NOTES simulator.

    Science.gov (United States)

    Sankaranarayanan, Ganesh; Matthes, Kai; Nemani, Arun; Ahn, Woojin; Kato, Masayuki; Jones, Daniel B; Schwaitzberg, Steven; De, Suvranu

    2013-05-01

    INTRODUCTION AND STUDY AIM: Natural orifice translumenal endoscopic surgery (NOTES) is an emerging surgical technique that requires a cautious adoption approach to ensure patient safety. High-fidelity virtual-reality-based simulators allow development of new surgical procedures and tools and train medical personnel without risk to human patients. As part of a project funded by the National Institutes of Health, we are developing the virtual transluminal endoscopic surgery trainer (VTEST) for this purpose. The objective of this study is to conduct a structured needs analysis to identify the design parameters for such a virtual-reality-based simulator for NOTES. A 30-point questionnaire was distributed at the 2011 National Orifice Surgery Consortium for Assessment and Research meeting to obtain responses from experts. Ordinal logistic regression and the Wilcoxon rank-sum test were used for analysis. A total of 22 NOTES experts participated in the study. Cholecystectomy (CE, 68 %) followed by appendectomy (AE, 63 %) (CE vs AE, p = 0.0521) was selected as the first choice for simulation. Flexible (FL, 47 %) and hybrid (HY, 47 %) approaches were equally favorable compared with rigid (RI, 6 %) with p virtual NOTES simulator in training and testing new tools for NOTES were rated very high by the participants. Our study reinforces the importance of developing a virtual NOTES simulator and clearly presents expert preferences. The results of this analysis will direct our initial development of the VTEST platform.

  6. A data integration approach for cell cycle analysis oriented to model simulation in systems biology

    Directory of Open Access Journals (Sweden)

    Mosca Ettore

    2007-08-01

    Full Text Available Abstract Background The cell cycle is one of the biological processes most frequently investigated in systems biology studies and it involves the knowledge of a large number of genes and networks of protein interactions. A deep knowledge of the molecular aspect of this biological process can contribute to making cancer research more accurate and innovative. In this context the mathematical modelling of the cell cycle has a relevant role to quantify the behaviour of each component of the systems. The mathematical modelling of a biological process such as the cell cycle allows a systemic description that helps to highlight some features such as emergent properties which could be hidden when the analysis is performed only from a reductionism point of view. Moreover, in modelling complex systems, a complete annotation of all the components is equally important to understand the interaction mechanism inside the network: for this reason data integration of the model components has high relevance in systems biology studies. Description In this work, we present a resource, the Cell Cycle Database, intended to support systems biology analysis on the Cell Cycle process, based on two organisms, yeast and mammalian. The database integrates information about genes and proteins involved in the cell cycle process, stores complete models of the interaction networks and allows the mathematical simulation over time of the quantitative behaviour of each component. To accomplish this task, we developed, a web interface for browsing information related to cell cycle genes, proteins and mathematical models. In this framework, we have implemented a pipeline which allows users to deal with the mathematical part of the models, in order to solve, using different variables, the ordinary differential equation systems that describe the biological process. Conclusion This integrated system is freely available in order to support systems biology research on the cell cycle and

  7. Environmental management policy analysis using complex system simulation

    International Nuclear Information System (INIS)

    Van Eeckhout, E.; Roberts, D.; Oakes, R.; Shieh, A.; Hardie, W.; Pope, P.

    1999-01-01

    The two primary modules of Envirosim (the model of Los Alamos TA-55 and the WIPP transport/storage model) have been combined into one application, with the simulated waste generated by TA-55 operations being fed to storage, packaging, and transport simulation entities. Three simulation scenarios were executed which demonstrate the usefulness of Envirosim as a policy analysis tool for use in planning shipments to WIPP. A graphical user interface (GUI) has been implemented using IDL (Interactive Data Language) which allows the analyst to easily view simulation results. While IDL is not necessarily the graphics interface that would be selected for a production version of Envirosim, it does provide some powerful data manipulation capabilities, and it runs on a variety of platforms

  8. User's guide of DETRAS system-3. Description of the simulated reactor plant

    International Nuclear Information System (INIS)

    Yamaguchi, Yukichi

    2006-12-01

    DETRAS system is a PWR reactor simulator system for operation trainings whose distinguished feature is that it can be operated from the remote place of the simulator site. The document which is the third one of a series of three volumes of the user's guide of DETRAS, describes firstly an outline of the simulated reactor system then a user's interface needed for operation of the simulator of interest and finally a series of procedure for startup of the simulated reactor and shutdown of it from its rated operation state. (author)

  9. A multi-subject evaluation of uncertainty in anatomical landmark location on shoulder kinematic description.

    Science.gov (United States)

    Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J

    2009-04-01

    An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.

  10. Study on Real-Time Simulation Analysis and Inverse Analysis System for Temperature and Stress of Concrete Dam

    Directory of Open Access Journals (Sweden)

    Lei Zhang

    2015-01-01

    Full Text Available In the concrete dam construction, it is very necessary to strengthen the real-time monitoring and scientific management of concrete temperature control. This paper constructs the analysis and inverse analysis system of temperature stress simulation, which is based on various useful data collected in real time in the process of concrete construction. The system can produce automatically data file of temperature and stress calculation and then achieve the remote real-time simulation calculation of temperature stress by using high performance computing techniques, so the inverse analysis can be carried out based on a basis of monitoring data in the database; it fulfills the automatic feedback calculation according to the error requirement and generates the corresponding curve and chart after the automatic processing and analysis of corresponding results. The system realizes the automation and intellectualization of complex data analysis and preparation work in simulation process and complex data adjustment in the inverse analysis process, which can facilitate the real-time tracking simulation and feedback analysis of concrete temperature stress in construction process and enable you to discover problems timely, take measures timely, and adjust construction scheme and can well instruct you how to ensure project quality.

  11. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    OpenAIRE

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  12. Frictional effects between Overton sand and a simulated casing for a bore hole

    International Nuclear Information System (INIS)

    Dong, R.G.

    1975-01-01

    A series of tests were run to simulate the frictional effects between Overton sand and the casing for a bore hole for an underground nuclear test. The objective was to find a description for this frictional interaction which can be applied to an analysis of stemming materials under field conditions

  13. Learning and Learning-to-Learn by Doing: Simulating Corporate Practice in Law School.

    Science.gov (United States)

    Okamoto, Karl S.

    1995-01-01

    A law school course in advanced corporate legal practice is described. The course, a series of simulated lawyering tasks centered on a hypothetical leveraged buyout transaction, is designed to go beyond basic legal analysis to develop professional expertise in legal problem solving. The course description includes goals, syllabus design,…

  14. Software Geometry in Simulations

    Science.gov (United States)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  15. A comparison of two follow-up analyses after multiple analysis of variance, analysis of variance, and descriptive discriminant analysis: A case study of the program effects on education-abroad programs

    Science.gov (United States)

    Alvin H. Yu; Garry. Chick

    2010-01-01

    This study compared the utility of two different post-hoc tests after detecting significant differences within factors on multiple dependent variables using multivariate analysis of variance (MANOVA). We compared the univariate F test (the Scheffé method) to descriptive discriminant analysis (DDA) using an educational-tour survey of university study-...

  16. Metabolic control analysis of biochemical pathways based on a thermokinetic description of reaction rates

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1997-01-01

    Metabolic control analysis is a powerful technique for the evaluation of flux control within biochemical pathways. Its foundation is the elasticity coefficients and the flux control coefficients (FCCs). On the basis of a thermokinetic description of reaction rates it is here shown...... that the elasticity coefficients can be calculated directly from the pool levels of metabolites at steady state. The only requirement is that one thermodynamic parameter be known, namely the reaction affinity at the intercept of the tangent in the inflection point of the curve of reaction rate against reaction...... of the thermokinetic description of reaction rates to include the influence of effecters. Here the reaction rate is written as a linear function of the logarithm of the metabolite concentrations. With this type of rate function it is shown that the approach of Delgado and Liao [Biochem. J. (1992) 282, 919-927] can...

  17. Conformational analysis of oligosaccharides and polysaccharides using molecular dynamics simulations.

    Science.gov (United States)

    Frank, Martin

    2015-01-01

    Complex carbohydrates usually have a large number of rotatable bonds and consequently a large number of theoretically possible conformations can be generated (combinatorial explosion). The application of systematic search methods for conformational analysis of carbohydrates is therefore limited to disaccharides and trisaccharides in a routine analysis. An alternative approach is to use Monte-Carlo methods or (high-temperature) molecular dynamics (MD) simulations to explore the conformational space of complex carbohydrates. This chapter describes how to use MD simulation data to perform a conformational analysis (conformational maps, hydrogen bonds) of oligosaccharides and how to build realistic 3D structures of large polysaccharides using Conformational Analysis Tools (CAT).

  18. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  19. Mochovce NPP simulator

    International Nuclear Information System (INIS)

    Ziakova, M.

    1998-01-01

    Mochovce NPP simulator basic features and detailed description of its characteristics are presented with its performance, certification and application for training of NPP operators as well as the training scenario

  20. Detailed GEANT description of the SDC central calorimeters

    International Nuclear Information System (INIS)

    Glagolev, V.V.; Li, W.

    1994-01-01

    This article represents the very detailed simulation model of the SDC central calorimeters and some results which were obtained using that model. The central calorimeters structure was coded on the GEANT 3.15 base in the frame of the SDCSIM environment. The SDCSIM is the general shell for simulation of the SDC set-up. The calorimeters geometry has been coded according to the FNAL and ANL engineering drawings and engineering data file. SDC central calorimeters detailed description is extremely useful for different simulation tasks, for fast simulation program parameters tuning, for different geometry especially studying (local response nonuniformity from bulkheads in the e.m. calorimeter and from coil supports and many others) and for the interpretation of the experimental data from the calorimeters. This simulation model is very useful for tasks of the test beam modules calorimeter calibration and for calorimeter in situ calibration. 3 refs., 8 figs

  1. Sequentially linear analysis for simulating brittle failure

    NARCIS (Netherlands)

    van de Graaf, A.V.

    2017-01-01

    The numerical simulation of brittle failure at structural level with nonlinear finite
    element analysis (NLFEA) remains a challenge due to robustness issues. We attribute these problems to the dimensions of real-world structures combined with softening behavior and negative tangent stiffness at

  2. Equation-oriented specification of neural models for simulations

    Directory of Open Access Journals (Sweden)

    Marcel eStimberg

    2014-02-01

    Full Text Available Simulating biological neuronal networks is a core method of research in computational neuroscience. A full specification of such a network model includes a description of the dynamics and state changes of neurons and synapses, as well as the synaptic connectivity patterns and the initial values of all parameters. A standard approach in neuronal modelling software is to build models based on a library of pre-defined models and mechanisms; if a model component does not yet exist, it has to be defined in a special-purpose or general low-level language and potentially be compiled and linked with the simulator. Here we propose an alternative approach that allows flexible definition of models by writing textual descriptions based on mathematical notation. We demonstrate that this approach allows the definition of a wide range of models with minimal syntax. Furthermore, such explicit model descriptions allow the generation of executable code for various target languages and devices, since the description is not tied to an implementation. Finally, this approach also has advantages for readability and reproducibility, because the model description is fully explicit, and because it can be automatically parsed and transformed into formatted descriptions.The presented approach has been implemented in the Brian2 simulator.

  3. Description of waste pretreatment and interfacing systems dynamic simulation model

    International Nuclear Information System (INIS)

    Garbrick, D.J.; Zimmerman, B.D.

    1995-05-01

    The Waste Pretreatment and Interfacing Systems Dynamic Simulation Model was created to investigate the required pretreatment facility processing rates for both high level and low level waste so that the vitrification of tank waste can be completed according to the milestones defined in the Tri-Party Agreement (TPA). In order to achieve this objective, the processes upstream and downstream of the pretreatment facilities must also be included. The simulation model starts with retrieval of tank waste and ends with vitrification for both low level and high level wastes. This report describes the results of three simulation cases: one based on suggested average facility processing rates, one with facility rates determined so that approximately 6 new DSTs are required, and one with facility rates determined so that approximately no new DSTs are required. It appears, based on the simulation results, that reasonable facility processing rates can be selected so that no new DSTs are required by the TWRS program. However, this conclusion must be viewed with respect to the modeling assumptions, described in detail in the report. Also included in the report, in an appendix, are results of two sensitivity cases: one with glass plant water recycle steams recycled versus not recycled, and one employing the TPA SST retrieval schedule versus a more uniform SST retrieval schedule. Both recycling and retrieval schedule appear to have a significant impact on overall tank usage

  4. An efficiency improvement in warehouse operation using simulation analysis

    Science.gov (United States)

    Samattapapong, N.

    2017-11-01

    In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.

  5. TYPES OF SCIENTIFIC DESCRIPTIONS IN ROMANIAN GEOGRAPHY TEXTBOOKS

    Directory of Open Access Journals (Sweden)

    VIORICA BLÎNDA

    2012-01-01

    Full Text Available This study will provide a brief look into the numerous aspects of description as a unit of discourse and into the/as well as into those/distinctive discourse methods. The perspectives of the proposed analysis emphasize that description as a unit of discourse is no longer denigrated and that it has regained its well-defined place within the discourse (especially within the discourse of geography as a primary unit of discourse. The analysis is based on a corpus of studies represented by texts of geography available in geography textbooks. Through this study there will be outlined a number of methods and strategies of the discursive process through description.

  6. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis.

    Science.gov (United States)

    Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B

    2012-01-20

    Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  7. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis

    Directory of Open Access Journals (Sweden)

    Neyeloff Jeruza L

    2012-01-01

    Full Text Available Abstract Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  8. Simulation and analysis of plutonium reprocessing plant data

    International Nuclear Information System (INIS)

    Burr, T.; Coulter, A.; Wangen, L.

    1996-01-01

    It will be difficult for large-throughput reprocessing plants to meet International Atomic Energy Agency (IAEA) detection goals for protracted diversion of plutonium by materials accounting alone. Therefore, the IAEA is considering supplementing traditional material balance analysis with analysis of solution monitoring data (frequent snapshots of such solution parameters as level, density, and temperature for all major process vessels). Analysis of solution monitoring data will enhance safeguards by improving anomaly detection and resolution, maintaining continuity of knowledge, and validating and improving measurement error models. However, there are costs associated with accessing and analyzing the data. To minimize these costs, analysis methods should be as complete as possible simple to implement, and require little human effort. As a step toward that goal, the authors have implemented simple analysis methods for use in an off-line situation. These methods use solution level to recognize major tank activities, such as tank-to-tank transfers and sampling. In this paper, the authors describe their application to realistic simulated data (the methods were developed by using both real and simulated data), and they present some quantifiable benefits of solution monitoring

  9. A simulation of the FASTBUS protocols

    International Nuclear Information System (INIS)

    Booth, A.W.

    1981-01-01

    FASTBUS is a standard bus system being developed for high speed data acquisition and processing in the next generation of large scale physics experiments. Prototypes are being built according to a draft specification. The FASTBUS protocols have been simulated using a powerful software tool which is a computer description language. This Instruction Set Processor Specification language, ISPS, has been used in the design and development of several microprocessor systems. It's applications are diverse, including automated design and the generation of machine relative software, as well as simulation. The results of the FASTBUS simulation are presented, with an overview of the ISPS hardware description language. An additional facility is discussed, which supplements the simulation by providing a visual presentation of the FASTBUS signals, that is, a timing-graph generator. (orig.)

  10. A Description of the Ship Combat System Simulation

    Science.gov (United States)

    1984-09-01

    developed by NWC, NOSC, NSWC, and CACI, Inc. SCSS is supported by naval and industrial laboratories throughout the country. The users of the...34’. 1lhutlon/ *A i ~n o 1. * Ruido i/iI NSWC TR 84-182 CONTENTS Page INTRODUCTION................................ .. . . .. . .. .. .. ... 1 SIMULATION...Isuch as a contract), Navy or private industry , can obtain the mulation by becoming a member of this Users’ Group. Additional information garding the

  11. Semiautomated analysis of optical coherence tomography crystalline lens images under simulated accommodation.

    Science.gov (United States)

    Kim, Eon; Ehrmann, Klaus; Uhlhorn, Stephen; Borja, David; Arrieta-Quintero, Esdras; Parel, Jean-Marie

    2011-05-01

    Presbyopia is an age related, gradual loss of accommodation, mainly due to changes in the crystalline lens. As part of research efforts to understand and cure this condition, ex vivo, cross-sectional optical coherence tomography images of crystalline lenses were obtained by using the Ex-Vivo Accommodation Simulator (EVAS II) instrument and analyzed to extract their physical and optical properties. Various filters and edge detection methods were applied to isolate the edge contour. An ellipse is fitted to the lens outline to obtain central reference point for transforming the pixel data into the analysis coordinate system. This allows for the fitting of a high order equation to obtain a mathematical description of the edge contour, which obeys constraints of continuity as well as zero to infinite surface slopes from apex to equator. Geometrical parameters of the lens were determined for the lens images captured at different accommodative states. Various curve fitting functions were developed to mathematically describe the anterior and posterior surfaces of the lens. Their differences were evaluated and their suitability for extracting optical performance of the lens was assessed. The robustness of these algorithms was tested by analyzing the same images repeated times.

  12. Wind turbine blockset in Saber. General overview and description of the models

    DEFF Research Database (Denmark)

    Iov, Florin; Timbus, Adrian Vasile; Hansen, Anca Daniela

    This report presents a new developed Saber Toolbox for wind turbine applications. This toolbox has been developed during the research project “Simulation Platform to model, optimize and design wind turbines”. The report provides a quick overview of the Saber and then explains the structure...... of this simulation package, which is different than other tools e.g. Matlab/Simulink. Then the structure of the toolbox is shown as well as the description of the developed models. The main focus here is to underline the special structure of the models, which are a mixture of Saber built-in blocks and new developed...... blocks. Since the developed models are based on Saber built-in blocks, a description of the libraries from Saber is given. Then some simulation results using the developed models are shown. Finally some general conclusions regarding this new developed Toolbox as well as some directions for future work...

  13. Wind Turbine Blockset in Saber. General Overview and Description of the Model

    DEFF Research Database (Denmark)

    Iov, Florin; Timbus, Adrian Vasile; Hansen, A. D.

    This report presents a new developed Saber Toolbox for wind turbine applications. This toolbox has been developed during the research project ?Simulation Platform to model, optimize and design wind turbines?. The report provides a quick overview of the Saber and then explains the structure...... of this simulation package, which is different than other tools e.g. Matlab/Simulink. Then the structure of the toolbox is shown as well as the description of the developed models. The main focus here is to underline the special structure of the models, which are a mixture of Saber built-in blocks and new developed...... blocks. Since the developed models are based on Saber built-in blocks, a description of the libraries from Saber is given. Then some simulation results using the developed models are shown. Finally some general conclusions regarding this new developed Toolbox as well as some directions for future work...

  14. Isentropic Analysis of a Simulated Hurricane

    Science.gov (United States)

    Mrowiec, Agnieszka A.; Pauluis, Olivier; Zhang, Fuqing

    2016-01-01

    Hurricanes, like many other atmospheric flows, are associated with turbulent motions over a wide range of scales. Here the authors adapt a new technique based on the isentropic analysis of convective motions to study the thermodynamic structure of the overturning circulation in hurricane simulations. This approach separates the vertical mass transport in terms of the equivalent potential temperature of air parcels. In doing so, one separates the rising air parcels at high entropy from the subsiding air at low entropy. This technique filters out oscillatory motions associated with gravity waves and separates convective overturning from the secondary circulation. This approach is applied here to study the flow of an idealized hurricane simulation with the Weather Research and Forecasting (WRF) Model. The isentropic circulation for a hurricane exhibits similar characteristics to that of moist convection, with a maximum mass transport near the surface associated with a shallow convection and entrainment. There are also important differences. For instance, ascent in the eyewall can be readily identified in the isentropic analysis as an upward mass flux of air with unusually high equivalent potential temperature. The isentropic circulation is further compared here to the Eulerian secondary circulation of the simulated hurricane to show that the mass transport in the isentropic circulation is much larger than the one in secondary circulation. This difference can be directly attributed to the mass transport by convection in the outer rainband and confirms that, even for a strongly organized flow like a hurricane, most of the atmospheric overturning is tied to the smaller scales.

  15. Design and realization of simulators

    International Nuclear Information System (INIS)

    Mathey, C.

    1984-01-01

    The two main categories of simulators are training simulators of which aim is the education of the nuclear power plant operators, and the study simulators. The French park of simulators is reviewed, as also their field of utilization. One deals with the simulator design: general description, calculation tools, middleware, and programming, mathematical models and numerical methods. Then, the instructor post of the EDF's simulators are more particularly described. The realization of a simulator includes two main stages: the development of the material and, the development of the software [fr

  16. Development and simulation of various methods for neutron activation analysis

    International Nuclear Information System (INIS)

    Otgooloi, B.

    1993-01-01

    Simple methods for neutron activation analysis have been developed. The results on the studies of installation for determination of fluorine in fluorite ores directly on the lorry by fast neutron activation analysis have been shown. Nitrogen in organic materials was shown by N 14 and N 15 activation. The description of the new equipment 'FLUORITE' for fluorate factory have been shortly given. Pu and Be isotope in organic materials, including in wheat, was measured. 25 figs, 19 tabs. (Author, Translated by J.U)

  17. Accident simulator development for probabilistic safety analysis

    International Nuclear Information System (INIS)

    Cacciabue, P.C.; Amendola, A.; Mancini, G.

    1985-01-01

    This paper describes the basic features of a new concept of incident simulator, Response System Analyzed (RSA) which is being developed within the CEC JRC Research Program on Reactor Safety. Focusing on somewhat different aims than actual simulators, RSA development extends the field of application of simulators to the area of risk and reliability analysis and in particular to the identification of relevant sequences, to the modeling of human behavior and to the validation of operating procedures. The fundamental components of the project, i.e. the deterministic transient model of the plant, the automatic probabilistic driver and the human possible intervention modeling, are discussed in connection with the problem of their dynamic interaction. The analyses so far performed by separately testing RSA on significant study cases have shown encouraging results and have proven the feasibility of the overall program

  18. Audio Description as a Pedagogical Tool

    Directory of Open Access Journals (Sweden)

    Georgina Kleege

    2015-05-01

    Full Text Available Audio description is the process of translating visual information into words for people who are blind or have low vision. Typically such description has focused on films, museum exhibitions, images and video on the internet, and live theater. Because it allows people with visual impairments to experience a variety of cultural and educational texts that would otherwise be inaccessible, audio description is a mandated aspect of disability inclusion, although it remains markedly underdeveloped and underutilized in our classrooms and in society in general. Along with increasing awareness of disability, audio description pushes students to practice close reading of visual material, deepen their analysis, and engage in critical discussions around the methodology, standards and values, language, and role of interpretation in a variety of academic disciplines. We outline a few pedagogical interventions that can be customized to different contexts to develop students' writing and critical thinking skills through guided description of visual material.

  19. Token Economy: A Systematic Review of Procedural Descriptions.

    Science.gov (United States)

    Ivy, Jonathan W; Meindl, James N; Overley, Eric; Robson, Kristen M

    2017-09-01

    The token economy is a well-established and widely used behavioral intervention. A token economy is comprised of six procedural components: the target response(s), a token that functions as a conditioned reinforcer, backup reinforcers, and three interconnected schedules of reinforcement. Despite decades of applied research, the extent to which the procedures of a token economy are described in complete and replicable detail has not been evaluated. Given the inherent complexity of a token economy, an analysis of the procedural descriptions may benefit future token economy research and practice. Articles published between 2000 and 2015 that included implementation of a token economy within an applied setting were identified and reviewed with a focus on evaluating the thoroughness of procedural descriptions. The results show that token economy components are regularly omitted or described in vague terms. Of the articles included in this analysis, only 19% (18 of 96 articles reviewed) included replicable and complete descriptions of all primary components. Missing or vague component descriptions could negatively affect future research or applied practice. Recommendations are provided to improve component descriptions.

  20. Data collection on the unit control room simulator as a method of operator reliability analysis

    International Nuclear Information System (INIS)

    Holy, J.

    1998-01-01

    The report consists of the following chapters: (1) Probabilistic assessment of nuclear power plant operation safety and human factor reliability analysis; (2) Simulators and simulations as human reliability analysis tools; (3) DOE project for using the collection and analysis of data from the unit control room simulator in human factor reliability analysis at the Paks nuclear power plant; (4) General requirements for the organization of the simulator data collection project; (5) Full-scale simulator at the Nuclear Power Plants Research Institute in Trnava, Slovakia, used as a training means for operators of the Dukovany NPP; (6) Assessment of the feasibility of quantification of important human actions modelled within a PSA study by employing simulator data analysis; (7) Assessment of the feasibility of using the various exercise topics for the quantification of the PSA model; (8) Assessment of the feasibility of employing the simulator in the analysis of the individual factors affecting the operator's activity; and (9) Examples of application of statistical methods in the analysis of the human reliability factor. (P.A.)

  1. The SocioEconomic Analysis of Repository Siting (SEARS): Technical description: Final draft

    International Nuclear Information System (INIS)

    1984-11-01

    Socioeconomic impacts must be assessed both for the near term and for the future. One means of addressing the need for the assessment of such impacts has been through the development of the computerized socioeconomic assessment model called the SocioEconomic Analysis of Repository Siting (SEARS) model. The SEARS model was developed for the Battelle Project Management Division. It was refined and adapted from state-of-the-art computerized projection models and thoroughly validated and is now available for use in projecting the likely socioeconomic impacts of a repository facility. This Technical Description is one of six major products that describe the SEARS modeling system. 61 refs., 11 figs., 9 tabs

  2. Research on neutron noise analysis stochastic simulation method for α calculation

    International Nuclear Information System (INIS)

    Zhong Bin; Shen Huayun; She Ruogu; Zhu Shengdong; Xiao Gang

    2014-01-01

    The prompt decay constant α has significant application on the physical design and safety analysis in nuclear facilities. To overcome the difficulty of a value calculation with Monte-Carlo method, and improve the precision, a new method based on the neutron noise analysis technology was presented. This method employs the stochastic simulation and the theory of neutron noise analysis technology. Firstly, the evolution of stochastic neutron was simulated by discrete-events Monte-Carlo method based on the theory of generalized Semi-Markov process, then the neutron noise in detectors was solved from neutron signal. Secondly, the neutron noise analysis methods such as Rossia method, Feynman-α method, zero-probability method, and cross-correlation method were used to calculate a value. All of the parameters used in neutron noise analysis method were calculated based on auto-adaptive arithmetic. The a value from these methods accords with each other, the largest relative deviation is 7.9%, which proves the feasibility of a calculation method based on neutron noise analysis stochastic simulation. (authors)

  3. Intelligent simulations for on-line transient analysis

    International Nuclear Information System (INIS)

    Hassberger, J.A.; Lee, J.C.

    1987-01-01

    A unique combination of simulation, parameter estimation and expert systems technology is applied to the problem of diagnosing nuclear power plant transients. Knowledge-based reasoning is ued to monitor plant data and hypothesize about the status of the plant. Fuzzy logic is employed as the inferencing mechanism and an implication scheme based on observations is developed and employed to handle scenarios involving competing failures. Hypothesis testing is performed by simulating the behavior of faulted components using numerical models. A filter has been developed for systematically adjusting key model parameters to force agreement between simulations and actual plant data. Pattern recognition is employed as a decision analysis technique for choosing among several hypotheses based on simulation results. An artificial Intelligence framework based on a critical functions approach is used to deal with the complexity of a nuclear plant system. Detailed simulation results of various nuclear power plant accident scenarios are presented to demonstrate the performance and robustness properties of the diagnostic algorithm developed. The system is shown to be successful in diagnosing and identifying fault parameters for a normal reactor scram, loss-of-feedwater (LOFW) and small loss-of-coolant (LOCA) transients occurring together in a scenario similar to the accident at Three Mile Island

  4. Management of Industrial Performance Indicators: Regression Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Walter Roberto Hernandez Vergara

    2017-11-01

    Full Text Available Stochastic methods can be used in problem solving and explanation of natural phenomena through the application of statistical procedures. The article aims to associate the regression analysis and systems simulation, in order to facilitate the practical understanding of data analysis. The algorithms were developed in Microsoft Office Excel software, using statistical techniques such as regression theory, ANOVA and Cholesky Factorization, which made it possible to create models of single and multiple systems with up to five independent variables. For the analysis of these models, the Monte Carlo simulation and analysis of industrial performance indicators were used, resulting in numerical indices that aim to improve the goals’ management for compliance indicators, by identifying systems’ instability, correlation and anomalies. The analytical models presented in the survey indicated satisfactory results with numerous possibilities for industrial and academic applications, as well as the potential for deployment in new analytical techniques.

  5. Dealing with behavioral ambiguity in textual process descriptions

    NARCIS (Netherlands)

    van der Aa, Han; Leopold, Henrik; Reijers, Hajo A.

    2016-01-01

    Textual process descriptions are widely used in organizations since they can be created and understood by virtually everyone. The inherent ambiguity of natural language, however, impedes the automated analysis of textual process descriptions. While human readers can use their context knowledge to

  6. Summary description of the Fast Flux Test Facility

    International Nuclear Information System (INIS)

    Cabell, C.P.

    1980-12-01

    This document has been compiled and issued to provide an illustrated engineering summary description of the FFTF. The document is limited to a description of the plant and its functions, and does not cover the extensive associated programs that have been carried out in the fields of design, design analysis, safety analysis, fuels development, equipment development and testing, quality assurance, equipment fabrication, plant construction, acceptance testing, operations planning and training, and the like

  7. Learning motivation and student achievement : description analysis and relationships both

    Directory of Open Access Journals (Sweden)

    Ari Riswanto

    2017-03-01

    Full Text Available Education is very important for humans, through the education throughout the world will increasingly flourish. However, if faced with the activities within the learning process, not a few men (students who have less motivation in learning activities. This resulted in fewer maximal learning processes and in turn will affect student achievement. This study focuses to discuss matters relating to the motivation to learn and student achievement, with the aim of strengthening the importance of motivation in the learning process so that a clear relationship with student achievement. The method used is descriptive analysis and simple correlation to the 97 students taking the course introduction to Microeconomics and Indonesian. The conclusion from this research is the students have a good record if it has a well and motivated as well, and this study concludes their tie's difference between learning motivation and achievement of students on two different courses.

  8. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic

  9. Macroscopic simulation of isotropic permanent magnets

    International Nuclear Information System (INIS)

    Bruckner, Florian; Abert, Claas; Vogler, Christoph; Heinrichs, Frank; Satz, Armin; Ausserlechner, Udo; Binder, Gernot; Koeck, Helmut; Suess, Dieter

    2016-01-01

    Accurate simulations of isotropic permanent magnets require to take the magnetization process into account and consider the anisotropic, nonlinear, and hysteretic material behaviour near the saturation configuration. An efficient method for the solution of the magnetostatic Maxwell equations including the description of isotropic permanent magnets is presented. The algorithm can easily be implemented on top of existing finite element methods and does not require a full characterization of the hysteresis of the magnetic material. Strayfield measurements of an isotropic permanent magnet and simulation results are in good agreement and highlight the importance of a proper description of the isotropic material. - Highlights: • Simulations of isotropic permanent magnets. • Accurate calculation of remanence magnetization and strayfield. • Comparison with strayfield measurements and anisotropic magnet simulations. • Efficient 3D FEM–BEM coupling for solution of Maxwell equations.

  10. Subset simulation for structural reliability sensitivity analysis

    International Nuclear Information System (INIS)

    Song Shufang; Lu Zhenzhou; Qiao Hongwei

    2009-01-01

    Based on two procedures for efficiently generating conditional samples, i.e. Markov chain Monte Carlo (MCMC) simulation and importance sampling (IS), two reliability sensitivity (RS) algorithms are presented. On the basis of reliability analysis of Subset simulation (Subsim), the RS of the failure probability with respect to the distribution parameter of the basic variable is transformed as a set of RS of conditional failure probabilities with respect to the distribution parameter of the basic variable. By use of the conditional samples generated by MCMC simulation and IS, procedures are established to estimate the RS of the conditional failure probabilities. The formulae of the RS estimator, its variance and its coefficient of variation are derived in detail. The results of the illustrations show high efficiency and high precision of the presented algorithms, and it is suitable for highly nonlinear limit state equation and structural system with single and multiple failure modes

  11. The compact simulator for Tihange nuclear plant

    International Nuclear Information System (INIS)

    Gueben, M.

    1982-01-01

    After an introduction about the simulators for nuclear plants, a description is given of the compact simulator for the Tihange nuclear power plant as well as the simulated circuits and equipments such as the primary and secondary coolant circuits. The extent of simulation, the functions used by the instructor, the use of the simulator, the formation programme and construction planning are described. (AF)

  12. The simulation for the ATLAS experiment Present status and outlook

    CERN Document Server

    Rimoldi, A; Gallas, M; Nairz, A; Boudreau, J; Tsulaia, V; Costanzo, D

    2004-01-01

    The simulation program for the ATLAS experiment is presently operational in a full OO environment. This important physics application has been successfully integrated into ATLAS's common analysis framework, ATHENA. In the last year, following a well stated strategy of transition from a GEANT3 to a GEANT4-based simulation, a careful validation programme confirmed the reliability, performance and robustness of this new tool, as well as its consistency with the results of previous simulation. Generation, simulation and digitization steps on different sets of full physics events we retested for performance. The same software used to simulate the full the ATLAS detector is also used with testbeam configurations. Comparisons to real data in the testbeam validate both the detector description and the physics processes within each subcomponent. In this paper we present the current status of ATLAS GEANT4 simulation, describe the functionality tests performed during its validation phase, and the experience with distrib...

  13. Fast simulation of the trigger system of the ATLAS detector at LHC

    International Nuclear Information System (INIS)

    Epp, B.; Ghete, V.M.; Kuhn, D.; Zhang, Y.J.

    2004-01-01

    The trigger system of the ATLAS detector aims to maximize the physics coverage and to be open to new and possibly unforeseen physics signatures. It is a multi-level system, composed from a hardware trigger at level-1, followed by the high-level-trigger (level-2 and event-filter). In order to understand its performance, to optimize it and to reduce its total cost, the trigger system requires a detailed simulation which is time- and resource-consuming. An alternative to the full detector simulation is a so-called 'fast simulation' which starts the analysis from particle level and replaces the full detector simulation and the detailed particle tracking with parametrized distributions obtained from the full simulation and/or a simplified detector geometry. The fast simulation offers a less precise description of trigger performance, but it is faster and less resource-consuming. (author)

  14. Comparative Analysis of Disruption Tolerant Network Routing Simulations in the One and NS-3

    Science.gov (United States)

    2017-12-01

    The added levels of simulation increase the processing required by a simulation . ns-3’s simulation of other layers of the network stack permits...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS COMPARATIVE ANALYSIS OF DISRUPTION TOLERANT NETWORK ROUTING SIMULATIONS IN THE ONE AND NS-3...Thesis 03-23-2016 to 12-15-2017 4. TITLE AND SUBTITLE COMPARATIVE ANALYSIS OF DISRUPTION TOLERANT NETWORK ROUTING SIMULATIONS IN THE ONE AND NS-3 5

  15. Discrete-Event Simulation in Chemical Engineering.

    Science.gov (United States)

    Schultheisz, Daniel; Sommerfeld, Jude T.

    1988-01-01

    Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)

  16. Dynamical twisted mass fermions with light quarks. Simulation and analysis details

    Energy Technology Data Exchange (ETDEWEB)

    Boucaud, P. [Paris-11 Univ., 91 - Orsay (France). Lab. de Physique Theorique; Dimopoulos, P. [Rome-2 Univ. (Italy). Dipt. di Fisica; Farchioni, F. [Muenster Univ. (DE). Inst. fuer Theoretische Physik] (and others)

    2008-03-15

    In a recent paper (2007) we presented precise lattice QCD results of our European Twisted Mass Collaboration (ETMC). They were obtained by employing two mass-degenerate flavours of twisted mass fermions at maximal twist. In the present paper we give details on our simulations and the computation of physical observables. In particular, we discuss the problem of tuning to maximal twist, the techniques we have used to compute correlators and error estimates. In addition, we provide more information on the algorithm used, the autocorrelation times and scale determination, the evaluation of disconnected contributions and the description of our data by means of chiral perturbation theory formulae. (orig.)

  17. Dynamical twisted mass fermions with light quarks. Simulation and analysis details

    International Nuclear Information System (INIS)

    Boucaud, P.; Dimopoulos, P.; Farchioni, F.

    2008-03-01

    In a recent paper (2007) we presented precise lattice QCD results of our European Twisted Mass Collaboration (ETMC). They were obtained by employing two mass-degenerate flavours of twisted mass fermions at maximal twist. In the present paper we give details on our simulations and the computation of physical observables. In particular, we discuss the problem of tuning to maximal twist, the techniques we have used to compute correlators and error estimates. In addition, we provide more information on the algorithm used, the autocorrelation times and scale determination, the evaluation of disconnected contributions and the description of our data by means of chiral perturbation theory formulae. (orig.)

  18. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  19. Ten Years of Simulation in Healthcare: A Thematic Analysis of Editorials.

    Science.gov (United States)

    Nestel, Debra

    2017-10-01

    In this commentary, I review 38 articles published as editorials in Simulation in Healthcare from inception to April 2016. Of the 27 authors, there was a predominance of medical doctors (63%), male authors (67%), and work originating in the United States (86%). The founding Editor-in-Chief Dr David Gaba contributed to half of the editorials. Using inductive thematic analysis, the following five themes were identified: "embedding" simulation, simulation responding to clinical practice, educational considerations for simulation, research practices, and communicating leadership and scholarship about the community. After thematic analysis, the theoretical notion of communities of practice was used to make further meaning of the themes. This theorizing process reveals that editorial content aligns with the features of an evolving community of practice. The editorials seem to have responded to and shaped contemporary simulation practices. The editorial is a powerful forum in which to frame issues relevant to the healthcare simulation community. As the founding Editor-in-Chief, Gaba has made an extraordinary contribution to the Society for Simulation in Healthcare, in these editorials and the broader healthcare simulation community. Under the leadership of the Editor-in-Chief, Dr Mark Scerbo, I am confident that the editorial voice will continue in the true spirit of scholarship.

  20. Comparison of scale analysis and numerical simulation for saturated zone convective mixing processes

    International Nuclear Information System (INIS)

    Oldenburg, C.M.

    1998-01-01

    Scale analysis can be used to predict a variety of quantities arising from natural systems where processes are described by partial differential equations. For example, scale analysis can be applied to estimate the effectiveness of convective missing on the dilution of contaminants in groundwater. Scale analysis involves substituting simple quotients for partial derivatives and identifying and equating the dominant terms in an order-of-magnitude sense. For free convection due to sidewall heating of saturated porous media, scale analysis shows that vertical convective velocity in the thermal boundary layer region is proportional to the Rayleigh number, horizontal convective velocity is proportional to the square root of the Rayleigh number, and thermal boundary layer thickness is proportional to the inverse square root of the Rayleigh number. These scale analysis estimates are corroborated by numerical simulations of an idealized system. A scale analysis estimate of mixing time for a tracer mixing by hydrodynamic dispersion in a convection cell also agrees well with numerical simulation for two different Rayleigh numbers. Scale analysis for the heating-from-below scenario produces estimates of maximum velocity one-half as large as the sidewall case. At small values of the Rayleigh number, this estimate is confirmed by numerical simulation. For larger Rayleigh numbers, simulation results suggest maximum velocities are similar to the sidewall heating scenario. In general, agreement between scale analysis estimates and numerical simulation results serves to validate the method of scale analysis. Application is to radioactive repositories

  1. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  2. PRANAS: A New Platform for Retinal Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Bruno Cessac

    2017-09-01

    Full Text Available The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr, standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.

  3. The ATLAS Simulation Infrastructure

    CERN Document Server

    Aad, G.; Abdallah, J.; Abdelalim, A.A.; Abdesselam, A.; Abdinov, O.; Abi, B.; Abolins, M.; Abramowicz, H.; Abreu, H.; Acharya, B.S.; Adams, D.L.; Addy, T.N.; Adelman, J.; Adorisio, C.; Adragna, P.; Adye, T.; Aefsky, S.; Aguilar-Saavedra, J.A.; Aharrouche, M.; Ahlen, S.P.; Ahles, F.; Ahmad, A.; Ahmed, H.; Ahsan, M.; Aielli, G.; Akdogan, T.; Akesson, T.P.A.; Akimoto, G.; Akimov, A.V.; Aktas, A.; Alam, M.S.; Alam, M.A.; Albrand, S.; Aleksa, M.; Aleksandrov, I.N.; Alexa, C.; Alexander, G.; Alexandre, G.; Alexopoulos, T.; Alhroob, M.; Aliev, M.; Alimonti, G.; Alison, J.; Aliyev, M.; Allport, P.P.; Allwood-Spiers, S.E.; Almond, J.; Aloisio, A.; Alon, R.; Alonso, A.; Alviggi, M.G.; Amako, K.; Amelung, C.; Amorim, A.; Amoros, G.; Amram, N.; Anastopoulos, C.; Andeen, T.; Anders, C.F.; Anderson, K.J.; Andreazza, A.; Andrei, V.; Anduaga, X.S.; Angerami, A.; Anghinolfi, F.; Anjos, N.; Annovi, A.; Antonaki, A.; Antonelli, M.; Antonelli, S.; Antos, J.; Antunovic, B.; Anulli, F.; Aoun, S.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A.T.H.; Archambault, J.P.; Arfaoui, S.; Arguin, J-F.; Argyropoulos, T.; Arik, M.; Armbruster, A.J.; Arnaez, O.; Arnault, C.; Artamonov, A.; Arutinov, D.; Asai, M.; Asai, S.; Asfandiyarov, R.; Ask, S.; Asman, B.; Asner, D.; Asquith, L.; Assamagan, K.; Astbury, A.; Astvatsatourov, A.; Atoian, G.; Auerbach, B.; Augsten, K.; Aurousseau, M.; Austin, N.; Avolio, G.; Avramidou, R.; Axen, D.; Ay, C.; Azuelos, G.; Azuma, Y.; Baak, M.A.; Bach, A.M.; Bachacou, H.; Bachas, K.; Backes, M.; Badescu, E.; Bagnaia, P.; Bai, Y.; Bain, T.; Baines, J.T.; Baker, O.K.; Baker, M.D.; Baker, S; Baltasar Dos Santos Pedrosa, F.; Banas, E.; Banerjee, P.; Banerjee, S.; Banfi, D.; Bangert, A.; Bansal, V.; Baranov, S.P.; Baranov, S.; Barashkou, A.; Barber, T.; Barberio, E.L.; Barberis, D.; Barbero, M.; Bardin, D.Y.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnett, B.M.; Barnett, R.M.; Baroncelli, A.; Barr, A.J.; Barreiro, F.; Barreiro Guimaraes da Costa, J.; Barrillon, P.; Bartoldus, R.; Bartsch, D.; Bates, R.L.; Batkova, L.; Batley, J.R.; Battaglia, A.; Battistin, M.; Bauer, F.; Bawa, H.S.; Bazalova, M.; Beare, B.; Beau, T.; Beauchemin, P.H.; Beccherle, R.; Becerici, N.; Bechtle, P.; Beck, G.A.; Beck, H.P.; Beckingham, M.; Becks, K.H.; Beddall, A.J.; Beddall, A.; Bednyakov, V.A.; Bee, C.; Begel, M.; Behar Harpaz, S.; Behera, P.K.; Beimforde, M.; Belanger-Champagne, C.; Bell, P.J.; Bell, W.H.; Bella, G.; Bellagamba, L.; Bellina, F.; Bellomo, M.; Belloni, A.; Belotskiy, K.; Beltramello, O.; Ben Ami, S.; Benary, O.; Benchekroun, D.; Bendel, M.; Benedict, B.H.; Benekos, N.; Benhammou, Y.; Benincasa, G.P.; Benjamin, D.P.; Benoit, M.; Bensinger, J.R.; Benslama, K.; Bentvelsen, S.; Beretta, M.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Berghaus, F.; Berglund, E.; Beringer, J.; Bernat, P.; Bernhard, R.; Bernius, C.; Berry, T.; Bertin, A.; Besana, M.I.; Besson, N.; Bethke, S.; Bianchi, R.M.; Bianco, M.; Biebel, O.; Biesiada, J.; Biglietti, M.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biscarat, C.; Bitenc, U.; Black, K.M.; Blair, R.E.; Blanchard, J-B; Blanchot, G.; Blocker, C.; Blondel, A.; Blum, W.; Blumenschein, U.; Bobbink, G.J.; Bocci, A.; Boehler, M.; Boek, J.; Boelaert, N.; Boser, S.; Bogaerts, J.A.; Bogouch, A.; Bohm, C.; Bohm, J.; Boisvert, V.; Bold, T.; Boldea, V.; Bondarenko, V.G.; Bondioli, M.; Boonekamp, M.; Bordoni, S.; Borer, C.; Borisov, A.; Borissov, G.; Borjanovic, I.; Borroni, S.; Bos, K.; Boscherini, D.; Bosman, M.; Boterenbrood, H.; Bouchami, J.; Boudreau, J.; Bouhova-Thacker, E.V.; Boulahouache, C.; Bourdarios, C.; Boveia, A.; Boyd, J.; Boyko, I.R.; Bozovic-Jelisavcic, I.; Bracinik, J.; Braem, A.; Branchini, P.; Brandenburg, G.W.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J.E.; Braun, H.M.; Brelier, B.; Bremer, J.; Brenner, R.; Bressler, S.; Britton, D.; Brochu, F.M.; Brock, I.; Brock, R.; Brodet, E.; Bromberg, C.; Brooijmans, G.; Brooks, W.K.; Brown, G.; Bruckman de Renstrom, P.A.; Bruncko, D.; Bruneliere, R.; Brunet, S.; Bruni, A.; Bruni, G.; Bruschi, M.; Bucci, F.; Buchanan, J.; Buchholz, P.; Buckley, A.G.; Budagov, I.A.; Budick, B.; Buscher, V.; Bugge, L.; Bulekov, O.; Bunse, M.; Buran, T.; Burckhart, H.; Burdin, S.; Burgess, T.; Burke, S.; Busato, E.; Bussey, P.; Buszello, C.P.; Butin, F.; Butler, B.; Butler, J.M.; Buttar, C.M.; Butterworth, J.M.; Byatt, T.; Caballero, J.; Cabrera Urban, S.; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calfayan, P.; Calkins, R.; Caloba, L.P.; Calvet, D.; Camarri, P.; Cameron, D.; Campana, S.; Campanelli, M.; Canale, V.; Canelli, F.; Canepa, A.; Cantero, J.; Capasso, L.; Capeans Garrido, M.D.M.; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Caramarcu, C.; Cardarelli, R.; Carli, T.; Carlino, G.; Carminati, L.; Caron, B.; Caron, S.; Carrillo Montoya, G.D.; Carron Montero, S.; Carter, A.A.; Carter, J.R.; Carvalho, J.; Casadei, D.; Casado, M.P.; Cascella, M.; Castaneda Hernandez, A.M.; Castaneda-Miranda, E.; Castillo Gimenez, V.; Castro, N.F.; Cataldi, G.; Catinaccio, A.; Catmore, J.R.; Cattai, A.; Cattani, G.; Caughron, S.; Cauz, D.; Cavalleri, P.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerqueira, A.S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cetin, S.A.; Chafaq, A.; Chakraborty, D.; Chan, K.; Chapman, J.D.; Chapman, J.W.; Chareyre, E.; Charlton, D.G.; Chavda, V.; Cheatham, S.; Chekanov, S.; Chekulaev, S.V.; Chelkov, G.A.; Chen, H.; Chen, S.; Chen, X.; Cheplakov, A.; Chepurnov, V.F.; Cherkaoui El Moursli, R.; Tcherniatine, V.; Chesneanu, D.; Cheu, E.; Cheung, S.L.; Chevalier, L.; Chevallier, F.; Chiarella, V.; Chiefari, G.; Chikovani, L.; Childers, J.T.; Chilingarov, A.; Chiodini, G.; Chizhov, V.; Choudalakis, G.; Chouridou, S.; Christidi, I.A.; Christov, A.; Chromek-Burckhart, D.; Chu, M.L.; Chudoba, J.; Ciapetti, G.; Ciftci, A.K.; Ciftci, R.; Cinca, D.; Cindro, V.; Ciobotaru, M.D.; Ciocca, C.; Ciocio, A.; Cirilli, M.; Citterio, M.; Clark, A.; Clark, P.J.; Cleland, W.; Clemens, J.C.; Clement, B.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coggeshall, J.; Cogneras, E.; Colijn, A.P.; Collard, C.; Collins, N.J.; Collins-Tooth, C.; Collot, J.; Colon, G.; Conde Muino, P.; Coniavitis, E.; Consonni, M.; Constantinescu, S.; Conta, C.; Conventi, F.; Cooke, M.; Cooper, B.D.; Cooper-Sarkar, A.M.; Cooper-Smith, N.J.; Copic, K.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M.J.; Costanzo, D.; Costin, T.; Cote, D.; Coura Torres, R.; Courneyea, L.; Cowan, G.; Cowden, C.; Cox, B.E.; Cranmer, K.; Cranshaw, J.; Cristinziani, M.; Crosetti, G.; Crupi, R.; Crepe-Renaudin, S.; Cuenca Almenar, C.; Cuhadar Donszelmann, T.; Curatolo, M.; Curtis, C.J.; Cwetanski, P.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; D'Orazio, A.; Da Via, C; Dabrowski, W.; Dai, T.; Dallapiccola, C.; Dallison, S.J.; Daly, C.H.; Dam, M.; Danielsson, H.O.; Dannheim, D.; Dao, V.; Darbo, G.; Darlea, G.L.; Davey, W.; Davidek, T.; Davidson, N.; Davidson, R.; Davies, M.; Davison, A.R.; Dawson, I.; Daya, R.K.; De, K.; de Asmundis, R.; De Castro, S.; De Castro Faria Salgado, P.E.; De Cecco, S.; de Graat, J.; De Groot, N.; de Jong, P.; De Mora, L.; De Oliveira Branco, M.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J.B.; De Zorzi, G.; Dean, S.; Dedovich, D.V.; Degenhardt, J.; Dehchar, M.; Del Papa, C.; Del Peso, J.; Del Prete, T.; Dell'Acqua, A.; Dell'Asta, L.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P.A.; Deluca, C.; Demers, S.; Demichev, M.; Demirkoz, B.; Deng, J.; Deng, W.; Denisov, S.P.; Derkaoui, J.E.; Derue, F.; Dervan, P.; Desch, K.; Deviveiros, P.O.; Dewhurst, A.; DeWilde, B.; Dhaliwal, S.; Dhullipudi, R.; Di Ciaccio, A.; Di Ciaccio, L.; Di Domenico, A.; Di Girolamo, A.; Di Girolamo, B.; Di Luise, S.; Di Mattia, A.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Diaz, M.A.; Diblen, F.; Diehl, E.B.; Dietrich, J.; Dietzsch, T.A.; Diglio, S.; Dindar Yagci, K.; Dingfelder, J.; Dionisi, C.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djilkibaev, R.; Djobava, T.; do Vale, M.A.B.; Do Valle Wemans, A.; Doan, T.K.O.; Dobos, D.; Dobson, E.; Dobson, M.; Doglioni, C.; Doherty, T.; Dolejsi, J.; Dolenc, I.; Dolezal, Z.; Dolgoshein, B.A.; Dohmae, T.; Donega, M.; Donini, J.; Dopke, J.; Doria, A.; Dos Anjos, A.; Dotti, A.; Dova, M.T.; Doxiadis, A.; Doyle, A.T.; Drasal, Z.; Dris, M.; Dubbert, J.; Duchovni, E.; Duckeck, G.; Dudarev, A.; Dudziak, F.; Duhrssen, M.; Duflot, L.; Dufour, M-A.; Dunford, M.; Duran Yildiz, H.; Dushkin, A.; Duxfield, R.; Dwuznik, M.; Duren, M.; Ebenstein, W.L.; Ebke, J.; Eckweiler, S.; Edmonds, K.; Edwards, C.A.; Egorov, K.; Ehrenfeld, W.; Ehrich, T.; Eifert, T.; Eigen, G.; Einsweiler, K.; Eisenhandler, E.; Ekelof, T.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellinghaus, F.; Ellis, K.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Engelmann, R.; Engl, A.; Epp, B.; Eppig, A.; Erdmann, J.; Ereditato, A.; Eriksson, D.; Ermoline, I.; Ernst, J.; Ernst, M.; Ernwein, J.; Errede, D.; Errede, S.; Ertel, E.; Escalier, M.; Escobar, C.; Espinal Curull, X.; Esposito, B.; Etienvre, A.I.; Etzion, E.; Evans, H.; Fabbri, L.; Fabre, C.; Facius, K.; Fakhrutdinov, R.M.; Falciano, S.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farley, J.; Farooque, T.; Farrington, S.M.; Farthouat, P.; Fassnacht, P.; Fassouliotis, D.; Fatholahzadeh, B.; Fayard, L.; Fayette, F.; Febbraro, R.; Federic, P.; Fedin, O.L.; Fedorko, W.; Feligioni, L.; Felzmann, C.U.; Feng, C.; Feng, E.J.; Fenyuk, A.B.; Ferencei, J.; Ferland, J.; Fernandes, B.; Fernando, W.; Ferrag, S.; Ferrando, J.; Ferrara, V.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferrer, A.; Ferrer, M.L.; Ferrere, D.; Ferretti, C.; Fiascaris, M.; Fiedler, F.; Filipcic, A.; Filippas, A.; Filthaut, F.; Fincke-Keeler, M.; Fiolhais, M.C.N.; Fiorini, L.; Firan, A.; Fischer, G.; Fisher, M.J.; Flechl, M.; Fleck, I.; Fleckner, J.; Fleischmann, P.; Fleischmann, S.; Flick, T.; Flores Castillo, L.R.; Flowerdew, M.J.; Fonseca Martin, T.; Formica, A.; Forti, A.; Fortin, D.; Fournier, D.; Fowler, A.J.; Fowler, K.; Fox, H.; Francavilla, P.; Franchino, S.; Francis, D.; Franklin, M.; Franz, S.; Fraternali, M.; Fratina, S.; Freestone, J.; French, S.T.; Froeschl, R.; Froidevaux, D.; Frost, J.A.; Fukunaga, C.; Fullana Torregrosa, E.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gadfort, T.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Galea, C.; Gallas, E.J.; Gallo, V.; Gallop, B.J.; Gallus, P.; Galyaev, E.; Gan, K.K.; Gao, Y.S.; Gaponenko, A.; Garcia-Sciveres, M.; Garcia, C.; Garcia Navarro, J.E.; Gardner, R.W.; Garelli, N.; Garitaonandia, H.; Garonne, V.; Gatti, C.; Gaudio, G.; Gautard, V.; Gauzzi, P.; Gavrilenko, I.L.; Gay, C.; Gaycken, G.; Gazis, E.N.; Ge, P.; Gee, C.N.P.; Geich-Gimbel, Ch.; Gellerstedt, K.; Gemme, C.; Genest, M.H.; Gentile, S.; Georgatos, F.; George, S.; Gershon, A.; Ghazlane, H.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giakoumopoulou, V.; Giangiobbe, V.; Gianotti, F.; Gibbard, B.; Gibson, A.; Gibson, S.M.; Gilbert, L.M.; Gilchriese, M.; Gilewsky, V.; Gingrich, D.M.; Ginzburg, J.; Giokaris, N.; Giordani, M.P.; Giordano, R.; Giorgi, F.M.; Giovannini, P.; Giraud, P.F.; Girtler, P.; Giugni, D.; Giusti, P.; Gjelsten, B.K.; Gladilin, L.K.; Glasman, C.; Glazov, A.; Glitza, K.W.; Glonti, G.L.; Godfrey, J.; Godlewski, J.; Goebel, M.; Gopfert, T.; Goeringer, C.; Gossling, C.; Gottfert, T.; Goggi, V.; Goldfarb, S.; Goldin, D.; Golling, T.; Gomes, A.; Gomez Fajardo, L.S.; Goncalo, R.; Gonella, L.; Gong, C.; Gonzalez de la Hoz, S.; Gonzalez Silva, M.L.; Gonzalez-Sevilla, S.; Goodson, J.J.; Goossens, L.; Gordon, H.A.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorisek, A.; Gornicki, E.; Gosdzik, B.; Gosselink, M.; Gostkin, M.I.; Gough Eschrich, I.; Gouighri, M.; Goujdami, D.; Goulette, M.P.; Goussiou, A.G.; Goy, C.; Grabowska-Bold, I.; Grafstrom, P.; Grahn, K-J.; Grancagnolo, S.; Grassi, V.; Gratchev, V.; Grau, N.; Gray, H.M.; Gray, J.A.; Graziani, E.; Green, B.; Greenshaw, T.; Greenwood, Z.D.; Gregor, I.M.; Grenier, P.; Griesmayer, E.; Griffiths, J.; Grigalashvili, N.; Grillo, A.A.; Grimm, K.; Grinstein, S.; Grishkevich, Y.V.; Groh, M.; Groll, M.; Gross, E.; Grosse-Knetter, J.; Groth-Jensen, J.; Grybel, K.; Guicheney, C.; Guida, A.; Guillemin, T.; Guler, H.; Gunther, J.; Guo, B.; Gupta, A.; Gusakov, Y.; Gutierrez, A.; Gutierrez, P.; Guttman, N.; Gutzwiller, O.; Guyot, C.; Gwenlan, C.; Gwilliam, C.B.; Haas, A.; Haas, S.; Haber, C.; Hadavand, H.K.; Hadley, D.R.; Haefner, P.; Hartel, R.; Hajduk, Z.; Hakobyan, H.; Haller, J.; Hamacher, K.; Hamilton, A.; Hamilton, S.; Han, L.; Hanagaki, K.; Hance, M.; Handel, C.; Hanke, P.; Hansen, J.R.; Hansen, J.B.; Hansen, J.D.; Hansen, P.H.; Hansl-Kozanecka, T.; Hansson, P.; Hara, K.; Hare, G.A.; Harenberg, T.; Harrington, R.D.; Harris, O.M.; Harrison, K; Hartert, J.; Hartjes, F.; Harvey, A.; Hasegawa, S.; Hasegawa, Y.; Hashemi, K.; Hassani, S.; Haug, S.; Hauschild, M.; Hauser, R.; Havranek, M.; Hawkes, C.M.; Hawkings, R.J.; Hayakawa, T.; Hayward, H.S.; Haywood, S.J.; Head, S.J.; Hedberg, V.; Heelan, L.; Heim, S.; Heinemann, B.; Heisterkamp, S.; Helary, L.; Heller, M.; Hellman, S.; Helsens, C.; Hemperek, T.; Henderson, R.C.W.; Henke, M.; Henrichs, A.; Henriques Correia, A.M.; Henrot-Versille, S.; Hensel, C.; Henss, T.; Hernandez Jimenez, Y.; Hershenhorn, A.D.; Herten, G.; Hertenberger, R.; Hervas, L.; Hessey, N.P.; Higon-Rodriguez, E.; Hill, J.C.; Hiller, K.H.; Hillert, S.; Hillier, S.J.; Hinchliffe, I.; Hines, E.; Hirose, M.; Hirsch, F.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M.C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M.R.; Hoffman, J.; Hoffmann, D.; Hohlfeld, M.; Holy, T.; Holzbauer, J.L.; Homma, Y.; Horazdovsky, T.; Hori, T.; Horn, C.; Horner, S.; Horvat, S.; Hostachy, J-Y.; Hou, S.; Hoummada, A.; Howe, T.; Hrivnac, J.; Hryn'ova, T.; Hsu, P.J.; Hsu, S.C.; Huang, G.S.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Hughes, E.W.; Hughes, G.; Hurwitz, M.; Husemann, U.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Idarraga, J.; Iengo, P.; Igonkina, O.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Ince, T.; Ioannou, P.; Iodice, M.; Irles Quiles, A.; Ishikawa, A.; Ishino, M.; Ishmukhametov, R.; Isobe, T.; Issakov, V.; Issever, C.; Istin, S.; Itoh, Y.; Ivashin, A.V.; Iwanski, W.; Iwasaki, H.; Izen, J.M.; Izzo, V.; Jackson, B.; Jackson, J.N.; Jackson, P.; Jaekel, M.R.; Jain, V.; Jakobs, K.; Jakobsen, S.; Jakubek, J.; Jana, D.K.; Jansen, E.; Jantsch, A.; Janus, M.; Jared, R.C.; Jarlskog, G.; Jeanty, L.; Jen-La Plante, I.; Jenni, P.; Jez, P.; Jezequel, S.; Ji, W.; Jia, J.; Jiang, Y.; Jimenez Belenguer, M.; Jin, S.; Jinnouchi, O.; Joffe, D.; Johansen, M.; Johansson, K.E.; Johansson, P.; Johnert, S; Johns, K.A.; Jon-And, K.; Jones, G.; Jones, R.W.L.; Jones, T.J.; Jorge, P.M.; Joseph, J.; Juranek, V.; Jussel, P.; Kabachenko, V.V.; Kaci, M.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kaiser, S.; Kajomovitz, E.; Kalinin, S.; Kalinovskaya, L.V.; Kalinowski, A.; Kama, S.; Kanaya, N.; Kaneda, M.; Kantserov, V.A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kaplon, J.; Kar, D.; Karagounis, M.; Karagoz Unel, M.; Kartvelishvili, V.; Karyukhin, A.N.; Kashif, L.; Kasmi, A.; Kass, R.D.; Kastanas, A.; Kastoryano, M.; Kataoka, M.; Kataoka, Y.; Katsoufis, E.; Katzy, J.; Kaushik, V.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kayl, M.S.; Kayumov, F.; Kazanin, V.A.; Kazarinov, M.Y.; Keates, J.R.; Keeler, R.; Keener, P.T.; Kehoe, R.; Keil, M.; Kekelidze, G.D.; Kelly, M.; Kenyon, M.; Kepka, O.; Kerschen, N.; Kersevan, B.P.; Kersten, S.; Kessoku, K.; Khakzad, M.; Khalil-zada, F.; Khandanyan, H.; Khanov, A.; Kharchenko, D.; Khodinov, A.; Khomich, A.; Khoriauli, G.; Khovanskiy, N.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kim, H.; Kim, M.S.; Kim, P.C.; Kim, S.H.; Kind, O.; Kind, P.; King, B.T.; Kirk, J.; Kirsch, G.P.; Kirsch, L.E.; Kiryunin, A.E.; Kisielewska, D.; Kittelmann, T.; Kiyamura, H.; Kladiva, E.; Klein, M.; Klein, U.; Kleinknecht, K.; Klemetti, M.; Klier, A.; Klimentov, A.; Klingenberg, R.; Klinkby, E.B.; Klioutchnikova, T.; Klok, P.F.; Klous, S.; Kluge, E.E.; Kluge, T.; Kluit, P.; Klute, M.; Kluth, S.; Knecht, N.S.; Kneringer, E.; Ko, B.R.; Kobayashi, T.; Kobel, M.; Koblitz, B.; Kocian, M.; Kocnar, A.; Kodys, P.; Koneke, K.; Konig, A.C.; Koenig, S.; Kopke, L.; Koetsveld, F.; Koevesarki, P.; Koffas, T.; Koffeman, E.; Kohn, F.; Kohout, Z.; Kohriki, T.; Kolanoski, H.; Kolesnikov, V.; Koletsou, I.; Koll, J.; Kollar, D.; Kolos, S.; Kolya, S.D.; Komar, A.A.; Komaragiri, J.R.; Kondo, T.; Kono, T.; Konoplich, R.; Konovalov, S.P.; Konstantinidis, N.; Koperny, S.; Korcyl, K.; Kordas, K.; Korn, A.; Korolkov, I.; Korolkova, E.V.; Korotkov, V.A.; Kortner, O.; Kostka, P.; Kostyukhin, V.V.; Kotov, S.; Kotov, V.M.; Kotov, K.Y.; Kourkoumelis, C.; Koutsman, A.; Kowalewski, R.; Kowalski, H.; Kowalski, T.Z.; Kozanecki, W.; Kozhin, A.S.; Kral, V.; Kramarenko, V.A.; Kramberger, G.; Krasny, M.W.; Krasznahorkay, A.; Kreisel, A.; Krejci, F.; Kretzschmar, J.; Krieger, N.; Krieger, P.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Kruger, H.; Krumshteyn, Z.V.; Kubota, T.; Kuehn, S.; Kugel, A.; Kuhl, T.; Kuhn, D.; Kukhtin, V.; Kulchitsky, Y.; Kuleshov, S.; Kummer, C.; Kuna, M.; Kunkle, J.; Kupco, A.; Kurashige, H.; Kurata, M.; Kurchaninov, L.L.; Kurochkin, Y.A.; Kus, V.; Kwee, R.; La Rotonda, L.; Labbe, J.; Lacasta, C.; Lacava, F.; Lacker, H.; Lacour, D.; Lacuesta, V.R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lamanna, M.; Lampen, C.L.; Lampl, W.; Lancon, E.; Landgraf, U.; Landon, M.P.J.; Lane, J.L.; Lankford, A.J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J.F.; Lari, T.; Larner, A.; Lassnig, M.; Laurelli, P.; Lavrijsen, W.; Laycock, P.; Lazarev, A.B.; Lazzaro, A.; Le Dortz, O.; Le Guirriec, E.; Le Menedeu, E.; Le Vine, M.; Lebedev, A.; Lebel, C.; LeCompte, T.; Ledroit-Guillon, F.; Lee, H.; Lee, J.S.H.; Lee, S.C.; Lefebvre, M.; Legendre, M.; LeGeyt, B.C.; Legger, F.; Leggett, C.; Lehmacher, M.; Lehmann Miotto, G.; Lei, X.; Leitner, R.; Lellouch, D.; Lellouch, J.; Lendermann, V.; Leney, K.J.C.; Lenz, T.; Lenzen, G.; Lenzi, B.; Leonhardt, K.; Leroy, C.; Lessard, J-R.; Lester, C.G.; Leung Fook Cheong, A.; Leveque, J.; Levin, D.; Levinson, L.J.; Leyton, M.; Li, H.; Li, S.; Li, X.; Liang, Z.; Liang, Z.; Liberti, B.; Lichard, P.; Lichtnecker, M.; Lie, K.; Liebig, W.; Lilley, J.N.; Lim, H.; Limosani, A.; Limper, M.; Lin, S.C.; Linnemann, J.T.; Lipeles, E.; Lipinsky, L.; Lipniacka, A.; Liss, T.M.; Lissauer, D.; Lister, A.; Litke, A.M.; Liu, C.; Liu, D.; Liu, H.; Liu, J.B.; Liu, M.; Liu, T.; Liu, Y.; Livan, M.; Lleres, A.; Lloyd, S.L.; Lobodzinska, E.; Loch, P.; Lockman, W.S.; Lockwitz, S.; Loddenkoetter, T.; Loebinger, F.K.; Loginov, A.; Loh, C.W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Long, R.E.; Lopes, L.; Lopez Mateos, D.; Losada, M.; Loscutoff, P.; Lou, X.; Lounis, A.; Loureiro, K.F.; Lovas, L.; Love, J.; Love, P.A.; Lowe, A.J.; Lu, F.; Lubatti, H.J.; Luci, C.; Lucotte, A.; Ludwig, A.; Ludwig, D.; Ludwig, I.; Luehring, F.; Luisa, L.; Lumb, D.; Luminari, L.; Lund, E.; Lund-Jensen, B.; Lundberg, B.; Lundberg, J.; Lundquist, J.; Lynn, D.; Lys, J.; Lytken, E.; Ma, H.; Ma, L.L.; Macana Goia, J.A.; Maccarrone, G.; Macchiolo, A.; Macek, B.; Machado Miguens, J.; Mackeprang, R.; Madaras, R.J.; Mader, W.F.; Maenner, R.; Maeno, T.; Mattig, P.; Mattig, S.; Magalhaes Martins, P.J.; Magradze, E.; Mahalalel, Y.; Mahboubi, K.; Mahmood, A.; Maiani, C.; Maidantchik, C.; Maio, A.; Majewski, S.; Makida, Y.; Makouski, M.; Makovec, N.; Malecki, Pa.; Malecki, P.; Maleev, V.P.; Malek, F.; Mallik, U.; Malon, D.; Maltezos, S.; Malyshev, V.; Malyukov, S.; Mambelli, M.; Mameghani, R.; Mamuzic, J.; Mandelli, L.; Mandic, I.; Mandrysch, R.; Maneira, J.; Mangeard, P.S.; Manjavidze, I.D.; Manning, P.M.; Manousakis-Katsikakis, A.; Mansoulie, B.; Mapelli, A.; Mapelli, L.; March, L.; Marchand, J.F.; Marchese, F.; Marchiori, G.; Marcisovsky, M.; Marino, C.P.; Marroquim, F.; Marshall, Z.; Marti-Garcia, S.; Martin, A.J.; Martin, A.J.; Martin, B.; Martin, B.; Martin, F.F.; Martin, J.P.; Martin, T.A.; Martin dit Latour, B.; Martinez, M.; Martinez Outschoorn, V.; Martini, A.; Martyniuk, A.C.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A.L.; Massa, I.; Massol, N.; Mastroberardino, A.; Masubuchi, T.; Matricon, P.; Matsunaga, H.; Matsushita, T.; Mattravers, C.; Maxfield, S.J.; Mayne, A.; Mazini, R.; Mazur, M.; Mazzanti, M.; Mc Donald, J.; Mc Kee, S.P.; McCarn, A.; McCarthy, R.L.; McCubbin, N.A.; McFarlane, K.W.; McGlone, H.; Mchedlidze, G.; McMahon, S.J.; McPherson, R.A.; Meade, A.; Mechnich, J.; Mechtel, M.; Medinnis, M.; Meera-Lebbai, R.; Meguro, T.M.; Mehlhase, S.; Mehta, A.; Meier, K.; Meirose, B.; Melachrinos, C.; Mellado Garcia, B.R.; Mendoza Navas, L.; Meng, Z.; Menke, S.; Meoni, E.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F.S.; Messina, A.M.; Metcalfe, J.; Mete, A.S.; Meyer, J-P.; Meyer, J.; Meyer, J.; Meyer, T.C.; Meyer, W.T.; Miao, J.; Michal, S.; Micu, L.; Middleton, R.P.; Migas, S.; Mijovic, L.; Mikenberg, G.; Mikestikova, M.; Mikuz, M.; Miller, D.W.; Mills, W.J.; Mills, C.M.; Milov, A.; Milstead, D.A.; Milstein, D.; Minaenko, A.A.; Minano, M.; Minashvili, I.A.; Mincer, A.I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L.M.; Mirabelli, G.; Misawa, S.; Miscetti, S.; Misiejuk, A.; Mitrevski, J.; Mitsou, V.A.; Miyagawa, P.S.; Mjornmark, J.U.; Mladenov, D.; Moa, T.; Moed, S.; Moeller, V.; Monig, K.; Moser, N.; Mohr, W.; Mohrdieck-Mock, S.; Moles-Valls, R.; Molina-Perez, J.; Monk, J.; Monnier, E.; Montesano, S.; Monticelli, F.; Moore, R.W.; Mora Herrera, C.; Moraes, A.; Morais, A.; Morel, J.; Morello, G.; Moreno, D.; Moreno Llacer, M.; Morettini, P.; Morii, M.; Morley, A.K.; Mornacchi, G.; Morozov, S.V.; Morris, J.D.; Moser, H.G.; Mosidze, M.; Moss, J.; Mount, R.; Mountricha, E.; Mouraviev, S.V.; Moyse, E.J.W.; Mudrinic, M.; Mueller, F.; Mueller, J.; Mueller, K.; Muller, T.A.; Muenstermann, D.; Muir, A.; Munwes, Y.; Murillo Garcia, R.; Murray, W.J.; Mussche, I.; Musto, E.; Myagkov, A.G.; Myska, M.; Nadal, J.; Nagai, K.; Nagano, K.; Nagasaka, Y.; Nairz, A.M.; Nakamura, K.; Nakano, I.; Nakatsuka, H.; Nanava, G.; Napier, A.; Nash, M.; Nation, N.R.; Nattermann, T.; Naumann, T.; Navarro, G.; Nderitu, S.K.; Neal, H.A.; Nebot, E.; Nechaeva, P.; Negri, A.; Negri, G.; Nelson, A.; Nelson, T.K.; Nemecek, S.; Nemethy, P.; Nepomuceno, A.A.; Nessi, M.; Neubauer, M.S.; Neusiedl, A.; Neves, R.N.; Nevski, P.; Newcomer, F.M.; Nickerson, R.B.; Nicolaidou, R.; Nicolas, L.; Nicoletti, G.; Nicquevert, B.; Niedercorn, F.; Nielsen, J.; Nikiforov, A.; Nikolaev, K.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, H.; Nilsson, P.; Nisati, A.; Nishiyama, T.; Nisius, R.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Nordberg, M.; Nordkvist, B.; Notz, D.; Novakova, J.; Nozaki, M.; Nozicka, M.; Nugent, I.M.; Nuncio-Quiroz, A.E.; Nunes Hanninger, G.; Nunnemann, T.; Nurse, E.; O'Neil, D.C.; O'Shea, V.; Oakham, F.G.; Oberlack, H.; Ochi, A.; Oda, S.; Odaka, S.; Odier, J.; Ogren, H.; Oh, A.; Oh, S.H.; Ohm, C.C.; Ohshima, T.; Ohshita, H.; Ohsugi, T.; Okada, S.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olchevski, A.G.; Oliveira, M.; Oliveira Damazio, D.; Oliver, J.; Oliver Garcia, E.; Olivito, D.; Olszewski, A.; Olszowska, J.; Omachi, C.; Onofre, A.; Onyisi, P.U.E.; Oram, C.J.; Oreglia, M.J.; Oren, Y.; Orestano, D.; Orlov, I.; Oropeza Barrera, C.; Orr, R.S.; Ortega, E.O.; Osculati, B.; Ospanov, R.; Osuna, C.; Ottersbach, J.P; Ould-Saada, F.; Ouraou, A.; Ouyang, Q.; Owen, M.; Owen, S.; Oyarzun, A; Ozcan, V.E.; Ozone, K.; Ozturk, N.; Pacheco Pages, A.; Padilla Aranda, C.; Paganis, E.; Pahl, C.; Paige, F.; Pajchel, K.; Palestini, S.; Pallin, D.; Palma, A.; Palmer, J.D.; Pan, Y.B.; Panagiotopoulou, E.; Panes, B.; Panikashvili, N.; Panitkin, S.; Pantea, D.; Panuskova, M.; Paolone, V.; Papadopoulou, Th.D.; Park, S.J.; Park, W.; Parker, M.A.; Parker, S.I.; Parodi, F.; Parsons, J.A.; Parzefall, U.; Pasqualucci, E.; Passeri, A.; Pastore, F.; Pastore, Fr.; Pasztor, G.; Pataraia, S.; Pater, J.R.; Patricelli, S.; Patwa, A.; Pauly, T.; Peak, L.S.; Pecsy, M.; Pedraza Morales, M.I.; Peleganchuk, S.V.; Peng, H.; Penson, A.; Penwell, J.; Perantoni, M.; Perez, K.; Perez Codina, E.; Perez Garcia-Estan, M.T.; Perez Reale, V.; Perini, L.; Pernegger, H.; Perrino, R.; Persembe, S.; Perus, P.; Peshekhonov, V.D.; Petersen, B.A.; Petersen, T.C.; Petit, E.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petschull, D; Petteni, M.; Pezoa, R.; Phan, A.; Phillips, A.W.; Piacquadio, G.; Piccinini, M.; Piegaia, R.; Pilcher, J.E.; Pilkington, A.D.; Pina, J.; Pinamonti, M.; Pinfold, J.L.; Pinto, B.; Pizio, C.; Placakyte, R.; Plamondon, M.; Pleier, M.A.; Poblaguev, A.; Poddar, S.; Podlyski, F.; Poffenberger, P.; Poggioli, L.; Pohl, M.; Polci, F.; Polesello, G.; Policicchio, A.; Polini, A.; Poll, J.; Polychronakos, V.; Pomeroy, D.; Pommes, K.; Ponsot, P.; Pontecorvo, L.; Pope, B.G.; Popeneciu, G.A.; Popovic, D.S.; Poppleton, A.; Popule, J.; Portell Bueso, X.; Porter, R.; Pospelov, G.E.; Pospisil, S.; Potekhin, M.; Potrap, I.N.; Potter, C.J.; Potter, C.T.; Potter, K.P.; Poulard, G.; Poveda, J.; Prabhu, R.; Pralavorio, P.; Prasad, S.; Pravahan, R.; Pribyl, L.; Price, D.; Price, L.E.; Prichard, P.M.; Prieur, D.; Primavera, M.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Prudent, X.; Przysiezniak, H.; Psoroulas, S.; Ptacek, E.; Puigdengoles, C.; Purdham, J.; Purohit, M.; Puzo, P.; Pylypchenko, Y.; Qi, M.; Qian, J.; Qian, W.; Qin, Z.; Quadt, A.; Quarrie, D.R.; Quayle, W.B.; Quinonez, F.; Raas, M.; Radeka, V.; Radescu, V.; Radics, B.; Rador, T.; Ragusa, F.; Rahal, G.; Rahimi, A.M.; Rajagopalan, S.; Rammensee, M.; Rammes, M.; Rauscher, F.; Rauter, E.; Raymond, M.; Read, A.L.; Rebuzzi, D.M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Reinherz-Aronis, E.; Reinsch, A; Reisinger, I.; Reljic, D.; Rembser, C.; Ren, Z.L.; Renkel, P.; Rescia, S.; Rescigno, M.; Resconi, S.; Resende, B.; Reznicek, P.; Rezvani, R.; Richards, A.; Richards, R.A.; Richter, R.; Richter-Was, E.; Ridel, M.; Rijpstra, M.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Rios, R.R.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Roa Romero, D.A.; Robertson, S.H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, JEM; Robinson, M.; Robson, A.; Rocha de Lima, J.G.; Roda, C.; Roda Dos Santos, D.; Rodriguez, D.; Rodriguez Garcia, Y.; Roe, S.; Rohne, O.; Rojo, V.; Rolli, S.; Romaniouk, A.; Romanov, V.M.; Romeo, G.; Romero Maltrana, D.; Roos, L.; Ros, E.; Rosati, S.; Rosenbaum, G.A.; Rosselet, L.; Rossetti, V.; Rossi, L.P.; Rotaru, M.; Rothberg, J.; Rousseau, D.; Royon, C.R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Ruckert, B.; Ruckstuhl, N.; Rud, V.I.; Rudolph, G.; Ruhr, F.; Ruggieri, F.; Ruiz-Martinez, A.; Rumyantsev, L.; Rurikova, Z.; Rusakovich, N.A.; Rutherfoord, J.P.; Ruwiedel, C.; Ruzicka, P.; Ryabov, Y.F.; Ryan, P.; Rybkin, G.; Rzaeva, S.; Saavedra, A.F.; Sadrozinski, H.F-W.; Sadykov, R.; Sakamoto, H.; Salamanna, G.; Salamon, A.; Saleem, M.S.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvachua Ferrando, B.M.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sampsonidis, D.; Samset, B.H.; Sandaker, H.; Sander, H.G.; Sanders, M.P.; Sandhoff, M.; Sandhu, P.; Sandstroem, R.; Sandvoss, S.; Sankey, D.P.C.; Sanny, B.; Sansoni, A.; Santamarina Rios, C.; Santoni, C.; Santonico, R.; Saraiva, J.G.; Sarangi, T.; Sarkisyan-Grinbaum, E.; Sarri, F.; Sasaki, O.; Sasao, N.; Satsounkevitch, I.; Sauvage, G.; Savard, P.; Savine, A.Y.; Savinov, V.; Sawyer, L.; Saxon, D.H.; Says, L.P.; Sbarra, C.; Sbrizzi, A.; Scannicchio, D.A.; Schaarschmidt, J.; Schacht, P.; Schafer, U.; Schaetzel, S.; Schaffer, A.C.; Schaile, D.; Schamberger, R.D.; Schamov, A.G.; Schegelsky, V.A.; Scheirich, D.; Schernau, M.; Scherzer, M.I.; Schiavi, C.; Schieck, J.; Schioppa, M.; Schlenker, S.; Schmidt, E.; Schmieden, K.; Schmitt, C.; Schmitz, M.; Schott, M.; Schouten, D.; Schovancova, J.; Schram, M.; Schreiner, A.; Schroeder, C.; Schroer, N.; Schroers, M.; Schultes, J.; Schultz-Coulon, H.C.; Schumacher, J.W.; Schumacher, M.; Schumm, B.A.; Schune, Ph.; Schwanenberger, C.; Schwartzman, A.; Schwemling, Ph.; Schwienhorst, R.; Schwierz, R.; Schwindling, J.; Scott, W.G.; Searcy, J.; Sedykh, E.; Segura, E.; Seidel, S.C.; Seiden, A.; Seifert, F.; Seixas, J.M.; Sekhniaidze, G.; Seliverstov, D.M.; Sellden, B.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Seuster, R.; Severini, H.; Sevior, M.E.; Sfyrla, A.; Shabalina, E.; Shamim, M.; Shan, L.Y.; Shank, J.T.; Shao, Q.T.; Shapiro, M.; Shatalov, P.B.; Shaw, K.; Sherman, D.; Sherwood, P.; Shibata, A.; Shimojima, M.; Shin, T.; Shmeleva, A.; Shochet, M.J.; Shupe, M.A.; Sicho, P.; Sidoti, A.; Siegert, F; Siegrist, J.; Sijacki, Dj.; Silbert, O.; Silva, J.; Silver, Y.; Silverstein, D.; Silverstein, S.B.; Simak, V.; Simic, Lj.; Simion, S.; Simmons, B.; Simonyan, M.; Sinervo, P.; Sinev, N.B.; Sipica, V.; Siragusa, G.; Sisakyan, A.N.; Sivoklokov, S.Yu.; Sjoelin, J.; Sjursen, T.B.; Skovpen, K.; Skubic, P.; Slater, M.; Slavicek, T.; Sliwa, K.; Sloper, J.; Sluka, T.; Smakhtin, V.; Smirnov, S.Yu.; Smirnov, Y.; Smirnova, L.N.; Smirnova, O.; Smith, B.C.; Smith, D.; Smith, K.M.; Smizanska, M.; Smolek, K.; Snesarev, A.A.; Snow, S.W.; Snow, J.; Snuverink, J.; Snyder, S.; Soares, M.; Sobie, R.; Sodomka, J.; Soffer, A.; Solans, C.A.; Solar, M.; Solc, J.; Solfaroli Camillocci, E.; Solodkov, A.A.; Solovyanov, O.V.; Soluk, R.; Sondericker, J.; Sopko, V.; Sopko, B.; Sosebee, M.; Soukharev, A.; Spagnolo, S.; Spano, F.; Spencer, E.; Spighi, R.; Spigo, G.; Spila, F.; Spiwoks, R.; Spousta, M.; Spreitzer, T.; Spurlock, B.; St. Denis, R.D.; Stahl, T.; Stahlman, J.; Stamen, R.; Stancu, S.N.; Stanecka, E.; Stanek, R.W.; Stanescu, C.; Stapnes, S.; Starchenko, E.A.; Stark, J.; Staroba, P.; Starovoitov, P.; Stastny, J.; Stavina, P.; Steele, G.; Steinbach, P.; Steinberg, P.; Stekl, I.; Stelzer, B.; Stelzer, H.J.; Stelzer-Chilton, O.; Stenzel, H.; Stevenson, K.; Stewart, G.A.; Stockton, M.C.; Stoerig, K.; Stoicea, G.; Stonjek, S.; Strachota, P.; Stradling, A.R.; Straessner, A.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, M.; Strizenec, P.; Strohmer, R.; Strom, D.M.; Stroynowski, R.; Strube, J.; Stugu, B.; Soh, D.A.; Su, D.; Sugaya, Y.; Sugimoto, T.; Suhr, C.; Suk, M.; Sulin, V.V.; Sultansoy, S.; Sumida, T.; Sun, X.H.; Sundermann, J.E.; Suruliz, K.; Sushkov, S.; Susinno, G.; Sutton, M.R.; Suzuki, T.; Suzuki, Y.; Sykora, I.; Sykora, T.; Szymocha, T.; Sanchez, J.; Ta, D.; Tackmann, K.; Taffard, A.; Tafirout, R.; Taga, A.; Takahashi, Y.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Talby, M.; Talyshev, A.; Tamsett, M.C.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tapprogge, S.; Tardif, D.; Tarem, S.; Tarrade, F.; Tartarelli, G.F.; Tas, P.; Tasevsky, M.; Tassi, E.; Tatarkhanov, M.; Taylor, C.; Taylor, F.E.; Taylor, G.N.; Taylor, R.P.; Taylor, W.; Teixeira-Dias, P.; Ten Kate, H.; Teng, P.K.; Tennenbaum-Katan, Y.D.; Terada, S.; Terashi, K.; Terron, J.; Terwort, M.; Testa, M.; Teuscher, R.J.; Thioye, M.; Thoma, S.; Thomas, J.P.; Thompson, E.N.; Thompson, P.D.; Thompson, P.D.; Thompson, R.J.; Thompson, A.S.; Thomson, E.; Thun, R.P.; Tic, T.; Tikhomirov, V.O.; Tikhonov, Y.A.; Tipton, P.; Tique Aires Viegas, F.J.; Tisserant, S.; Toczek, B.; Todorov, T.; Todorova-Nova, S.; Toggerson, B.; Tojo, J.; Tokar, S.; Tokushuku, K.; Tollefson, K.; Tomasek, L.; Tomasek, M.; Tomoto, M.; Tompkins, L.; Toms, K.; Tonoyan, A.; Topfel, C.; Topilin, N.D.; Torrence, E.; Torro Pastor, E.; Toth, J.; Touchard, F.; Tovey, D.R.; Trefzger, T.; Tremblet, L.; Tricoli, A.; Trigger, I.M.; Trincaz-Duvoid, S.; Trinh, T.N.; Tripiana, M.F.; Triplett, N.; Trischuk, W.; Trivedi, A.; Trocme, B.; Troncon, C.; Trzupek, A.; Tsarouchas, C.; Tseng, J.C-L.; Tsiakiris, M.; Tsiareshka, P.V.; Tsionou, D.; Tsipolitis, G.; Tsiskaridze, V.; Tskhadadze, E.G.; Tsukerman, I.I.; Tsulaia, V.; Tsung, J.W.; Tsuno, S.; Tsybychev, D.; Tuggle, J.M.; Turecek, D.; Turk Cakir, I.; Turlay, E.; Tuts, P.M.; Twomey, M.S.; Tylmad, M.; Tyndel, M.; Uchida, K.; Ueda, I.; Ugland, M.; Uhlenbrock, M.; Uhrmacher, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Unno, Y.; Urbaniec, D.; Urkovsky, E.; Urquijo, P.; Urrejola, P.; Usai, G.; Uslenghi, M.; Vacavant, L.; Vacek, V.; Vachon, B.; Vahsen, S.; Valente, P.; Valentinetti, S.; Valkar, S.; Valladolid Gallego, E.; Vallecorsa, S.; Valls Ferrer, J.A.; Van Berg, R.; van der Graaf, H.; van der Kraaij, E.; van der Poel, E.; van der Ster, D.; van Eldik, N.; van Gemmeren, P.; van Kesteren, Z.; van Vulpen, I.; Vandelli, W.; Vaniachine, A.; Vankov, P.; Vannucci, F.; Vari, R.; Varnes, E.W.; Varouchas, D.; Vartapetian, A.; Varvell, K.E.; Vasilyeva, L.; Vassilakopoulos, V.I.; Vazeille, F.; Vellidis, C.; Veloso, F.; Veneziano, S.; Ventura, A.; Ventura, D.; Venturi, M.; Venturi, N.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J.C.; Vetterli, M.C.; Vichou, I.; Vickey, T.; Viehhauser, G.H.A.; Villa, M.; Villani, E.G.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M.G.; Vinek, E.; Vinogradov, V.B.; Viret, S.; Virzi, J.; Vitale, A.; Vitells, O.; Vivarelli, I.; Vives Vaque, F.; Vlachos, S.; Vlasak, M.; Vlasov, N.; Vogel, A.; Vokac, P.; Volpi, M.; von der Schmitt, H.; von Loeben, J.; von Radziewski, H.; von Toerne, E.; Vorobel, V.; Vorwerk, V.; Vos, M.; Voss, R.; Voss, T.T.; Vossebeld, J.H.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vu Anh, T.; Vudragovic, D.; Vuillermet, R.; Vukotic, I.; Wagner, P.; Walbersloh, J.; Walder, J.; Walker, R.; Walkowiak, W.; Wall, R.; Wang, C.; Wang, H.; Wang, J.; Wang, S.M.; Warburton, A.; Ward, C.P.; Warsinsky, M.; Wastie, R.; Watkins, P.M.; Watson, A.T.; Watson, M.F.; Watts, G.; Watts, S.; Waugh, A.T.; Waugh, B.M.; Weber, M.D.; Weber, M.; Weber, M.S.; Weber, P.; Weidberg, A.R.; Weingarten, J.; Weiser, C.; Wellenstein, H.; Wells, P.S.; Wen, M.; Wenaus, T.; Wendler, S.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Werth, M.; Werthenbach, U.; Wessels, M.; Whalen, K.; White, A.; White, M.J.; White, S.; Whitehead, S.R.; Whiteson, D.; Whittington, D.; Wicek, F.; Wicke, D.; Wickens, F.J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik, L.A.M.; Wildauer, A.; Wildt, M.A.; Wilkens, H.G.; Williams, E.; Williams, H.H.; Willocq, S.; Wilson, J.A.; Wilson, M.G.; Wilson, A.; Wingerter-Seez, I.; Winklmeier, F.; Wittgen, M.; Wolter, M.W.; Wolters, H.; Wosiek, B.K.; Wotschack, J.; Woudstra, M.J.; Wraight, K.; Wright, C.; Wright, D.; Wrona, B.; Wu, S.L.; Wu, X.; Wulf, E.; Wynne, B.M.; Xaplanteris, L.; Xella, S.; Xie, S.; Xu, D.; Xu, N.; Yamada, M.; Yamamoto, A.; Yamamoto, K.; Yamamoto, S.; Yamamura, T.; Yamaoka, J.; Yamazaki, T.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, U.K.; Yang, Z.; Yao, W-M.; Yao, Y.; Yasu, Y.; Ye, J.; Ye, S.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, R.; Young, C.; Youssef, S.P.; Yu, D.; Yu, J.; Yuan, L.; Yurkewicz, A.; Zaidan, R.; Zaitsev, A.M.; Zajacova, Z.; Zambrano, V.; Zanello, L.; Zaytsev, A.; Zeitnitz, C.; Zeller, M.; Zemla, A.; Zendler, C.; Zenin, O.; Zenis, T.; Zenonos, Z.; Zenz, S.; Zerwas, D.; Zevi della Porta, G.; Zhan, Z.; Zhang, H.; Zhang, J.; Zhang, Q.; Zhang, X.; Zhao, L.; Zhao, T.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, N.; Zhou, Y.; Zhu, C.G.; Zhu, H.; Zhu, Y.; Zhuang, X.; Zhuravlov, V.; Zimmermann, R.; Zimmermann, S.; Zimmermann, S.; Ziolkowski, M.; Zivkovic, L.; Zobernig, G.; Zoccoli, A.; zur Nedden, M.; Zutshi, V.

    2010-01-01

    The simulation software for the ATLAS Experiment at the Large Hadron Collider is being used for large-scale production of events on the LHC Computing Grid. This simulation requires many components, from the generators that simulate particle collisions, through packages simulating the response of the various detectors and triggers. All of these components come together under the ATLAS simulation infrastructure. In this paper, that infrastructure is discussed, including that supporting the detector description, interfacing the event generation, and combining the GEANT4 simulation of the response of the individual detectors. Also described are the tools allowing the software validation, performance testing, and the validation of the simulated output against known physics processes.

  4. Descriptive Analysis Of Nigerian Children Human Figure Drawings ...

    African Journals Online (AJOL)

    Art is a symbolic means of communication through which man can communicate his inner needs, desires and worries. Children find art a good means of communicating what ordinarily, they may not be able to describe orally. This study was on description of Nigerian children human figure art using Lowenfeld and Brittain ...

  5. Primary Numbers Database for ATLAS Detector Description Parameters

    CERN Document Server

    Vaniachine, A; Malon, D; Nevski, P; Wenaus, T

    2003-01-01

    We present the design and the status of the database for detector description parameters in ATLAS experiment. The ATLAS Primary Numbers are the parameters defining the detector geometry and digitization in simulations, as well as certain reconstruction parameters. Since the detailed ATLAS detector description needs more than 10,000 such parameters, a preferred solution is to have a single verified source for all these data. The database stores the data dictionary for each parameter collection object, providing schema evolution support for object-based retrieval of parameters. The same Primary Numbers are served to many different clients accessing the database: the ATLAS software framework Athena, the Geant3 heritage framework Atlsim, the Geant4 developers framework FADS/Goofy, the generator of XML output for detector description, and several end-user clients for interactive data navigation, including web-based browsers and ROOT. The choice of the MySQL database product for the implementation provides addition...

  6. Unsupervised Idealization of Ion Channel Recordings by Minimum Description Length

    DEFF Research Database (Denmark)

    Gnanasambandam, Radhakrishnan; Nielsen, Morten S; Nicolai, Christopher

    2017-01-01

    and characterize an idealization algorithm based on Rissanen's Minimum Description Length (MDL) Principle. This method uses minimal assumptions and idealizes ion channel recordings without requiring a detailed user input or a priori assumptions about channel conductance and kinetics. Furthermore, we demonstrate...... that correlation analysis of conductance steps can resolve properties of single ion channels in recordings contaminated by signals from multiple channels. We first validated our methods on simulated data defined with a range of different signal-to-noise levels, and then showed that our algorithm can recover...... channel currents and their substates from recordings with multiple channels, even under conditions of high noise. We then tested the MDL algorithm on real experimental data from human PIEZO1 channels and found that our method revealed the presence of substates with alternate conductances....

  7. A simplified description of the three-dimensional structure of agroforestry trees for use with a radiative transfer model

    International Nuclear Information System (INIS)

    Meloni, S.

    1998-01-01

    To simulate transmitted radiation in agroforestry systems, radiative transfer models usually require a detailed three-dimensional description of the tree canopy. We propose here a simplification of the description of the three-dimensional structure of wild cherry trees (Prunus avium). The simplified tree description was tested against the detailed one for five-year-old wild cherry. It allowed accurate simulation of transmitted radiation and avoided tedious measurements of tree structure. The simplified description was then applied to older trees. Allometric relationships were used to compute the parameters not available on free-grown trees. The transmitted radiation in an agroforestry system was simulated at four different ages: 5, 10, 15 and 20 years. The trees were planted on a 5 m square grid. Two row orientations, chosen to provide different transmitted radiation patterns, were tested: north/south and north- east/south-west. The simulations showed that the daily mean transmitted radiation was reduced from 92% of incident radiation under five-year-old trees to 37% under 20-year-old trees. The variability of transmitted radiation increased with tree growth. The row orientation had only small effects on the shaded area at the beginning and end of the day when solar elevation was low. (author)

  8. Global sensitivity analysis using emulators, with an example analysis of large fire plumes based on FDS simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kelsey, Adrian [Health and Safety Laboratory, Harpur Hill, Buxton (United Kingdom)

    2015-12-15

    Uncertainty in model predictions of the behaviour of fires is an important issue in fire safety analysis in nuclear power plants. A global sensitivity analysis can help identify the input parameters or sub-models that have the most significant effect on model predictions. However, to perform a global sensitivity analysis using Monte Carlo sampling might require thousands of simulations to be performed and therefore would not be practical for an analysis based on a complex fire code using computational fluid dynamics (CFD). An alternative approach is to perform a global sensitivity analysis using an emulator. Gaussian process emulators can be built using a limited number of simulations and once built a global sensitivity analysis can be performed on an emulator, rather than using simulations directly. Typically reliable emulators can be built using ten simulations for each parameter under consideration, therefore allowing a global sensitivity analysis to be performed, even for a complex computer code. In this paper we use an example of a large scale pool fire to demonstrate an emulator based approach to global sensitivity analysis. In that work an emulator based global sensitivity analysis was used to identify the key uncertain model inputs affecting the entrainment rates and flame heights in large Liquefied Natural Gas (LNG) fire plumes. The pool fire simulations were performed using the Fire Dynamics Simulator (FDS) software. Five model inputs were varied: the fire diameter, burn rate, radiative fraction, computational grid cell size and choice of turbulence model. The ranges used for these parameters in the analysis were determined from experiment and literature. The Gaussian process emulators used in the analysis were created using 127 FDS simulations. The emulators were checked for reliability, and then used to perform a global sensitivity analysis and uncertainty analysis. Large-scale ignited releases of LNG on water were performed by Sandia National

  9. Validation of SIMULATE-3K for stability analysis of Laguna Verde nuclear plant

    Energy Technology Data Exchange (ETDEWEB)

    Castillo, Rogelio, E-mail: rogelio.castillo@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carretera México-Toluca s/n, La Marquesa, Ocoyoacac, Estado de México 52750 (Mexico); Alonso, Gustavo, E-mail: gustavo.alonso@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carretera México-Toluca s/n, La Marquesa, Ocoyoacac, Estado de México 52750 (Mexico); Instituto Politecnico Nacional, Unidad Profesional Adolfo Lopez Mateos, Ed. 9, Lindavista, D.F. 07300 (Mexico); Ramírez, J. Ramón, E-mail: ramon.ramirez@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carretera México-Toluca s/n, La Marquesa, Ocoyoacac, Estado de México 52750 (Mexico)

    2013-12-15

    Highlights: • Neutronic/thermal hydraulic event in Laguna Verde is modeled. • A good agreement is obtained between SIMULATE-3K results and data plant for frequency and DR. • Other noise analysis techniques are used for the same purpose with good agreement. • Validation of SIMULATE-3K for stability analysis of Laguna Verde is confirmed - Abstract: Boiling Water Reactors are two phase flow systems which are susceptible to different types of flow instabilities. Among these are the coupled neutronic/thermal-hydraulic instabilities, these may compromise established fuel safety limits. These instabilities are characterized by periodic core-power and hydraulic oscillations. SIMULATE-3K code has been tested for stability analysis for several benchmarks, however to qualify the SIMULATE-3K code for a particular power plant a specific reactor plant analysis must be done. In this paper, the plant model of Laguna Verde Nuclear Power Plant is built and SIMULATE-3K is tested against the 1995 coupled neutronic/thermal-hydraulic instability event of Laguna Verde. Results obtained show the adequacy of this code to specific Laguna Verde power plant stability analysis.

  10. Failure analysis of parameter-induced simulation crashes in climate models

    Science.gov (United States)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-08-01

    Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  11. Simulation analysis for integrated evaluation of technical and commercial risk

    International Nuclear Information System (INIS)

    Gutleber, D.S.; Heiberger, E.M.; Morris, T.D.

    1995-01-01

    Decisions to invest in oil- and gasfield acquisitions or participating interests often are based on the perceived ability to enhance the economic value of the underlying asset. A multidisciplinary approach integrating reservoir engineering, operations and drilling, and deal structuring with Monte Carlo simulation modeling can overcome weaknesses of deterministic analysis and significantly enhance investment decisions. This paper discusses the use of spreadsheets and Monte Carlo simulation to generate probabilistic outcomes for key technical and economic parameters for ultimate identification of the economic volatility and value of potential deal concepts for a significant opportunity. The approach differs from a simple risk analysis for an individual well by incorporating detailed, full-field simulations that vary the reservoir parameters, capital and operating cost assumptions, and schedules on timing in the framework of various deal structures

  12. Data Science Programs in U.S. Higher Education: An Exploratory Content Analysis of Program Description, Curriculum Structure, and Course Focus

    Science.gov (United States)

    Tang, Rong; Sae-Lim, Watinee

    2016-01-01

    In this study, an exploratory content analysis of 30 randomly selected Data Science (DS) programs from eight disciplines revealed significant gaps in current DS education in the United States. The analysis centers on linguistic patterns of program descriptions, curriculum requirements, and DS course focus as pertaining to key skills and domain…

  13. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is

  14. Simulation program description for the fusion by inertial confinement

    International Nuclear Information System (INIS)

    Ferro Fontan, C.; Mancini, R.C.

    1982-01-01

    The physical model and the numerical technique used to describe the target evolution with plane, cylindrical or spherical symmetry, irradiate with a laser pulse are presented. As a simulation example, the results obtained for the irradiation of a aluminium plane target illuminated with a short time pulse are shown. (L.C.) [pt

  15. Description of Simulated Small Satellite Operation Data Sets

    Science.gov (United States)

    Kulkarni, Chetan S.; Guarneros Luna, Ali

    2018-01-01

    A set of two BP930 batteries (Identified as PK31 and PK35) were operated continuously for a simulated satellite operation profile completion for single cycle. The battery packs were charged to an initial voltage of around 8.35 V for 100% SOC before the experiment was started. This document explains the structure of the battery data sets. Please cite this paper when using this dataset: Z. Cameron, C. Kulkarni, A. Guarneros, K. Goebel, S. Poll, "A Battery Certification Testbed for Small Satellite Missions", IEEE AUTOTESTCON 2015, Nov 2-5, 2015, National Harbor, MA

  16. Description, prescription and the choice of discount rates

    International Nuclear Information System (INIS)

    Baum, Seth D.

    2009-01-01

    The choice of discount rates is a key issue in the analysis of long-term societal issues, in particular environmental issues such as climate change. Approaches to choosing discount rates are generally placed into two categories: the descriptive approach and the prescriptive approach. The descriptive approach is often justified on grounds that it uses a description of how society discounts instead of having analysts impose their own discounting views on society. This paper analyzes the common forms of the descriptive and prescriptive approaches and finds that, in contrast with customary thinking, both forms are equally descriptive and prescriptive. The prescriptions concern who has standing (i.e. who is included) in society, how the views of these individuals are measured, and how the measurements are aggregated. Such prescriptions are necessary to choose from among the many possible descriptions of how society discounts. The descriptions are the measurements made given a choice of measurement technique. Thus, the labels 'descriptive approach' and 'prescriptive approach' are deeply misleading, as analysts cannot avoid imposing their own views on society. (author)

  17. Towards understanding of magnetization reversal in Nd-Fe-B nanocomposites: analysis by high-throughput micromagnetic simulations

    Science.gov (United States)

    Erokhin, Sergey; Berkov, Dmitry; Ito, Masaaki; Kato, Akira; Yano, Masao; Michels, Andreas

    2018-03-01

    We demonstrate how micromagnetic simulations can be employed in order to characterize and analyze the magnetic microstructure of nanocomposites. For the example of nanocrystalline Nd-Fe-B, which is a potential material for future permanent-magnet applications, we have compared three different models for the micromagnetic analysis of this material class: (i) a description of the nanocomposite microstructure in terms of Stoner-Wohlfarth particles with and without the magnetodipolar interaction; (ii) a model based on the core-shell representation of the nanograins; (iii) the latter model including a contribution of superparamagnetic clusters. The relevant parameter spaces have been systematically scanned with the aim to establish which micromagnetic approach can most adequately describe experimental data for this material. According to our results, only the last, most sophisticated model is able to provide an excellent agreement with the measured hysteresis loop. The presented methodology is generally applicable to multiphase magnetic nanocomposites and it highligths the complex interrelationship between the microstructure, magnetic interactions, and the macroscopic magnetic properties.

  18. MULTIREGION: a simulation-forecasting model of BEA economic area population and employment. [Bureau of Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, R.J.; Westley, G.W.; Herzog, H.W. Jr.; Kerley, C.R.; Bjornstad, D.J.; Vogt, D.P.; Bray, L.G.; Grady, S.T.; Nakosteen, R.A.

    1977-10-01

    This report documents the development of MULTIREGION, a computer model of regional and interregional socio-economic development. The MULTIREGION model interprets the economy of each BEA economic area as a labor market, measures all activity in terms of people as members of the population (labor supply) or as employees (labor demand), and simultaneously simulates or forecasts the demands and supplies of labor in all BEA economic areas at five-year intervals. In general the outputs of MULTIREGION are intended to resemble those of the Water Resource Council's OBERS projections and to be put to similar planning and analysis purposes. This report has been written at two levels to serve the needs of multiple audiences. The body of the report serves as a fairly nontechnical overview of the entire MULTIREGION project; a series of technical appendixes provide detailed descriptions of the background empirical studies of births, deaths, migration, labor force participation, natural resource employment, manufacturing employment location, and local service employment used to construct the model.

  19. Development of Nuclear Plant Specific Analysis Simulators with ATLAS

    International Nuclear Information System (INIS)

    Jakubowski, Z.; Draeger, P.; Horche, W.; Pointner, W.

    2006-01-01

    The simulation software ATLAS, based on the best-estimate code ATHLET, has been developed by the GRS for a range of applications in the field of nuclear plant safety analysis. Through application of versatile simulation tools and graphical interfaces the user should be able to analyse with ATLAS all essential accident scenarios. Detailed analysis simulators for several German and Russian NPPs are being constructed on the basis of ATLAS. An overview of the ATLAS is presented in the paper, describing its configuration, functions performed by main components and relationships among them. A significant part of any power plant simulator are the balance-of-plant (BOP) models, not only because all the plant transients and non-LOCA accidents can be initiated by operation of BOP systems, but also because the response of the plant to transients or accidents is strongly influenced by the automatic operation of BOP systems. Modelling aspects of BOP systems are shown in detail, also the interface between the process model and BOP systems. Special emphasis has been put on the BOP model builder based on the methodology developed in the GRS. The BOP modeler called GCSM-Generator is an object oriented tool which runs on the online expert system G2. It is equipped with utilities to edit the BOP models, to verification them and to generate a GCSM code, specific for the ATLAS. The communication system of ATLAS presents graphically the results of the simulation and allows interactively influencing the execution of the simulation process (malfunctions, manual control). Displays for communications with simulated processes and presentation of calculations results are also presented. In the framework of the verification of simulation models different tools are used e.g. the PC-codes MATHCAD for the calculation and documentation, ATLET-Input-Graphic for control of geometry data and the expert system G2 for development of BOP-Models. The validation procedure and selected analyses results

  20. A comparative study of volatile components in Dianhong teas from fresh leaves of four tea cultivars by using chromatography-mass spectrometry, multivariate data analysis, and descriptive sensory analysis.

    Science.gov (United States)

    Wang, Chao; Zhang, Chenxia; Kong, Yawen; Peng, Xiaopei; Li, Changwen; Liu, Shunhang; Du, Liping; Xiao, Dongguang; Xu, Yongquan

    2017-10-01

    Dianhong teas produced from fresh leaves of different tea cultivars (YK is Yunkang No. 10, XY is Xueya 100, CY is Changyebaihao, SS is Shishengmiao), were compared in terms of volatile compounds and descriptive sensory analysis. A total of 73 volatile compounds in 16 tea samples were tentatively identified. YK, XY, CY, and SS contained 55, 53, 49, and 51 volatile compounds, respectively. Partial least squares-discriminant analysis (PLS-DA) and hierarchical cluster analysis (HCA) were used to classify the samples, and 40 key components were selected based on variable importance in the projection. Moreover, 11 flavor attributes, namely, floral, fruity, grass/green, woody, sweet, roasty, caramel, mellow and thick, bitter, astringent, and sweet aftertaste were identified through descriptive sensory analysis (DSA). In generally, innate differences among the tea varieties significantly affected the intensities of most of the key sensory attributes of Dianhong teas possibly because of the different amounts of aroma-active and taste components in Dianhong teas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. LISA. A code for safety assessment in nuclear waste disposals program description and user guide

    International Nuclear Information System (INIS)

    Saltelli, A.; Bertozzi, G.; Stanners, D.A.

    1984-01-01

    The code LISA (Long term Isolation Safety Assessment), developed at the Joint Research Centre, Ispra is a useful tool in the analysis of the hazard due to the disposal of nuclear waste in geological formations. The risk linked to preestablished release scenarios is assessed by the code in terms of dose rate to a maximum exposed individual. The various submodels in the code simulate the system of barriers -both natural and man made- which are interposed between the contaminants and man. After a description of the code features a guide for the user is supplied and then a test case is presented

  2. First experiences of high-fidelity simulation training in junior nursing students in Korea.

    Science.gov (United States)

    Lee, Suk Jeong; Kim, Sang Suk; Park, Young-Mi

    2015-07-01

    This study was conducted to explore first experiences of high-fidelity simulation training in Korean nursing students, in order to develop and establish more effective guidelines for future simulation training in Korea. Thirty-three junior nursing students participated in high-fidelity simulation training for the first time. Using both qualitative and quantitative methods, data were collected from reflective journals and questionnaires of simulation effectiveness after simulation training. Descriptive statistics were used to analyze simulation effectiveness and content analysis was performed with the reflective journal data. Five dimensions and 31 domains, both positive and negative experiences, emerged from qualitative analysis: (i) machine-human interaction in a safe environment; (ii) perceived learning capability; (iii) observational learning; (iv) reconciling practice with theory; and (v) follow-up debriefing effect. More than 70% of students scored high on increased ability to identify changes in the patient's condition, critical thinking, decision-making, effectiveness of peer observation, and debriefing in effectiveness of simulation. This study reported both positive and negative experiences of simulation. The results of this study could be used to set the level of task difficulty in simulation. Future simulation programs can be designed by reinforcing the positive experiences and modifying the negative results. © 2014 The Authors. Japan Journal of Nursing Science © 2014 Japan Academy of Nursing Science.

  3. Qualitative Descriptive Methods in Health Science Research.

    Science.gov (United States)

    Colorafi, Karen Jiggins; Evans, Bronwynne

    2016-07-01

    The purpose of this methodology paper is to describe an approach to qualitative design known as qualitative descriptive that is well suited to junior health sciences researchers because it can be used with a variety of theoretical approaches, sampling techniques, and data collection strategies. It is often difficult for junior qualitative researchers to pull together the tools and resources they need to embark on a high-quality qualitative research study and to manage the volumes of data they collect during qualitative studies. This paper seeks to pull together much needed resources and provide an overview of methods. A step-by-step guide to planning a qualitative descriptive study and analyzing the data is provided, utilizing exemplars from the authors' research. This paper presents steps to conducting a qualitative descriptive study under the following headings: describing the qualitative descriptive approach, designing a qualitative descriptive study, steps to data analysis, and ensuring rigor of findings. The qualitative descriptive approach results in a summary in everyday, factual language that facilitates understanding of a selected phenomenon across disciplines of health science researchers. © The Author(s) 2016.

  4. HYDRASTAR - a code for stochastic simulation of groundwater flow

    International Nuclear Information System (INIS)

    Norman, S.

    1992-05-01

    The computer code HYDRASTAR was developed as a tool for groundwater flow and transport simulations in the SKB 91 safety analysis project. Its conceptual ideas can be traced back to a report by Shlomo Neuman in 1988, see the reference section. The main idea of the code is the treatment of the rock as a stochastic continuum which separates it from the deterministic methods previously employed by SKB and also from the discrete fracture models. The current report is a comprehensive description of HYDRASTAR including such topics as regularization or upscaling of a hydraulic conductivity field, unconditional and conditional simulation of stochastic processes, numerical solvers for the hydrology and streamline equations and finally some proposals for future developments

  5. Validation of analysis methods to simulate national and international impact experiments. Final report; Validierung von Analysemethoden zur Simulation von Aufprallversuchen im In- und Ausland. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Heckoetter, C.; Sievers, J.

    2012-08-15

    Within the framework of project RS1182 on ''Validierung von Analysemethoden zur Simulation von Aufprallversuchen im In- und Ausland'' the examination of different mechanical phenomena which might occur during the impact of deformable, rigid or liquid-filled missiles on robust structures was carried out. The safety-related significance of the work lies in the evaluation of the accuracy of analysis methods employed for the assessment of the load-bearing capacity of building structures subjected to intentional external hazards. In the process, simulations of selected impact tests were conducted with the analysis code ANSYS AUTODYN. Key subject of the tasks was the examination of impact tests with reinforced concrete target structures, including intermediate-scaled tests (carried out at VTT) as well as almost full-scaled tests (carried out in Meppen and at SNL). Besides the behaviour of the missiles especially the description of the damage processes of the reinforced concrete structures constituted a priority. The relevant damage mechanisms include global bending and crack formation, local concrete scabbing and spalling, punching, penetration of missile and perforation. Furthermore, effects of liquid infill of missiles on the load-time-function and structural damage were investigated. By means of bilateral co-operations with organisations at home and abroad test results were exchanged. Further, selective comparative calculations carried out within the IRIS2010 activity of the WGIAGE of CSNI of the OECD/NEA and within the frame of the VTT IMPACT project contributed to the enhancement of the accuracy of statements of the employed analysis methods. For the characterisation of impact loaded concrete the RHT-model was comprehensively tested. Basically, the simulation of the behaviour of reinforced concrete structures under impact loading exhibits dependencies on physical and numerical modelling parameters, which could also be concluded from the

  6. Introduction to co-simulation of software and hardware in embedded processor systems

    Energy Technology Data Exchange (ETDEWEB)

    Dreike, P.L.; McCoy, J.A.

    1996-09-01

    From the dawn of the first use of microprocessors and microcontrollers in embedded systems, the software has been blamed for products being late to market, This is due to software being developed after hardware is fabricated. During the past few years, the use of Hardware Description (or Design) Languages (HDLs) and digital simulation have advanced to a point where the concurrent development of software and hardware can be contemplated using simulation environments. This offers the potential of 50% or greater reductions in time-to-market for embedded systems. This paper is a tutorial on the technical issues that underlie software-hardware (swhw) co-simulation, and the current state of the art. We review the traditional sequential hardware-software design paradigm, and suggest a paradigm for concurrent design, which is supported by co-simulation of software and hardware. This is followed by sections on HDLs modeling and simulation;hardware assisted approaches to simulation; microprocessor modeling methods; brief descriptions of four commercial products for sw-hw co-simulation and a description of our own experiments to develop a co-simulation environment.

  7. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  8. Optoelectronic Devices Advanced Simulation and Analysis

    CERN Document Server

    Piprek, Joachim

    2005-01-01

    Optoelectronic devices transform electrical signals into optical signals and vice versa by utilizing the sophisticated interaction of electrons and light within micro- and nano-scale semiconductor structures. Advanced software tools for design and analysis of such devices have been developed in recent years. However, the large variety of materials, devices, physical mechanisms, and modeling approaches often makes it difficult to select appropriate theoretical models or software packages. This book presents a review of devices and advanced simulation approaches written by leading researchers and software developers. It is intended for scientists and device engineers in optoelectronics, who are interested in using advanced software tools. Each chapter includes the theoretical background as well as practical simulation results that help to better understand internal device physics. The software packages used in the book are available to the public, on a commercial or noncommercial basis, so that the interested r...

  9. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  10. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  11. Performance of Geant4 in simulating semiconductor particle detector response in the energy range below 1 MeV

    Science.gov (United States)

    Soti, G.; Wauters, F.; Breitenfeldt, M.; Finlay, P.; Kraev, I. S.; Knecht, A.; Porobić, T.; Zákoucký, D.; Severijns, N.

    2013-11-01

    Geant4 simulations play a crucial role in the analysis and interpretation of experiments providing low energy precision tests of the Standard Model. This paper focuses on the accuracy of the description of the electron processes in the energy range between 100 and 1000 keV. The effect of the different simulation parameters and multiple scattering models on the backscattering coefficients is investigated. Simulations of the response of HPGe and passivated implanted planar Si detectors to β particles are compared to experimental results. An overall good agreement is found between Geant4 simulations and experimental data.

  12. ASAS: Computational code for Analysis and Simulation of Atomic Spectra

    Directory of Open Access Journals (Sweden)

    Jhonatha R. dos Santos

    2017-01-01

    Full Text Available The laser isotopic separation process is based on the selective photoionization principle and, because of this, it is necessary to know the absorption spectrum of the desired atom. Computational resource has become indispensable for the planning of experiments and analysis of the acquired data. The ASAS (Analysis and Simulation of Atomic Spectra software presented here is a helpful tool to be used in studies involving atomic spectroscopy. The input for the simulations is friendly and essentially needs a database containing the energy levels and spectral lines of the atoms subjected to be studied.

  13. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  14. Coupled multiscale simulation and optimization in nanoelectronics

    CERN Document Server

    2015-01-01

    Designing complex integrated circuits relies heavily on mathematical methods and calls for suitable simulation and optimization tools. The current design approach involves simulations and optimizations in different physical domains (device, circuit, thermal, electromagnetic) and in a range of electrical engineering disciplines (logic, timing, power, crosstalk, signal integrity, system functionality). COMSON was a Marie Curie Research Training Network created to meet these new scientific and training challenges by (a) developing new descriptive models that take these mutual dependencies into account, (b) combining these models with existing circuit descriptions in new simulation strategies, and (c) developing new optimization techniques that will accommodate new designs. The book presents the main project results in the fields of PDAE modeling and simulation, model order reduction techniques and optimization, based on merging the know-how of three major European semiconductor companies with the combined expe...

  15. Coarse-graining stochastic biochemical networks: adiabaticity and fast simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nemenman, Ilya [Los Alamos National Laboratory; Sinitsyn, Nikolai [Los Alamos National Laboratory; Hengartner, Nick [Los Alamos National Laboratory

    2008-01-01

    We propose a universal approach for analysis and fast simulations of stiff stochastic biochemical kinetics networks, which rests on elimination of fast chemical species without a loss of information about mesoscoplc, non-Poissonian fluctuations of the slow ones. Our approach, which is similar to the Born-Oppenhelmer approximation in quantum mechanics, follows from the stochastic path Integral representation of the cumulant generating function of reaction events. In applications with a small number of chemIcal reactions, It produces analytical expressions for cumulants of chemical fluxes between the slow variables. This allows for a low-dimensional, Interpretable representation and can be used for coarse-grained numerical simulation schemes with a small computational complexity and yet high accuracy. As an example, we derive the coarse-grained description for a chain of biochemical reactions, and show that the coarse-grained and the microscopic simulations are in an agreement, but the coarse-gralned simulations are three orders of magnitude faster.

  16. SOSA – a new model to simulate the concentrations of organic vapours and sulphuric acid inside the ABL – Part 1: Model description and initial evaluation

    DEFF Research Database (Denmark)

    Boy, M.; Sogachev, Andrey; Lauros, J.

    2010-01-01

    Chemistry in the atmospheric boundary layer (ABL) is controlled by complex processes of surface fluxes, flow, turbulent transport, and chemical reactions. We present a new model SOSA (model to simulate the concentration of organic vapours and sulphuric acid) and attempt to reconstruct the emissions...... in the surface layer we were able to get a reasonable description of turbulence and other quantities through the ABL. As a first application of the model, we present vertical profiles of organic compounds and discuss their relation to newly formed particles....

  17. SOSA – a new model to simulate the concentrations of organic vapours and sulphuric acid inside the ABL – Part 1: Model description and initial evaluation

    DEFF Research Database (Denmark)

    Boy, M.; Sogachev, Andrey; Lauros, J.

    2011-01-01

    Chemistry in the atmospheric boundary layer (ABL) is controlled by complex processes of surface fluxes, flow, turbulent transport, and chemical reactions. We present a new model SOSA (model to simulate the concentration of organic vapours and sulphuric acid) and attempt to reconstruct the emissions...... in the surface layer we were able to get a reasonable description of turbulence and other quantities through the ABL. As a first application of the model, we present vertical profiles of organic compounds and discuss their relation to newly formed particles....

  18. Simulation analysis of globally integrated logistics and recycling strategies

    Energy Technology Data Exchange (ETDEWEB)

    Song, S.J.; Hiroshi, K. [Hiroshima Inst. of Tech., Graduate School of Mechanical Systems Engineering, Dept. of In formation and Intelligent Systems Engineering, Hiroshima (Japan)

    2004-07-01

    This paper focuses on the optimal analysis of world-wide recycling activities associated with managing the logistics and production activities in global manufacturing whose activities stretch across national boundaries. Globally integrated logistics and recycling strategies consist of the home country and two free trading economic blocs, NAFTA and ASEAN, where significant differences are found in production and disassembly cost, tax rates, local content rules and regulations. Moreover an optimal analysis of globally integrated value-chain was developed by applying simulation optimization technique as a decision-making tool. The simulation model was developed and analyzed by using ProModel packages, and the results help to identify some of the appropriate conditions required to make well-performed logistics and recycling plans in world-wide collaborated manufacturing environment. (orig.)

  19. Simulation and Analysis of Converging Shock Wave Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramsey, Scott D. [Los Alamos National Laboratory; Shashkov, Mikhail J. [Los Alamos National Laboratory

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the original problem, and minimally straining the general credibility of associated analysis and conclusions.

  20. Simulator For The Linear Collider (SLIC): A Tool For ILC Detector Simulations

    International Nuclear Information System (INIS)

    Graf, Norman; McCormick, Jeremy

    2006-01-01

    The Simulator for the Linear Collider (SLIC) is a detector simulation program based on the GEANT4 toolkit. It is intended to enable end users to easily model detector concepts by providing the ability to fully describe detectors using plain text files read in by a common executable at runtime. The detector geometry, typically the most complex part of a detector simulation, is described at runtime using the Linear Collider Detector Description (LCDD). This system allows end users to create complex detector geometries in a standard XML format rather than procedural code such as C++. The LCDD system is based on the Geometry Description Markup Language (GDML) from the LHC Applications Group (LCG). The geometry system facilitates the study of different full detector design and their variations. SLIC uses the StdHep format to read input created by event generators and outputs events in the Linear Collider IO (LCIO) format. The SLIC package provides a binding to GEANT4 and many additional commands and features for the end user

  1. Simulator for the Linear Collider (SLIC): a Tool for ILC Detector Simulations

    International Nuclear Information System (INIS)

    Graf, N.; McCormick, J.

    2007-01-01

    The Simulator for the Linear Collider (SLIC) is a detector simulation program based on the GEANT4 toolkit. It is intended to enable end users to easily model detector concepts by providing the ability to fully describe detectors using plain text files read in by a common executable at runtime. The detector geometry, typically the most complex part of a detector simulation, is described at runtime using the Linear Collider Detector Description (LCDD). This system allows end users to create complex detector geometries in a standard XML format rather than procedural code such as C++. The LCDD system is based on the Geometry Description Markup Language (GDML) from the LHC Applications Group (LCG). The geometry system facilitates the study of different full detector design and their variations. SLIC uses the StdHep format to read input created by event generators and outputs events in the Linear Collider IO (LCIO) format. The SLIC package provides a binding to GEANT4 and many additional commands and features for the end user

  2. Modeling, simulation, and design of SAW grating filters

    Science.gov (United States)

    Schwelb, Otto; Adler, E. L.; Slaboszewicz, J. K.

    1990-05-01

    A systematic procedure for modeling, simulating, and designing SAW (surface acoustic wave) grating filters, taking losses into account, is described. Grating structures and IDTs (interdigital transducers) coupling to SAWs are defined by cascadable transmission-matrix building blocks. Driving point and transfer characteristics (immittances) of complex architectures consisting of gratings, transducers, and coupling networks are obtained by chain-multiplying building-block matrices. This modular approach to resonator filter analysis and design combines the elements of lossy filter synthesis with the transmission-matrix description of SAW components. A multipole filter design procedure based on a lumped-element-model approximation of one-pole two-port resonator building blocks is given and the range of validity of this model examined. The software for simulating the performance of SAW grating devices based on this matrix approach is described, and its performance, when linked to the design procedure to form a CAD/CAA (computer-aided design and analysis) multiple-filter design package, is illustrated with a resonator filter design example.

  3. Sensory characterization of a ready-to-eat sweetpotato breakfast cereal by descriptive analysis

    Science.gov (United States)

    Dansby, M. A.; Bovell-Benjamin, A. C.

    2003-01-01

    The sweetpotato [Ipomoea batatas (L.) Lam], an important industry in the United States, has been selected as a candidate crop to be grown on future long-duration space missions by NASA. Raw sweetpotato roots were processed into flour, which was used to formulate ready-to-eat breakfast cereal (RTEBC). Twelve trained panelists evaluated the sensory attributes of the extruded RTEBC using descriptive analysis. The samples were significantly different (Psensory attributes, which could be used to differentiate the appearance, texture, and flavor of sweetpotato RTEBC, were described. The data could be used to optimize the RTEBC and for designing studies to test its consumer acceptance.

  4. Overview description of the base scenario derived from FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    , subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. This report uses the conceptual models developed from the FEP analysis to present a description of the base scenario, in terms of the processes to be represented in detailed models. This report does not present an assessment of the base scenario, but rather seeks to provide a summary of those features, events and processes that should be represented, at an appropriate level of detail, within numerical models. The requirements for the development of appropriate models for representing the base scenario are described in an underlying report within the model development document suite. (author)

  5. Fusion Engineering Device. Volume II. Design description

    International Nuclear Information System (INIS)

    1981-10-01

    This volume summarizes the design of the FED. It includes a description of the major systems and subsystems, the supporting plasma design analysis, a projected device cost and associated construction schedule, and a description of the facilities to house and support the device. This effort represents the culmination of the FY81 studies conducted at the Fusion Engineering Design Center (FEDC). Unique in these design activities has been the collaborative involvement of the Design Center personnel and numerous resource physicists from the fusion community who have made significant contributions in the physics design analysis as well as the physics support of the engineering design of the major FED systems and components

  6. The PHMC algorithm for simulations of dynamical fermions; 1, description and properties

    CERN Document Server

    Frezzotti, R

    1999-01-01

    We give a detailed description of the so-called Polynomial Hybrid Monte Carlo (PHMC) algorithm. The effects of the correction factor, which is introduced to render the algorithm exact, are discussed, stressing their relevance for the statistical fluctuations and (almost) zero mode contributions to physical observables. We also investigate rounding-error effects and propose several ways to reduce memory requirements.

  7. Analysis of Time Delay Simulation in Networked Control System

    OpenAIRE

    Nyan Phyo Aung; Zaw Min Naing; Hla Myo Tun

    2016-01-01

    The paper presents a PD controller for the Networked Control Systems (NCS) with delay. The major challenges in this networked control system (NCS) are the delay of the data transmission throughout the communication network. The comparative performance analysis is carried out for different delays network medium. In this paper, simulation is carried out on Ac servo motor control system using CAN Bus as communication network medium. The True Time toolbox of MATLAB is used for simulation to analy...

  8. A Formal Semantics for Concept Understanding relying on Description Logics

    DEFF Research Database (Denmark)

    Badie, Farshad

    2017-01-01

    In this research, Description Logics (DLs) will be employed for logical description, logical characterisation, logical modelling and ontological description of concept understanding in terminological systems. It’s strongly believed that using a formal descriptive logic could support us in reveali...... logical assumptions whose discovery may lead us to a better understanding of ‘concept understanding’. The Structure of Observed Learning Outcomes (SOLO) model as an appropriate model of increasing complexity of humans’ understanding has supported the formal analysis....

  9. A Formal Semantics for Concept Understanding relying on Description Logics

    DEFF Research Database (Denmark)

    Badie, Farshad

    2017-01-01

    logical assumptions whose discovery may lead us to a better understanding of ‘concept understanding’. The Structure of Observed Learning Outcomes (SOLO) model as an appropriate model of increasing complexity of humans’ understanding has supported the formal analysis.......In this research, Description Logics (DLs) will be employed for logical description, logical characterisation, logical modelling and ontological description of concept understanding in terminological systems. It’s strongly believed that using a formal descriptive logic could support us in revealing...

  10. Abnormal transient analysis by using PWR plant simulator, (2)

    International Nuclear Information System (INIS)

    Naitoh, Akira; Murakami, Yoshimitsu; Yokobayashi, Masao.

    1983-06-01

    This report describes results of abnormal transient analysis by using a PWR plant simulator. The simulator is based on an existing 822MWe power plant with 3 loops, and designed to cover wide range of plant operation from cold shutdown to full power at EOL. In the simulator, malfunctions are provided for abnormal conditions of equipment failures, and in this report, 17 malfunctions for secondary system and 4 malfunctions for nuclear instrumentation systems were simulated. The abnormal conditions are turbine and generator trip, failure of condenser, feedwater system and valve and detector failures of pressure and water level. Fathermore, failure of nuclear instrumentations are involved such as source range channel, intermediate range channel and audio counter. Transient behaviors caused by added malfunctions were reasonable and detail information of dynamic characteristics for turbine-condenser system were obtained. (author)

  11. Uncertainty and sensitivity analysis in the neutronic parameters generation for BWR and PWR coupled thermal-hydraulic–neutronic simulations

    International Nuclear Information System (INIS)

    Ánchel, F.; Barrachina, T.; Miró, R.; Verdú, G.; Juanas, J.; Macián-Juan, R.

    2012-01-01

    Highlights: ► Best-estimate codes are affected by the uncertainty in the methods and the models. ► Influence of the uncertainty in the macroscopic cross-sections in a BWR and PWR RIA accidents analysis. ► The fast diffusion coefficient, the scattering cross section and both fission cross sections are the most influential factors. ► The absorption cross sections very little influence. ► Using a normal pdf the results are more “conservative” comparing the power peak reached with uncertainty quantified with a uniform pdf. - Abstract: The Best Estimate analysis consists of a coupled thermal-hydraulic and neutronic description of the nuclear system's behavior; uncertainties from both aspects should be included and jointly propagated. This paper presents a study of the influence of the uncertainty in the macroscopic neutronic information that describes a three-dimensional core model on the most relevant results of the simulation of a Reactivity Induced Accident (RIA). The analyses of a BWR-RIA and a PWR-RIA have been carried out with a three-dimensional thermal-hydraulic and neutronic model for the coupled system TRACE-PARCS and RELAP-PARCS. The cross section information has been generated by the SIMTAB methodology based on the joint use of CASMO-SIMULATE. The statistically based methodology performs a Monte-Carlo kind of sampling of the uncertainty in the macroscopic cross sections. The size of the sampling is determined by the characteristics of the tolerance intervals by applying the Noether–Wilks formulas. A number of simulations equal to the sample size have been carried out in which the cross sections used by PARCS are directly modified with uncertainty, and non-parametric statistical methods are applied to the resulting sample of the values of the output variables to determine their intervals of tolerance.

  12. Analysis of Satellite-Based Navigation Signal Reflectometry: Simulations and Observations

    DEFF Research Database (Denmark)

    von Benzon, Hans-Henrik; Høeg, Per; Durgonics, Tibor

    2016-01-01

    on different ocean characteristics. The spectra of the simulated surface reflections are analyzed, and the results from the simulations are compared to measured GPS surface reflections. The measurements were performed using a space-qualified GPS receiver placed on a mountain at the Haleakala observatory...... on the Hawaiian island of Maui. The GPS receiver was during the experiments running in an open-loop configuration. The analysis of both the simulated surface-reflection signals and the measured reflection signals will in general reveal spectral structures of the reflected signals that can lead to extraction...

  13. Econ Simulation Cited as Success

    Science.gov (United States)

    Workman, Robert; Maher, John

    1973-01-01

    A brief description of a computerized economics simulation model which provides students with an opportunity to apply microeconomic principles along with elementary accounting and statistical techniques.'' (Author/AK)

  14. Description of CORSET: a computer program for quantitative x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Stohl, F.V.

    1980-08-01

    Quantitative x-ray fluorescence analysis requires a method of correcting for absorption and secondary fluorescence effects due to the sample matrix. The computer program CORSET carries out these corrections without requiring a knowledge of the spectral distribution of the x-ray source, and only requires one standard per element or one standard containing all the elements. Sandia's version of CORSET has been divided into three separate programs to fit Sandia's specific requirements for on-line analysis in a melt facility. The melt facility is used to fabricate new alloys with very variable compositions and requires very rapid analyses during a run. Therefore, the standards must be analyzed several days in advance. Program DAT1 is used to set up a permanent file consisting of all the data related to the standards. Program UNINT is used to set up a permanent file with the intensities, background counts and counting times of the unknowns. Program CORSET uses the files created in UNINT and DAT1 to carry out the analysis. This report contains descriptions, listings, and sample runs for these programs. The accuracy of the analyses carried out with these three programs is about 1 to 2% relative with an elemental concentration of about 10 wt %

  15. Automating the simulator testing and data collection process

    Energy Technology Data Exchange (ETDEWEB)

    Magi, T.; Dimitri-Hakim, R. [L-3 Communications MAPPS Inc., Montreal, Quebec (Canada)

    2012-07-01

    Scenario-based training is a key process in the use of Full Scope Simulators (FSS) for operator training. Scenario-based training can be defined as any set of simulated plant operations performed with a specific training objective in mind. In order to meet this training objective, the ANSI/ANS-3.5-2009 standard requires that certain simulator training scenarios be tested to ensure that they reproduce the expected plant responses, that all plant procedures can be followed, and that scenario-based training objectives can be met. While malfunction testing provided a narrow view of the simulator performance revolving around the malfunction itself, scenario testing provides a broader, overall view. The concept of instructor validation of simulator scenarios to be used for training and evaluation, and oversight of simulator performance during the validation process, work hand-in-hand. This is where Scenario-Based Testing comes into play. With the description of Scenario-Based Testing (SBT) within Nuclear Energy Institute NEI 09-09 white paper and within the ANSI/ANS-3.5-2009 standard, the industry now has a way forward that reduces the regulatory uncertainty. Together, scenario-based testing and scenario-based training combine to produce better simulators which in turn can be used to more effectively and efficiently train new and existing power plant operators. However, they also impose a significant data gathering and analysis burden on FSS users. L-3 MAPPS Orchid Instructor Station (Orchid IS) facilitates this data gathering and analysis by providing features that automate this process with a simple, centralized, easy to use interface. (author)

  16. HANFORD TANK WASTE OPERATIONS SIMULATOR VERSION DESCRIPTION DOCUMENT

    International Nuclear Information System (INIS)

    ALLEN, G.K.

    2003-01-01

    This document describes the software version controls established for the Hanford Tank Waste Operations Simulator (HTWOS). It defines: the methods employed to control the configuration of HTWOS; the version of each of the 26 separate modules for the version 1.0 of HTWOS; the numbering rules for incrementing the version number of each module; and a requirement to include module version numbers in each case results documentation. Version 1.0 of HTWOS is the first version under formal software version control. HTWOS contains separate revision numbers for each of its 26 modules. Individual module version numbers do not reflect the major release HTWOS configured version number

  17. The Design and the Formative Evaluation of a Web-Based Course for Simulation Analysis Experiences

    Science.gov (United States)

    Tao, Yu-Hui; Guo, Shin-Ming; Lu, Ya-Hui

    2006-01-01

    Simulation output analysis has received little attention comparing to modeling and programming in real-world simulation applications. This is further evidenced by our observation that students and beginners acquire neither adequate details of knowledge nor relevant experience of simulation output analysis in traditional classroom learning. With…

  18. A Description and Linguistic Analysis of the Tai Khuen Writing System

    Directory of Open Access Journals (Sweden)

    R. Wyn Owen

    2017-03-01

    Full Text Available This article provides a description and linguistic analysis of the Tai Tham script-based orthography of Tai Khuen, a Tai-Kadai language spoken in Eastern Shan State, Myanmar. The language has a long history of writing flowing out of the literary and religious culture nurtured by the Lan Na Kingdom from the 13th Century onwards. Comparison of phoneme and grapheme inventories shows that the orthography is well able to represent the spoken language as well as ancient Pali religious texts. Apart from spelling conventions reflecting the etymology of borrowed Pali and Sanskrit morphemes, sound changes over time have also decreased the phonological transparency of the orthography, notwithstanding the need of some more conservative varieties which still preserve distinctions lost in other varieties. Despite the complexities of the script, the literacy rates in Khuen are remarkably high for a minority language not taught in the government school system.

  19. Emulation of dynamic simulators with application to hydrology

    Energy Technology Data Exchange (ETDEWEB)

    Machac, David, E-mail: david.machac@eawag.ch [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Department of Systems Analysis, Integrated Assessment and Modelling, 8600 Dübendorf (Switzerland); ETH Zurich, Department of Environmental Systems Science, 8092 Zurich (Switzerland); Reichert, Peter [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Department of Systems Analysis, Integrated Assessment and Modelling, 8600 Dübendorf (Switzerland); ETH Zurich, Department of Environmental Systems Science, 8092 Zurich (Switzerland); Albert, Carlo [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Department of Systems Analysis, Integrated Assessment and Modelling, 8600 Dübendorf (Switzerland)

    2016-05-15

    Many simulation-intensive tasks in the applied sciences, such as sensitivity analysis, parameter inference or real time control, are hampered by slow simulators. Emulators provide the opportunity of speeding up simulations at the cost of introducing some inaccuracy. An emulator is a fast approximation to a simulator that interpolates between design input–output pairs of the simulator. Increasing the number of design data sets is a computationally demanding way of improving the accuracy of emulation. We investigate the complementary approach of increasing emulation accuracy by including knowledge about the mechanisms of the simulator into the formulation of the emulator. To approximately reproduce the output of dynamic simulators, we consider emulators that are based on a system of linear, ordinary or partial stochastic differential equations with a noise term formulated as a Gaussian process of the parameters to be emulated. This stochastic model is then conditioned to the design data so that it mimics the behavior of the nonlinear simulator as a function of the parameters. The drift terms of the linear model are designed to provide a simplified description of the simulator as a function of its key parameters so that the required corrections by the conditioned Gaussian process noise are as small as possible. The goal of this paper is to compare the gain in accuracy of these emulators by enlarging the design data set and by varying the degree of simplification of the linear model. We apply this framework to a simulator for the shallow water equations in a channel and compare emulation accuracy for emulators based on different spatial discretization levels of the channel and for a standard non-mechanistic emulator. Our results indicate that we have a large gain in accuracy already when using the simplest mechanistic description by a single linear reservoir to formulate the drift term of the linear model. Adding some more reservoirs does not lead to a significant

  20. Computer simulations for the nano-scale

    International Nuclear Information System (INIS)

    Stich, I.

    2007-01-01

    A review of methods for computations for the nano-scale is presented. The paper should provide a convenient starting point into computations for the nano-scale as well as a more in depth presentation for those already working in the field of atomic/molecular-scale modeling. The argument is divided in chapters covering the methods for description of the (i) electrons, (ii) ions, and (iii) techniques for efficient solving of the underlying equations. A fairly broad view is taken covering the Hartree-Fock approximation, density functional techniques and quantum Monte-Carlo techniques for electrons. The customary quantum chemistry methods, such as post Hartree-Fock techniques, are only briefly mentioned. Description of both classical and quantum ions is presented. The techniques cover Ehrenfest, Born-Oppenheimer, and Car-Parrinello dynamics. The strong and weak points of both principal and technical nature are analyzed. In the second part we introduce a number of applications to demonstrate the different approximations and techniques introduced in the first part. They cover a wide range of applications such as non-simple liquids, surfaces, molecule-surface interactions, applications in nano technology, etc. These more in depth presentations, while certainly not exhaustive, should provide information on technical aspects of the simulations, typical parameters used, and ways of analysis of the huge amounts of data generated in these large-scale supercomputer simulations. (author)

  1. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    International Nuclear Information System (INIS)

    Back, Paer-Erik; Sundberg, Jan

    2007-09-01

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  2. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    Energy Technology Data Exchange (ETDEWEB)

    Back, Paer-Erik; Sundberg, Jan [Geo Innova AB (Sweden)

    2007-09-15

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  3. Isotropic-nematic transition in a mixture of hard spheres and hard spherocylinders: scaled particle theory description

    Directory of Open Access Journals (Sweden)

    M.F. Holovko

    2017-12-01

    Full Text Available The scaled particle theory is developed for the description of thermodynamical properties of a mixture of hard spheres and hard spherocylinders. Analytical expressions for free energy, pressure and chemical potentials are derived. From the minimization of free energy, a nonlinear integral equation for the orientational singlet distribution function is formulated. An isotropic-nematic phase transition in this mixture is investigated from the bifurcation analysis of this equation. It is shown that with an increase of concentration of hard spheres, the total packing fraction of a mixture on phase boundaries slightly increases. The obtained results are compared with computer simulations data.

  4. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical

  5. Analysis and simulation of an electrostatic FN Tandem accelerator

    International Nuclear Information System (INIS)

    Ugarte, Ricardo

    2007-01-01

    An analysis, modeling, and simulation of a positive ion FN Tandem electrostatic accelerator has been done. That has induced a detailed study over all physics components inside the accelerators tank, the terminal control stabilizer (TPS), the corona point, the capacitor pick off (CPO) and over the generating voltmeter (GVM) signals. The parameter of the model has been developed using the Prediction Error estimation Methods (PEM), and within classical techniques of analysis of circuits. The result obtained was used to check and increase the stability of the terminal voltage using Matlab software tools. The result of the simulation was contrasted with the reality and it was possible to improve the stability of the terminal voltage, successfully. The facility belongs to ARN (Argentina) and, in principle, it was installed to development an AMS system. (author)

  6. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  7. Description and preliminary evaluation of a diabetes technology simulation course.

    Science.gov (United States)

    Wilson, Rebecca D; Bailey, Marilyn; Boyle, Mary E; Seifert, Karen M; Cortez, Karla Y; Baker, Leslie J; Hovan, Michael J; Stepanek, Jan; Cook, Curtiss B

    2013-11-01

    We aim to provide data on a diabetes technology simulation course (DTSC) that instructs internal medicine residents in the use of continuous subcutaneous insulin infusion (CSII) and continuous glucose monitoring system (CGMS) devices. The DTSC was implemented during calendar year 2012 and conducted in the institution's simulation center. It consisted of a set of prerequisites, a practicum, and completion of a web-based inpatient CSII-ordering simulation. DTSC participants included only those residents in the outpatient endocrinology rotation. Questionnaires were used to determine whether course objectives were met and to assess the satisfaction of residents with the course. Questionnaires were also administered before and after the endocrine rotation to gauge improvement in familiarity with CSII and CGMS technologies. During the first year, 12 of 12 residents in the outpatient endocrinology rotation completed the DTSC. Residents reported that the course objectives were fully met. The mean satisfaction score with the course ranged from 4.0 to 4.9 (maximum, 5), with most variables rated above 4.5. Self-reported familiarity with the operation of CSII and CGMS devices increased significantly in the postrotation survey compared with that on the prerotation survey (both p technologies. In light of these preliminary findings, the course will continue to be offered, with further data accrual. Future work will involve piloting the DTSC approach among other types of providers, such as residents in other specialties or inpatient nursing staff. © 2013 Diabetes Technology Society.

  8. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  9. Analysis by simulation of the disposition of nuclear fuel waste

    International Nuclear Information System (INIS)

    Turek, J.L.

    1980-09-01

    A descriptive simulation model is developed which includes all aspects of nuclear waste disposition. The model is comprised of two systems, the second system orchestrated by GASP IV. A spent fuel generation prediction module is interfaced with the AFR Program Management Information System and a repository scheduling information module. The user is permitted a wide range of options with which to tailor the simulation to any desired storage scenario. The model projects storage requirements through the year 2020. The outputs are evaluations of the impact that alternative decision policies and milestone date changes have on the demand for, the availability of, and the utilization of spent fuel storage capacities. Both graphs and detailed listings are available. These outputs give a comprehensive view of the particular scenario under observation, including the tracking, by year, of each discharge from every reactor. Included within the work is a review of the status of spent fuel disposition based on input data accurate as of August 1980. The results indicate that some temporary storage techniques (e.g., transshipment of fuel and/or additional at-reactor storage pools) must be utilized to prevent reactor shutdowns. These techniques will be required until the 1990's when several AFR facilities, and possibly one repository, can become operational

  10. Development of a multi-media crew-training program for the terminal configured vehicle mission simulator

    Science.gov (United States)

    Rhouck, J. A.; Markos, A. T.

    1980-01-01

    This paper describes the work being done at the National Aeronautics and Space Administration's (NASA) Langley Research Center on the development of a multi-media crew-training program for the Terminal Configured Vehicle (TCV) Mission Simulator. Brief descriptions of the goals and objectives of the TCV Program and of the TCV Mission Simulator are presented. A detailed description of the training program is provided along with a description of the performance of the first group of four commercial pilots to be qualified in the TCV Mission Simulator.

  11. U-10Mo Baseline Fuel Fabrication Process Description

    Energy Technology Data Exchange (ETDEWEB)

    Hubbard, Lance R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Arendt, Christina L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dye, Daniel F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clayton, Christopher K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lerchen, Megan E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lombardo, Nicholas J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lavender, Curt A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zacher, Alan H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-27

    This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle of the USHPRR program. This document, along with the accompanying PFD, is updated regularly

  12. Unified Modeling Language description of the object-oriented multi-scale adaptive finite element method for Step-and-Flash Imprint Lithography Simulations

    International Nuclear Information System (INIS)

    Paszynski, Maciej; Gurgul, Piotr; Sieniek, Marcin; Pardo, David

    2010-01-01

    In the first part of the paper we present the multi-scale simulation of the Step-and-Flash Imprint Lithography (SFIL), a modern patterning process. The simulation utilizes the hp adaptive Finite Element Method (hp-FEM) coupled with Molecular Statics (MS) model. Thus, we consider the multi-scale problem, with molecular statics applied in the areas of the mesh where the highest accuracy is required, and the continuous linear elasticity with thermal expansion coefficient applied in the remaining part of the domain. The degrees of freedom from macro-scale element's nodes located on the macro-scale side of the interface have been identified with particles from nano-scale elements located on the nano-scale side of the interface. In the second part of the paper we present Unified Modeling Language (UML) description of the resulting multi-scale application (hp-FEM coupled with MS). We investigated classical, procedural codes from the point of view of the object-oriented (O-O) programming paradigm. The discovered hierarchical structure of classes and algorithms makes the UML project as independent on the spatial dimension of the problem as possible. The O-O UML project was defined at an abstract level, independent on the programming language used.

  13. Achievement of a training simulator for PWR power plant: application to control parametric studies

    International Nuclear Information System (INIS)

    Salomon-Sigogne, A.

    1982-09-01

    A simulation tool adapted to training tasks is developed. One presents the description of the simulator. One studies the management of a model by NEPTUN X2. A general description of a 900 MW PWR power station and the modelling of the power station are presented. The results obtained on the FIDIANE version of the simulator are finally analyzed [fr

  14. Towards a standard model for research in agent-based modeling and simulation

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2015-11-01

    Full Text Available Agent-based modeling (ABM is a bottom-up modeling approach, where each entity of the system being modeled is uniquely represented as an independent decision-making agent. ABMs are very sensitive to implementation details. Thus, it is very easy to inadvertently introduce changes which modify model dynamics. Such problems usually arise due to the lack of transparency in model descriptions, which constrains how models are assessed, implemented and replicated. In this paper, we present PPHPC, a model which aims to serve as a standard in agent based modeling research, namely, but not limited to, conceptual model specification, statistical analysis of simulation output, model comparison and parallelization studies. This paper focuses on the first two aspects (conceptual model specification and statistical analysis of simulation output, also providing a canonical implementation of PPHPC. The paper serves as a complete reference to the presented model, and can be used as a tutorial for simulation practitioners who wish to improve the way they communicate their ABMs.

  15. Simulated, Emulated, and Physical Investigative Analysis (SEPIA) of networked systems.

    Energy Technology Data Exchange (ETDEWEB)

    Burton, David P.; Van Leeuwen, Brian P.; McDonald, Michael James; Onunkwo, Uzoma A.; Tarman, Thomas David; Urias, Vincent E.

    2009-09-01

    This report describes recent progress made in developing and utilizing hybrid Simulated, Emulated, and Physical Investigative Analysis (SEPIA) environments. Many organizations require advanced tools to analyze their information system's security, reliability, and resilience against cyber attack. Today's security analysis utilize real systems such as computers, network routers and other network equipment, computer emulations (e.g., virtual machines) and simulation models separately to analyze interplay between threats and safeguards. In contrast, this work developed new methods to combine these three approaches to provide integrated hybrid SEPIA environments. Our SEPIA environments enable an analyst to rapidly configure hybrid environments to pass network traffic and perform, from the outside, like real networks. This provides higher fidelity representations of key network nodes while still leveraging the scalability and cost advantages of simulation tools. The result is to rapidly produce large yet relatively low-cost multi-fidelity SEPIA networks of computers and routers that let analysts quickly investigate threats and test protection approaches.

  16. Unsteady, Cooled Turbine Simulation Using a PC-Linux Analysis System

    Science.gov (United States)

    List, Michael G.; Turner, Mark G.; Chen, Jen-Pimg; Remotigue, Michael G.; Veres, Joseph P.

    2004-01-01

    The fist stage of the high-pressure turbine (HPT) of the GE90 engine was simulated with a three-dimensional unsteady Navier-Sokes solver, MSU Turbo, which uses source terms to simulate the cooling flows. In addition to the solver, its pre-processor, GUMBO, and a post-processing and visualization tool, Turbomachinery Visual3 (TV3) were run in a Linux environment to carry out the simulation and analysis. The solver was run both with and without cooling. The introduction of cooling flow on the blade surfaces, case, and hub and its effects on both rotor-vane interaction as well the effects on the blades themselves were the principle motivations for this study. The studies of the cooling flow show the large amount of unsteadiness in the turbine and the corresponding hot streak migration phenomenon. This research on the GE90 turbomachinery has also led to a procedure for running unsteady, cooled turbine analysis on commodity PC's running the Linux operating system.

  17. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  18. Analytical simulations in the field of two-phase flow

    International Nuclear Information System (INIS)

    Karwat, H.

    1978-01-01

    Power reactors are designed with engineered safeguards to cope with the consequences of possible failures or malfunctions. Experiments are carried out to verify the analytical simulations used in the design of these engineered safeguards. The paper discusses the basis for the verification of the analytical simulations, the requirements of corresponding experiments used to validitate the analysis and the necessary boundary conditions of the experiment as well as of the reactor systems. A detailed description of a typical boundary condition for real reactor systems is shown to be important, if experimental observations are to be interpreted correctly. Finally, the question will be addressed whether experiments on a larger scale than 1/1000 or 1/100 are necessary to extrapolate experimental observatons to a full scale reactor situation. (author)

  19. Material control system simulator program reference manual

    Energy Technology Data Exchange (ETDEWEB)

    Hollstien, R.B.

    1978-01-24

    A description is presented of a Material Control System Simulator (MCSS) program for determination of material accounting uncertainty and system response to particular adversary action sequences that constitute plausible material diversion attempts. The program is intended for use in situations where randomness, uncertainty, or interaction of adversary actions and material control system components make it difficult to assess safeguards effectiveness against particular material diversion attempts. Although MCSS may be used independently in the design or analysis of material handling and processing systems, it has been tailored toward the determination of material accountability and the response of material control systems to adversary action sequences.

  20. Material control system simulator program reference manual

    International Nuclear Information System (INIS)

    Hollstien, R.B.

    1978-01-01

    A description is presented of a Material Control System Simulator (MCSS) program for determination of material accounting uncertainty and system response to particular adversary action sequences that constitute plausible material diversion attempts. The program is intended for use in situations where randomness, uncertainty, or interaction of adversary actions and material control system components make it difficult to assess safeguards effectiveness against particular material diversion attempts. Although MCSS may be used independently in the design or analysis of material handling and processing systems, it has been tailored toward the determination of material accountability and the response of material control systems to adversary action sequences

  1. Passion fruit juice with different sweeteners: sensory profile by descriptive analysis and acceptance.

    Science.gov (United States)

    Rocha, Izabela Furtado de Oliveira; Bolini, Helena Maria André

    2015-03-01

    This study evaluated the effect of different sweeteners on the sensory profile, acceptance, and drivers of preference of passion fruit juice samples sweetened with sucrose, aspartame, sucralose, stevia, cyclamate/saccharin blend 2:1, and neotame. Sensory profiling was performed by 12 trained assessors using quantitative descriptive analysis (QDA). Acceptance tests (appearance, aroma, flavor, texture and overall impression) were performed with 124 consumers of tropical fruit juice. Samples with sucrose, aspartame and sucralose showed similar sensory profile (P Passion fruit flavor affected positively and sweet aftertaste affected negatively the acceptance of the samples. Samples sweetened with aspartame, sucralose, and sucrose presented higher acceptance scores for the attributes flavor, texture, and overall impression, with no significant (P passion fruit juice.

  2. Epsilon. A System Description Language

    DEFF Research Database (Denmark)

    Jensen, Kurt; Kyng, Morten

    This paper discusses the use of Petri nets as a semantic tool in the design of languages and in the construction and analysis of system descriptions. The topics treated are: -- Languages based on nets. -- The problem of time in nets. -- Nets and related models. -- Nets and formal semantics...

  3. Innovations in systems engineering and analysis for the simulation of beyond design-base accidents

    International Nuclear Information System (INIS)

    Frisch, W.; Beraha, D.

    1990-01-01

    An important target in improving reactor safety is to have the most realistic simulation possible of beyond design-base accidents in the computer. This paper presents new developments in ATHLET and further developments (description of the thermo-fluid-dynamic conditions in the core and cooling circuits during serious incidents in the computer programme ATHLET-SA) and extensions (link-up to RALOC). RALOC is a computer programme for describing thermodynamic conditions inside the containment during design-base accidents and accidents involving core meltdown. Further research is dedicated to code acceleration. (DG) [de

  4. Development of a simulation program to study error propagation in the reprocessing input accountancy measurements

    International Nuclear Information System (INIS)

    Sanfilippo, L.

    1987-01-01

    A physical model and a computer program have been developed to simulate all the measurement operations involved with the Isotopic Dilution Analysis technique currently applied in the Volume - Concentration method for the Reprocessing Input Accountancy, together with their errors or uncertainties. The simulator is apt to easily solve a number of problems related to the measurement sctivities of the plant operator and the inspector. The program, written in Fortran 77, is based on a particular Montecarlo technique named ''Random Sampling''; a full description of the code is reported

  5. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  6. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  7. Bridging the gap: linking molecular simulations and systemic descriptions of cellular compartments.

    Directory of Open Access Journals (Sweden)

    Tihamér Geyer

    Full Text Available Metabolic processes in biological cells are commonly either characterized at the level of individual enzymes and metabolites or at the network level. Often these two paradigms are considered as mutually exclusive because concepts from neither side are suited to describe the complete range of scales. Additionally, when modeling metabolic or regulatory cellular systems, often a large fraction of the required kinetic parameters are unknown. This even applies to such simple and extensively studied systems like the photosynthetic apparatus of purple bacteria. Using the chromatophore vesicles of Rhodobacter sphaeroides as a model system, we show that a consistent kinetic model emerges when fitting the dynamics of a molecular stochastic simulation to a set of time dependent experiments even though about two thirds of the kinetic parameters in this system are not known from experiment. Those kinetic parameters that were previously known all came out in the expected range. The simulation model was built from independent protein units composed of elementary reactions processing single metabolites. This pools-and-proteins approach naturally compiles the wealth of available molecular biological data into a systemic model and can easily be extended to describe other systems by adding new protein or nucleic acid types. The automated parameter optimization, performed with an evolutionary algorithm, reveals the sensitivity of the model to the value of each parameter and the relative importances of the experiments used. Such an analysis identifies the crucial system parameters and guides the setup of new experiments that would add most knowledge for a systemic understanding of cellular compartments. The successful combination of the molecular model and the systemic parametrization presented here on the example of the simple machinery for bacterial photosynthesis shows that it is actually possible to combine molecular and systemic modeling. This framework can now

  8. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  9. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  10. Efficient Analysis of Simulations of the Sun's Magnetic Field

    Science.gov (United States)

    Scarborough, C. W.; Martínez-Sykora, J.

    2014-12-01

    Dynamics in the solar atmosphere, including solar flares, coronal mass ejections, micro-flares and different types of jets, are powered by the evolution of the sun's intense magnetic field. 3D Radiative Magnetohydrodnamics (MHD) computer simulations have furthered our understanding of the processes involved: When non aligned magnetic field lines reconnect, the alteration of the magnetic topology causes stored magnetic energy to be converted into thermal and kinetic energy. Detailed analysis of this evolution entails tracing magnetic field lines, an operation which is not time-efficient on a single processor. By utilizing a graphics card (GPU) to trace lines in parallel, conducting such analysis is made feasible. We applied our GPU implementation to the most advanced 3D Radiative-MHD simulations (Bifrost, Gudicksen et al. 2011) of the solar atmosphere in order to better understand the evolution of the modeled field lines.

  11. Monte Carlo simulations to advance characterisation of landmines by pulsed fast/thermal neutron analysis

    NARCIS (Netherlands)

    Maucec, M.; Rigollet, C.

    The performance of a detection system based on the pulsed fast/thermal neutron analysis technique was assessed using Monte Carlo simulations. The aim was to develop and implement simulation methods, to support and advance the data analysis techniques of the characteristic gamma-ray spectra,

  12. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  13. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  14. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  15. 2D Numerical Simulation and Sensitive Analysis of H-Darrieus Wind Turbine

    Directory of Open Access Journals (Sweden)

    Seyed Mohammad E. Saryazdi

    2018-02-01

    Full Text Available Recently, a lot of attention has been devoted to the use of Darrieus wind turbines in urban areas. The aerodynamics of a Darrieus turbine are very complex due to dynamic stall and changing forces on the turbine triggered by changing horizontal angles. In this study, the aerodynamics of H-rotor vertical axis wind turbine (VAWT has been studied using computational fluid dynamics via two different turbulence models. Shear stress transport (SST k-ω turbulence model was used to simulate a 2D unsteady model of the H-Darrieus turbine. In order to complete this simulation, sensitivity analysis of the effective turbine parameters such as solidity factor, airfoil shape, wind velocity and shaft diameter were done. To simulate the flow through the turbine, a 2D simplified computational domain has been generated. Then fine mesh for each case consisting of different turbulence models and dimensions has been generated. Each mesh in this simulation dependent on effective parameters consisted of domain size, mesh quality, time step and total revolution. The sliding mesh method was applied to evaluate the unsteady interaction between the stationary and rotating components. Previous works just simulated turbine, while in our study sensitivity analysis of effective parameters was done. The simulation results closely match the experimental data, providing an efficient and reliable foundation to study wind turbine aerodynamics. This also demonstrates computing the best value of the effective parameter. The sensitivity analysis revealed best value of the effective parameter that could be used in the process of designing turbine. This work provides the first step in developing an accurate 3D aerodynamic modeling of Darrieus wind turbines. Article History: Received :August 19th 2017; Received: December 15th 2017; Accepted: Januari 14th 2018; Available online How to Cite This Article: Saryazdi, S. M. E. and Boroushaki, M. (2018 2D Numerical Simulation and Sensitive

  16. ROSA-IV Large Scale Test Facility (LSTF) system description for second simulated fuel assembly

    International Nuclear Information System (INIS)

    1990-10-01

    The ROSA-IV Program's Large Scale Test Facility (LSTF) is a test facility for integral simulation of thermal-hydraulic response of a pressurized water reactor (PWR) during small break loss-of-coolant accidents (LOCAs) and transients. In this facility, the PWR core nuclear fuel rods are simulated using electric heater rods. The simulated fuel assembly which was installed during the facility construction was replaced with a new one in 1988. The first test with this second simulated fuel assembly was conducted in December 1988. This report describes the facility configuration and characteristics as of this date (December 1988) including the new simulated fuel assembly design and the facility changes which were made during the testing with the first assembly as well as during the renewal of the simulated fuel assembly. (author)

  17. Pyroprocess Deployment Analysis and Remote Accessibility Experiment using Digital Mockup and Simulation

    International Nuclear Information System (INIS)

    Kim, K. H.; Park, H. S.; Kim, S. H.; Choi, C. H.; Lee, H. J.; Park, B. S.; Yoon, G. S.; Kim, K. H.; Kim, H. D.

    2009-11-01

    Nuclear fuel cycle facility that treats with spent fuel must be designed and manufactured a Pyroprcess facility and process with considering a speciality as every process have to be processed remotely. To prevent an unexpected accident under a circumstance that must operate with a remote manipulator after done the Pyroprocess facility, an procedure related Pyroprocess operation and maintenance need to establish it in the early design stage. To develop the simulator that is mixed by 3D modelling and simulation, a system architecture was designed. A full-scale digital mockup with a real pyroprocess facility was designed and manufactured. An inverse kinematics algorithm of remote manipulator was created in order to simulate an accident and repair that could happen in pyroprocess operation and maintenance under a virtual digital mockup environment. Deployment analysis of process devices through a workspace analysis was carried out and Accessibility analysis by using haptic device was examined

  18. Integral Pressurized Water Reactor Simulator Manual

    International Nuclear Information System (INIS)

    2017-01-01

    This publication provides detailed explanations of the theoretical concepts that the simulator users have to know to gain a comprehensive understanding of the physics and technology of integral pressurized water reactors. It provides explanations of each of the simulator screens and various controls that a user can monitor and modify. A complete description of all the simulator features is also provided. A detailed set of exercises is provided in the Exercise Handbook accompanying this publication.

  19. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  20. On geometric simulating in nuclear reactor calculations by the Monte-Carlo method

    International Nuclear Information System (INIS)

    Ostashenko, S.V.

    1988-01-01

    Analysis of existing geometric modules makes it possible to reveal their disadvantages and to formulate requirements list, which should be satisfied by any usefull geometry system. Short description of GDL language used for complex reactor systems simulating is given. GDL language applies hierarchical representation scheme to assemblies, which aids to reduce significantly amount of input data. The language is part of GDL geometry system designed for MCU package and implemented on ES computers

  1. A system for designing and simulating particle physics experiments

    International Nuclear Information System (INIS)

    Zelazny, R.; Strzalkowski, P.

    1987-01-01

    In view of the rapid development of experimental facilities and their costs, the systematic design and preparation of particle physics experiments have become crucial. A software system is proposed as an aid for the experimental designer, mainly for experimental geometry analysis and experimental simulation. The following model is adopted: the description of an experiment is formulated in a language (here called XL) and put by its processor in a data base. The language is based on the entity-relationship-attribute approach. The information contained in the data base can be reported and analysed by an analyser (called XA) and modifications can be made at any time. In particular, the Monte Carlo methods can be used in experiment simulation for both physical phenomena in experimental set-up and detection analysis. The general idea of the system is based on the design concept of ISDOS project information systems. The characteristics of the simulation module are similar to those of the CERN Geant system, but some extensions are proposed. The system could be treated as a component of greater, integrated software environment for the design of particle physics experiments, their monitoring and data processing. (orig.)

  2. Simulation-Based Approach to Operating Costs Analysis of Freight Trucking

    Directory of Open Access Journals (Sweden)

    Ozernova Natalja

    2015-12-01

    Full Text Available The article is devoted to the problem of costs uncertainty in road freight transportation services. The article introduces the statistical approach, based on Monte Carlo simulation on spreadsheets, to the analysis of operating costs. The developed model gives an opportunity to estimate operating freight trucking costs under different configuration of cost factors. Important conclusions can be made after running simulations regarding sensitivity to different factors, optimal decisions and variability of operating costs.

  3. KAFEPA-II program users' manual and description

    International Nuclear Information System (INIS)

    Suk, H. C.; Hwang, W.; Kim, B. G.; Sim, K. S.; Heo, Y. H.; Byun, T. S.; Park, G. S.

    1992-04-01

    KAFEPA-II is a computer program for simulating the behaviour of UO 2 fuel elements under normal operating conditions of a CANDU reactor. It computes the one-dimensional temperature distribution and thermal expansion of the fuel pellets. The amount of gas released during irradiation of the fuel is also computed. Thermal expansion and gas pressure inside the fuel element are then used to compute the strains and stresses in the sheath. This document is intended as a user's manual and description for KAFEPA-II. (Author)

  4. The parametrized simulation of electromagnetic showers

    International Nuclear Information System (INIS)

    Peters, S.

    1992-09-01

    The simulation of electromagnetic showers in calorimeters by detailed tracking of all secondary particles is extremely computer time consuming. Without loosing considerably in precision, the use of parametrizations for global shower properties may reduce the computing time by factors of 10 1 to 10 4 , depending on the energy, the degree of parametrization, and the complexity in the material description and the cut off energies in the detailed simulation. To arrive at a high degree of universality, parametrizations of individual electromagnetic showers in homogeneous media are developed, taking the dependence of the shower development on the material into account. In sampling calorimeters, the inhomogeneous material distribution leads to additional effects which can be taken into account by geometry dependent terms in the parametrization of the longitudinal and radial energy density distributions. Comparisons with detailed simulations of homogeneous and sampling calorimeters show very good agreement in the fluctuations, correlations, and signal averages of spatial energy distributions. Verifications of the algorithms for the simulation of the H1 detector are performed using calorimeter test data for different moduls of the H1 liquid argon calorimeter. Special attention has been paid to electron pion separation, which is of great importance for physics analysis. (orig.) [de

  5. Mining skeletal phenotype descriptions from scientific literature.

    Directory of Open Access Journals (Sweden)

    Tudor Groza

    Full Text Available Phenotype descriptions are important for our understanding of genetics, as they enable the computation and analysis of a varied range of issues related to the genetic and developmental bases of correlated characters. The literature contains a wealth of such phenotype descriptions, usually reported as free-text entries, similar to typical clinical summaries. In this paper, we focus on creating and making available an annotated corpus of skeletal phenotype descriptions. In addition, we present and evaluate a hybrid Machine Learning approach for mining phenotype descriptions from free text. Our hybrid approach uses an ensemble of four classifiers and experiments with several aggregation techniques. The best scoring technique achieves an F-1 score of 71.52%, which is close to the state-of-the-art in other domains, where training data exists in abundance. Finally, we discuss the influence of the features chosen for the model on the overall performance of the method.

  6. Comparison of Analysis, Simulation, and Measurement of Wire-to-Wire Crosstalk. Part 1

    Science.gov (United States)

    Bradley, Arthur T.; Yavoich, Brian James; Hodson, Shame M.; Godley, Richard Franklin

    2010-01-01

    In this investigation, we compare crosstalk analysis, simulation, and measurement results for electrically short configurations. Methods include hand calculations, PSPICE simulations, Microstripes transient field solver, and empirical measurement. In total, four representative physical configurations are examined, including a single wire over a ground plane, a twisted pair over a ground plane, generator plus receptor wires inside a cylindrical conduit, and a single receptor wire inside a cylindrical conduit. Part 1 addresses the first two cases, and Part 2 addresses the final two. Agreement between the analysis, simulation, and test data is shown to be very good.

  7. A Descriptive Analysis of Tactical Casualty Care Interventions Performed by Law Enforcement Personnel in the State of Wisconsin, 2010-2015.

    Science.gov (United States)

    Stiles, Chad M; Cook, Christopher; Sztajnkrycer, Matthew D

    2017-06-01

    Introduction Based upon military experience, law enforcement has developed guidelines for medical care during high-threat conditions. The purpose of the current study was to provide a descriptive analysis of reported outcomes of law enforcement medical interventions. This was a descriptive analysis of a convenience sample of cases submitted to the Wisconsin Tactical Medicine Initiative (Wisconsin USA), after the provision of successful patient care, between January 2010 and December 2015. The study was reviewed by the Mayo Foundation Institutional Review Board (Rochester, Minnesota USA) and deemed exempt. Nineteen agencies submitted information during the study period. Of the 56 episodes of care reported, four (7.1%) cases involved care provided to injured officers while 52 (92.9%) involved care to injured civilians, including suspects. In at least two cases, on-going threats existed during the provision of medical care to an injured civilian. Law enforcement rendered care prior to Emergency Medical Services (EMS) arrival in all but two cases. The current case series demonstrates the life-saving potential for law enforcement personnel trained and equipped under current Tactical Combat Casualty Care (TCCC)/ Committee on Tactical Emergency Casualty Care (C-TECC) tactical casualty care guidelines. Although originally developed to save the lives of wounded combat personnel, in the civilian sector, the training appears more likely to save victims rather than law enforcement personnel. Stiles CM , Cook C , Sztajnkrycer MD . A descriptive analysis of tactical casualty care interventions performed by law enforcement personnel in the State of Wisconsin, 2010-2015. Prehosp Disaster Med. 2017;32(3):284-288.

  8. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  9. Physics Detector Simulation Facility Phase II system software description

    International Nuclear Information System (INIS)

    Scipioni, B.; Allen, J.; Chang, C.; Huang, J.; Liu, J.; Mestad, S.; Pan, J.; Marquez, M.; Estep, P.

    1993-05-01

    This paper presents the Physics Detector Simulation Facility (PDSF) Phase II system software. A key element in the design of a distributed computing environment for the PDSF has been the separation and distribution of the major functions. The facility has been designed to support batch and interactive processing, and to incorporate the file and tape storage systems. By distributing these functions, it is often possible to provide higher throughput and resource availability. Similarly, the design is intended to exploit event-level parallelism in an open distributed environment

  10. Simulation-Based Analysis of Ship Motions in Short-Crested Irregular Seas

    Directory of Open Access Journals (Sweden)

    Kıvanç Ali ANIL

    2017-03-01

    Full Text Available Demonstration of the seakeeping calculation results other than polar diagrams and Cartesian plots is important during the initial and detail design stages of naval platforms due to the necessity of numerical simulations (time series data for the design and validation of the systems on board. These time series simulations are called as “real time computer experiments”. Similar simulation algorithms for ship motions and wave elevation are also used by ship-handling simulators for realistic visualization. The goal of this paper is to create a basis for the simulation-based analysis of ship motions and wave elevation for future design and validation studies for both the naval platform itself and the systems on board. The focus of this paper is the clarification of the theoretical background of this process, i.e. all formulations required to create and validate a ship motion and wave surface simulation are given in detail. The results of this study may also be used in ship-handling simulators or helicopter landing on ship simulations.

  11. Statistical analysis and Monte Carlo simulation of growing self-avoiding walks on percolation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Yuxia [Department of Physics, Wuhan University, Wuhan 430072 (China); Sang Jianping [Department of Physics, Wuhan University, Wuhan 430072 (China); Department of Physics, Jianghan University, Wuhan 430056 (China); Zou Xianwu [Department of Physics, Wuhan University, Wuhan 430072 (China)]. E-mail: xwzou@whu.edu.cn; Jin Zhunzhi [Department of Physics, Wuhan University, Wuhan 430072 (China)

    2005-09-26

    The two-dimensional growing self-avoiding walk on percolation was investigated by statistical analysis and Monte Carlo simulation. We obtained the expression of the mean square displacement and effective exponent as functions of time and percolation probability by statistical analysis and made a comparison with simulations. We got a reduced time to scale the motion of walkers in growing self-avoiding walks on regular and percolation lattices.

  12. Small-signal modelling and analysis of switching converters using MATLAB

    NARCIS (Netherlands)

    Duarte, J.L.

    1998-01-01

    A general procedure for the description of power electronic circuit dynamics is proposed, with the intention of control system design and discrete-time system simulation. The approach is especially suited to be used along with computeraided analysis and synthesis software packages such as MATLAB.

  13. Simulation analysis of cascade controller for DC-DC bank converter

    International Nuclear Information System (INIS)

    Mahar, M.A.; Abro, M.R.; Larik, A.S.

    2009-01-01

    Power electronic converters are periodic variable structure systems due to their switched operation. During the last few decades several new dc-dc converter topologies have emerged. Buck converter being simple in topology, has recently drawn attraction of many researchers. Basically, a buck converter is highly underdamped system. In order to overcome the developed oscillations in output of this converter, various control techniques have been proposed. However, these techniques are fraught with many drawbacks. This paper focus on a cascade controller based buck topology. Steady state analysis is given in this paper which shows output voltage and inductor current in detail. Dynamic analysis for line and load variation is also presented. The buck topology is implemented and simulated in MATLAB/Simulink. The simulated results are presented. (author)

  14. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-01-01

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  15. SED-ML web tools: generate, modify and export standard-compliant simulation studies.

    Science.gov (United States)

    Bergmann, Frank T; Nickerson, David; Waltemath, Dagmar; Scharm, Martin

    2017-04-15

    The Simulation Experiment Description Markup Language (SED-ML) is a standardized format for exchanging simulation studies independently of software tools. We present the SED-ML Web Tools, an online application for creating, editing, simulating and validating SED-ML documents. The Web Tools implement all current SED-ML specifications and, thus, support complex modifications and co-simulation of models in SBML and CellML formats. Ultimately, the Web Tools lower the bar on working with SED-ML documents and help users create valid simulation descriptions. http://sysbioapps.dyndns.org/SED-ML_Web_Tools/ . fbergman@caltech.edu . © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. Regional hydrogeological simulations for Forsmark - numerical modelling using CONNECTFLOW. Preliminary site description Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Cox, Ian; Hunter, Fiona; Jackson, Peter; Joyce, Steve; Swift, Ben [Serco Assurance, Risley (United Kingdom); Gylling, Bjoern; Marsic, Niko [Kemakta Konsult AB, Stockholm (Sweden)

    2005-05-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) carries out site investigations in two different candidate areas in Sweden with the objective of describing the in-situ conditions for a bedrock repository for spent nuclear fuel. The site characterisation work is divided into two phases, an initial site investigation phase (IPLU) and a complete site investigation phase (KPLU). The results of IPLU are used as a basis for deciding on a subsequent KPLU phase. On the basis of the KPLU investigations a decision is made as to whether detailed characterisation will be performed (including sinking of a shaft). An integrated component in the site characterisation work is the development of site descriptive models. These comprise basic models in three dimensions with an accompanying text description. Central in the modelling work is the geological model, which provides the geometrical context in terms of a model of deformation zones and the rock mass between the zones. Using the geological and geometrical description models as a basis, descriptive models for other geo-disciplines (hydrogeology, hydro-geochemistry, rock mechanics, thermal properties and transport properties) will be developed. Great care is taken to arrive at a general consistency in the description of the various models and assessment of uncertainty and possible needs of alternative models. Here, a numerical model is developed on a regional-scale (hundreds of square kilometres) to understand the zone of influence for groundwater flow that affects the Forsmark area. Transport calculations are then performed by particle tracking from a local-scale release area (a few square kilometres) to identify potential discharge areas for the site and using greater grid resolution. The main objective of this study is to support the development of a preliminary Site Description of the Forsmark area on a regional-scale based on the available data of 30 June 2004 and the previous Site Description. A more specific

  17. A Method for Functional Task Alignment Analysis of an Arthrocentesis Simulator.

    Science.gov (United States)

    Adams, Reid A; Gilbert, Gregory E; Buckley, Lisa A; Nino Fong, Rodolfo; Fuentealba, I Carmen; Little, Erika L

    2018-05-16

    During simulation-based education, simulators are subjected to procedures composed of a variety of tasks and processes. Simulators should functionally represent a patient in response to the physical action of these tasks. The aim of this work was to describe a method for determining whether a simulator does or does not have sufficient functional task alignment (FTA) to be used in a simulation. Potential performance checklist items were gathered from published arthrocentesis guidelines and aggregated into a performance checklist using Lawshe's method. An expert panel used this performance checklist and an FTA analysis questionnaire to evaluate a simulator's ability to respond to the physical actions required by the performance checklist. Thirteen items, from a pool of 39, were included on the performance checklist. Experts had mixed reviews of the simulator's FTA and its suitability for use in simulation. Unexpectedly, some positive FTA was found for several tasks where the simulator lacked functionality. By developing a detailed list of specific tasks required to complete a clinical procedure, and surveying experts on the simulator's response to those actions, educators can gain insight into the simulator's clinical accuracy and suitability. Unexpected of positive FTA ratings of function deficits suggest that further revision of the survey method is required.

  18. Theoretical and Empirical Descriptions of Thermospheric Density

    Science.gov (United States)

    Solomon, S. C.; Qian, L.

    2004-12-01

    The longest-term and most accurate overall description the density of the upper thermosphere is provided by analysis of change in the ephemeris of Earth-orbiting satellites. Empirical models of the thermosphere developed in part from these measurements can do a reasonable job of describing thermospheric properties on a climatological basis, but the promise of first-principles global general circulation models of the coupled thermosphere/ionosphere system is that a true high-resolution, predictive capability may ultimately be developed for thermospheric density. However, several issues are encountered when attempting to tune such models so that they accurately represent absolute densities as a function of altitude, and their changes on solar-rotational and solar-cycle time scales. Among these are the crucial ones of getting the heating rates (from both solar and auroral sources) right, getting the cooling rates right, and establishing the appropriate boundary conditions. However, there are several ancillary issues as well, such as the problem of registering a pressure-coordinate model onto an altitude scale, and dealing with possible departures from hydrostatic equilibrium in empirical models. Thus, tuning a theoretical model to match empirical climatology may be difficult, even in the absence of high temporal or spatial variation of the energy sources. We will discuss some of the challenges involved, and show comparisons of simulations using the NCAR Thermosphere-Ionosphere-Electrodynamics General Circulation Model (TIE-GCM) to empirical model estimates of neutral thermosphere density and temperature. We will also show some recent simulations using measured solar irradiance from the TIMED/SEE instrument as input to the TIE-GCM.

  19. Distinguishing Features and Similarities Between Descriptive Phenomenological and Qualitative Description Research.

    Science.gov (United States)

    Willis, Danny G; Sullivan-Bolyai, Susan; Knafl, Kathleen; Cohen, Marlene Z

    2016-09-01

    Scholars who research phenomena of concern to the discipline of nursing are challenged with making wise choices about different qualitative research approaches. Ultimately, they want to choose an approach that is best suited to answer their research questions. Such choices are predicated on having made distinctions between qualitative methodology, methods, and analytic frames. In this article, we distinguish two qualitative research approaches widely used for descriptive studies: descriptive phenomenological and qualitative description. Providing a clear basis that highlights the distinguishing features and similarities between descriptive phenomenological and qualitative description research will help students and researchers make more informed choices in deciding upon the most appropriate methodology in qualitative research. We orient the reader to distinguishing features and similarities associated with each approach and the kinds of research questions descriptive phenomenological and qualitative description research address. © The Author(s) 2016.

  20. The DART general equilibrium model: A technical description

    OpenAIRE

    Springer, Katrin

    1998-01-01

    This paper provides a technical description of the Dynamic Applied Regional Trade (DART) General Equilibrium Model. The DART model is a recursive dynamic, multi-region, multi-sector computable general equilibrium model. All regions are fully specified and linked by bilateral trade flows. The DART model can be used to project economic activities, energy use and trade flows for each of the specified regions to simulate various trade policy as well as environmental policy scenarios, and to analy...

  1. Modeling and simulation for RF system design

    CERN Document Server

    Frevert, Ronny; Jancke, Roland; Knöchel, Uwe; Schwarz, Peter; Kakerow, Ralf; Darianian, Mohsen

    2005-01-01

    Focusing on RF specific modeling and simulation methods, and system and circuit level descriptions, this work contains application-oriented training material. Accompanied by a CD- ROM, it combines the presentation of a mixed-signal design flow, an introduction into VHDL-AMS and Verilog-A, and the application of commercially available simulators.

  2. Nuclear magnetic resonance provides a quantitative description of protein conformational flexibility on physiologically important time scales.

    Science.gov (United States)

    Salmon, Loïc; Bouvignies, Guillaume; Markwick, Phineus; Blackledge, Martin

    2011-04-12

    A complete description of biomolecular activity requires an understanding of the nature and the role of protein conformational dynamics. In recent years, novel nuclear magnetic resonance-based techniques that provide hitherto inaccessible detail concerning biomolecular motions occurring on physiologically important time scales have emerged. Residual dipolar couplings (RDCs) provide precise information about time- and ensemble-averaged structural and dynamic processes with correlation times up to the millisecond and thereby encode key information for understanding biological activity. In this review, we present the application of two very different approaches to the quantitative description of protein motion using RDCs. The first is purely analytical, describing backbone dynamics in terms of diffusive motions of each peptide plane, using extensive statistical analysis to validate the proposed dynamic modes. The second is based on restraint-free accelerated molecular dynamics simulation, providing statistically sampled free energy-weighted ensembles that describe conformational fluctuations occurring on time scales from pico- to milliseconds, at atomic resolution. Remarkably, the results from these two approaches converge closely in terms of distribution and absolute amplitude of motions, suggesting that this kind of combination of analytical and numerical models is now capable of providing a unified description of protein conformational dynamics in solution.

  3. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  4. Collaborative Classroom Simulation (CCS): An Innovative Pedagogy Using Simulation in Nursing Education.

    Science.gov (United States)

    Berndt, Jodi; Dinndorf-Hogenson, Georgia; Herheim, Rena; Hoover, Carrie; Lanc, Nicole; Neuwirth, Janet; Tollefson, Bethany

    2015-01-01

    Collaborative Classroom Simulation (CCS) is a pedagogy designed to provide a simulation learning experience for a classroom of students simultaneously through the use of unfolding case scenarios. The purpose of this descriptive study was to explore the effectiveness of CCS based on student perceptions. Baccalaureate nursing students (n = 98) participated in the study by completing a survey after participation in the CCS experience. Opportunities for collaboration, clinical judgment, and participation as both observer and active participant were seen as strengths of the experience. Developed as a method to overcome barriers to simulation, CCS was shown to be an effective active learning technique that may prove to be sustainable.

  5. Model extension and improvement for simulator-based software safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H.-W. [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China) and Institute of Nuclear Energy Research (INER), No. 1000 Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China)]. E-mail: hwhwang@iner.gov.tw; Shih Chunkuan [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China); Yih Swu [Department of Computer Science and Information Engineering, Ching Yun University, 229 Chien-Hsin Road, Jung-Li, Taoyuan County 320, Taiwan (China); Chen, M.-H. [Institute of Nuclear Energy Research (INER), No. 1000Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China); Lin, J.-M. [Taiwan Power Company (TPC), 242 Roosevelt Road, Section 3, Taipei 100, Taiwan (China)

    2007-05-15

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study.

  6. A verification study and trend analysis of simulated boundary layer wind fields over Europe

    Energy Technology Data Exchange (ETDEWEB)

    Lindenberg, Janna

    2011-07-01

    Simulated wind fields from regional climate models (RCMs) are increasingly used as a surrogate for observations which are costly and prone to homogeneity deficiencies. Compounding the problem, a lack of reliable observations makes the validation of the simulated wind fields a non trivial exercise. Whilst the literature shows that RCMs tend to underestimate strong winds over land these investigations mainly relied on comparisons with near surface measurements and extrapolated model wind fields. In this study a new approach is proposed using measurements from high towers and a robust validation process. Tower height wind data are smoother and thus more representative of regional winds. As benefit this approach circumvents the need to extrapolate simulated wind fields. The performance of two models using different downscaling techniques is evaluated. The influence of the boundary conditions on the simulation of wind statistics is investigated. Both models demonstrate a reasonable performance over flat homogeneous terrain and deficiencies over complex terrain, such as the Upper Rhine Valley, due to a too coarse spatial resolution ({proportional_to}50 km). When the spatial resolution is increased to 10 and 20 km respectively a benefit is found for the simulation of the wind direction only. A sensitivity analysis shows major deviations of international land cover data. A time series analysis of dynamically downscaled simulations is conducted. While the annual cycle and the interannual variability are well simulated, the models are less effective at simulating small scale fluctuations and the diurnal cycle. The hypothesis that strong winds are underestimated by RCMs is supported by means of a storm analysis. Only two-thirds of the observed storms are simulated by the model using a spectral nudging approach. In addition ''False Alarms'' are simulated, which are not detected in the observations. A trend analysis over the period 1961 - 2000 is conducted

  7. [Recontextualization of nursing clinical simulation based on Basil Bernstein: semiology of pedagogical practice].

    Science.gov (United States)

    dos Santos, Mateus Casanova; Leite, Maria Cecília Lorea; Heck, Rita Maria

    2010-12-01

    This is an investigative case study with descriptive and participative character, based on an educational experience with the Simulation in Nursing learning trigger. It was carried out during the second semester of the first cycle of Faculdade de Enfermagem (FEN), Universidade Federal de Pelotas (UFPel). The aim is to study the recontextualization of pedagogic practice of simulation-based theories developed by Basil Bernstein, an education sociologist, and to contribute with the improvement process of education planning, and especially the evaluation of learning trigger. The research shows that Bernstein's theory is a powerful tool semiotic pedagogical of practices which contributes to the planning and analysis of curricular educational device.

  8. Critical slowing down and error analysis in lattice QCD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, Stefan [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Sommer, Rainer; Virotta, Francesco [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC

    2010-09-15

    We study the critical slowing down towards the continuum limit of lattice QCD simulations with Hybrid Monte Carlo type algorithms. In particular for the squared topological charge we find it to be very severe with an effective dynamical critical exponent of about 5 in pure gauge theory. We also consider Wilson loops which we can demonstrate to decouple from the modes which slow down the topological charge. Quenched observables are studied and a comparison to simulations of full QCD is made. In order to deal with the slow modes in the simulation, we propose a method to incorporate the information from slow observables into the error analysis of physical observables and arrive at safer error estimates. (orig.)

  9. Critical slowing down and error analysis in lattice QCD simulations

    International Nuclear Information System (INIS)

    Schaefer, Stefan; Sommer, Rainer; Virotta, Francesco

    2010-09-01

    We study the critical slowing down towards the continuum limit of lattice QCD simulations with Hybrid Monte Carlo type algorithms. In particular for the squared topological charge we find it to be very severe with an effective dynamical critical exponent of about 5 in pure gauge theory. We also consider Wilson loops which we can demonstrate to decouple from the modes which slow down the topological charge. Quenched observables are studied and a comparison to simulations of full QCD is made. In order to deal with the slow modes in the simulation, we propose a method to incorporate the information from slow observables into the error analysis of physical observables and arrive at safer error estimates. (orig.)

  10. Tube Bulge Process : Theoretical Analysis and Finite Element Simulations

    International Nuclear Information System (INIS)

    Velasco, Raphael; Boudeau, Nathalie

    2007-01-01

    This paper is focused on the determination of mechanics characteristics for tubular materials, using tube bulge process. A comparative study is made between two different models: theoretical model and finite element analysis. The theoretical model is completely developed, based first on a geometrical analysis of the tube profile during bulging, which is assumed to strain in arc of circles. Strain and stress analysis complete the theoretical model, which allows to evaluate tube thickness and state of stress, at any point of the free bulge region. Free bulging of a 304L stainless steel is simulated using Ls-Dyna 970. To validate FE simulations approach, a comparison between theoretical and finite elements models is led on several parameters such as: thickness variation at the free bulge region pole with bulge height, tube thickness variation with z axial coordinate, and von Mises stress variation with plastic strain. Finally, the influence of geometrical parameters deviations on flow stress curve is observed using analytical model: deviations of the tube outer diameter, its initial thickness and the bulge height measurement are taken into account to obtain a resulting error on plastic strain and von Mises stress

  11. FE-Analysis of Stretch-Blow Moulded Bottles Using an Integrative Process Simulation

    Science.gov (United States)

    Hopmann, C.; Michaeli, W.; Rasche, S.

    2011-05-01

    The two-stage stretch-blow moulding process has been established for the large scale production of high quality PET containers with excellent mechanical and optical properties. The total production costs of a bottle are significantly caused by the material costs. Due to this dominant share of the bottle material, the PET industry is interested in reducing the total production costs by an optimised material efficiency. However, a reduced material inventory means decreasing wall thicknesses and therewith a reduction of the bottle properties (e.g. mechanical properties, barrier properties). Therefore, there is often a trade-off between a minimal bottle weight and adequate properties of the bottle. In order to achieve the objectives Computer Aided Engineering (CAE) techniques can assist the designer of new stretch-blow moulded containers. Hence, tools such as the process simulation and the structural analysis have become important in the blow moulding sector. The Institute of Plastics Processing (IKV) at RWTH Aachen University, Germany, has developed an integrative three-dimensional process simulation which models the complete path of a preform through a stretch-blow moulding machine. At first, the reheating of the preform is calculated by a thermal simulation. Afterwards, the inflation of the preform to a bottle is calculated by finite element analysis (FEA). The results of this step are e.g. the local wall thickness distribution and the local biaxial stretch ratios. Not only the material distribution but also the material properties that result from the deformation history of the polymer have significant influence on the bottle properties. Therefore, a correlation between the material properties and stretch ratios is considered in an integrative simulation approach developed at IKV. The results of the process simulation (wall thickness, stretch ratios) are transferred to a further simulation program and mapped on the bottles FE mesh. This approach allows a local

  12. A short introduction to digital simulations in electrochemistry: simulating the Cottrell experiment in NI LabVIEW

    Directory of Open Access Journals (Sweden)

    Soma Vesztergom

    2018-05-01

    Full Text Available A brief introduction to the use of digital simulations in electrochemistry is given by a detailed description of the simulation of Cottrell’s experiment in the LabVIEW programming language. A step-by-step approach is followed and different simulation techniques (explicit and implicit Euler, Runge–Kutta and Crank–Nicolson methods are applied. The applied techniques are introduced and discussed on the basis of Padé approximants. The paper might be found useful by undergraduate and graduate students familiarizing themselves with the digital simulation of electrochemical problems, as well as by university lecturers involved with the teaching of theoretical electrochemistry.

  13. Discrete processes modelling and geometry description in RTS and T code

    International Nuclear Information System (INIS)

    Degtyarev, I.I.; Liashenko, O.A.; Lokhovitskii, A.E.; Yazynin, I.A.; Belyakov-Bodin, V.I.; Blokhin, A.I.

    2001-01-01

    This paper describes the recent version of the RTS and T code system. RTS and T performs detailed simulations of many types of particles transport in complex 3D geometries in the energy range from a part of eV up to 20 TeV. A description of the main processes is given. (author)

  14. Simulation-based power calculations for planning a two-stage individual participant data meta-analysis.

    Science.gov (United States)

    Ensor, Joie; Burke, Danielle L; Snell, Kym I E; Hemming, Karla; Riley, Richard D

    2018-05-18

    Researchers and funders should consider the statistical power of planned Individual Participant Data (IPD) meta-analysis projects, as they are often time-consuming and costly. We propose simulation-based power calculations utilising a two-stage framework, and illustrate the approach for a planned IPD meta-analysis of randomised trials with continuous outcomes where the aim is to identify treatment-covariate interactions. The simulation approach has four steps: (i) specify an underlying (data generating) statistical model for trials in the IPD meta-analysis; (ii) use readily available information (e.g. from publications) and prior knowledge (e.g. number of studies promising IPD) to specify model parameter values (e.g. control group mean, intervention effect, treatment-covariate interaction); (iii) simulate an IPD meta-analysis dataset of a particular size from the model, and apply a two-stage IPD meta-analysis to obtain the summary estimate of interest (e.g. interaction effect) and its associated p-value; (iv) repeat the previous step (e.g. thousands of times), then estimate the power to detect a genuine effect by the proportion of summary estimates with a significant p-value. In a planned IPD meta-analysis of lifestyle interventions to reduce weight gain in pregnancy, 14 trials (1183 patients) promised their IPD to examine a treatment-BMI interaction (i.e. whether baseline BMI modifies intervention effect on weight gain). Using our simulation-based approach, a two-stage IPD meta-analysis has meta-analysis was appropriate. Pre-specified adjustment for prognostic factors would increase power further. Incorrect dichotomisation of BMI would reduce power by over 20%, similar to immediately throwing away IPD from ten trials. Simulation-based power calculations could inform the planning and funding of IPD projects, and should be used routinely.

  15. A comparative analysis of currently used microscopic and macroscopic traffic simulation software

    International Nuclear Information System (INIS)

    Ratrout Nedal T; Rahman Syed Masiur

    2009-01-01

    The significant advancements of information technology have contributed to increased development of traffic simulation models. These include microscopic models and broadening the areas of applications ranging from the modeling of specific components of the transportation system to a whole network having different kinds of intersections and links, even in a few cases combining travel demand models. This paper mainly reviews the features of traditionally used macroscopic and microscopic traffic simulation models along with a comparative analysis focusing on freeway operations, urban congested networks, project-level emission modeling, and variations in delay and capacity estimates. The models AIMSUN, CORSIM, and VISSIM are found to be suitable for congested arterials and freeways, and integrated networks of freeways and surface streets. The features of AIMSUN are favorable for creating large urban and regional networks. The models AIMSUN, PARAMICS, INTEGRATION, and CORSIM are potentially useful for Intelligent Transportation System (ITS). There are a few simulation models which are developed focusing on ITS such as MITSIMLab. The TRAF-family and HUTSIM models attempt a system-level simulation approach and develop open environments where several analysis models can be used interactively to solve traffic simulation problems. In Saudi Arabia, use of simulation software with the capability of analyzing an integrated system of freeways and surface streets has not been reported. Calibration and validation of simulation software either for freeways or surface streets has been reported. This paper suggests that researchers evaluate the state-of-the-art simulation tools and find out the suitable tools or approaches for the local conditions of Saudi Arabia. (author)

  16. Thermodynamic approach to rheological modeling and simulations at the configuration space level of description

    NARCIS (Netherlands)

    Jongschaap, R.J.J.; Denneman, A.I.M.; Denneman, A.I.M.; Conrads, W.

    1997-01-01

    The so-called matrix model is a general thermodynamic framework for microrheological modeling. This model has already been proven to be applicable for a wide class of systems, in particular to models formulated at the configuration tensor level of description. For models formulated at the

  17. Analysis and Simulation of Multi-target Echo Signals from a Phased Array Radar

    OpenAIRE

    Jia Zhen; Zhou Rui

    2017-01-01

    The construction of digital radar simulation systems has been a research hotspot of the radar field. This paper focuses on theoretical analysis and simulation of multi-target echo signals produced in a phased array radar system, and constructs an array antenna element and a signal generation environment. The antenna element is able to simulate planar arrays and optimizes these arrays by adding window functions. And the signal environment can model and simulate radar transmission signals, rada...

  18. Development and new applications of quantum chemical simulation methodology

    International Nuclear Information System (INIS)

    Weiss, A. K. H.

    2012-01-01

    The Division of Theoretical Chemistry at the University of Innsbruck is focused on the study of chemical compounds in aqueous solution, in terms of mainly hybrid quantum mechanical / molecular mechanical molecular dynamics simulations (QM/MM MD). Besides the standard means of data analysis employed for such simulations, this study presents several advanced and capable algorithms for the description of structural and dynamic properties of the simulated species and its hydration. The first part of this thesis further presents selected exemplary simulations, in particular a comparative study of Formamide and N-methylformamide, Guanidinium, and Urea. An included review article further summarizes the major advances of these studies. The computer programs developed in the course of this thesis are by now well established in the research field. The second part of this study presents the theory and a development guide for a quantum chemical program, QuMuLuS, that is by now used as a QM program for recent QM/MM simulations at the division. In its course, this part presents newly developed algorithms for electron integral evaluation and point charge embedding. This program is validated in terms of benchmark computations. The associated theory is presented on a detailed level, to serve as a source for contemporary and future studies in the division. In the third and final part, further investigations of related topics are addressed. This covers additional schemes of molecular simulation analysis, new software, as well as a mathematical investigation of a non-standard two-electron integral. (author)

  19. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    International Nuclear Information System (INIS)

    Hartwig, Zachary S.

    2016-01-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  20. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Hartwig, Zachary S., E-mail: hartwig@mit.edu

    2016-04-11

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  1. FLAG Simulations of the Elasticity Test Problem of Gavrilyuk et al.

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Runnels, Scott R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Canfield, Thomas R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Carney, Theodore C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-04-23

    This report contains a description of the impact problem used to compare hypoelastic and hyperelastic material models, as described by Gavrilyuk, Favrie & Saurel. That description is used to set up hypoelastic simulations in the FLAG hydrocode.

  2. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  3. A study of the feasibility of statistical analysis of airport performance simulation

    Science.gov (United States)

    Myers, R. H.

    1982-01-01

    The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.

  4. Evaluating current trends in psychiatric music therapy: a descriptive analysis.

    Science.gov (United States)

    Silverman, Michael J

    2007-01-01

    , improvisation, songwriting, lyric analysis, and music and movement to address consumer objectives. Participants indicated they used therapeutic verbal skills and techniques such as humor, redirection, reinforcement, empathy, and affirmation in their clinical practice. Additionally, the results of this survey were compared to the psychiatric portion of a music therapy descriptive study published in 1979. Similarities and differences are discussed.

  5. Micromagnetic simulations of cylindrical magnetic nanowires

    KAUST Repository

    Ivanov, Yurii P.; Chubykalo-Fesenko, O.

    2015-01-01

    This chapter reviews micromagnetic simulations of cylindrical magnetic nanowires and their ordered arrays. It starts with a description of the theoretical background of micromagnetism. The chapter discusses main magnetization reversal modes, domain

  6. Integrating atomistic molecular dynamics simulations, experiments, and network analysis to study protein dynamics

    DEFF Research Database (Denmark)

    Papaleo, Elena

    2015-01-01

    that we observe and the functional properties of these important cellular machines. To make progresses in this direction, we need to improve the physical models used to describe proteins and solvent in molecular dynamics, as well as to strengthen the integration of experiments and simulations to overcome...... with the possibility to validate simulation methods and physical models against a broad range of experimental observables. On the other side, it also allows a complementary and comprehensive view on protein structure and dynamics. What is needed now is a better understanding of the link between the dynamic properties...... simulations with attention to the effects that can be propagated over long distances and are often associated to important biological functions. In this context, approaches inspired by network analysis can make an important contribution to the analysis of molecular dynamics simulations....

  7. BRENDA: a dynamic simulator for a sodium-cooled fast reactor power plant

    International Nuclear Information System (INIS)

    Hetrick, D.L.; Sowers, G.W.

    1978-06-01

    This report is a users' manual for one version of BRENDA (Breeder Reactor Nuclear Dynamic Analysis), which is a digital program for simulating the dynamic behavior of a sodium-cooled fast reactor power plant. This version, which contains 57 differential equations, represents a simplified model of the Clinch River Breeder Reactor Project (CRBRP). BRENDA is an input deck for DARE P (Differential Analyzer Replacement, Portable), which is a continuous-system simulation language developed at the University of Arizona. This report contains brief descriptions of DARE P and BRENDA, instructions for using BRENDA in conjunction with DARE P, and some sample output. A list of variable names and a listing for BRENDA are included as appendices

  8. Automated detection and analysis of particle beams in laser-plasma accelerator simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.; Bethel, E. Wes; Jacobsen, J.; Prabhat; Ruebel, O.; Weber, G.; Hamann, B.

    2010-01-01

    Numerical simulations of laser-plasma wakefield (particle) accelerators model the acceleration of electrons trapped in plasma oscillations (wakes) left behind when an intense laser pulse propagates through the plasma. The goal of these simulations is to better understand the process involved in plasma wake generation and how electrons are trapped and accelerated by the wake. Understanding of such accelerators, and their development, offer high accelerating gradients, potentially reducing size and cost of new accelerators. One operating regime of interest is where a trapped subset of electrons loads the wake and forms an isolated group of accelerated particles with low spread in momentum and position, desirable characteristics for many applications. The electrons trapped in the wake may be accelerated to high energies, the plasma gradient in the wake reaching up to a gigaelectronvolt per centimeter. High-energy electron accelerators power intense X-ray radiation to terahertz sources, and are used in many applications including medical radiotherapy and imaging. To extract information from the simulation about the quality of the beam, a typical approach is to examine plots of the entire dataset, visually determining the adequate parameters necessary to select a subset of particles, which is then further analyzed. This procedure requires laborious examination of massive data sets over many time steps using several plots, a routine that is unfeasible for large data collections. Demand for automated analysis is growing along with the volume and size of simulations. Current 2D LWFA simulation datasets are typically between 1GB and 100GB in size, but simulations in 3D are of the order of TBs. The increase in the number of datasets and dataset sizes leads to a need for automatic routines to recognize particle patterns as particle bunches (beam of electrons) for subsequent analysis. Because of the growth in dataset size, the application of machine learning techniques for

  9. Congress ANPCONT: descriptive and evaluative bibliometric analysis of the articles published from 2007 to 2011 - doi: 10.4025/enfoque.v31i3.16946

    Directory of Open Access Journals (Sweden)

    Eduardo Bona Safe de Matos

    2012-12-01

    Full Text Available The analysis of congresses, scientific journals and scientific production is held in different areas of science, with objective of understanding the authors’ profile, the production characteristics or evaluate those qualities. Despite using different methods and tools, the objectives commonly propose to develop the science and characterization of the production in different areas. This paper is aimed with two objectives related to two different areas of the bibliometrics, evaluative and descriptive, characterizing its differential. The general objective consists in the analysis and understanding of the characteristics of the works published in the Congress ANPCONT, and the specific objectives are the showing of the authors’ profile, of the productions and use the of the references. The methods used are bibliometrics and, for the descriptive analysis it is applied the Lotka Law, which determines the productivity of the authors. For the evaluative analysis it is applied the study of the typology of the references used. As a result, in respect to the authors profile, thy are, in general, connected to the academy, professors and doctors, with predominance of author from Universidade de São Paulo, Fucape Business School and Universidade Regional de Blumenau. In respect to the descriptive analysis, the production does not adjust to the Lotka Law, and in respect to the evaluative analysis is observed that there is a predominance of use of international papers in the researches. It is concluded that the production demonstrated a development during the years, and some authors, universities and states are highlighted in the congress production.

  10. [Clinical Simulation and Emotional Learning].

    Science.gov (United States)

    Afanador, Adalberto Amaya

    2012-01-01

    At present, the clinical simulation has been incorporated into medical school curriculum. It is considered that the simulation is useful to develop skills, and as such its diffusion. Within the acquisition of skills, meaningful learning is an essential emotional component for the student and this point is essential to optimize the results of the simulation experience. Narrative description on the subject of simulation and the degree of "emotionality." The taxonomy is described for the types of clinical simulation fidelity and correlates it with the degree of emotionality required to achieve significant and lasting learning by students. It is essential to take into account the student's level of emotion in the learning process through simulation strategy. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  11. A Descriptive Analysis of Care Provided by Law Enforcement Prior to EMS Arrival in the United States.

    Science.gov (United States)

    Klassen, Aaron B; Core, S Brent; Lohse, Christine M; Sztajnkrycer, Matthew D

    2018-04-01

    Study Objectives Law enforcement is increasingly viewed as a key component in the out-of-hospital chain of survival, with expanded roles in cardiac arrest, narcotic overdose, and traumatic bleeding. Little is known about the nature of care provided by law enforcement prior to the arrival of Emergency Medical Services (EMS) assets. The purpose of the current study was to perform a descriptive analysis of events reported to a national EMS database. This study was a descriptive analysis of the 2014 National Emergency Medical Services Information System (NEMSIS) public release research data set, containing EMS emergency response data from 41 states. Code E09_02 1200 specifically identifies care provided by law enforcement prior to EMS arrival. A total of 25,835,729 unique events were reported. Of events in which pre-arrival care was documented, 2.0% received prior aid by law enforcement. Patients receiving law enforcement care prior to EMS arrival were more likely to be younger (52.8 [SD=23.3] years versus 58.7 [SD=23.3] years), male (54.8% versus 46.7%), and white (80.3% versus 77.5%). Basic Life Support (BLS) EMS response was twice as likely in patients receiving prior aid by law enforcement. Multiple-casualty incidents were five times more likely with prior aid by law enforcement. Compared with prior aid by other services, law enforcement pre-arrival care was more likely with motor vehicle accidents, firearm assaults, knife assaults, blunt assaults, and drug overdoses, and less likely at falls and childbirths. Cardiac arrest was significantly more common in patients receiving prior aid by law enforcement (16.5% versus 2.6%). Tourniquet application and naloxone administration were more common in the law enforcement prior aid group. Where noted, law enforcement pre-arrival care occurs in 2.0% of EMS patient encounters. The majority of cases involve cardiac arrest, motor vehicle accidents, and assaults. Better understanding of the nature of law enforcement care is

  12. Dynamic wind turbine models in power system simulation tool DIgSILENT

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, A.C.; Jauch, C.; Soerensen, P.; Iov, F.; Blaabjerg, F.

    2003-12-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). The initialisation issues on the wind turbine models into the power system simulation are also presented. However, the main attention in this report is drawn to the modelling at the system level of two wind turbine concepts: 1. Active stall wind turbine with induction generator 2. Variable speed, variable pitch wind turbine with doubly fed induction generator. These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies, connection of the wind turbine at different types of grid and storage systems. For both these two concepts, control strategies are developed and implemented, their performance assessed and discussed by means of simulations. (au)

  13. Fluid Flow Simulation and Energetic Analysis of Anomalocarididae Locomotion

    Science.gov (United States)

    Mikel-Stites, Maxwell; Staples, Anne

    2014-11-01

    While an abundance of animal locomotion simulations have been performed modeling the motions of living arthropods and aquatic animals, little quantitative simulation and reconstruction of gait parameters has been done to model the locomotion of extinct animals, many of which bear little physical resemblance to their modern descendants. To that end, this project seeks to analyze potential swimming patterns used by the anomalocaridid family, (specifically Anomalocaris canadensis, a Cambrian Era aquatic predator), and determine the most probable modes of movement. This will serve to either verify or cast into question the current assumed movement patterns and properties of these animals and create a bridge between similar flexible-bodied swimmers and their robotic counterparts. This will be accomplished by particle-based fluid flow simulations of the flow around the fins of the animal, as well as an energy analysis of a variety of sample gaits. The energy analysis will then be compared to the extant information regarding speed/energy use curves in an attempt to determine which modes of swimming were most energy efficient for a given range of speeds. These results will provide a better understanding of how these long-extinct animals moved, possibly allowing an improved understanding of their behavioral patterns, and may also lead to a novel potential platform for bio-inspired underwater autonomous vehicles (UAVs).

  14. Visualization and Analysis of Climate Simulation Performance Data

    Science.gov (United States)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  15. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    International Nuclear Information System (INIS)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.; Wecksung, M.J.; Willcutt, G.J.E. Jr.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework

  16. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  17. Charge transfer through single molecule contacts: How reliable are rate descriptions?

    Directory of Open Access Journals (Sweden)

    Denis Kast

    2011-08-01

    Full Text Available Background: The trend for the fabrication of electrical circuits with nanoscale dimensions has led to impressive progress in the field of molecular electronics in the last decade. However, a theoretical description of molecular contacts as the building blocks of future devices is challenging, as it has to combine the properties of Fermi liquids in the leads with charge and phonon degrees of freedom on the molecule. Outside of ab initio schemes for specific set-ups, generic models reveal the characteristics of transport processes. Particularly appealing are descriptions based on transfer rates successfully used in other contexts such as mesoscopic physics and intramolecular electron transfer. However, a detailed analysis of this scheme in comparison with numerically exact solutions is still elusive.Results: We show that a formulation in terms of transfer rates provides a quantitatively accurate description even in domains of parameter space where strictly it is expected to fail, e.g., at lower temperatures. Typically, intramolecular phonons are distributed according to a voltage driven steady state that can only roughly be captured by a thermal distribution with an effective elevated temperature (heating. An extension of a master equation for the charge–phonon complex, to effectively include the impact of off-diagonal elements of the reduced density matrix, provides very accurate solutions even for stronger electron–phonon coupling.Conclusion: Rate descriptions and master equations offer a versatile model to describe and understand charge transfer processes through molecular junctions. Such methods are computationally orders of magnitude less expensive than elaborate numerical simulations that, however, provide exact solutions as benchmarks. Adjustable parameters obtained, e.g., from ab initio calculations allow for the treatment of various realizations. Even though not as rigorously formulated as, e.g., nonequilibrium Green’s function

  18. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    Science.gov (United States)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  19. Water hammer analysis. Dynamic simulation model check valve piston; Analisis del golpe de airete. Modelo de simulacion dinamica de valvula de retencion piston

    Energy Technology Data Exchange (ETDEWEB)

    Royo, B.; Valdes, R.

    2012-11-01

    This report contains the description and the results of the dynamic simulation model that has been developed to predict the behaviour of one of our lift check valve design. The aim of the model is not only to simulate the closing process of the valve and to product the magnitude of the water hammer effect that may appear immediately after the valve closing, but also to simulate several design version until obtain the optimum which further minimizes such effect. the input data used for this study ensure reliable results since they represent a real system. (Author)

  20. Acoustic Analysis and Design of the E-STA MSA Simulator

    Science.gov (United States)

    Bittinger, Samantha A.

    2016-01-01

    The Orion European Service Module Structural Test Article (E-STA) Acoustic Test was completed in May 2016 to verify that the European Service Module (ESM) can withstand qualification acoustic environments. The test article required an aft closeout to simulate the Multi-Purpose Crew Vehicle (MPCV) Stage Adapter (MSA) cavity, however, the flight MSA design was too cost-prohibitive to build. NASA Glenn Research Center (GRC) had 6 months to design an MSA Simulator that could recreate the qualification prediction MSA cavity sound pressure level to within a reasonable tolerance. This paper summarizes the design and analysis process to arrive at a design for the MSA Simulator, and then compares its performance to the final prediction models created prior to test.

  1. Status of CHAP: composite HTGR analysis program

    International Nuclear Information System (INIS)

    Secker, P.A.; Gilbert, J.S.

    1975-12-01

    Development of an HTGR accident simulation program is in progress for the prediction of the overall HTGR plant transient response to various initiating events. The status of the digital computer program named CHAP (Composite HTGR Analysis Program) as of June 30, 1975, is given. The philosophy, structure, and capabilities of the CHAP code are discussed. Mathematical descriptions are given for those HTGR components that have been modeled. Component model validation and evaluation using auxiliary analysis codes are also discussed

  2. Security Analysis of Smart Grid Cyber Physical Infrastructures Using Modeling and Game Theoretic Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Abercrombie, Robert K [ORNL; Sheldon, Frederick T. [University of Idaho

    2015-01-01

    Cyber physical computing infrastructures typically consist of a number of sites are interconnected. Its operation critically depends both on cyber components and physical components. Both types of components are subject to attacks of different kinds and frequencies, which must be accounted for the initial provisioning and subsequent operation of the infrastructure via information security analysis. Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the electric sector failure scenarios and impact analyses by the NESCOR Working Group Study, From the Section 5 electric sector representative failure scenarios; we extracted the four generic failure scenarios and grouped them into three specific threat categories (confidentiality, integrity, and availability) to the system. These specific failure scenarios serve as a demonstration of our simulation. The analysis using our ABGT simulation demonstrates how to model the electric sector functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the cyber physical infrastructure network with respect to CIA.

  3. Progress on RTSS simulation-based analysis for real-time systems development at two laboratories

    International Nuclear Information System (INIS)

    Shu, Y.; Jia, M.; Fei, Y.; Zhang, Y.; Liu, G.; Yang, S.; Chen, Y.

    1996-01-01

    A new object-oriented Real Time System Simulator (RTSS) with the capability for simulation graphics and animation, has been developed and used for modeling the distributed data acquisition and processing systems at JET and ASIPP. Simulation allows estimates of response time, throughput and resource utilization for a variety of configurations to be investigated. Performance measurements, simulation and analysis are used together to calibrate and validate each other

  4. Magnetic field simulation and shimming analysis of 3.0T superconducting MRI system

    Science.gov (United States)

    Yue, Z. K.; Liu, Z. Z.; Tang, G. S.; Zhang, X. C.; Duan, L. J.; Liu, W. C.

    2018-04-01

    3.0T superconducting magnetic resonance imaging (MRI) system has become the mainstream of modern clinical MRI system because of its high field intensity and high degree of uniformity and stability. It has broad prospects in scientific research and other fields. We analyze the principle of magnet designing in this paper. We also perform the magnetic field simulation and shimming analysis of the first 3.0T/850 superconducting MRI system in the world using the Ansoft Maxwell simulation software. We guide the production and optimization of the prototype based on the results of simulation analysis. Thus the magnetic field strength, magnetic field uniformity and magnetic field stability of the prototype is guided to achieve the expected target.

  5. Regional hydrogeological simulations using CONECTFLOW. Preliminary site description. Laxemar sub area - version 1.2

    International Nuclear Information System (INIS)

    Hartley, Lee; Hunter, Fiona; Jackson, Peter; McCarthy, Rachel; Gylling, Bjoern; Marsic, Niko

    2006-04-01

    The main objective of this study is to support the development of a preliminary Site Description of the Laxemar subarea on a regional-scale based on the available data of November 2004 (Data Freeze L1.2). A more specific objective of this study is to assess the role of both known and less quantified hydrogeological conditions in determining the present-day distribution of saline groundwater in the Laxemar subarea on a regional-scale. An improved understanding of the palaeo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale, as well as predictions of future hydrogeological conditions. Another objective is to assess the flow-paths from the local-scale model domain, based on the present-day flow conditions, to assess the distribution of discharge and recharge areas connected to the flow at the approximate repository depth to inform the Preliminary Safety Evaluation. Significant new features incorporated in the modelling include: a depth variation in hydraulic properties within the deformation zones; a dependence on rock domain and depth in the rock mass properties in regional-scale models; a more detailed model of the overburden in terms of a layered system of spatially variable thickness made up of several different types of Quaternary deposits has been implemented; and several variants on the position of the watertable have been tried. The motivation for introducing a dependence on rock domain was guided by the hydrogeological interpretation with the aim of honouring the observed differences in hydraulic properties measured at the boreholes

  6. Regional hydrogeological simulations using CONECTFLOW. Preliminary site description. Laxemar sub area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Hunter, Fiona; Jackson, Peter; McCarthy, Rachel [Serco Assurance, Risley (United Kingdom); Gylling, Bjoern; Marsic, Niko [Kemakta Konsult AB, Stockholm (Sweden)

    2006-04-15

    The main objective of this study is to support the development of a preliminary Site Description of the Laxemar subarea on a regional-scale based on the available data of November 2004 (Data Freeze L1.2). A more specific objective of this study is to assess the role of both known and less quantified hydrogeological conditions in determining the present-day distribution of saline groundwater in the Laxemar subarea on a regional-scale. An improved understanding of the palaeo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale, as well as predictions of future hydrogeological conditions. Another objective is to assess the flow-paths from the local-scale model domain, based on the present-day flow conditions, to assess the distribution of discharge and recharge areas connected to the flow at the approximate repository depth to inform the Preliminary Safety Evaluation. Significant new features incorporated in the modelling include: a depth variation in hydraulic properties within the deformation zones; a dependence on rock domain and depth in the rock mass properties in regional-scale models; a more detailed model of the overburden in terms of a layered system of spatially variable thickness made up of several different types of Quaternary deposits has been implemented; and several variants on the position of the watertable have been tried. The motivation for introducing a dependence on rock domain was guided by the hydrogeological interpretation with the aim of honouring the observed differences in hydraulic properties measured at the boreholes.

  7. Computer code for the atomistic simulation of lattice defects and dynamics

    International Nuclear Information System (INIS)

    Schiffgens, J.O.; Graves, N.J.; Oster, C.A.

    1980-04-01

    This document has been prepared to satisfy the need for a detailed, up-to-date description of a computer code that can be used to simulate phenomena on an atomistic level. COMENT was written in FORTRAN IV and COMPASS (CDC assembly language) to solve the classical equations of motion for a large number of atoms interacting according to a given force law, and to perform the desired ancillary analysis of the resulting data. COMENT is a dual-purpose intended to describe static defect configurations as well as the detailed motion of atoms in a crystal lattice. It can be used to simulate the effect of temperature, impurities, and pre-existing defects on radiation-induced defect production mechanisms, defect migration, and defect stability

  8. Self-care 3 months after attending chronic obstructive pulmonary disease patient education: a qualitative descriptive analysis

    DEFF Research Database (Denmark)

    Mousing, Camilla A; Lomborg, Kirsten

    2012-01-01

    Purpose: The authors performed a qualitative descriptive analysis to explore how group patient education influences the self-care of patients with chronic obstructive pulmonary disease. Patients and methods: In the period 2009–2010, eleven patients diagnosed with chronic obstructive pulmonary...... their symptoms, and that the social aspect of patient education had motivated them to utilize their new habits after finishing the course. The data indicate that patients need a period of adjustment (a "ripening period"): it took time for patients to integrate new habits and competencies into everyday life...

  9. Learning with STEM Simulations in the Classroom: Findings and Trends from a Meta-Analysis

    Science.gov (United States)

    D'Angelo, Cynthia M.; Rutstein, Daisy; Harris, Christopher J.

    2016-01-01

    This article presents a summary of the findings of a systematic review and meta-analysis of the literature on computer-based interactive simulations for K-12 science, technology, engineering, and mathematics (STEM) learning topics. For achievement outcomes, simulations had a moderate to strong effect on student learning. Overall, simulations have…

  10. RELAP5 Model Description and Validation for the BR2 Loss-of-Flow Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Licht, J. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States); Van den Branden, G. [Argonne National Lab. (ANL), Argonne, IL (United States); Sikik, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Koonen, E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-07-01

    This paper presents a description of the RELAP5 model, the calibration method used to obtain the minor loss coefficients from the available hydraulic data and the LOFA simulation results compared to the 1963 experimental tests for HEU fuel.

  11. Simulation of reactor noise analysis measurement for light-water critical assembly TCA using MCNP-DSP

    International Nuclear Information System (INIS)

    Yamamoto, Toshihiro; Sakurai, Kiyoshi; Tonoike, Kotaro; Miyoshi, Yoshinori

    2001-01-01

    Reactor noise analysis methods using Monte Carlo technique have been proposed and developed in the field of nuclear criticality safety. The Monte Carlo simulation for noise analysis can be made by simulating physical phenomena in the course of neutron transport in a nuclear fuel as practically as possible. MCNP-DSP was developed by T. Valentine of ORNL for this purpose and it is a modified version of MCNP-4A. The authors applied this code to frequency analysis measurements performed in light-water critical assembly TCA. Prompt neutron generation times for critical and subcritical cores were measured by doing the frequency analysis of detector signals. The Monte Carlo simulations for these experiments were carried out using MCNP-DSP, and prompt neutron generation times were calculated. (author)

  12. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    Science.gov (United States)

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  13. A proposed descriptive methodology for environmental geologic (envirogeologic) site characterization

    International Nuclear Information System (INIS)

    Schwarz, D.L.; Snyder, W.S.

    1994-01-01

    We propose a descriptive methodology for use in environmental geologic (envirogeologic) site characterization. The method uses traditional sedimentologic descriptions augmented by environmental data needs, and facies analysis. Most other environmental methodologies for soil and sediment characterization use soil engineering and engineering geology techniques that classify by texture and engineering properties. This technique is inadequate for envirogeologic characterization of sediments. In part, this inadequacy is due to differences in the grain-size between the Unified soil Classification and the Udden-Wentworth scales. Use of the soil grain-size classification could easily cause confusion when attempting to relate descriptions based on this classification to our basic understanding of sedimentary depositional systems. The proposed envirogeologic method uses descriptive parameters to characterize a sediment sample, suggests specific tests on samples for adequate characterization, and provides a guidelines for subsurface facies analysis, based on data retrieved from shallow boreholes, that will allow better predictive models to be developed. This methodology should allow for both a more complete site assessment, and provide sufficient data for selection of the appropriate remediation technology, including bioremediation. 50 refs

  14. Frozen-shower simulation of electromagnetic showers in the ATLAS forward calorimeter

    CERN Document Server

    Gasnikova, Ksenia; The ATLAS collaboration

    2016-01-01

    Accurate simulation of calorimeter response for high energy electromagnetic particles is essential for the LHC experiments. Detailed simulation of the electromagnetic showers using Geant4 is however very CPU intensive and various fast simulation methods were proposed instead. The frozen shower simulation substitutes the full propagation of the showers for energies below 1~GeV by showers taken from a pre-simulated library. The method is used for production of the main ATLAS Monte Carlo samples, greatly improving the production time. The frozen showers describe shower shapes, sampling fraction, sampling and noise-related fluctuations very well, while description of the constant term, related to calorimeter non-uniformity, requires a careful choice of the shower library binning. A new method is proposed to tune the binning variables, using multivariate techniques. The method is tested and optimized for the description of the ATLAS forward calorimeter.

  15. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  16. The Satisfaction of Entrepreneurs in Terengganu Private Limited Companies toward the Concept of Corporate Entrepreneurship: A Descriptive Analysis

    OpenAIRE

    Muhammad Abi Sofian Abdul Halim; Hasyaniza Yahya; Syahrul Hezrin Mahmud; Fatimah Zainab

    2011-01-01

    This study will determine the level of satisfaction among Terengganu private limited companies towards the concept of corporate entrepreneurship, in a context of; corporate venturing, strategic renewal, and internalization. The study was based on a survey carried out from the questionnaire that is administered involving 105 private limited companies which are operated their business in Terengganu. By using the descriptive analysis on the level of satisfaction among the Terengganu private limi...

  17. Multiwaveband simulation-based signature analysis of camouflaged human dismounts in cluttered environments with TAIThermIR and MuSES

    Science.gov (United States)

    Packard, Corey D.; Klein, Mark D.; Viola, Timothy S.; Hepokoski, Mark A.

    2016-10-01

    The ability to predict electro-optical (EO) signatures of diverse targets against cluttered backgrounds is paramount for signature evaluation and/or management. Knowledge of target and background signatures is essential for a variety of defense-related applications. While there is no substitute for measured target and background signatures to determine contrast and detection probability, the capability to simulate any mission scenario with desired environmental conditions is a tremendous asset for defense agencies. In this paper, a systematic process for the thermal and visible-through-infrared simulation of camouflaged human dismounts in cluttered outdoor environments is presented. This process, utilizing the thermal and EO/IR radiance simulation tool TAIThermIR (and MuSES), provides a repeatable and accurate approach for analyzing contrast, signature and detectability of humans in multiple wavebands. The engineering workflow required to combine natural weather boundary conditions and the human thermoregulatory module developed by ThermoAnalytics is summarized. The procedure includes human geometry creation, human segmental physiology description and transient physical temperature prediction using environmental boundary conditions and active thermoregulation. Radiance renderings, which use Sandford-Robertson BRDF optical surface property descriptions and are coupled with MODTRAN for the calculation of atmospheric effects, are demonstrated. Sensor effects such as optical blurring and photon noise can be optionally included, increasing the accuracy of detection probability outputs that accompany each rendering. This virtual evaluation procedure has been extensively validated and provides a flexible evaluation process that minimizes the difficulties inherent in human-subject field testing. Defense applications such as detection probability assessment, camouflage pattern evaluation, conspicuity tests and automatic target recognition are discussed.

  18. Battery Simulation Tool for Worst Case Analysis and Mission Evaluations

    Directory of Open Access Journals (Sweden)

    Lefeuvre Stéphane

    2017-01-01

    The first part of this paper presents the PSpice models including their respective variable parameters at SBS and cell level. Then the second part of the paper introduces to the reader the model parameters that were chosen and identified to perform Monte Carlo Analysis simulations. The third part reflects some MCA results for a VES16 battery module. Finally the reader will see some other simulations that were performed by re-using the battery model for an another Saft battery cell type (MP XTD for a specific space application, at high temperature.

  19. Uncovering the Hidden Dimensions of Meaning in Descriptions of Educational Practice.

    Science.gov (United States)

    Harris, Ilene B.

    Descriptions of educational practice offer an array of important, but typically hidden dimensions of meaning which provide potentially rich resources for understanding the practices. This paper illustrates: (1) how analysis, interpretations, and assessments interpenetrate what appear to be descriptions and suggest how readers can tease out these…

  20. Multiscale simulation of water flow past a C540 fullerene

    DEFF Research Database (Denmark)

    Walther, Jens Honore; Praprotnik, Matej; Kotsalis, Evangelos M.

    2012-01-01

    We present a novel, three-dimensional, multiscale algorithm for simulations of water flow past a fullerene. We employ the Schwarz alternating overlapping domain method to couple molecular dynamics (MD) of liquid water around the C540 buckyball with a Lattice–Boltzmann (LB) description for the Nav......We present a novel, three-dimensional, multiscale algorithm for simulations of water flow past a fullerene. We employ the Schwarz alternating overlapping domain method to couple molecular dynamics (MD) of liquid water around the C540 buckyball with a Lattice–Boltzmann (LB) description...

  1. CHEMICAL REACTIONS ON ADSORBING SURFACE: KINETIC LEVEL OF DESCRIPTION

    Directory of Open Access Journals (Sweden)

    P.P.Kostrobii

    2003-01-01

    Full Text Available Based on the effective Hubbard model we suggest a statistical description of reaction-diffusion processes for bimolecular chemical reactions of gas particles adsorbed on the metallic surface. The system of transport equations for description of particles diffusion as well as reactions is obtained. We carry out the analysis of the contributions of all physical processes to the formation of diffusion coefficients and chemical reactions constants.

  2. An Efficient Explicit-time Description Method for Timed Model Checking

    Directory of Open Access Journals (Sweden)

    Hao Wang

    2009-12-01

    Full Text Available Timed model checking, the method to formally verify real-time systems, is attracting increasing attention from both the model checking community and the real-time community. Explicit-time description methods verify real-time systems using general model constructs found in standard un-timed model checkers. Lamport proposed an explicit-time description method using a clock-ticking process (Tick to simulate the passage of time together with a group of global variables to model time requirements. Two methods, the Sync-based Explicit-time Description Method using rendezvous synchronization steps and the Semaphore-based Explicit-time Description Method using only one global variable were proposed; they both achieve better modularity than Lamport's method in modeling the real-time systems. In contrast to timed automata based model checkers like UPPAAL, explicit-time description methods can access and store the current time instant for future calculations necessary for many real-time systems, especially those with pre-emptive scheduling. However, the Tick process in the above three methods increments the time by one unit in each tick; the state spaces therefore grow relatively fast as the time parameters increase, a problem when the system's time period is relatively long. In this paper, we propose a more efficient method which enables the Tick process to leap multiple time units in one tick. Preliminary experimental results in a high performance computing environment show that this new method significantly reduces the state space and improves both the time and memory efficiency.

  3. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings

    International Nuclear Information System (INIS)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241 Pu and 208-keV 237 U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  4. Description and validation of ANTEO, an optimised PC code the thermalhydraulic analysis of fuel bundles

    International Nuclear Information System (INIS)

    Cevolani, S.

    1995-01-01

    The paper deals with the description of a Personal Computer oriented subchannel code, devoted to the steady state thermal hydraulic analysis of nuclear reactor fuel bundles. The development of such a code was made possible by two facts: firstly, the increase, in the computing power of the desk machines; secondly, the fact that several years of experience into operate subchannels codes have shown how to simplify many of the physical models without a sensible loss of accuracy. For sake of validation, the developed code was compared with a traditional subchannel code, the COBRA one. The results of the comparison show a very good agreement between the two codes. (author)

  5. Descriptive sensory analysis of Aceto Balsamico Tradizionale di Modena DOP and Aceto Balsamico Tradizionale di Reggio Emilia DOP.

    Science.gov (United States)

    Zeppa, Giuseppe; Gambigliani Zoccoli, Mario; Nasi, Enrico; Masini, Giovanni; Meglioli, Giuseppe; Zappino, Matteo

    2013-12-01

    Aceto Balsamico Tradizionale (ABT) is a typical Italian vinegar available in two different forms: Aceto Balsamico Tradizionale di Modena DOP (ABTM) and Aceto Balsamico Tradizionale di Reggio Emilia DOP (ABTRE). ABT is obtained by alcoholic fermentation and acetic bio-oxidation of cooked grape must and aged at least 12 years in wooden casks and is known and sold around the world. Despite this widespread recognition, data on sensory characteristics of these products are very scarce. Therefore a descriptive analysis was conducted to define a lexicon for the ABT sensory profile and to create a simple, stable and reproducible synthetic ABT for training panellists. A lexicon of 20 sensory parameters was defined and validated and a synthetic ABT was prepared as standard reference. Simple standards for panellist training were also defined and the sensory profiles of ABTM and ABTRE were obtained. The obtained results confirm that descriptive analysis can be used for the sensory characterisation of ABT and that the sensory profiles of ABTM and ABTRE are very different. Furthermore, the results demonstrate that a lexicon and proper standard references are essential for describing the sensory qualities of ABT both for technical purposes and to protect the product from commercial fraud. © 2013 Society of Chemical Industry.

  6. ELESTRES.M11K program users'manual and description

    International Nuclear Information System (INIS)

    Suk, H. C.; Hwang, W.; Kim, B. G.; Sim, K. S.; Heo, Y. H.; Byun, T. S.; Park, G. S.

    1992-12-01

    ELESTRES.M11K is a computer program for simulating the behaviour of UO 2 fuel elements under normal operating conditions of a CANDU reactor. It computers the one-dimensional temperature distribution and thermal expansion of the fuel pellets, and computes two-dimensional pellet deformation using FEM. The amount of fission gas released and sheath strain/stress are also computed. This document is intended as a users' manual and description for ELESTRES.M11K program. (Author)

  7. On Monte Carlo Simulation and Analysis of Electricity Markets

    International Nuclear Information System (INIS)

    Amelin, Mikael

    2004-07-01

    This dissertation is about how Monte Carlo simulation can be used to analyse electricity markets. There are a wide range of applications for simulation; for example, players in the electricity market can use simulation to decide whether or not an investment can be expected to be profitable, and authorities can by means of simulation find out which consequences a certain market design can be expected to have on electricity prices, environmental impact, etc. In the first part of the dissertation, the focus is which electricity market models are suitable for Monte Carlo simulation. The starting point is a definition of an ideal electricity market. Such an electricity market is partly practical from a mathematical point of view (it is simple to formulate and does not require too complex calculations) and partly it is a representation of the best possible resource utilisation. The definition of the ideal electricity market is followed by analysis how the reality differs from the ideal model, what consequences the differences have on the rules of the electricity market and the strategies of the players, as well as how non-ideal properties can be included in a mathematical model. Particularly, questions about environmental impact, forecast uncertainty and grid costs are studied. The second part of the dissertation treats the Monte Carlo technique itself. To reduce the number of samples necessary to obtain accurate results, variance reduction techniques can be used. Here, six different variance reduction techniques are studied and possible applications are pointed out. The conclusions of these studies are turned into a method for efficient simulation of basic electricity markets. The method is applied to some test systems and the results show that the chosen variance reduction techniques can produce equal or better results using 99% fewer samples compared to when the same system is simulated without any variance reduction technique. More complex electricity market models

  8. Qualitative analysis of the rare earth element by simulation of inductively coupled plasma emission spectra

    International Nuclear Information System (INIS)

    Hashimoto, M.S.; Tobishima, Taeko; Kamitake, Seigo; Yasuda, Kazuo.

    1985-01-01

    The emission lines for qualitative analysis of rare earth elements by a simulation technique of ICP spectra were proposed. The spectra were simulated by employing a Gaussian (or a Lorentzian at high concentrations) profile. The simulated spectra corresponded quite well with the observed ones. The emission lines were selected so that the interference was as small as possible. The present qualitative analysis is based on a pattern recognition method where observed intensity ratios of the emission lines in each element are compared with those of a single analyte element. The qualitative analysis was performed for twelve standard solutions containing a single rare earth element and for eight standard solutions containing an element other than rare earth elements. The selection of the emission lines and the algorithm of the present qualitative analysis were justified. (author)

  9. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kwon, Eun Ha; Kim, Ho Dong

    2005-06-15

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered.

  10. THEXSYST - a knowledge based system for the control and analysis of technical simulation calculations

    International Nuclear Information System (INIS)

    Burger, B.

    1991-07-01

    This system (THEXSYST) will be used for control, analysis and presentation of thermal hydraulic simulation calculations of light water reactors. THEXSYST is a modular system consisting of an expert shell with user interface, a data base, and a simulation program and uses techniques available in RSYST. A knowledge base, which was created to control the simulational calculation of pressurized water reactors, includes both the steady state calculation and the transient calculation in the domain of the depressurization, as a result of a small break loss of coolant accident. The methods developed are tested using a simulational calculation with RELAP5/Mod2. It will be seen that the application of knowledge base techniques may be a helpful tool to support existing solutions especially in graphical analysis. (orig./HP) [de

  11. LNG pool fire simulation for domino effect analysis

    International Nuclear Information System (INIS)

    Masum Jujuly, Muhammad; Rahman, Aziz; Ahmed, Salim; Khan, Faisal

    2015-01-01

    A three-dimensional computational fluid dynamics (CFD) simulation of liquefied natural gas (LNG) pool fire has been performed using ANSYS CFX-14. The CFD model solves the fundamental governing equations of fluid dynamics, namely, the continuity, momentum and energy equations. Several built-in sub-models are used to capture the characteristics of pool fire. The Reynolds-averaged Navier–Stokes (RANS) equation for turbulence and the eddy-dissipation model for non-premixed combustion are used. For thermal radiation, the Monte Carlo (MC) radiation model is used with the Magnussen soot model. The CFD results are compared with a set of experimental data for validation; the results are consistent with experimental data. CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. - Highlights: • Simulation of pool fire using computational fluid dynamics (CFD) model. • Integration of CFD based pool fire model with domino effect. • Application of the integrated CFD based domino effect analysis

  12. Detector simulations with DD4hep

    CERN Document Server

    AUTHOR|(SzGeCERN)668365; Frank, Markus; Gaede, Frank-Dieter; Lu, Shaojun; Nikiforou, Nikiforos; Sailer, Andre

    2017-01-01

    Detector description is a key component of detector design studies, test beam analyses, and most of particle physics experiments that require the simulation of more and more different detector geometries and event types. This paper describes DD4hep, which is an easy-to-use yet flexible and powerful detector description framework that can be used for detector simulation and also extended to specific needs for a particular working environment. Linear collider detector concepts ILD, SiD and CLICdp as well as detector development collaborations CALICE and FCal have chosen to adopt the DD4hep geometry framework and its DDG4 pathway to Geant4 as its core simulation and reconstruction tools. The DDG4 plugins suite includes a wide variety of input formats, provides access to the Geant4 particle gun or general particles source and allows for handling of Monte Carlo truth information, e.g. by linking hits and the primary particle that caused them, which is indispensable for performance and efficiency studies. An extend...

  13. Thermodynamic description of polymorphism in Q- and N-rich peptide aggregates revealed by atomistic simulation.

    Science.gov (United States)

    Berryman, Joshua T; Radford, Sheena E; Harris, Sarah A

    2009-07-08

    Amyloid fibrils are long, helically symmetric protein aggregates that can display substantial variation (polymorphism), including alterations in twist and structure at the beta-strand and protofilament levels, even when grown under the same experimental conditions. The structural and thermodynamic origins of this behavior are not yet understood. We performed molecular-dynamics simulations to determine the thermodynamic properties of different polymorphs of the peptide GNNQQNY, modeling fibrils containing different numbers of protofilaments based on the structure of amyloid-like cross-beta crystals of this peptide. We also modeled fibrils with new orientations of the side chains, as well as a de novo designed structure based on antiparallel beta-strands. The simulations show that these polymorphs are approximately isoenergetic under a range of conditions. Structural analysis reveals a dynamic reorganization of electrostatics and hydrogen bonding in the main and side chains of the Gln and Asn residues that characterize this peptide sequence. Q/N-rich stretches are found in several amyloidogenic proteins and peptides, including the yeast prions Sup35-N and Ure2p, as well as in the human poly-Q disease proteins, including the ataxins and huntingtin. Based on our results, we propose that these residues imbue a unique structural plasticity to the amyloid fibrils that they comprise, rationalizing the ability of proteins enriched in these amino acids to form prion strains with heritable and different phenotypic traits.

  14. One-dimensional, time dependent simulation of the planetary boundary layer over a 48-hour period

    International Nuclear Information System (INIS)

    Haschke, D.; Gassmann, F.; Rudin, F.

    1978-05-01

    Results of a one-dimensional, time dependent simulation of the planetary boundary layer are given. First, a description of the mathematical model used is given and its approximations are discussed. Then a description of the initial and boundary conditions used for the simulation is given. Results are discussed with respect to their agreement with observed data and their precision. It can be demonstrated that a simulation of the planetary boundary layer is possible with satisfactory precision. The incompleteness of observed data gives, however, problems with their use and thus introduces uncertainties into the simulation. As a consequence, the report tries to point to the inherent limitations of such a simulation. (Auth.)

  15. Sensitivity analysis of simulated SOA loadings using a variance-based statistical approach: SENSITIVITY ANALYSIS OF SOA

    Energy Technology Data Exchange (ETDEWEB)

    Shrivastava, Manish [Pacific Northwest National Laboratory, Richland Washington USA; Zhao, Chun [Pacific Northwest National Laboratory, Richland Washington USA; Easter, Richard C. [Pacific Northwest National Laboratory, Richland Washington USA; Qian, Yun [Pacific Northwest National Laboratory, Richland Washington USA; Zelenyuk, Alla [Pacific Northwest National Laboratory, Richland Washington USA; Fast, Jerome D. [Pacific Northwest National Laboratory, Richland Washington USA; Liu, Ying [Pacific Northwest National Laboratory, Richland Washington USA; Zhang, Qi [Department of Environmental Toxicology, University of California Davis, California USA; Guenther, Alex [Department of Earth System Science, University of California, Irvine California USA

    2016-04-08

    We investigate the sensitivity of secondary organic aerosol (SOA) loadings simulated by a regional chemical transport model to 7 selected tunable model parameters: 4 involving emissions of anthropogenic and biogenic volatile organic compounds, anthropogenic semi-volatile and intermediate volatility organics (SIVOCs), and NOx, 2 involving dry deposition of SOA precursor gases, and one involving particle-phase transformation of SOA to low volatility. We adopt a quasi-Monte Carlo sampling approach to effectively sample the high-dimensional parameter space, and perform a 250 member ensemble of simulations using a regional model, accounting for some of the latest advances in SOA treatments based on our recent work. We then conduct a variance-based sensitivity analysis using the generalized linear model method to study the responses of simulated SOA loadings to the tunable parameters. Analysis of SOA variance from all 250 simulations shows that the volatility transformation parameter, which controls whether particle-phase transformation of SOA from semi-volatile SOA to non-volatile is on or off, is the dominant contributor to variance of simulated surface-level daytime SOA (65% domain average contribution). We also split the simulations into 2 subsets of 125 each, depending on whether the volatility transformation is turned on/off. For each subset, the SOA variances are dominated by the parameters involving biogenic VOC and anthropogenic SIVOC emissions. Furthermore, biogenic VOC emissions have a larger contribution to SOA variance when the SOA transformation to non-volatile is on, while anthropogenic SIVOC emissions have a larger contribution when the transformation is off. NOx contributes less than 4.3% to SOA variance, and this low contribution is mainly attributed to dominance of intermediate to high NOx conditions throughout the simulated domain. The two parameters related to dry deposition of SOA precursor gases also have very low contributions to SOA variance

  16. Comparative analysis of methods and tools for open and closed fuel cycles modeling: MESSAGE and DESAE

    International Nuclear Information System (INIS)

    Andrianov, A.A.; Korovin, Yu.A.; Murogov, V.M.; Fedorova, E.V.; Fesenko, G.A.

    2006-01-01

    Comparative analysis of optimization and simulation methods by the example of MESSAGE and DESAE programs is carried out for nuclear power prospects and advanced fuel cycles modeling. Test calculations for open and two-component nuclear power and closed fuel cycle are performed. Auxiliary simulation-dynamic model is developed to specify MESSAGE and DESAE modeling approaches difference. The model description is given [ru

  17. Application of subset simulation methods to dynamic fault tree analysis

    International Nuclear Information System (INIS)

    Liu Mengyun; Liu Jingquan; She Ding

    2015-01-01

    Although fault tree analysis has been implemented in the nuclear safety field over the past few decades, it was recently criticized for the inability to model the time-dependent behaviors. Several methods are proposed to overcome this disadvantage, and dynamic fault tree (DFT) has become one of the research highlights. By introducing additional dynamic gates, DFT is able to describe the dynamic behaviors like the replacement of spare components or the priority of failure events. Using Monte Carlo simulation (MCS) approach to solve DFT has obtained rising attention, because it can model the authentic behaviors of systems and avoid the limitations in the analytical method. In this paper, it provides an overview and MCS information for DFT analysis, including the sampling of basic events and the propagation rule for logic gates. When calculating rare-event probability, large amount of simulations in standard MCS are required. To improve the weakness, subset simulation (SS) approach is applied. Using the concept of conditional probability and Markov Chain Monte Carlo (MCMC) technique, the SS method is able to accelerate the efficiency of exploring the failure region. Two cases are tested to illustrate the performance of SS approach, and the numerical results suggest that it gives high efficiency when calculating complicated systems with small failure probabilities. (author)

  18. The electricity portfolio simulation model (EPSim) technical description.

    Energy Technology Data Exchange (ETDEWEB)

    Drennen, Thomas E.; Klotz, Richard (Hobart and William Smith Colleges, Geneva, NY)

    2005-09-01

    Stakeholders often have competing interests when selecting or planning new power plants. The purpose of developing this preliminary Electricity Portfolio Simulation Model (EPSim) is to provide a first cut, dynamic methodology and approach to this problem, that can subsequently be refined and validated, that may help energy planners, policy makers, and energy students better understand the tradeoffs associated with competing electricity portfolios. EPSim allows the user to explore competing electricity portfolios annually from 2002 to 2025 in terms of five different criteria: cost, environmental impacts, energy dependence, health and safety, and sustainability. Four additional criteria (infrastructure vulnerability, service limitations, policy needs and science and technology needs) may be added in future versions of the model. Using an analytic hierarchy process (AHP) approach, users or groups of users apply weights to each of the criteria. The default energy assumptions of the model mimic Department of Energy's (DOE) electricity portfolio to 2025 (EIA, 2005). At any time, the user can compare alternative portfolios to this reference case portfolio.

  19. The daylighting dashboard - A simulation-based design analysis for daylit spaces

    Energy Technology Data Exchange (ETDEWEB)

    Reinhart, Christoph F. [Harvard University, Graduate School of Design, 48 Quincy Street, Cambridge, MA 02138 (United States); Wienold, Jan [Fraunhofer Institute for Solar Energy Systems, Heidenhofstrasse 2, 79110 Freiburg (Germany)

    2011-02-15

    This paper presents a vision of how state-of-the-art computer-based analysis techniques can be effectively used during the design of daylit spaces. Following a review of recent advances in dynamic daylight computation capabilities, climate-based daylighting metrics, occupant behavior and glare analysis, a fully integrated design analysis method is introduced that simultaneously considers annual daylight availability, visual comfort and energy use: Annual daylight glare probability profiles are combined with an occupant behavior model in order to determine annual shading profiles and visual comfort conditions throughout a space. The shading profiles are then used to calculate daylight autonomy plots, energy loads, operational energy costs and green house gas emissions. The paper then shows how simulation results for a sidelit space can be visually presented to simulation non-experts using the concept of a daylighting dashboard. The paper ends with a discussion of how the daylighting dashboard could be practically implemented using technologies that are available today. (author)

  20. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.