WorldWideScience

Sample records for high performance simulation

  1. High-Performance Beam Simulator for the LANSCE Linac

    International Nuclear Information System (INIS)

    Pang, Xiaoying; Rybarcyk, Lawrence J.; Baily, Scott A.

    2012-01-01

    A high performance multiparticle tracking simulator is currently under development at Los Alamos. The heart of the simulator is based upon the beam dynamics simulation algorithms of the PARMILA code, but implemented in C++ on Graphics Processing Unit (GPU) hardware using NVIDIA's CUDA platform. Linac operating set points are provided to the simulator via the EPICS control system so that changes of the real time linac parameters are tracked and the simulation results updated automatically. This simulator will provide valuable insight into the beam dynamics along a linac in pseudo real-time, especially where direct measurements of the beam properties do not exist. Details regarding the approach, benefits and performance are presented.

  2. MUMAX: A new high-performance micromagnetic simulation tool

    International Nuclear Information System (INIS)

    Vansteenkiste, A.; Van de Wiele, B.

    2011-01-01

    We present MUMAX, a general-purpose micromagnetic simulation tool running on graphical processing units (GPUs). MUMAX is designed for high-performance computations and specifically targets large simulations. In that case speedups of over a factor 100 x can be obtained compared to the CPU-based OOMMF program developed at NIST. MUMAX aims to be general and broadly applicable. It solves the classical Landau-Lifshitz equation taking into account the magnetostatic, exchange and anisotropy interactions, thermal effects and spin-transfer torque. Periodic boundary conditions can optionally be imposed. A spatial discretization using finite differences in two or three dimensions can be employed. MUMAX is publicly available as open-source software. It can thus be freely used and extended by community. Due to its high computational performance, MUMAX should open up the possibility of running extensive simulations that would be nearly inaccessible with typical CPU-based simulators. - Highlights: → Novel, open-source micromagnetic simulator on GPU hardware. → Speedup of ∝100x compared to other widely used tools. → Extensively validated against standard problems. → Makes previously infeasible simulations accessible.

  3. High performance ultrasonic field simulation on complex geometries

    Science.gov (United States)

    Chouh, H.; Rougeron, G.; Chatillon, S.; Iehl, J. C.; Farrugia, J. P.; Ostromoukhov, V.

    2016-02-01

    Ultrasonic field simulation is a key ingredient for the design of new testing methods as well as a crucial step for NDT inspection simulation. As presented in a previous paper [1], CEA-LIST has worked on the acceleration of these simulations focusing on simple geometries (planar interfaces, isotropic materials). In this context, significant accelerations were achieved on multicore processors and GPUs (Graphics Processing Units), bringing the execution time of realistic computations in the 0.1 s range. In this paper, we present recent works that aim at similar performances on a wider range of configurations. We adapted the physical model used by the CIVA platform to design and implement a new algorithm providing a fast ultrasonic field simulation that yields nearly interactive results for complex cases. The improvements over the CIVA pencil-tracing method include adaptive strategies for pencil subdivisions to achieve a good refinement of the sensor geometry while keeping a reasonable number of ray-tracing operations. Also, interpolation of the times of flight was used to avoid time consuming computations in the impulse response reconstruction stage. To achieve the best performance, our algorithm runs on multi-core superscalar CPUs and uses high performance specialized libraries such as Intel Embree for ray-tracing, Intel MKL for signal processing and Intel TBB for parallelization. We validated the simulation results by comparing them to the ones produced by CIVA on identical test configurations including mono-element and multiple-element transducers, homogeneous, meshed 3D CAD specimens, isotropic and anisotropic materials and wave paths that can involve several interactions with interfaces. We show performance results on complete simulations that achieve computation times in the 1s range.

  4. High performance real-time flight simulation at NASA Langley

    Science.gov (United States)

    Cleveland, Jeff I., II

    1994-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be deterministic and be completed in as short a time as possible. This includes simulation mathematical model computational and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, personnel at NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to a standard input/output system to provide for high bandwidth, low latency data acquisition and distribution. The Computer Automated Measurement and Control technology (IEEE standard 595) was extended to meet the performance requirements for real-time simulation. This technology extension increased the effective bandwidth by a factor of ten and increased the performance of modules necessary for simulator communications. This technology is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications of this technology are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC have completed the development of the use of supercomputers for simulation mathematical model computational to support real-time flight simulation. This includes the development of a real-time operating system and the development of specialized software and hardware for the CAMAC simulator network. This work, coupled with the use of an open systems software architecture, has advanced the state of the art in real time flight simulation. The data acquisition technology innovation and experience with recent developments in this technology are described.

  5. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...

  6. Simulations of KSTAR high performance steady state operation scenarios

    International Nuclear Information System (INIS)

    Na, Yong-Su; Kessel, C.E.; Park, J.M.; Yi, Sumin; Kim, J.Y.; Becoulet, A.; Sips, A.C.C.

    2009-01-01

    We report the results of predictive modelling of high performance steady state operation scenarios in KSTAR. Firstly, the capabilities of steady state operation are investigated with time-dependent simulations using a free-boundary plasma equilibrium evolution code coupled with transport calculations. Secondly, the reproducibility of high performance steady state operation scenarios developed in the DIII-D tokamak, of similar size to that of KSTAR, is investigated using the experimental data taken from DIII-D. Finally, the capability of ITER-relevant steady state operation is investigated in KSTAR. It is found that KSTAR is able to establish high performance steady state operation scenarios; β N above 3, H 98 (y, 2) up to 2.0, f BS up to 0.76 and f NI equals 1.0. In this work, a realistic density profile is newly introduced for predictive simulations by employing the scaling law of a density peaking factor. The influence of the current ramp-up scenario and the transport model is discussed with respect to the fusion performance and non-inductive current drive fraction in the transport simulations. As observed in the experiments, both the heating and the plasma current waveforms in the current ramp-up phase produce a strong effect on the q-profile, the fusion performance and also on the non-inductive current drive fraction in the current flattop phase. A criterion in terms of q min is found to establish ITER-relevant steady state operation scenarios. This will provide a guideline for designing the current ramp-up phase in KSTAR. It is observed that the transport model also affects the predictive values of fusion performance as well as the non-inductive current drive fraction. The Weiland transport model predicts the highest fusion performance as well as non-inductive current drive fraction in KSTAR. In contrast, the GLF23 model exhibits the lowest ones. ITER-relevant advanced scenarios cannot be obtained with the GLF23 model in the conditions given in this work

  7. An Advanced, Interactive, High-Performance Liquid Chromatography Simulator and Instructor Resources

    Science.gov (United States)

    Boswell, Paul G.; Stoll, Dwight R.; Carr, Peter W.; Nagel, Megan L.; Vitha, Mark F.; Mabbott, Gary A.

    2013-01-01

    High-performance liquid chromatography (HPLC) simulation software has long been recognized as an effective educational tool, yet many of the existing HPLC simulators are either too expensive, outdated, or lack many important features necessary to make them widely useful for educational purposes. Here, a free, open-source HPLC simulator is…

  8. Crystal and molecular simulation of high-performance polymers.

    Science.gov (United States)

    Colquhoun, H M; Williams, D J

    2000-03-01

    Single-crystal X-ray analyses of oligomeric models for high-performance aromatic polymers, interfaced to computer-based molecular modeling and diffraction simulation, have enabled the determination of a range of previously unknown polymer crystal structures from X-ray powder data. Materials which have been successfully analyzed using this approach include aromatic polyesters, polyetherketones, polythioetherketones, polyphenylenes, and polycarboranes. Pure macrocyclic homologues of noncrystalline polyethersulfones afford high-quality single crystals-even at very large ring sizes-and have provided the first examples of a "protein crystallographic" approach to the structures of conventionally amorphous synthetic polymers.

  9. High performance MRI simulations of motion on multi-GPU systems.

    Science.gov (United States)

    Xanthis, Christos G; Venetis, Ioannis E; Aletras, Anthony H

    2014-07-04

    MRI physics simulators have been developed in the past for optimizing imaging protocols and for training purposes. However, these simulators have only addressed motion within a limited scope. The purpose of this study was the incorporation of realistic motion, such as cardiac motion, respiratory motion and flow, within MRI simulations in a high performance multi-GPU environment. Three different motion models were introduced in the Magnetic Resonance Imaging SIMULator (MRISIMUL) of this study: cardiac motion, respiratory motion and flow. Simulation of a simple Gradient Echo pulse sequence and a CINE pulse sequence on the corresponding anatomical model was performed. Myocardial tagging was also investigated. In pulse sequence design, software crushers were introduced to accommodate the long execution times in order to avoid spurious echoes formation. The displacement of the anatomical model isochromats was calculated within the Graphics Processing Unit (GPU) kernel for every timestep of the pulse sequence. Experiments that would allow simulation of custom anatomical and motion models were also performed. Last, simulations of motion with MRISIMUL on single-node and multi-node multi-GPU systems were examined. Gradient Echo and CINE images of the three motion models were produced and motion-related artifacts were demonstrated. The temporal evolution of the contractility of the heart was presented through the application of myocardial tagging. Better simulation performance and image quality were presented through the introduction of software crushers without the need to further increase the computational load and GPU resources. Last, MRISIMUL demonstrated an almost linear scalable performance with the increasing number of available GPU cards, in both single-node and multi-node multi-GPU computer systems. MRISIMUL is the first MR physics simulator to have implemented motion with a 3D large computational load on a single computer multi-GPU configuration. The incorporation

  10. High performance simulation for the Silva project using the tera computer

    International Nuclear Information System (INIS)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F.; Boulet, M.; Scheurer, B.; Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A.

    2003-01-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  11. High performance simulation for the Silva project using the tera computer

    Energy Technology Data Exchange (ETDEWEB)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F. [CS Communication and Systemes, 92 - Clamart (France); Boulet, M.; Scheurer, B. [CEA Bruyeres-le-Chatel, 91 - Bruyeres-le-Chatel (France); Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A. [CEA Saclay, 91 - Gif sur Yvette (France)

    2003-07-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  12. High performance cellular level agent-based simulation with FLAME for the GPU.

    Science.gov (United States)

    Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela

    2010-05-01

    Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.

  13. High performance electromagnetic simulation tools

    Science.gov (United States)

    Gedney, Stephen D.; Whites, Keith W.

    1994-10-01

    Army Research Office Grant #DAAH04-93-G-0453 has supported the purchase of 24 additional compute nodes that were installed in the Intel iPsC/860 hypercube at the Univesity Of Kentucky (UK), rendering a 32-node multiprocessor. This facility has allowed the investigators to explore and extend the boundaries of electromagnetic simulation for important areas of defense concerns including microwave monolithic integrated circuit (MMIC) design/analysis and electromagnetic materials research and development. The iPSC/860 has also provided an ideal platform for MMIC circuit simulations. A number of parallel methods based on direct time-domain solutions of Maxwell's equations have been developed on the iPSC/860, including a parallel finite-difference time-domain (FDTD) algorithm, and a parallel planar generalized Yee-algorithm (PGY). The iPSC/860 has also provided an ideal platform on which to develop a 'virtual laboratory' to numerically analyze, scientifically study and develop new types of materials with beneficial electromagnetic properties. These materials simulations are capable of assembling hundreds of microscopic inclusions from which an electromagnetic full-wave solution will be obtained in toto. This powerful simulation tool has enabled research of the full-wave analysis of complex multicomponent MMIC devices and the electromagnetic properties of many types of materials to be performed numerically rather than strictly in the laboratory.

  14. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  15. High Performance Wideband CMOS CCI and its Application in Inductance Simulator Design

    Directory of Open Access Journals (Sweden)

    ARSLAN, E.

    2012-08-01

    Full Text Available In this paper, a new, differential pair based, low-voltage, high performance and wideband CMOS first generation current conveyor (CCI is proposed. The proposed CCI has high voltage swings on ports X and Y and very low equivalent impedance on port X due to super source follower configuration. It also has high voltage swings (close to supply voltages on input and output ports and wideband current and voltage transfer ratios. Furthermore, two novel grounded inductance simulator circuits are proposed as application examples. Using HSpice, it is shown that the simulation results of the proposed CCI and also of the presented inductance simulators are in very good agreement with the expected ones.

  16. Simulation model of a twin-tail, high performance airplane

    Science.gov (United States)

    Buttrill, Carey S.; Arbuckle, P. Douglas; Hoffler, Keith D.

    1992-01-01

    The mathematical model and associated computer program to simulate a twin-tailed high performance fighter airplane (McDonnell Douglas F/A-18) are described. The simulation program is written in the Advanced Continuous Simulation Language. The simulation math model includes the nonlinear six degree-of-freedom rigid-body equations, an engine model, sensors, and first order actuators with rate and position limiting. A simplified form of the F/A-18 digital control laws (version 8.3.3) are implemented. The simulated control law includes only inner loop augmentation in the up and away flight mode. The aerodynamic forces and moments are calculated from a wind-tunnel-derived database using table look-ups with linear interpolation. The aerodynamic database has an angle-of-attack range of -10 to +90 and a sideslip range of -20 to +20 degrees. The effects of elastic deformation are incorporated in a quasi-static-elastic manner. Elastic degrees of freedom are not actively simulated. In the engine model, the throttle-commanded steady-state thrust level and the dynamic response characteristics of the engine are based on airflow rate as determined from a table look-up. Afterburner dynamics are switched in at a threshold based on the engine airflow and commanded thrust.

  17. Simulating Effects of High Angle of Attack on Turbofan Engine Performance

    Science.gov (United States)

    Liu, Yuan; Claus, Russell W.; Litt, Jonathan S.; Guo, Ten-Huei

    2013-01-01

    A method of investigating the effects of high angle of attack (AOA) flight on turbofan engine performance is presented. The methodology involves combining a suite of diverse simulation tools. Three-dimensional, steady-state computational fluid dynamics (CFD) software is used to model the change in performance of a commercial aircraft-type inlet and fan geometry due to various levels of AOA. Parallel compressor theory is then applied to assimilate the CFD data with a zero-dimensional, nonlinear, dynamic turbofan engine model. The combined model shows that high AOA operation degrades fan performance and, thus, negatively impacts compressor stability margins and engine thrust. In addition, the engine response to high AOA conditions is shown to be highly dependent upon the type of control system employed.

  18. High-Performance Modeling of Carbon Dioxide Sequestration by Coupling Reservoir Simulation and Molecular Dynamics

    KAUST Repository

    Bao, Kai; Yan, Mi; Allen, Rebecca; Salama, Amgad; Lu, Ligang; Jordan, Kirk E.; Sun, Shuyu; Keyes, David E.

    2015-01-01

    The present work describes a parallel computational framework for carbon dioxide (CO2) sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel high-performance-computing (HPC) systems

  19. Aging analysis of high performance FinFET flip-flop under Dynamic NBTI simulation configuration

    Science.gov (United States)

    Zainudin, M. F.; Hussin, H.; Halim, A. K.; Karim, J.

    2018-03-01

    A mechanism known as Negative-bias Temperature Instability (NBTI) degrades a main electrical parameters of a circuit especially in terms of performance. So far, the circuit design available at present are only focussed on high performance circuit without considering the circuit reliability and robustness. In this paper, the main circuit performances of high performance FinFET flip-flop such as delay time, and power were studied with the presence of the NBTI degradation. The aging analysis was verified using a 16nm High Performance Predictive Technology Model (PTM) based on different commands available at Synopsys HSPICE. The results shown that the circuit under the longer dynamic NBTI simulation produces the highest impact in the increasing of gate delay and decrease in the average power reduction from a fresh simulation until the aged stress time under a nominal condition. In addition, the circuit performance under a varied stress condition such as temperature and negative stress gate bias were also studied.

  20. Correlations between the simulated military tasks performance and physical fitness tests at high altitude

    Directory of Open Access Journals (Sweden)

    Eduardo Borba Neves

    2017-11-01

    Full Text Available The aim of this study was to investigate the Correlations between the Simulated Military Tasks Performance and Physical Fitness Tests at high altitude. This research is part of a project to modernize the physical fitness test of the Colombian Army. Data collection was performed at the 13th Battalion of Instruction and Training, located 30km south of Bogota D.C., with a temperature range from 1ºC to 23ºC during the study period, and at 3100m above sea level. The sample was composed by 60 volunteers from three different platoons. The volunteers start the data collection protocol after 2 weeks of acclimation at this altitude. The main results were the identification of a high positive correlation between the 3 Assault wall in succession and the Simulated Military Tasks performance (r = 0.764, p<0.001, and a moderate negative correlation between pull-ups and the Simulated Military Tasks performance (r = -0.535, p<0.001. It can be recommended the use of the 20-consecutive overtaking of the 3 Assault wall in succession as a good way to estimate the performance in operational tasks which involve: assault walls, network of wires, military Climbing Nets, Tarzan jump among others, at high altitude.

  1. High correlation between performance on a virtual-reality simulator and real-life cataract surgery

    DEFF Research Database (Denmark)

    Thomsen, Ann Sofia Skou; Smith, Phillip; Subhi, Yousif

    2017-01-01

    -tracking software of cataract surgical videos with a Pearson correlation coefficient of -0.70 (p = 0.017). CONCLUSION: Performance on the EyeSi simulator is significantly and highly correlated to real-life surgical performance. However, it is recommended that performance assessments are made using multiple data......PURPOSE: To investigate the correlation in performance of cataract surgery between a virtual-reality simulator and real-life surgery using two objective assessment tools with evidence of validity. METHODS: Cataract surgeons with varying levels of experience were included in the study. All...... antitremor training, forceps training, bimanual training, capsulorhexis and phaco divide and conquer. RESULTS: Eleven surgeons were enrolled. After a designated warm-up period, the proficiency-based test on the EyeSi simulator was strongly correlated to real-life performance measured by motion...

  2. Cognitive load, emotion, and performance in high-fidelity simulation among beginning nursing students: a pilot study.

    Science.gov (United States)

    Schlairet, Maura C; Schlairet, Timothy James; Sauls, Denise H; Bellflowers, Lois

    2015-03-01

    Establishing the impact of the high-fidelity simulation environment on student performance, as well as identifying factors that could predict learning, would refine simulation outcome expectations among educators. The purpose of this quasi-experimental pilot study was to explore the impact of simulation on emotion and cognitive load among beginning nursing students. Forty baccalaureate nursing students participated in teaching simulations, rated their emotional state and cognitive load, and completed evaluation simulations. Two principal components of emotion were identified representing the pleasant activation and pleasant deactivation components of affect. Mean rating of cognitive load following simulation was high. Linear regression identiffed slight but statistically nonsignificant positive associations between principal components of emotion and cognitive load. Logistic regression identified a negative but statistically nonsignificant effect of cognitive load on assessment performance. Among lower ability students, a more pronounced effect of cognitive load on assessment performance was observed; this also was statistically non-significant. Copyright 2015, SLACK Incorporated.

  3. High-Fidelity Simulation in Occupational Therapy Curriculum: Impact on Level II Fieldwork Performance

    Directory of Open Access Journals (Sweden)

    Rebecca Ozelie

    2016-10-01

    Full Text Available Simulation experiences provide experiential learning opportunities during artificially produced real-life medical situations in a safe environment. Evidence supports using simulation in health care education yet limited quantitative evidence exists in occupational therapy. This study aimed to evaluate the differences in scores on the AOTA Fieldwork Performance Evaluation for the Occupational Therapy Student of Level II occupational therapy students who received high-fidelity simulation training and students who did not. A retrospective analysis of 180 students from a private university was used. Independent samples nonparametric t tests examined mean differences between Fieldwork Performance Evaluation scores of those who did and did not receive simulation experiences in the curriculum. Mean ranks were also analyzed for subsection scores and practice settings. Results of this study found no significant difference in overall Fieldwork Performance Evaluation scores between the two groups. The students who completed simulation and had fieldwork in inpatient rehabilitation had the greatest increase in mean rank scores and increases in several subsections. The outcome measure used in this study was found to have limited discriminatory capability and may have affected the results; however, this study finds that using simulation may be a beneficial supplement to didactic coursework in occupational therapy curriculums.

  4. Driving Simulator Development and Performance Study

    OpenAIRE

    Juto, Erik

    2010-01-01

    The driving simulator is a vital tool for much of the research performed at theSwedish National Road and Transport Institute (VTI). Currently VTI posses three driving simulators, two high fidelity simulators developed and constructed by VTI, and a medium fidelity simulator from the German company Dr.-Ing. Reiner Foerst GmbH. The two high fidelity simulators run the same simulation software, developed at VTI. The medium fidelity simulator runs a proprietary simulation software. At VTI there is...

  5. Simulating and stimulating performance: Introducing distributed simulation to enhance musical learning and performance

    Directory of Open Access Journals (Sweden)

    Aaron eWilliamon

    2014-02-01

    Full Text Available Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of real performance could be recreated. Advanced violin students (n=11 were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three expert virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for

  6. Simulating and stimulating performance: introducing distributed simulation to enhance musical learning and performance.

    Science.gov (United States)

    Williamon, Aaron; Aufegger, Lisa; Eiholzer, Hubert

    2014-01-01

    Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all) how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of "real" performance could be recreated. Advanced violin students (n = 11) were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three "expert" virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for performance training.

  7. High performance stream computing for particle beam transport simulations

    International Nuclear Information System (INIS)

    Appleby, R; Bailey, D; Higham, J; Salt, M

    2008-01-01

    Understanding modern particle accelerators requires simulating charged particle transport through the machine elements. These simulations can be very time consuming due to the large number of particles and the need to consider many turns of a circular machine. Stream computing offers an attractive way to dramatically improve the performance of such simulations by calculating the simultaneous transport of many particles using dedicated hardware. Modern Graphics Processing Units (GPUs) are powerful and affordable stream computing devices. The results of simulations of particle transport through the booster-to-storage-ring transfer line of the DIAMOND synchrotron light source using an NVidia GeForce 7900 GPU are compared to the standard transport code MAD. It is found that particle transport calculations are suitable for stream processing and large performance increases are possible. The accuracy and potential speed gains are compared and the prospects for future work in the area are discussed

  8. Assessing Technical Performance and Determining the Learning Curve in Cleft Palate Surgery Using a High-Fidelity Cleft Palate Simulator.

    Science.gov (United States)

    Podolsky, Dale J; Fisher, David M; Wong Riff, Karen W; Szasz, Peter; Looi, Thomas; Drake, James M; Forrest, Christopher R

    2018-06-01

    This study assessed technical performance in cleft palate repair using a newly developed assessment tool and high-fidelity cleft palate simulator through a longitudinal simulation training exercise. Three residents performed five and one resident performed nine consecutive endoscopically recorded cleft palate repairs using a cleft palate simulator. Two fellows in pediatric plastic surgery and two expert cleft surgeons also performed recorded simulated repairs. The Cleft Palate Objective Structured Assessment of Technical Skill (CLOSATS) and end-product scales were developed to assess performance. Two blinded cleft surgeons assessed the recordings and the final repairs using the CLOSATS, end-product scale, and a previously developed global rating scale. The average procedure-specific (CLOSATS), global rating, and end-product scores increased logarithmically after each successive simulation session for the residents. Reliability of the CLOSATS (average item intraclass correlation coefficient (ICC), 0.85 ± 0.093) and global ratings (average item ICC, 0.91 ± 0.02) among the raters was high. Reliability of the end-product assessments was lower (average item ICC, 0.66 ± 0.15). Standard setting linear regression using an overall cutoff score of 7 of 10 corresponded to a pass score for the CLOSATS and the global score of 44 (maximum, 60) and 23 (maximum, 30), respectively. Using logarithmic best-fit curves, 6.3 simulation sessions are required to reach the minimum standard. A high-fidelity cleft palate simulator has been developed that improves technical performance in cleft palate repair. The simulator and technical assessment scores can be used to determine performance before operating on patients.

  9. OpenMM 4: A Reusable, Extensible, Hardware Independent Library for High Performance Molecular Simulation.

    Science.gov (United States)

    Eastman, Peter; Friedrichs, Mark S; Chodera, John D; Radmer, Randall J; Bruns, Christopher M; Ku, Joy P; Beauchamp, Kyle A; Lane, Thomas J; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R; Pande, Vijay S

    2013-01-08

    OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added.

  10. 20th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Patel, Nisarg; Kobayashi, Hiroaki

    2016-01-01

    The book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It explores general trends in hardware and software development, and then focuses specifically on the future of high-performance systems and heterogeneous architectures. It also covers applications such as computational fluid dynamics, material science, medical applications and climate research and discusses innovative fields like coupled multi-physics or multi-scale simulations. The papers included were selected from the presentations given at the 20th Workshop on Sustained Simulation Performance at the HLRS, University of Stuttgart, Germany in December 2015, and the subsequent Workshop on Sustained Simulation Performance at Tohoku University in February 2016.

  11. Performance of high-rate TRD prototypes for the CBM experiment in test beam and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Klein-Boesing, Melanie [Institut fuer Kernphysik, Muenster (Germany)

    2008-07-01

    The goal of the future Compressed Baryonic Matter (CBM) experiment is to explore the QCD phase diagram in the region of high baryon densities not covered by other experiments. Among other detectors, it will employ a Transition Radiation Detector (TRD) for tracking of charged particles and electron identification. To meet the demands for tracking and for electron identification at large particle densities and very high interaction rates, high efficiency TRD prototypes have been developed. These prototypes with double-sided pad plane electrodes based on Multiwire Proportional Chambers (MWPC) have been tested at GSI and implemented in the simulation framework of CBM. Results of the performance in a test beam and in simulations are shown. In addition, we present a study of the performance of CBM for electron identification and dilepton reconstruction with this new detector layout.

  12. High-Performance Modeling of Carbon Dioxide Sequestration by Coupling Reservoir Simulation and Molecular Dynamics

    KAUST Repository

    Bao, Kai

    2015-10-26

    The present work describes a parallel computational framework for carbon dioxide (CO2) sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel high-performance-computing (HPC) systems. In this framework, a parallel reservoir simulator, reservoir-simulation toolbox (RST), solves the flow and transport equations that describe the subsurface flow behavior, whereas the MD simulations are performed to provide the required physical parameters. Technologies from several different fields are used to make this novel coupled system work efficiently. One of the major applications of the framework is the modeling of large-scale CO2 sequestration for long-term storage in subsurface geological formations, such as depleted oil and gas reservoirs and deep saline aquifers, which has been proposed as one of the few attractive and practical solutions to reduce CO2 emissions and address the global-warming threat. Fine grids and accurate prediction of the properties of fluid mixtures under geological conditions are essential for accurate simulations. In this work, CO2 sequestration is presented as a first example for coupling reservoir simulation and MD, although the framework can be extended naturally to the full multiphase multicomponent compositional flow simulation to handle more complicated physical processes in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our MD simulations compared with published data, and good scalability is observed with the massively parallel HPC systems. The performance and capacity of the proposed framework are well-demonstrated with several experiments with hundreds of millions to one billion cells. To the best of our knowledge, the present work represents the first attempt to couple reservoir simulation and molecular simulation for large-scale modeling. Because of the complexity of

  13. High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications

    Science.gov (United States)

    Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad

    2012-01-01

    NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure

  14. Direct numerical simulation of reactor two-phase flows enabled by high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Jun; Cambareri, Joseph J.; Brown, Cameron S.; Feng, Jinyong; Gouws, Andre; Li, Mengnan; Bolotnov, Igor A.

    2018-04-01

    Nuclear reactor two-phase flows remain a great engineering challenge, where the high-resolution two-phase flow database which can inform practical model development is still sparse due to the extreme reactor operation conditions and measurement difficulties. Owing to the rapid growth of computing power, the direct numerical simulation (DNS) is enjoying a renewed interest in investigating the related flow problems. A combination between DNS and an interface tracking method can provide a unique opportunity to study two-phase flows based on first principles calculations. More importantly, state-of-the-art high-performance computing (HPC) facilities are helping unlock this great potential. This paper reviews the recent research progress of two-phase flow DNS related to reactor applications. The progress in large-scale bubbly flow DNS has been focused not only on the sheer size of those simulations in terms of resolved Reynolds number, but also on the associated advanced modeling and analysis techniques. Specifically, the current areas of active research include modeling of sub-cooled boiling, bubble coalescence, as well as the advanced post-processing toolkit for bubbly flow simulations in reactor geometries. A novel bubble tracking method has been developed to track the evolution of bubbles in two-phase bubbly flow. Also, spectral analysis of DNS database in different geometries has been performed to investigate the modulation of the energy spectrum slope due to bubble-induced turbulence. In addition, the single-and two-phase analysis results are presented for turbulent flows within the pressurized water reactor (PWR) core geometries. The related simulations are possible to carry out only with the world leading HPC platforms. These simulations are allowing more complex turbulence model development and validation for use in 3D multiphase computational fluid dynamics (M-CFD) codes.

  15. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    International Nuclear Information System (INIS)

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-01-01

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  16. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Energy Technology Data Exchange (ETDEWEB)

    Oelerich, Jan Oliver, E-mail: jan.oliver.oelerich@physik.uni-marburg.de; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-06-15

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  17. Comparison of turbulence measurements from DIII-D low-mode and high-performance plasmas to turbulence simulations and models

    International Nuclear Information System (INIS)

    Rhodes, T.L.; Leboeuf, J.-N.; Sydora, R.D.; Groebner, R.J.; Doyle, E.J.; McKee, G.R.; Peebles, W.A.; Rettig, C.L.; Zeng, L.; Wang, G.

    2002-01-01

    Measured turbulence characteristics (correlation lengths, spectra, etc.) in low-confinement (L-mode) and high-performance plasmas in the DIII-D tokamak [Luxon et al., Proceedings Plasma Physics and Controlled Nuclear Fusion Research 1986 (International Atomic Energy Agency, Vienna, 1987), Vol. I, p. 159] show many similarities with the characteristics determined from turbulence simulations. Radial correlation lengths Δr of density fluctuations from L-mode discharges are found to be numerically similar to the ion poloidal gyroradius ρ θ,s , or 5-10 times the ion gyroradius ρ s over the radial region 0.2 θ,s or 5-10 times ρ s , an experiment was performed which modified ρ θs while keeping other plasma parameters approximately fixed. It was found that the experimental Δr did not scale as ρ θ,s , which was similar to low-resolution UCAN simulations. Finally, both experimental measurements and gyrokinetic simulations indicate a significant reduction in the radial correlation length from high-performance quiescent double barrier discharges, as compared to normal L-mode, consistent with reduced transport in these high-performance plasmas

  18. GPU-based high performance Monte Carlo simulation in neutron transport

    Energy Technology Data Exchange (ETDEWEB)

    Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Inteligencia Artificial Aplicada], e-mail: cmnap@ien.gov.br

    2009-07-01

    Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)

  19. GPU-based high performance Monte Carlo simulation in neutron transport

    International Nuclear Information System (INIS)

    Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A.

    2009-01-01

    Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)

  20. High-resolution 3D simulations of NIF ignition targets performed on Sequoia with HYDRA

    Science.gov (United States)

    Marinak, M. M.; Clark, D. S.; Jones, O. S.; Kerbel, G. D.; Sepke, S.; Patel, M. V.; Koning, J. M.; Schroeder, C. R.

    2015-11-01

    Developments in the multiphysics ICF code HYDRA enable it to perform large-scale simulations on the Sequoia machine at LLNL. With an aggregate computing power of 20 Petaflops, Sequoia offers an unprecedented capability to resolve the physical processes in NIF ignition targets for a more complete, consistent treatment of the sources of asymmetry. We describe modifications to HYDRA that enable it to scale to over one million processes on Sequoia. These include new options for replicating parts of the mesh over a subset of the processes, to avoid strong scaling limits. We consider results from a 3D full ignition capsule-only simulation performed using over one billion zones run on 262,000 processors which resolves surface perturbations through modes l = 200. We also report progress towards a high-resolution 3D integrated hohlraum simulation performed using 262,000 processors which resolves surface perturbations on the ignition capsule through modes l = 70. These aim for the most complete calculations yet of the interactions and overall impact of the various sources of asymmetry for NIF ignition targets. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344.

  1. The computer program LIAR for the simulation and modeling of high performance linacs

    International Nuclear Information System (INIS)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.O.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-07-01

    High performance linear accelerators are the central components of the proposed next generation of linear colliders. They must provide acceleration of up to 750 GeV per beam while maintaining small normalized emittances. Standard simulation programs, mainly developed for storage rings, did not meet the specific requirements for high performance linacs with high bunch charges and strong wakefields. The authors present the program. LIAR (LInear Accelerator Research code) that includes single and multi-bunch wakefield effects, a 6D coupled beam description, specific optimization algorithms and other advanced features. LIAR has been applied to and checked against the existing Stanford Linear Collider (SLC), the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS) at SLAC. Its modular structure allows easy extension for different purposes. The program is available for UNIX workstations and Windows PC's

  2. Development of three-dimensional neoclassical transport simulation code with high performance Fortran on a vector-parallel computer

    International Nuclear Information System (INIS)

    Satake, Shinsuke; Okamoto, Masao; Nakajima, Noriyoshi; Takamaru, Hisanori

    2005-11-01

    A neoclassical transport simulation code (FORTEC-3D) applicable to three-dimensional configurations has been developed using High Performance Fortran (HPF). Adoption of computing techniques for parallelization and a hybrid simulation model to the δf Monte-Carlo method transport simulation, including non-local transport effects in three-dimensional configurations, makes it possible to simulate the dynamism of global, non-local transport phenomena with a self-consistent radial electric field within a reasonable computation time. In this paper, development of the transport code using HPF is reported. Optimization techniques in order to achieve both high vectorization and parallelization efficiency, adoption of a parallel random number generator, and also benchmark results, are shown. (author)

  3. Improving the Performance of the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2014-01-01

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation-based toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1) a new deadlock resolution protocol to reduce the parallel discrete event simulation management overhead and (2) a new simulated MPI message matching algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement, such as by reducing the simulation overhead for running the NAS Parallel Benchmark suite inside the simulator from 1,020\\% to 238% for the conjugate gradient (CG) benchmark and from 102% to 0% for the embarrassingly parallel (EP) and benchmark, as well as, from 37,511% to 13,808% for CG and from 3,332% to 204% for EP with accurate process failure simulation.

  4. 24th & 25th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Gienger, Michael; Kobayashi, Hiroaki

    2017-01-01

    This book presents the state of the art in High Performance Computing on modern supercomputer architectures. It addresses trends in hardware and software development in general, as well as the future of High Performance Computing systems and heterogeneous architectures. The contributions cover a broad range of topics, from improved system management to Computational Fluid Dynamics, High Performance Data Analytics, and novel mathematical approaches for large-scale systems. In addition, they explore innovative fields like coupled multi-physics and multi-scale simulations. All contributions are based on selected papers presented at the 24th Workshop on Sustained Simulation Performance, held at the University of Stuttgart’s High Performance Computing Center in Stuttgart, Germany in December 2016 and the subsequent Workshop on Sustained Simulation Performance, held at the Cyberscience Center, Tohoku University, Japan in March 2017.

  5. Optimized Parallel Discrete Event Simulation (PDES) for High Performance Computing (HPC) Clusters

    National Research Council Canada - National Science Library

    Abu-Ghazaleh, Nael

    2005-01-01

    The aim of this project was to study the communication subsystem performance of state of the art optimistic simulator Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES...

  6. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  7. Simulation-Driven Development and Optimization of a High-Performance Six-Dimensional Wrist Force/Torque Sensor

    Directory of Open Access Journals (Sweden)

    Qiaokang LIANG

    2010-05-01

    Full Text Available This paper describes the Simulation-Driven Development and Optimization (SDDO of a six-dimensional force/torque sensor with high performance. By the implementation of the SDDO, the developed sensor possesses high performance such as high sensitivity, linearity, stiffness and repeatability simultaneously, which is hard for tranditional force/torque sensor. Integrated approach provided by software ANSYS was used to streamline and speed up the process chain and thereby to deliver results significantly faster than traditional approaches. The result of calibration experiment possesses some impressive characters, therefore the developed fore/torque sensor can be usefully used in industry and the methods of design can also be used to develop industrial product.

  8. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  9. Effects of reflex-based self-defence training on police performance in simulated high-pressure arrest situations

    NARCIS (Netherlands)

    Renden, Peter G.; Savelsbergh, Geert J. P.; Oudejans, Raoul R. D.

    2017-01-01

    We investigated the effects of reflex-based self-defence training on police performance in simulated high-pressure arrest situations. Police officers received this training as well as a regular police arrest and self-defence skills training (control training) in a crossover design. Officers’

  10. High Fidelity BWR Fuel Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Su Jong [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This report describes the Consortium for Advanced Simulation of Light Water Reactors (CASL) work conducted for completion of the Thermal Hydraulics Methods (THM) Level 3 milestone THM.CFD.P13.03: High Fidelity BWR Fuel Simulation. High fidelity computational fluid dynamics (CFD) simulation for Boiling Water Reactor (BWR) was conducted to investigate the applicability and robustness performance of BWR closures. As a preliminary study, a CFD model with simplified Ferrule spacer grid geometry of NUPEC BWR Full-size Fine-mesh Bundle Test (BFBT) benchmark has been implemented. Performance of multiphase segregated solver with baseline boiling closures has been evaluated. Although the mean values of void fraction and exit quality of CFD result for BFBT case 4101-61 agreed with experimental data, the local void distribution was not predicted accurately. The mesh quality was one of the critical factors to obtain converged result. The stability and robustness of the simulation was mainly affected by the mesh quality, combination of BWR closure models. In addition, the CFD modeling of fully-detailed spacer grid geometry with mixing vane is necessary for improving the accuracy of CFD simulation.

  11. High-performance computing using FPGAs

    CERN Document Server

    Benkrid, Khaled

    2013-01-01

    This book is concerned with the emerging field of High Performance Reconfigurable Computing (HPRC), which aims to harness the high performance and relative low power of reconfigurable hardware–in the form Field Programmable Gate Arrays (FPGAs)–in High Performance Computing (HPC) applications. It presents the latest developments in this field from applications, architecture, and tools and methodologies points of view. We hope that this work will form a reference for existing researchers in the field, and entice new researchers and developers to join the HPRC community.  The book includes:  Thirteen application chapters which present the most important application areas tackled by high performance reconfigurable computers, namely: financial computing, bioinformatics and computational biology, data search and processing, stencil computation e.g. computational fluid dynamics and seismic modeling, cryptanalysis, astronomical N-body simulation, and circuit simulation.     Seven architecture chapters which...

  12. Simulation of the High Performance Time to Digital Converter for the ATLAS Muon Spectrometer trigger upgrade

    International Nuclear Information System (INIS)

    Meng, X.T.; Levin, D.S.; Chapman, J.W.; Zhou, B.

    2016-01-01

    The ATLAS Muon Spectrometer endcap thin-Resistive Plate Chamber trigger project compliments the New Small Wheel endcap Phase-1 upgrade for higher luminosity LHC operation. These new trigger chambers, located in a high rate region of ATLAS, will improve overall trigger acceptance and reduce the fake muon trigger incidence. These chambers must generate a low level muon trigger to be delivered to a remote high level processor within a stringent latency requirement of 43 bunch crossings (1075 ns). To help meet this requirement the High Performance Time to Digital Converter (HPTDC), a multi-channel ASIC designed by CERN Microelectronics group, has been proposed for the digitization of the fast front end detector signals. This paper investigates the HPTDC performance in the context of the overall muon trigger latency, employing detailed behavioral Verilog simulations in which the latency in triggerless mode is measured for a range of configurations and under realistic hit rate conditions. The simulation results show that various HPTDC operational configurations, including leading edge and pair measurement modes can provide high efficiency (>98%) to capture and digitize hits within a time interval satisfying the Phase-1 latency tolerance.

  13. Simulator experiments: effects of NPP operator experience on performance

    International Nuclear Information System (INIS)

    Beare, A.N.; Gray, L.H.

    1984-01-01

    During the FY83 research, a simulator experiment was conducted at the control room simulator for a GE Boiling Water Reactor (BWR) NPP. The research subjects were licensed operators undergoing requalification training and shift technical advisors (STAs). This experiment was designed to investigate the effects of senior reactor operator (SRO) experience, operating crew augmentation with an STA and practice, as a crew, upon crew and individual operator performance, in response to anticipated plant transients. Sixteen two-man crews of licensed operators were employed in a 2 x 2 factorial design. The SROs leading the crews were split into high and low experience groups on the basis of their years of experience as an SRO. One half of the high- and low-SRO experience groups were assisted by an STA. The crews responded to four simulated plant casualties. A five-variable set of content-referenced performance measures was derived from task analyses of the procedurally correct responses to the four casualties. System parameters and control manipulations were recorded by the computer controlling the simulator. Data on communications and procedure use were obtained from analysis of videotapes of the exercises. Questionnaires were used to collect subject biographical information and data on subjective workload during each simulated casualty. For four of the five performance measures, no significant differences were found between groups led by high (25 to 114 months) and low (1 to 17 months as an SRO) experience SROs. However, crews led by low experience SROs tended to have significantly shorter task performance times than crews led by high experience SROs. The presence of the STA had no significant effect on overall team performance in responding to the four simulated casualties. The FY84 experiments are a partial replication and extension of the FY83 experiment, but with PWR operators and simulator

  14. High performance thermal stress analysis on the earth simulator

    International Nuclear Information System (INIS)

    Noriyuki, Kushida; Hiroshi, Okuda; Genki, Yagawa

    2003-01-01

    In this study, the thermal stress finite element analysis code optimized for the earth simulator was developed. A processor node of which of the earth simulator is the 8-way vector processor, and each processor can communicate using the message passing interface. Thus, there are two ways to parallelize the finite element method on the earth simulator. The first method is to assign one processor for one sub-domain, and the second method is to assign one node (=8 processors) for one sub-domain considering the shared memory type parallelization. Considering that the preconditioned conjugate gradient (PCG) method, which is one of the suitable linear equation solvers for the large-scale parallel finite element methods, shows the better convergence behavior if the number of domains is the smaller, we have determined to employ PCG and the hybrid parallelization, which is based on the shared and distributed memory type parallelization. It has been said that it is hard to obtain the good parallel or vector performance, since the finite element method is based on unstructured grids. In such situation, the reordering is inevitable to improve the computational performance [2]. In this study, we used three reordering methods, i.e. Reverse Cuthil-McKee (RCM), cyclic multicolor (CM) and diagonal jagged descending storage (DJDS)[3]. RCM provides the good convergence of the incomplete lower-upper (ILU) PCG, but causes the load imbalance. On the other hand, CM provides the good load balance, but worsens the convergence of ILU PCG if the vector length is so long. Therefore, we used the combined-method of RCM and CM. DJDS is the method to store the sparse matrices such that longer vector length can be obtained. For attaining the efficient inter-node parallelization, such partitioning methods as the recursive coordinate bisection (RCM) or MeTIS have been used. Computational performance of the practical large-scale engineering problems will be shown at the meeting. (author)

  15. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  16. A lattice-particle approach for the simulation of fracture processes in fiber-reinforced high-performance concrete

    NARCIS (Netherlands)

    Montero-Chacón, F.; Schlangen, H.E.J.G.; Medina, F.

    2013-01-01

    The use of fiber-reinforced high-performance concrete (FRHPC) is becoming more extended; therefore it is necessary to develop tools to simulate and better understand its behavior. In this work, a discrete model for the analysis of fracture mechanics in FRHPC is presented. The plain concrete matrix,

  17. 20th and 21st Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Kobayashi, Hiroaki; Qi, Jiaxing; Roller, Sabine

    2015-01-01

    The book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general, and the future of high-performance systems and heterogeneous architectures specifically. The application contributions cover computational fluid dynamics, material science, medical applications and climate research. Innovative fields like coupled multi-physics or multi-scale simulations are also discussed. All papers were chosen from presentations given at the 20th Workshop on Sustained Simulation Performance in December 2014 at the HLRS, University of Stuttgart, Germany, and the subsequent Workshop on Sustained Simulation Performance at Tohoku University in February 2015.  .

  18. Critical thinking skills in nursing students: comparison of simulation-based performance with metrics

    Science.gov (United States)

    Fero, Laura J.; O’Donnell, John M.; Zullo, Thomas G.; Dabbs, Annette DeVito; Kitutu, Julius; Samosky, Joseph T.; Hoffman, Leslie A.

    2018-01-01

    Aim This paper is a report of an examination of the relationship between metrics of critical thinking skills and performance in simulated clinical scenarios. Background Paper and pencil assessments are commonly used to assess critical thinking but may not reflect simulated performance. Methods In 2007, a convenience sample of 36 nursing students participated in measurement of critical thinking skills and simulation-based performance using videotaped vignettes, high-fidelity human simulation, the California Critical Thinking Disposition Inventory and California Critical Thinking Skills Test. Simulation- based performance was rated as ‘meeting’ or ‘not meeting’ overall expectations. Test scores were categorized as strong, average, or weak. Results Most (75·0%) students did not meet overall performance expectations using videotaped vignettes or high-fidelity human simulation; most difficulty related to problem recognition and reporting findings to the physician. There was no difference between overall performance based on method of assessment (P = 0·277). More students met subcategory expectations for initiating nursing interventions (P ≤ 0·001) using high-fidelity human simulation. The relationship between video-taped vignette performance and critical thinking disposition or skills scores was not statistically significant, except for problem recognition and overall critical thinking skills scores (Cramer’s V = 0·444, P = 0·029). There was a statistically significant relationship between overall high-fidelity human simulation performance and overall critical thinking disposition scores (Cramer’s V = 0·413, P = 0·047). Conclusion Students’ performance reflected difficulty meeting expectations in simulated clinical scenarios. High-fidelity human simulation performance appeared to approximate scores on metrics of critical thinking best. Further research is needed to determine if simulation-based performance correlates with critical thinking skills

  19. Critical thinking skills in nursing students: comparison of simulation-based performance with metrics.

    Science.gov (United States)

    Fero, Laura J; O'Donnell, John M; Zullo, Thomas G; Dabbs, Annette DeVito; Kitutu, Julius; Samosky, Joseph T; Hoffman, Leslie A

    2010-10-01

    This paper is a report of an examination of the relationship between metrics of critical thinking skills and performance in simulated clinical scenarios. Paper and pencil assessments are commonly used to assess critical thinking but may not reflect simulated performance. In 2007, a convenience sample of 36 nursing students participated in measurement of critical thinking skills and simulation-based performance using videotaped vignettes, high-fidelity human simulation, the California Critical Thinking Disposition Inventory and California Critical Thinking Skills Test. Simulation-based performance was rated as 'meeting' or 'not meeting' overall expectations. Test scores were categorized as strong, average, or weak. Most (75.0%) students did not meet overall performance expectations using videotaped vignettes or high-fidelity human simulation; most difficulty related to problem recognition and reporting findings to the physician. There was no difference between overall performance based on method of assessment (P = 0.277). More students met subcategory expectations for initiating nursing interventions (P ≤ 0.001) using high-fidelity human simulation. The relationship between videotaped vignette performance and critical thinking disposition or skills scores was not statistically significant, except for problem recognition and overall critical thinking skills scores (Cramer's V = 0.444, P = 0.029). There was a statistically significant relationship between overall high-fidelity human simulation performance and overall critical thinking disposition scores (Cramer's V = 0.413, P = 0.047). Students' performance reflected difficulty meeting expectations in simulated clinical scenarios. High-fidelity human simulation performance appeared to approximate scores on metrics of critical thinking best. Further research is needed to determine if simulation-based performance correlates with critical thinking skills in the clinical setting. © 2010 The Authors. Journal of Advanced

  20. Virtual reality simulation training of mastoidectomy - studies on novice performance.

    Science.gov (United States)

    Andersen, Steven Arild Wuyts

    2016-08-01

    Virtual reality (VR) simulation-based training is increasingly used in surgical technical skills training including in temporal bone surgery. The potential of VR simulation in enabling high-quality surgical training is great and VR simulation allows high-stakes and complex procedures such as mastoidectomy to be trained repeatedly, independent of patients and surgical tutors, outside traditional learning environments such as the OR or the temporal bone lab, and with fewer of the constraints of traditional training. This thesis aims to increase the evidence-base of VR simulation training of mastoidectomy and, by studying the final-product performances of novices, investigates the transfer of skills to the current gold-standard training modality of cadaveric dissection, the effect of different practice conditions and simulator-integrated tutoring on performance and retention of skills, and the role of directed, self-regulated learning. Technical skills in mastoidectomy were transferable from the VR simulation environment to cadaveric dissection with significant improvement in performance after directed, self-regulated training in the VR temporal bone simulator. Distributed practice led to a better learning outcome and more consolidated skills than massed practice and also resulted in a more consistent performance after three months of non-practice. Simulator-integrated tutoring accelerated the initial learning curve but also caused over-reliance on tutoring, which resulted in a drop in performance when the simulator-integrated tutor-function was discontinued. The learning curves were highly individual but often plateaued early and at an inadequate level, which related to issues concerning both the procedure and the VR simulator, over-reliance on the tutor function and poor self-assessment skills. Future simulator-integrated automated assessment could potentially resolve some of these issues and provide trainees with both feedback during the procedure and immediate

  1. Mixed-Language High-Performance Computing for Plasma Simulations

    Directory of Open Access Journals (Sweden)

    Quanming Lu

    2003-01-01

    Full Text Available Java is receiving increasing attention as the most popular platform for distributed computing. However, programmers are still reluctant to embrace Java as a tool for writing scientific and engineering applications due to its still noticeable performance drawbacks compared with other programming languages such as Fortran or C. In this paper, we present a hybrid Java/Fortran implementation of a parallel particle-in-cell (PIC algorithm for plasma simulations. In our approach, the time-consuming components of this application are designed and implemented as Fortran subroutines, while less calculation-intensive components usually involved in building the user interface are written in Java. The two types of software modules have been glued together using the Java native interface (JNI. Our mixed-language PIC code was tested and its performance compared with pure Java and Fortran versions of the same algorithm on a Sun E6500 SMP system and a Linux cluster of Pentium~III machines.

  2. 18th and 19th Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Kobayashi, Hiroaki; Patel, Nisarg

    2015-01-01

    This book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and the future of high-performance systems and heterogeneous architectures in particular. The application-related contributions cover computational fluid dynamics, material science, medical applications and climate research; innovative fields such as coupled multi-physics and multi-scale simulations are highlighted. All papers were chosen from presentations given at the 18th Workshop on Sustained Simulation Performance held at the HLRS, University of Stuttgart, Germany in October 2013 and subsequent Workshop of the same name held at Tohoku University in March 2014.  

  3. Application of Nuclear Power Plant Simulator for High School Student Training

    Energy Technology Data Exchange (ETDEWEB)

    Kong, Chi Dong; Choi, Soo Young; Park, Min Young; Lee, Duck Jung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2014-10-15

    In this context, two lectures on nuclear power plant simulator and practical training were provided to high school students in 2014. The education contents were composed of two parts: the micro-physics simulator and the macro-physics simulator. The micro-physics simulator treats only in-core phenomena, whereas the macro-physics simulator describes whole system of a nuclear power plant but it considers a reactor core as a point. The high school students showed strong interests caused by the fact that they operated the simulation by themselves. This abstract reports the training detail and evaluation of the effectiveness of the training. Lectures on nuclear power plant simulator and practical exercises were performed at Ulsan Energy High School and Ulsan Meister High School. Two simulators were used: the macro- and micro-physics simulator. Using the macro-physics simulator, the following five simulations were performed: reactor power increase/decrease, reactor trip, single reactor coolant pump trip, large break loss of coolant accident, and station black-out with D.C. power loss. Using the micro-physics simulator, the following three analyses were performed: the transient analysis, fuel rod performance analysis, and thermal-hydraulics analysis. The students at both high schools showed interest and strong support for the simulator-based training. After the training, the students showed passionate responses that the education was of help for them to get interest in a nuclear power plant.

  4. Application of Nuclear Power Plant Simulator for High School Student Training

    International Nuclear Information System (INIS)

    Kong, Chi Dong; Choi, Soo Young; Park, Min Young; Lee, Duck Jung

    2014-01-01

    In this context, two lectures on nuclear power plant simulator and practical training were provided to high school students in 2014. The education contents were composed of two parts: the micro-physics simulator and the macro-physics simulator. The micro-physics simulator treats only in-core phenomena, whereas the macro-physics simulator describes whole system of a nuclear power plant but it considers a reactor core as a point. The high school students showed strong interests caused by the fact that they operated the simulation by themselves. This abstract reports the training detail and evaluation of the effectiveness of the training. Lectures on nuclear power plant simulator and practical exercises were performed at Ulsan Energy High School and Ulsan Meister High School. Two simulators were used: the macro- and micro-physics simulator. Using the macro-physics simulator, the following five simulations were performed: reactor power increase/decrease, reactor trip, single reactor coolant pump trip, large break loss of coolant accident, and station black-out with D.C. power loss. Using the micro-physics simulator, the following three analyses were performed: the transient analysis, fuel rod performance analysis, and thermal-hydraulics analysis. The students at both high schools showed interest and strong support for the simulator-based training. After the training, the students showed passionate responses that the education was of help for them to get interest in a nuclear power plant

  5. COMSOL-PHREEQC: a tool for high performance numerical simulation of reactive transport phenomena

    International Nuclear Information System (INIS)

    Nardi, Albert; Vries, Luis Manuel de; Trinchero, Paolo; Idiart, Andres; Molinero, Jorge

    2012-01-01

    Document available in extended abstract form only. Comsol Multiphysics (COMSOL, from now on) is a powerful Finite Element software environment for the modelling and simulation of a large number of physics-based systems. The user can apply variables, expressions or numbers directly to solid and fluid domains, boundaries, edges and points, independently of the computational mesh. COMSOL then internally compiles a set of equations representing the entire model. The availability of extremely powerful pre and post processors makes COMSOL a numerical platform well known and extensively used in many branches of sciences and engineering. On the other hand, PHREEQC is a freely available computer program for simulating chemical reactions and transport processes in aqueous systems. It is perhaps the most widely used geochemical code in the scientific community and is openly distributed. The program is based on equilibrium chemistry of aqueous solutions interacting with minerals, gases, solid solutions, exchangers, and sorption surfaces, but also includes the capability to model kinetic reactions with rate equations that are user-specified in a very flexible way by means of Basic statements directly written in the input file. Here we present COMSOL-PHREEQC, a software interface able to communicate and couple these two powerful simulators by means of a Java interface. The methodology is based on Sequential Non Iterative Approach (SNIA), where PHREEQC is compiled as a dynamic subroutine (iPhreeqc) that is called by the interface to solve the geochemical system at every element of the finite element mesh of COMSOL. The numerical tool has been extensively verified by comparison with computed results of 1D, 2D and 3D benchmark examples solved with other reactive transport simulators. COMSOL-PHREEQC is parallelized so that CPU time can be highly optimized in multi-core processors or clusters. Then, fully 3D detailed reactive transport problems can be readily simulated by means of

  6. Simulation on following Performance of High-Speed Railway In Situ Testing System

    Directory of Open Access Journals (Sweden)

    Fei-Long Zheng

    2013-01-01

    Full Text Available Subgrade bears both the weight of superstructures and the impacts of running trains. Its stability affects the line smoothness directly, but in situ testing method on it is inadequate. This paper presents a railway roadbed in situ testing device, the key component of which is an excitation hydraulic servo cylinder that can output the static pressure and dynamic pressure simultaneously to simulate the force of the trains to the subgrade. The principle of the excitation system is briefly introduced, and the transfer function of the closed-loop force control system is derived and simulated; that, it shows without control algorithm, the dynamic response is very low and the following performance is quite poor. So, the improvedadaptive model following control (AMFC algorithm based on direct state method is adopted. Then, control block diagram is built and simulated with the input of different waveforms and frequencies. The simulation results show that the system has been greatly improved; the output waveform can follow the input signal much better except for a little distortion when the signal varies severely. And the following performance becomes even better as the load stiffness increases.

  7. Investigating the Mobility of Light Autonomous Tracked Vehicles using a High Performance Computing Simulation Capability

    Science.gov (United States)

    Negrut, Dan; Mazhar, Hammad; Melanz, Daniel; Lamb, David; Jayakumar, Paramsothy; Letherwood, Michael; Jain, Abhinandan; Quadrelli, Marco

    2012-01-01

    This paper is concerned with the physics-based simulation of light tracked vehicles operating on rough deformable terrain. The focus is on small autonomous vehicles, which weigh less than 100 lb and move on deformable and rough terrain that is feature rich and no longer representable using a continuum approach. A scenario of interest is, for instance, the simulation of a reconnaissance mission for a high mobility lightweight robot where objects such as a boulder or a ditch that could otherwise be considered small for a truck or tank, become major obstacles that can impede the mobility of the light autonomous vehicle and negatively impact the success of its mission. Analyzing and gauging the mobility and performance of these light vehicles is accomplished through a modeling and simulation capability called Chrono::Engine. Chrono::Engine relies on parallel execution on Graphics Processing Unit (GPU) cards.

  8. Progress on H5Part: A Portable High Performance Parallel Data Interface for Electromagnetics Simulations

    International Nuclear Information System (INIS)

    Adelmann, Andreas; Gsell, Achim; Oswald, Benedikt; Schietinger, Thomas; Bethel, Wes; Shalf, John; Siegerist, Cristina; Stockinger, Kurt

    2007-01-01

    Significant problems facing all experimental and computational sciences arise from growing data size and complexity. Common to all these problems is the need to perform efficient data I/O on diverse computer architectures. In our scientific application, the largest parallel particle simulations generate vast quantities of six-dimensional data. Such a simulation run produces data for an aggregate data size up to several TB per run. Motivated by the need to address data I/O and access challenges, we have implemented H5Part, an open source data I/O API that simplifies the use of the Hierarchical Data Format v5 library (HDF5). HDF5 is an industry standard for high performance, cross-platform data storage and retrieval that runs on all contemporary architectures from large parallel supercomputers to laptops. H5Part, which is oriented to the needs of the particle physics and cosmology communities, provides support for parallel storage and retrieval of particles, structured and in the future unstructured meshes. In this paper, we describe recent work focusing on I/O support for particles and structured meshes and provide data showing performance on modern supercomputer architectures like the IBM POWER 5

  9. Use of high performance networks and supercomputers for real-time flight simulation

    Science.gov (United States)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  10. Approaching Sentient Building Performance Simulation Systems

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas; Heller, Alfred

    2014-01-01

    Sentient BPS systems can combine one or more high precision BPS and provide near instantaneous performance feedback directly in the design tool, thus providing speed and precision of building performance in the early design stages. Sentient BPS systems are essentially combining: 1) design tools, 2......) parametric tools, 3) BPS tools, 4) dynamic databases 5) interpolation techniques and 6) prediction techniques as a fast and valid simulation system, in the early design stage....

  11. High performance computing on vector systems

    CERN Document Server

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  12. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  13. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    Science.gov (United States)

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  14. Performance of space charge simulations using High Performance Computing (HPC) cluster

    CERN Document Server

    Bartosik, Hannes; CERN. Geneva. ATS Department

    2017-01-01

    In 2016 a collaboration agreement between CERN and Istituto Nazionale di Fisica Nucleare (INFN) through its Centro Nazionale Analisi Fotogrammi (CNAF, Bologna) was signed [1], which foresaw the purchase and installation of a cluster of 20 nodes with 32 cores each, connected with InfiniBand, at CNAF for the use of CERN members to develop parallelized codes as well as conduct massive simulation campaigns with the already available parallelized tools. As outlined in [1], after the installation and the set up of the first 12 nodes, the green light to proceed with the procurement and installation of the next 8 nodes can be given only after successfully passing an acceptance test based on two specific benchmark runs. This condition is necessary to consider the first batch of the cluster operational and complying with the desired performance specifications. In this brief note, we report the results of the above mentioned acceptance test.

  15. On the performance simulation of inter-stage turbine reheat

    International Nuclear Information System (INIS)

    Pellegrini, Alvise; Nikolaidis, Theoklis; Pachidis, Vassilios; Köhler, Stephan

    2017-01-01

    Highlights: • An innovative gas turbine performance simulation methodology is proposed. • It allows to perform DP and OD performance calculations for complex engines layouts. • It is essential for inter-turbine reheat (ITR) engine performance calculation. • A detailed description is provided for fast and flexible implementation. • The methodology is successfully verified against a commercial closed-source software. - Abstract: Several authors have suggested the implementation of reheat in high By-Pass Ratio (BPR) aero engines, to improve engine performance. In contrast to military afterburning, civil aero engines would aim at reducing Specific Fuel Consumption (SFC) by introducing ‘Inter-stage Turbine Reheat’ (ITR). To maximise benefits, the second combustor should be placed at an early stage of the expansion process, e.g. between the first and second High-Pressure Turbine (HPT) stages. The aforementioned cycle design requires the accurate simulation of two or more turbine stages on the same shaft. The Design Point (DP) performance can be easily evaluated by defining a Turbine Work Split (TWS) ratio between the turbine stages. However, the performance simulation of Off-Design (OD) operating points requires the calculation of the TWS parameter for every OD step, by taking into account the thermodynamic behaviour of each turbine stage, represented by their respective maps. No analytical solution of the aforementioned problem is currently available in the public domain. This paper presents an analytical methodology by which ITR can be simulated at DP and OD. Results show excellent agreement with a commercial, closed-source performance code; discrepancies range from 0% to 3.48%, and are ascribed to the different gas models implemented in the codes.

  16. High-performance modeling of CO2 sequestration by coupling reservoir simulation and molecular dynamics

    KAUST Repository

    Bao, Kai

    2013-01-01

    The present work describes a parallel computational framework for CO2 sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel HPC systems. In this framework, a parallel reservoir simulator, Reservoir Simulation Toolbox (RST), solves the flow and transport equations that describe the subsurface flow behavior, while the molecular dynamics simulations are performed to provide the required physical parameters. Numerous technologies from different fields are employed to make this novel coupled system work efficiently. One of the major applications of the framework is the modeling of large scale CO2 sequestration for long-term storage in the subsurface geological formations, such as depleted reservoirs and deep saline aquifers, which has been proposed as one of the most attractive and practical solutions to reduce the CO2 emission problem to address the global-warming threat. To effectively solve such problems, fine grids and accurate prediction of the properties of fluid mixtures are essential for accuracy. In this work, the CO2 sequestration is presented as our first example to couple the reservoir simulation and molecular dynamics, while the framework can be extended naturally to the full multiphase multicomponent compositional flow simulation to handle more complicated physical process in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our MD simulations compared with published data, and good scalability are observed with the massively parallel HPC systems. The performance and capacity of the proposed framework are well demonstrated with several experiments with hundreds of millions to a billion cells. To our best knowledge, the work represents the first attempt to couple the reservoir simulation and molecular simulation for large scale modeling. Due to the complexity of the subsurface systems

  17. THC-MP: High performance numerical simulation of reactive transport and multiphase flow in porous media

    Science.gov (United States)

    Wei, Xiaohui; Li, Weishan; Tian, Hailong; Li, Hongliang; Xu, Haixiao; Xu, Tianfu

    2015-07-01

    The numerical simulation of multiphase flow and reactive transport in the porous media on complex subsurface problem is a computationally intensive application. To meet the increasingly computational requirements, this paper presents a parallel computing method and architecture. Derived from TOUGHREACT that is a well-established code for simulating subsurface multi-phase flow and reactive transport problems, we developed a high performance computing THC-MP based on massive parallel computer, which extends greatly on the computational capability for the original code. The domain decomposition method was applied to the coupled numerical computing procedure in the THC-MP. We designed the distributed data structure, implemented the data initialization and exchange between the computing nodes and the core solving module using the hybrid parallel iterative and direct solver. Numerical accuracy of the THC-MP was verified through a CO2 injection-induced reactive transport problem by comparing the results obtained from the parallel computing and sequential computing (original code). Execution efficiency and code scalability were examined through field scale carbon sequestration applications on the multicore cluster. The results demonstrate successfully the enhanced performance using the THC-MP on parallel computing facilities.

  18. Prediction of SFL Interruption Performance from the Results of Arc Simulation during High-Current Phase

    Science.gov (United States)

    Lee, Jong-Chul; Lee, Won-Ho; Kim, Woun-Jea

    2015-09-01

    The design and development procedures of SF6 gas circuit breakers are still largely based on trial and error through testing although the development costs go higher every year. The computation cannot cover the testing satisfactorily because all the real processes arc not taken into account. But the knowledge of the arc behavior and the prediction of the thermal-flow inside the interrupters by numerical simulations are more useful than those by experiments due to the difficulties to obtain physical quantities experimentally and the reduction of computational costs in recent years. In this paper, in order to get further information into the interruption process of a SF6 self-blast interrupter, which is based on a combination of thermal expansion and the arc rotation principle, gas flow simulations with a CFD-arc modeling are performed during the whole switching process such as high-current period, pre-current zero period, and current-zero period. Through the complete work, the pressure-rise and the ramp of the pressure inside the chamber before current zero as well as the post-arc current after current zero should be a good criterion to predict the short-line fault interruption performance of interrupters.

  19. Performance simulation in high altitude platforms (HAPs) communications systems

    Science.gov (United States)

    Ulloa-Vásquez, Fernando; Delgado-Penin, J. A.

    2002-07-01

    This paper considers the analysis by simulation of a digital narrowband communication system for an scenario which consists of a High-Altitude aeronautical Platform (HAP) and fixed/mobile terrestrial transceivers. The aeronautical channel is modelled considering geometrical (angle of elevation vs. horizontal distance of the terrestrial reflectors) and statistical arguments and under these circumstances a serial concatenated coded digital transmission is analysed for several hypothesis related to radio-electric coverage areas. The results indicate a good feasibility for the communication system proposed and analysed.

  20. High-Fidelity Contrast Reaction Simulation Training: Performance Comparison of Faculty, Fellows, and Residents.

    Science.gov (United States)

    Pfeifer, Kyle; Staib, Lawrence; Arango, Jennifer; Kirsch, John; Arici, Mel; Kappus, Liana; Pahade, Jay

    2016-01-01

    Reactions to contrast material are uncommon in diagnostic radiology, and vary in clinical presentation from urticaria to life-threatening anaphylaxis. Prior studies have demonstrated a high error rate in contrast reaction management, with smaller studies using simulation demonstrating variable data on effectiveness. We sought to assess the effectiveness of high-fidelity simulation in teaching contrast reaction management for residents, fellows, and attendings. A 20-question multiple-choice test assessing contrast reaction knowledge, with Likert-scale questions assessing subjective comfort levels of management of contrast reactions, was created. Three simulation scenarios that represented a moderate reaction, a severe reaction, and a contrast reaction mimic were completed in a one-hour period in a simulation laboratory. All participants completed a pretest and a posttest at one month. A six-month delayed posttest was given, but was optional for all participants. A total of 150 radiologists participated (residents = 52; fellows = 24; faculty = 74) in the pretest and posttest; and 105 participants completed the delayed posttest (residents = 31; fellows = 17; faculty = 57). A statistically significant increase was found in the one-month posttest (P < .00001) and the six-month posttest scores (P < .00001) and Likert scores (P < .001) assessing comfort level in managing all contrast reactions, compared with the pretest. Test scores and comfort level for moderate and severe reactions significantly decreased at six months, compared with the one-month posttest (P < .05). High-fidelity simulation is an effective learning tool, allowing practice of "high-acuity" situation management in a nonthreatening environment; the simulation training resulted in significant improvement in test scores, as well as an increase in subjective comfort in management of reactions, across all levels of training. A six-month refresher course is suggested, to maintain knowledge and comfort level in

  1. Measurement and simulation of the performance of high energy physics data grids

    Science.gov (United States)

    Crosby, Paul Andrew

    This thesis describes a study of resource brokering in a computational Grid for high energy physics. Such systems are being devised in order to manage the unprecedented workload of the next generation particle physics experiments such as those at the Large Hadron Collider. A simulation of the European Data Grid has been constructed, and calibrated using logging data from a real Grid testbed. This model is then used to explore the Grid's middleware configuration, and suggest improvements to its scheduling policy. The expansion of the simulation to include data analysis of the type conducted by particle physicists is then described. A variety of job and data management policies are explored, in order to determine how well they meet the needs of physicists, as well as how efficiently they make use of CPU and network resources. Appropriate performance indicators are introduced in order to measure how well jobs and resources are managed from different perspectives. The effects of inefficiencies in Grid middleware are explored, as are methods of compensating for them. It is demonstrated that a scheduling algorithm should alter its weighting on load balancing and data distribution, depending on whether data transfer or CPU requirements dominate, and also on the level of job loading. It is also shown that an economic model for data management and replication can improve the efficiency of network use and job processing.

  2. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Carolyn L., E-mail: wangcl@uw.edu [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Schopp, Jennifer G.; Kani, Kimia [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Petscavage-Thomas, Jonelle M. [Penn State Hershey Medical Center, Department of Radiology, 500 University Drive, Hershey, PA 17033 (United States); Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H. [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States)

    2013-12-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation.

  3. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    International Nuclear Information System (INIS)

    Wang, Carolyn L.; Schopp, Jennifer G.; Kani, Kimia; Petscavage-Thomas, Jonelle M.; Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H.

    2013-01-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation

  4. The Effect of High and Low Antiepileptic Drug Dosage on Simulated Driving Performance in Person's with Seizures: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Alexander M. Crizzle

    2015-10-01

    Full Text Available Background: Prior studies examining driving performance have not examined the effects of antiepileptic drugs (AED’s or their dosages in persons with epilepsy. AED’s are the primary form of treatment to control seizures, but they are shown to affect cognition, attention, and vision, all which may impair driving. The purpose of this study was to describe the characteristics of high and low AED dosages on simulated driving performance in persons with seizures. Method: Patients (N = 11; mean age 42.1 ± 6.3; 55% female; 100% Caucasian were recruited from the Epilepsy Monitoring Unit and had their driving assessed on a simulator. Results: No differences emerged in total or specific types of driving errors between high and low AED dosages. However, high AED drug dosage was significantly associated with errors of lane maintenance (r = .67, p < .05 and gap acceptance (r = .66, p < .05. The findings suggest that higher AED dosages may adversely affect driving performance, irrespective of having a diagnosis of epilepsy, conversion disorder, or other medical conditions. Conclusion: Future studies with larger samples are required to examine whether AED dosage or seizure focus alone can impair driving performance in persons with and without seizures.

  5. Reusable Object-Oriented Solutions for Numerical Simulation of PDEs in a High Performance Environment

    Directory of Open Access Journals (Sweden)

    Andrea Lani

    2006-01-01

    Full Text Available Object-oriented platforms developed for the numerical solution of PDEs must combine flexibility and reusability, in order to ease the integration of new functionalities and algorithms. While designing similar frameworks, a built-in support for high performance should be provided and enforced transparently, especially in parallel simulations. The paper presents solutions developed to effectively tackle these and other more specific problems (data handling and storage, implementation of physical models and numerical methods that have arisen in the development of COOLFluiD, an environment for PDE solvers. Particular attention is devoted to describe a data storage facility, highly suitable for both serial and parallel computing, and to discuss the application of two design patterns, Perspective and Method-Command-Strategy, that support extensibility and run-time flexibility in the implementation of physical models and generic numerical algorithms respectively.

  6. Extended-Term Dynamic Simulations with High Penetrations of Photovoltaic Generation.

    Energy Technology Data Exchange (ETDEWEB)

    Concepcion, Ricky James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Ryan Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Donnelly, Matt [Montana Tech., Butte, MT (United States); Sanchez-Gasca, Juan [GE Energy, Schenectady, NY (United States)

    2016-01-01

    The uncontrolled intermittent availability of renewable energy sources makes integration of such devices into today's grid a challenge. Thus, it is imperative that dynamic simulation tools used to analyze power system performance are able to support systems with high amounts of photovoltaic (PV) generation. Additionally, simulation durations expanding beyond minutes into hours must be supported. This report aims to identify the path forward for dynamic simulation tools to accom- modate these needs by characterizing the properties of power systems (with high PV penetration), analyzing how these properties affect dynamic simulation software, and offering solutions for po- tential problems. We present a study of fixed time step, explicit numerical integration schemes that may be more suitable for these goals, based on identified requirements for simulating high PV penetration systems. We also present the alternative of variable time step integration. To help determine the characteristics of systems with high PV generation, we performed small signal sta- bility studies and time domain simulations of two representative systems. Along with feedback from stakeholders and vendors, we identify the current gaps in power system modeling including fast and slow dynamics and propose a new simulation framework to improve our ability to model and simulate longer-term dynamics.

  7. Micromagnetics on high-performance workstation and mobile computational platforms

    Science.gov (United States)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  8. The Fuel Accident Condition Simulator (FACS) furnace system for high temperature performance testing of VHTR fuel

    Energy Technology Data Exchange (ETDEWEB)

    Demkowicz, Paul A., E-mail: paul.demkowicz@inl.gov [Idaho National Laboratory, 2525 Fremont Avenue, MS 3860, Idaho Falls, ID 83415-3860 (United States); Laug, David V.; Scates, Dawn M.; Reber, Edward L.; Roybal, Lyle G.; Walter, John B.; Harp, Jason M. [Idaho National Laboratory, 2525 Fremont Avenue, MS 3860, Idaho Falls, ID 83415-3860 (United States); Morris, Robert N. [Oak Ridge National Laboratory, 1 Bethel Valley Road, Oak Ridge, TN 37831 (United States)

    2012-10-15

    Highlights: Black-Right-Pointing-Pointer A system has been developed for safety testing of irradiated coated particle fuel. Black-Right-Pointing-Pointer FACS system is designed to facilitate remote operation in a shielded hot cell. Black-Right-Pointing-Pointer System will measure release of fission gases and condensable fission products. Black-Right-Pointing-Pointer Fuel performance can be evaluated at temperatures as high as 2000 Degree-Sign C in flowing helium. - Abstract: The AGR-1 irradiation of TRISO-coated particle fuel specimens was recently completed and represents the most successful such irradiation in US history, reaching peak burnups of greater than 19% FIMA with zero failures out of 300,000 particles. An extensive post-irradiation examination (PIE) campaign will be conducted on the AGR-1 fuel in order to characterize the irradiated fuel properties, assess the in-pile fuel performance in terms of coating integrity and fission metals release, and determine the fission product retention behavior during high temperature safety testing. A new furnace system has been designed, built, and tested to perform high temperature accident tests. The Fuel Accident Condition Simulator furnace system is designed to heat fuel specimens at temperatures up to 2000 Degree-Sign C in helium while monitoring the release of volatile fission metals (e.g. Cs, Ag, Sr, and Eu), iodine, and fission gases (Kr, Xe). Fission gases released from the fuel to the sweep gas are monitored in real time using dual cryogenic traps fitted with high purity germanium detectors. Condensable fission products are collected on a plate attached to a water-cooled cold finger that can be exchanged periodically without interrupting the test. Analysis of fission products on the condensation plates involves dry gamma counting followed by chemical analysis of selected isotopes. This paper will describe design and operational details of the Fuel Accident Condition Simulator furnace system and the associated

  9. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  10. Equipment and performance upgrade of compact nuclear simulator

    International Nuclear Information System (INIS)

    Park, J. C.; Kwon, K. C.; Lee, D. Y.; Hwang, I. K.; Park, W. M.; Cha, K. H.; Song, S. J.; Lee, J. W.; Kim, B. G.; Kim, H. J.

    1999-01-01

    The simulator at Nuclear Training Center in KAERI became old and has not been used effectively for nuclear-related training and researches due to the problems such as aging of the equipment, difficulties in obtaining consumables and their high cost, and less personnel available who can handle the old equipment. To solve the problems, this study was performed for recovering the functions of the simulator through the technical design and replacement of components with new ones. As results of this study, our test after the replacement showed the same simulation status as the previous one, and new graphic displays added to the simulator was effective for the training and easy for maintenance. This study is meaningful as demonstrating the way of upgrading nuclear training simulators that lost their functioning due to the obsolescence of simulators and the unavailability of components

  11. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  12. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  13. Integrated plasma control for high performance tokamaks

    International Nuclear Information System (INIS)

    Humphreys, D.A.; Deranian, R.D.; Ferron, J.R.; Johnson, R.D.; LaHaye, R.J.; Leuer, J.A.; Penaflor, B.G.; Walker, M.L.; Welander, A.S.; Jayakumar, R.J.; Makowski, M.A.; Khayrutdinov, R.R.

    2005-01-01

    Sustaining high performance in a tokamak requires controlling many equilibrium shape and profile characteristics simultaneously with high accuracy and reliability, while suppressing a variety of MHD instabilities. Integrated plasma control, the process of designing high-performance tokamak controllers based on validated system response models and confirming their performance in detailed simulations, provides a systematic method for achieving and ensuring good control performance. For present-day devices, this approach can greatly reduce the need for machine time traditionally dedicated to control optimization, and can allow determination of high-reliability controllers prior to ever producing the target equilibrium experimentally. A full set of tools needed for this approach has recently been completed and applied to present-day devices including DIII-D, NSTX and MAST. This approach has proven essential in the design of several next-generation devices including KSTAR, EAST, JT-60SC, and ITER. We describe the method, results of design and simulation tool development, and recent research producing novel approaches to equilibrium and MHD control in DIII-D. (author)

  14. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  15. Visualization and Analysis of Climate Simulation Performance Data

    Science.gov (United States)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  16. Doctors' stress responses and poor communication performance in simulated bad-news consultations.

    Science.gov (United States)

    Brown, Rhonda; Dunn, Stewart; Byrnes, Karen; Morris, Richard; Heinrich, Paul; Shaw, Joanne

    2009-11-01

    No studies have previously evaluated factors associated with high stress levels and poor communication performance in breaking bad news (BBN) consultations. This study determined factors that were most strongly related to doctors' stress responses and poor communication performance during a simulated BBN task. In 2007, the authors recruited 24 doctors comprising 12 novices (i.e., interns/residents with 1-3 years' experience) and 12 experts (i.e., registrars, medical/radiation oncologists, or cancer surgeons, with more than 4 years' experience). Doctors participated in simulated BBN consultations and a number of control tasks. Five-minute-epoch heart rate (HR), HR variability, and communication performance were assessed in all participants. Subjects also completed a short questionnaire asking about their prior experience BBN, perceived stress, psychological distress (i.e., anxiety, depression), fatigue, and burnout. High stress responses were related to inexperience with BBN, fatigue, and giving bad versus good news. Poor communication performance in the consultation was related to high burnout and fatigue scores. These results suggest that BBN was a stressful experience for doctors even in a simulated encounter, especially for those who were inexperienced and/or fatigued. Poor communication performance was related to burnout and fatigue, but not inexperience with BBN. These results likely indicate that burnout and fatigue contributed to stress and poor work performance in some doctors during the simulated BBN task.

  17. H5Part A Portable High Performance Parallel Data Interface for Particle Simulations

    CERN Document Server

    Adelmann, Andreas; Shalf, John M; Siegerist, Cristina

    2005-01-01

    Largest parallel particle simulations, in six dimensional phase space generate wast amont of data. It is also desirable to share data and data analysis tools such as ParViT (Particle Visualization Toolkit) among other groups who are working on particle-based accelerator simulations. We define a very simple file schema built on top of HDF5 (Hierarchical Data Format version 5) as well as an API that simplifies the reading/writing of the data to the HDF5 file format. HDF5 offers a self-describing machine-independent binary file format that supports scalable parallel I/O performance for MPI codes on a variety of supercomputing systems and works equally well on laptop computers. The API is available for C, C++, and Fortran codes. The file format will enable disparate research groups with very different simulation implementations to share data transparently and share data analysis tools. For instance, the common file format will enable groups that depend on completely different simulation implementations to share c...

  18. StagBL : A Scalable, Portable, High-Performance Discretization and Solver Layer for Geodynamic Simulation

    Science.gov (United States)

    Sanan, P.; Tackley, P. J.; Gerya, T.; Kaus, B. J. P.; May, D.

    2017-12-01

    StagBL is an open-source parallel solver and discretization library for geodynamic simulation,encapsulating and optimizing operations essential to staggered-grid finite volume Stokes flow solvers.It provides a parallel staggered-grid abstraction with a high-level interface in C and Fortran.On top of this abstraction, tools are available to define boundary conditions and interact with particle systems.Tools and examples to efficiently solve Stokes systems defined on the grid are provided in small (direct solver), medium (simple preconditioners), and large (block factorization and multigrid) model regimes.By working directly with leading application codes (StagYY, I3ELVIS, and LaMEM) and providing an API and examples to integrate with others, StagBL aims to become a community tool supplying scalable, portable, reproducible performance toward novel science in regional- and planet-scale geodynamics and planetary science.By implementing kernels used by many research groups beneath a uniform abstraction layer, the library will enable optimization for modern hardware, thus reducing community barriers to large- or extreme-scale parallel simulation on modern architectures. In particular, the library will include CPU-, Manycore-, and GPU-optimized variants of matrix-free operators and multigrid components.The common layer provides a framework upon which to introduce innovative new tools.StagBL will leverage p4est to provide distributed adaptive meshes, and incorporate a multigrid convergence analysis tool.These options, in addition to a wealth of solver options provided by an interface to PETSc, will make the most modern solution techniques available from a common interface. StagBL in turn provides a PETSc interface, DMStag, to its central staggered grid abstraction.We present public version 0.5 of StagBL, including preliminary integration with application codes and demonstrations with its own demonstration application, StagBLDemo. Central to StagBL is the notion of an

  19. Performance measurement system for training simulators. Interim report

    International Nuclear Information System (INIS)

    Bockhold, G. Jr.; Roth, D.R.

    1978-05-01

    In the first project phase, the project team has designed, installed, and test run on the Browns Ferry nuclear power plant training simulator a performance measurement system capable of automatic recording of statistical information on operator actions and plant response. Key plant variables and operator actions were monitored and analyzed by the simulator computer for a selected set of four operating and casualty drills. The project has the following objectives: (1) To provide an empirical data base for statistical analysis of operator reliability and for allocation of safety and control functions between operators and automated controls; (2) To develop a method for evaluation of the effectiveness of control room designs and operating procedures; and (3) To develop a system for scoring aspects of operator performance to assist in training evaluations and to support operator selection research. The performance measurement system has shown potential for meeting the research objectives. However, the cost of training simulator time is high; to keep research program costs reasonable, the measurement system is being designed to be an integral part of operator training programs. In the pilot implementation, participating instructors judged the measurement system to be a valuable and objective extension of their abilities to monitor trainee performance

  20. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    Science.gov (United States)

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.

  1. Effects of Dietary Nitrate Supplementation on Physiological Responses, Cognitive Function, and Exercise Performance at Moderate and Very-High Simulated Altitude

    Directory of Open Access Journals (Sweden)

    Oliver M. Shannon

    2017-06-01

    Full Text Available Purpose: Nitric oxide (NO bioavailability is reduced during acute altitude exposure, contributing toward the decline in physiological and cognitive function in this environment. This study evaluated the effects of nitrate (NO3− supplementation on NO bioavailability, physiological and cognitive function, and exercise performance at moderate and very-high simulated altitude.Methods:Ten males (mean (SD: V˙O2max: 60.9 (10.1 ml·kg−1·min−1 rested and performed exercise twice at moderate (~14.0% O2; ~3,000 m and twice at very-high (~11.7% O2; ~4,300 m simulated altitude. Participants ingested either 140 ml concentrated NO3−-rich (BRJ; ~12.5 mmol NO3− or NO3−-deplete (PLA; 0.01 mmol NO3− beetroot juice 2 h before each trial. Participants rested for 45 min in normobaric hypoxia prior to completing an exercise task. Exercise comprised a 45 min walk at 30% V˙O2max and a 3 km time-trial (TT, both conducted on a treadmill at a 10% gradient whilst carrying a 10 kg backpack to simulate altitude hiking. Plasma nitrite concentration ([NO2−], peripheral oxygen saturation (SpO2, pulmonary oxygen uptake (V˙O2, muscle and cerebral oxygenation, and cognitive function were measured throughout.Results: Pre-exercise plasma [NO2−] was significantly elevated in BRJ compared with PLA (p = 0.001. Pulmonary V˙O2 was reduced (p = 0.020, and SpO2 was elevated (p = 0.005 during steady-state exercise in BRJ compared with PLA, with similar effects at both altitudes. BRJ supplementation enhanced 3 km TT performance relative to PLA by 3.8% [1,653.9 (261.3 vs. 1718.7 (213.0 s] and 4.2% [1,809.8 (262.0 vs. 1,889.1 (203.9 s] at 3,000 and 4,300 m, respectively (p = 0.019. Oxygenation of the gastrocnemius was elevated during the TT consequent to BRJ (p = 0.011. The number of false alarms during the Rapid Visual Information Processing Task tended to be lower with BRJ compared with PLA prior to altitude exposure (p = 0.056. Performance in all other cognitive tasks

  2. Importance of debriefing in high-fidelity simulations

    Directory of Open Access Journals (Sweden)

    Igor Karnjuš

    2014-04-01

    Full Text Available Debriefing has been identified as one of the most important parts of a high-fidelity simulation learning process. During debriefing, the mentor invites learners to critically assess the knowledge and skills used during the execution of a scenario. Regardless of the abundance of studies that have examined simulation-based education, debriefing is still poorly defined.The present article examines the essential features of debriefing, its phases, techniques and methods with a systematic review of recent publications. It emphasizes the mentor’s role, since the effectiveness of debriefing largely depends on the mentor’s skills to conduct it. The guidelines that allow the mentor to evaluate his performance in conducting debriefing are also presented. We underline the importance of debriefing in clinical settings as part of continuous learning process. Debriefing allows the medical teams to assess their performance and develop new strategies to achieve higher competencies.Although the debriefing is the cornerstone of high-fidelity simulation learning process, it also represents an important learning strategy in the clinical setting. Many important aspects of debriefing are still poorly explored and understood, therefore this part of the learning process should be given greater attention in the future.

  3. A predictive analytic model for high-performance tunneling field-effect transistors approaching non-equilibrium Green's function simulations

    International Nuclear Information System (INIS)

    Salazar, Ramon B.; Appenzeller, Joerg; Ilatikhameneh, Hesameddin; Rahman, Rajib; Klimeck, Gerhard

    2015-01-01

    A new compact modeling approach is presented which describes the full current-voltage (I-V) characteristic of high-performance (aggressively scaled-down) tunneling field-effect-transistors (TFETs) based on homojunction direct-bandgap semiconductors. The model is based on an analytic description of two key features, which capture the main physical phenomena related to TFETs: (1) the potential profile from source to channel and (2) the elliptic curvature of the complex bands in the bandgap region. It is proposed to use 1D Poisson's equations in the source and the channel to describe the potential profile in homojunction TFETs. This allows to quantify the impact of source/drain doping on device performance, an aspect usually ignored in TFET modeling but highly relevant in ultra-scaled devices. The compact model is validated by comparison with state-of-the-art quantum transport simulations using a 3D full band atomistic approach based on non-equilibrium Green's functions. It is shown that the model reproduces with good accuracy the data obtained from the simulations in all regions of operation: the on/off states and the n/p branches of conduction. This approach allows calculation of energy-dependent band-to-band tunneling currents in TFETs, a feature that allows gaining deep insights into the underlying device physics. The simplicity and accuracy of the approach provide a powerful tool to explore in a quantitatively manner how a wide variety of parameters (material-, size-, and/or geometry-dependent) impact the TFET performance under any bias conditions. The proposed model presents thus a practical complement to computationally expensive simulations such as the 3D NEGF approach

  4. Multi-Bunch Simulations of the ILC for Luminosity Performance Studies

    CERN Document Server

    White, Glen; Walker, Nicholas J

    2005-01-01

    To study the luminosity performance of the International Linear Collider (ILC) with different design parameters, a simulation was constructed that tracks a multi-bunch representation of the beam from the Damping Ring extraction through to the Interaction Point. The simulation code PLACET is used to simulate the LINAC, MatMerlin is used to track through the Beam Delivery System and GUINEA-PIG for the beam-beam interaction. Included in the simulation are ground motion and wakefield effects, intra-train fast feedback and luminosity-based feedback systems. To efficiently study multiple parameters/multiple seeds, the simulation is deployed on the Queen Mary High-Throughput computing cluster at Queen Mary, University of London, where 100 simultaneous simulation seeds can be run.

  5. INEX simulations of the optical performance of the AFEL

    International Nuclear Information System (INIS)

    Goldstein, J.C.; Wang, T.S.F.; Sheffield, R.L.

    1991-01-01

    The AFEL (Advanced Free-Electron Laser) Project at Los Alamos National Laboratory is presently under construction. The project's goal is to produce a very high-brightness electron beam which will be generated by a photocathode injector and a 20 MeV rf-linac. Initial laser experiments will be performed with a 1-cm-period permanent magnet wiggler which will generate intense optical radiation near a wavelength of 3.7 μm. Future experiments will operate with ''slotted-tube'' electromagnetic wigglers (formerly called ''pulsed- wire'' wigglers). Experiments at both fundamental and higher-harmonic wavelengths are planned. This paper presents results of INEX (Integrated Numerical EXperiment) simulations of the optical performance of the AFEL. These simulations use the electron micropulse produced by the accelerator/beam transport code PARMELA in the 3-D FEL simulation code FELEX. 9 refs., 4 figs., 6 tabs

  6. A High-Fidelity Batch Simulation Environment for Integrated Batch and Piloted Air Combat Simulation Analysis

    Science.gov (United States)

    Goodrich, Kenneth H.; McManus, John W.; Chappell, Alan R.

    1992-01-01

    A batch air combat simulation environment known as the Tactical Maneuvering Simulator (TMS) is presented. The TMS serves as a tool for developing and evaluating tactical maneuvering logics. The environment can also be used to evaluate the tactical implications of perturbations to aircraft performance or supporting systems. The TMS is capable of simulating air combat between any number of engagement participants, with practical limits imposed by computer memory and processing power. Aircraft are modeled using equations of motion, control laws, aerodynamics and propulsive characteristics equivalent to those used in high-fidelity piloted simulation. Databases representative of a modern high-performance aircraft with and without thrust-vectoring capability are included. To simplify the task of developing and implementing maneuvering logics in the TMS, an outer-loop control system known as the Tactical Autopilot (TA) is implemented in the aircraft simulation model. The TA converts guidance commands issued by computerized maneuvering logics in the form of desired angle-of-attack and wind axis-bank angle into inputs to the inner-loop control augmentation system of the aircraft. This report describes the capabilities and operation of the TMS.

  7. LIAR -- A computer program for the modeling and simulation of high performance linacs

    International Nuclear Information System (INIS)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-04-01

    The computer program LIAR (LInear Accelerator Research Code) is a numerical modeling and simulation tool for high performance linacs. Amongst others, it addresses the needs of state-of-the-art linear colliders where low emittance, high-intensity beams must be accelerated to energies in the 0.05-1 TeV range. LIAR is designed to be used for a variety of different projects. LIAR allows the study of single- and multi-particle beam dynamics in linear accelerators. It calculates emittance dilutions due to wakefield deflections, linear and non-linear dispersion and chromatic effects in the presence of multiple accelerator imperfections. Both single-bunch and multi-bunch beams can be simulated. Several basic and advanced optimization schemes are implemented. Present limitations arise from the incomplete treatment of bending magnets and sextupoles. A major objective of the LIAR project is to provide an open programming platform for the accelerator physics community. Due to its design, LIAR allows straight-forward access to its internal FORTRAN data structures. The program can easily be extended and its interactive command language ensures maximum ease of use. Presently, versions of LIAR are compiled for UNIX and MS Windows operating systems. An interface for the graphical visualization of results is provided. Scientific graphs can be saved in the PS and EPS file formats. In addition a Mathematica interface has been developed. LIAR now contains more than 40,000 lines of source code in more than 130 subroutines. This report describes the theoretical basis of the program, provides a reference for existing features and explains how to add further commands. The LIAR home page and the ONLINE version of this manual can be accessed under: http://www.slac.stanford.edu/grp/arb/rwa/liar.htm

  8. Burnout among pilots: psychosocial factors related to happiness and performance at simulator training.

    Science.gov (United States)

    Demerouti, Evangelia; Veldhuis, Wouter; Coombes, Claire; Hunter, Rob

    2018-06-18

    In this study among airline pilots, we aim to uncover the work characteristics (job demands and resources) and the outcomes (job crafting, happiness and simulator training performance) that are related to burnout for this occupational group. Using a large sample of airline pilots, we showed that 40% of the participating pilots experience high burnout. In line with Job Demands-Resources theory, job demands were detrimental for simulator training performance because they made pilots more exhausted and less able to craft their job, whereas job resources had a favourable effect because they reduced feelings of disengagement and increased job crafting. Moreover, burnout was negatively related to pilots' happiness with life. These findings highlight the importance of psychosocial factors and health for valuable outcomes for both pilots and airlines. Practitioner Summary: Using an online survey among the members of a European pilots' professional association, we examined the relationship between psychosocial factors (work characteristics, burnout) and outcomes (simulator training performance, happiness). Forty per cent of the participating pilots experience high burnout. Job demands were detrimental, whereas job resources were favourable for simulator training performance/happiness. Twitter text: 40% of airline pilots experience burnout and psychosocial work factors and burnout relate to performance at pilots' simulator training.

  9. Comparison between the performance of some KEK-klystrons and simulation results

    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, Shigeki [National Lab. for High Energy Physics, Tsukuba, Ibaraki (Japan)

    1997-04-01

    Recent developments of various klystron simulation codes have enabled us to realistically design klystrons. This paper presents various simulation results using the FCI code and the performances of tubes manufactured based on this code. Upgrading a 30-MW S-band klystron and developing a 50-MW S-band klystron for the KEKB projects are successful examples based on FCI-code predictions. Mass-productions of these tubes have already started. On the other hand, a discrepancy has been found between the FCI simulation results and the performance of real tubes. In some cases, the simulation results lead to high-efficiency results, while manufactured tubes show the usual value, or a lower value, of the efficiency. One possible cause may come from a data mismatch between the electron-gun simulation and the input data set of the FCI code for the gun region. This kind of discrepancy has been observed in 30-MW S-band pulsed tubes, sub-booster pulsed tubes and L-band high-duty pulsed klystrons. Sometimes, JPNDSK (one-dimensional disk-model code) gives similar results. Some examples using the FCI code are given in this article. An Arsenal-MSU code could be applied to the 50-MW klystron under collaboration with Moscow State University; a good agreement has been found between the prediction of the code and performance. (author)

  10. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  11. Undergraduate nursing students' performance in recognising and responding to sudden patient deterioration in high psychological fidelity simulated environments: an Australian multi-centre study.

    Science.gov (United States)

    Bogossian, Fiona; Cooper, Simon; Cant, Robyn; Beauchamp, Alison; Porter, Joanne; Kain, Victoria; Bucknall, Tracey; Phillips, Nicole M

    2014-05-01

    Early recognition and situation awareness of sudden patient deterioration, a timely appropriate clinical response, and teamwork are critical to patient outcomes. High fidelity simulated environments provide the opportunity for undergraduate nursing students to develop and refine recognition and response skills. This paper reports the quantitative findings of the first phase of a larger program of ongoing research: Feedback Incorporating Review and Simulation Techniques to Act on Clinical Trends (FIRST2ACTTM). It specifically aims to identify the characteristics that may predict primary outcome measures of clinical performance, teamwork and situation awareness in the management of deteriorating patients. Mixed-method multi-centre study. High fidelity simulated acute clinical environment in three Australian universities. A convenience sample of 97 final year nursing students enrolled in an undergraduate Bachelor of Nursing or combined Bachelor of Nursing degree were included in the study. In groups of three, participants proceeded through three phases: (i) pre-briefing and completion of a multi-choice question test, (ii) three video-recorded simulated clinical scenarios where actors substituted real patients with deteriorating conditions, and (iii) post-scenario debriefing. Clinical performance, teamwork and situation awareness were evaluated, using a validated standard checklist (OSCE), Team Emergency Assessment Measure (TEAM) score sheet and Situation Awareness Global Assessment Technique (SAGAT). A Modified Angoff technique was used to establish cut points for clinical performance. Student teams engaged in 97 simulation experiences across the three scenarios and achieved a level of clinical performance consistent with the experts' identified pass level point in only 9 (1%) of the simulation experiences. Knowledge was significantly associated with overall teamwork (p=.034), overall situation awareness (p=.05) and clinical performance in two of the three scenarios

  12. libRoadRunner: a high performance SBML simulation and analysis library.

    Science.gov (United States)

    Somogyi, Endre T; Bouteiller, Jean-Marie; Glazier, James A; König, Matthias; Medley, J Kyle; Swat, Maciej H; Sauro, Herbert M

    2015-10-15

    This article presents libRoadRunner, an extensible, high-performance, cross-platform, open-source software library for the simulation and analysis of models expressed using Systems Biology Markup Language (SBML). SBML is the most widely used standard for representing dynamic networks, especially biochemical networks. libRoadRunner is fast enough to support large-scale problems such as tissue models, studies that require large numbers of repeated runs and interactive simulations. libRoadRunner is a self-contained library, able to run both as a component inside other tools via its C++ and C bindings, and interactively through its Python interface. Its Python Application Programming Interface (API) is similar to the APIs of MATLAB ( WWWMATHWORKSCOM: ) and SciPy ( HTTP//WWWSCIPYORG/: ), making it fast and easy to learn. libRoadRunner uses a custom Just-In-Time (JIT) compiler built on the widely used LLVM JIT compiler framework. It compiles SBML-specified models directly into native machine code for a variety of processors, making it appropriate for solving extremely large models or repeated runs. libRoadRunner is flexible, supporting the bulk of the SBML specification (except for delay and non-linear algebraic equations) including several SBML extensions (composition and distributions). It offers multiple deterministic and stochastic integrators, as well as tools for steady-state analysis, stability analysis and structural analysis of the stoichiometric matrix. libRoadRunner binary distributions are available for Mac OS X, Linux and Windows. The library is licensed under Apache License Version 2.0. libRoadRunner is also available for ARM-based computers such as the Raspberry Pi. http://www.libroadrunner.org provides online documentation, full build instructions, binaries and a git source repository. hsauro@u.washington.edu or somogyie@indiana.edu Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2015. This work is written

  13. A high-performance model for shallow-water simulations in distributed and heterogeneous architectures

    Science.gov (United States)

    Conde, Daniel; Canelas, Ricardo B.; Ferreira, Rui M. L.

    2017-04-01

    unstructured nature of the mesh topology with the corresponding employed solution, based on space-filling curves, being analyzed and discussed. Intra-node parallelism is achieved through OpenMP for CPUs and CUDA for GPUs, depending on which kind of device the process is running. Here the main difficulty is associated with the Object-Oriented approach, where the presence of complex data structures can degrade model performance considerably. STAV-2D now supports fully distributed and heterogeneous simulations where multiple different devices can be used to accelerate computation time. The advantages, short-comings and specific solutions for the employed unified Object-Oriented approach, where the source code for CPU and GPU has the same compilation units (no device specific branches like seen in available models), are discussed and quantified with a thorough scalability and performance analysis. The assembled parallel model is expected to achieve faster than real-time simulations for high resolutions (from meters to sub-meter) in large scaled problems (from cities to watersheds), effectively bridging the gap between detailed and timely simulation results. Acknowledgements This research as partially supported by Portuguese and European funds, within programs COMPETE2020 and PORL-FEDER, through project PTDC/ECM-HID/6387/2014 and Doctoral Grant SFRH/BD/97933/2013 granted by the National Foundation for Science and Technology (FCT). References Canelas, R.; Murillo, J. & Ferreira, R.M.L. (2013), Two-dimensional depth-averaged modelling of dam-break flows over mobile beds. Journal of Hydraulic Research, 51(4), 392-407. Conde, D. A. S.; Baptista, M. A. V.; Sousa Oliveira, C. & Ferreira, R. M. L. (2013), A shallow-flow model for the propagation of tsunamis over complex geometries and mobile beds, Nat. Hazards and Earth Syst. Sci., 13, 2533-2542. Conde, D. A. S.; Telhado, M. J.; Viana Baptista, M. A. & Ferreira, R. M. L. (2015) Severity and exposure associated with tsunami actions in

  14. Water desalination price from recent performances: Modelling, simulation and analysis

    International Nuclear Information System (INIS)

    Metaiche, M.; Kettab, A.

    2005-01-01

    The subject of the present article is the technical simulation of seawater desalination, by a one stage reverse osmosis system, the objectives of which are the recent valuation of cost price through the use of new membrane and permeator performances, the use of new means of simulation and modelling of desalination parameters, and show the main parameters influencing the cost price. We have taken as the simulation example the Seawater Desalting centre of Djannet (Boumerdes, Algeria). The present performances allow water desalting at a price of 0.5 $/m 3 , which is an interesting and promising price, corresponding with the very acceptable water product quality, in the order of 269 ppm. It is important to run the desalting systems by reverse osmosis under high pressure, resulting in further decrease of the desalting cost and the production of good quality water. Aberration in choice of functioning conditions produces high prices and unacceptable quality. However there exists the possibility of decreasing the price by decreasing the requirement on the product quality. The seawater temperature has an effect on the cost price and quality. The installation of big desalting centres, contributes to the decrease in prices. A very important, long and tedious calculation is effected, which is impossible to conduct without programming and informatics tools. The use of the simulation model has been much efficient in the design of desalination centres that can perform at very improved prices. (author)

  15. Performance simulation of an absorption heat transformer operating with partially miscible mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, D.; Cachot, T.; Hornut, J.M. [LSGC-CNRS-ENSIC, Nancy (France); Univ. Henri Poincare, Nancy (France). IUT

    2002-07-08

    This paper proposes to study the thermodynamics performances of a new absorption heat-transformer cycle, where the separation step is obtained by the cooling and settling of a partially miscible mixture at low temperature. This new cycle has been called an absorption-demixing heat transformer (ADHT) cycle. A numerical simulation code has been written, and has allowed us to evaluate the temperature lift and thermal yield of 2 working pairs. Both high qualitative and quantitative performances have been obtained, so demonstrating the feasibility and industrial interest for such a cycle. Moreover a comparison of the simulation results with performances really obtained on an experimental ADHT has confirmed the pertinence of the simulation code.(author)

  16. Propagation Diagnostic Simulations Using High-Resolution Equatorial Plasma Bubble Simulations

    Science.gov (United States)

    Rino, C. L.; Carrano, C. S.; Yokoyama, T.

    2017-12-01

    In a recent paper, under review, equatorial-plasma-bubble (EPB) simulations were used to conduct a comparative analysis of the EPB spectra characteristics with high-resolution in-situ measurements from the C/NOFS satellite. EPB realizations sampled in planes perpendicular to magnetic field lines provided well-defined EPB structure at altitudes penetrating both high and low-density regions. The average C/NOFS structure in highly disturbed regions showed nearly identical two-component inverse-power-law spectral characteristics as the measured EPB structure. This paper describes the results of PWE simulations using the same two-dimensional cross-field EPB realizations. New Irregularity Parameter Estimation (IPE) diagnostics, which are based on two-dimensional equivalent-phase-screen theory [A theory of scintillation for two-component power law irregularity spectra: Overview and numerical results, by Charles Carrano and Charles Rino, DOI: 10.1002/2015RS005903], have been successfully applied to extract two-component inverse-power-law parameters from measured intensity spectra. The EPB simulations [Low and Midlatitude Ionospheric Plasma DensityIrregularities and Their Effects on Geomagnetic Field, by Tatsuhiro Yokoyama and Claudia Stolle, DOI 10.1007/s11214-016-0295-7] have sufficient resolution to populate the structure scales (tens of km to hundreds of meters) that cause strong scintillation at GPS frequencies. The simulations provide an ideal geometry whereby the ramifications of varying structure along the propagation path can be investigated. It is well known path-integrated one-dimensional spectra increase the one-dimensional index by one. The relation requires decorrelation along the propagation path. Correlated structure would be interpreted as stochastic total-electron-content (TEC). The simulations are performed with unmodified structure. Because the EPB structure is confined to the central region of the sample planes, edge effects are minimized. Consequently

  17. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  18. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  19. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  20. High Performance Computing in Science and Engineering '16 : Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2016

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2016. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  1. High-performance modeling of CO2 sequestration by coupling reservoir simulation and molecular dynamics

    KAUST Repository

    Bao, Kai; Yan, Mi; Lu, Ligang; Allen, Rebecca; Salam, Amgad; Jordan, Kirk E.; Sun, Shuyu

    2013-01-01

    multicomponent compositional flow simulation to handle more complicated physical process in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our

  2. Simulation and design of omni-directional high speed multibeam transmitter system

    Science.gov (United States)

    Tang, Jaw-Luen; Jui, Ping-Chang; Wang, Sun-Chen

    2006-09-01

    For future high speed indoor wireless communication, diffuse wireless optical communications offer more robust optical links against shadowing than line-of-sight links. However, their performance may be degraded by multipath dispersion resulting from surface reflections. We have developed a multipath diffusive propagation model capable of providing channel impulse responses data. It is aimed to design and simulate any multi-beam transmitter under a variety of indoor environments. In this paper, a multi-beam transmitter system with semi-sphere structure is proposed to combat the diverse effects of multipath distortion albeit, at the cost of increased laser power and cost. Simulation results of multiple impulse responses showed that this type of multi-beam transmitter can significantly improve the performance of BER suitable for high bit rate application. We present the performance and simulation results for both line-of-sight and diffuse link configurations.

  3. Simulant Basis for the Standard High Solids Vessel Design

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Reid A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fiskum, Sandra K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suffield, Sarah R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daniel, Richard C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gauglitz, Phillip A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wells, Beric E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-30

    The Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant and a non-Newtonian simulant be developed that would represent the Most Adverse Design Conditions (in development) with respect to mixing performance as specified by WTP. The majority of the simulant requirements are specified in 24590-PTF-RPT-PE-16-001, Rev. 0. The first step in this process is to develop the basis for these simulants. This document describes the basis for the properties of these two simulant types. The simulant recipes that meet this basis will be provided in a subsequent document.

  4. Simulated Performances of a Very High Energy Tomograph for Non-Destructive Characterization of large objects

    Science.gov (United States)

    Kistler, Marc; Estre, Nicolas; Merle, Elsa

    2018-01-01

    As part of its R&D activities on high-energy X-ray imaging for non-destructive characterization, the Nuclear Measurement Laboratory has started an upgrade of its imaging system currently implemented at the CEA-Cadarache center. The goals are to achieve a sub-millimeter spatial resolution and the ability to perform tomographies on very large objects (more than 100-cm standard concrete or 40-cm steel). This paper presentsresults on the detection part of the imaging system. The upgrade of the detection part needs a thorough study of the performance of two detectors: a series of CdTe semiconductor sensors and two arrays of segmented CdWO4 scintillators with different pixel sizes. This study consists in a Quantum Accounting Diagram (QAD) analysis coupled with Monte-Carlo simulations. The scintillator arrays are able to detect millimeter details through 140 cm of concrete, but are limited to 120 cm for smaller ones. CdTe sensors have lower but more stable performance, with a 0.5 mm resolution for 90 cm of concrete. The choice of the detector then depends on the preferred characteristic: the spatial resolution or the use on large volumes. The combination of the features of the source and the studies on the detectors gives the expected performance of the whole equipment, in terms of signal-over-noise ratio (SNR), spatial resolution and acquisition time.

  5. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  6. Building performance simulation for sustainable buildings

    NARCIS (Netherlands)

    Hensen, J.L.M.

    2010-01-01

    This paper aims to provide a general view of the background and current state of building performance simulation, which has the potential to deliver, directly or indirectly, substantial benefits to building stakeholders and to the environment. However the building simulation community faces many

  7. Validation of High-resolution Climate Simulations over Northern Europe.

    Science.gov (United States)

    Muna, R. A.

    2005-12-01

    Two AMIP2-type (Gates 1992) experiments have been performed with climate versions of ARPEGE/IFS model examine for North Atlantic North Europe, and Norwegian region and analyzed the effect of increasing resolution on the simulated biases. The ECMWF reanalysis or ERA-15 has been used to validate the simulations. Each of the simulations is an integration of the period 1979 to 1996. The global simulations used observed monthly mean sea surface temperatures (SST) as lower boundary condition. All aspects but the horizontal resolutions are similar in the two simulations. The first simulation has a uniform horizontal resolution of T63L. The second one has a variable resolution (T106Lc3) with the highest resolution in the Norwegian Sea. Both simulations have 31 vertical layers in the same locations. For each simulation the results were divided into two seasons: winter (DJF) and summer (JJA). The parameters investigated were mean sea level pressure, geopotential and temperature at 850 hPa and 500 hPa. To find out the causes of temperature bias during summer, latent and sensible heat flux, total cloud cover and total precipitation were analyzed. The high-resolution simulation exhibits more or less realistic climate over Nordic, Artic and European region. The overall performance of the simulations shows improvements of generally all fields investigated with increasing resolution over the target area both in winter (DJF) and summer (JJA).

  8. Damaris: Addressing performance variability in data management for post-petascale simulations

    International Nuclear Information System (INIS)

    Dorier, Matthieu; Antoniu, Gabriel; Cappello, Franck; Snir, Marc; Sisneros, Robert

    2016-01-01

    With exascale computing on the horizon, reducing performance variability in data management tasks (storage, visualization, analysis, etc.) is becoming a key challenge in sustaining high performance. Here, this variability significantly impacts the overall application performance at scale and its predictability over time. In this article, we present Damaris, a system that leverages dedicated cores in multicore nodes to offload data management tasks, including I/O, data compression, scheduling of data movements, in situ analysis, and visualization. We evaluate Damaris with the CM1 atmospheric simulation and the Nek5000 computational fluid dynamic simulation on four platforms, including NICS’s Kraken and NCSA’s Blue Waters. Our results show that (1) Damaris fully hides the I/O variability as well as all I/O-related costs, thus making simulation performance predictable; (2) it increases the sustained write throughput by a factor of up to 15 compared with standard I/O approaches; (3) it allows almost perfect scalability of the simulation up to over 9,000 cores, as opposed to state-of-the-art approaches that fail to scale; and (4) it enables a seamless connection to the VisIt visualization software to perform in situ analysis and visualization in a way that impacts neither the performance of the simulation nor its variability. In addition, we extended our implementation of Damaris to also support the use of dedicated nodes and conducted a thorough comparison of the two approaches—dedicated cores and dedicated nodes—for I/O tasks with the aforementioned applications.

  9. The effectiveness of and satisfaction with high-fidelity simulation to teach cardiac surgical resuscitation skills to nurses.

    Science.gov (United States)

    McRae, Marion E; Chan, Alice; Hulett, Renee; Lee, Ai Jin; Coleman, Bernice

    2017-06-01

    There are few reports of the effectiveness or satisfaction with simulation to learn cardiac surgical resuscitation skills. To test the effect of simulation on the self-confidence of nurses to perform cardiac surgical resuscitation simulation and nurses' satisfaction with the simulation experience. A convenience sample of sixty nurses rated their self-confidence to perform cardiac surgical resuscitation skills before and after two simulations. Simulation performance was assessed. Subjects completed the Satisfaction with Simulation Experience scale and demographics. Self-confidence scores to perform all cardiac surgical skills as measured by paired t-tests were significantly increased after the simulation (d=-0.50 to 1.78). Self-confidence and cardiac surgical work experience were not correlated with time to performance. Total satisfaction scores were high (mean 80.2, SD 1.06) indicating satisfaction with the simulation. There was no correlation of the satisfaction scores with cardiac surgical work experience (τ=-0.05, ns). Self-confidence scores to perform cardiac surgical resuscitation procedures were higher after the simulation. Nurses were highly satisfied with the simulation experience. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Experimental Investigation and High Resolution Simulation of In-Situ Combustion Processes

    Energy Technology Data Exchange (ETDEWEB)

    Margot Gerritsen; Tony Kovscek

    2008-04-30

    This final technical report describes work performed for the project 'Experimental Investigation and High Resolution Numerical Simulator of In-Situ Combustion Processes', DE-FC26-03NT15405. In summary, this work improved our understanding of in-situ combustion (ISC) process physics and oil recovery. This understanding was translated into improved conceptual models and a suite of software algorithms that extended predictive capabilities. We pursued experimental, theoretical, and numerical tasks during the performance period. The specific project objectives were (i) identification, experimentally, of chemical additives/injectants that improve combustion performance and delineation of the physics of improved performance, (ii) establishment of a benchmark one-dimensional, experimental data set for verification of in-situ combustion dynamics computed by simulators, (iii) develop improved numerical methods that can be used to describe in-situ combustion more accurately, and (iv) to lay the underpinnings of a highly efficient, 3D, in-situ combustion simulator using adaptive mesh refinement techniques and parallelization. We believe that project goals were met and exceeded as discussed.

  11. Impact of High-Fidelity Simulation and Pharmacist-Specific Didactic Lectures in Addition to ACLS Provider Certification on Pharmacy Resident ACLS Performance.

    Science.gov (United States)

    Bartel, Billie J

    2014-08-01

    This pilot study explored the use of multidisciplinary high-fidelity simulation and additional pharmacist-focused training methods in training postgraduate year 1 (PGY1) pharmacy residents to provide Advanced Cardiovascular Life Support (ACLS) care. Pharmacy resident confidence and comfort level were assessed after completing these training requirements. The ACLS training requirements for pharmacy residents were revised to include didactic instruction on ACLS pharmacology and rhythm recognition and participation in multidisciplinary high-fidelity simulation ACLS experiences in addition to ACLS provider certification. Surveys were administered to participating residents to assess the impact of this additional education on resident confidence and comfort level in cardiopulmonary arrest situations. The new ACLS didactic and simulation training requirements resulted in increased resident confidence and comfort level in all assessed functions. Residents felt more confident in all areas except providing recommendations for dosing and administration of medications and rhythm recognition after completing the simulation scenarios than with ACLS certification training and the didactic components alone. All residents felt the addition of lectures and simulation experiences better prepared them to function as a pharmacist in the ACLS team. Additional ACLS training requirements for pharmacy residents increased overall awareness of pharmacist roles and responsibilities and greatly improved resident confidence and comfort level in performing most essential pharmacist functions during ACLS situations. © The Author(s) 2013.

  12. SEAscan 3.5: A simulator performance analyzer

    International Nuclear Information System (INIS)

    Dennis, T.; Eisenmann, S.

    1990-01-01

    SEAscan 3.5 is a personal computer based tool developed to analyze the dynamic performance of nuclear power plant training simulators. The system has integrated features to provide its own human featured performance. In this paper, the program is described as a tool for the analysis of training simulator performance. The structure and operating characteristics of SEAscan 3.5 are described. The hardcopy documents are shown to aid in verification of conformance to ANSI/ANS-3.5-1985

  13. Evaluating the performance of coupled snow-soil models in SURFEXv8 to simulate the permafrost thermal regime at a high Arctic site

    Science.gov (United States)

    Barrere, Mathieu; Domine, Florent; Decharme, Bertrand; Morin, Samuel; Vionnet, Vincent; Lafaysse, Matthieu

    2017-09-01

    Climate change projections still suffer from a limited representation of the permafrost-carbon feedback. Predicting the response of permafrost temperature to climate change requires accurate simulations of Arctic snow and soil properties. This study assesses the capacity of the coupled land surface and snow models ISBA-Crocus and ISBA-ES to simulate snow and soil properties at Bylot Island, a high Arctic site. Field measurements complemented with ERA-Interim reanalyses were used to drive the models and to evaluate simulation outputs. Snow height, density, temperature, thermal conductivity and thermal insulance are examined to determine the critical variables involved in the soil and snow thermal regime. Simulated soil properties are compared to measurements of thermal conductivity, temperature and water content. The simulated snow density profiles are unrealistic, which is most likely caused by the lack of representation in snow models of the upward water vapor fluxes generated by the strong temperature gradients within the snowpack. The resulting vertical profiles of thermal conductivity are inverted compared to observations, with high simulated values at the bottom of the snowpack. Still, ISBA-Crocus manages to successfully simulate the soil temperature in winter. Results are satisfactory in summer, but the temperature of the top soil could be better reproduced by adequately representing surface organic layers, i.e., mosses and litter, and in particular their water retention capacity. Transition periods (soil freezing and thawing) are the least well reproduced because the high basal snow thermal conductivity induces an excessively rapid heat transfer between the soil and the snow in simulations. Hence, global climate models should carefully consider Arctic snow thermal properties, and especially the thermal conductivity of the basal snow layer, to perform accurate predictions of the permafrost evolution under climate change.

  14. Numerical simulations on a high-temperature particle moving in coolant

    International Nuclear Information System (INIS)

    Li Xiaoyan; Shang Zhi; Xu Jijun

    2006-01-01

    This study considers the coupling effect between film boiling heat transfer and evaporation drag around a hot-particle in cold liquid. Taking momentum and energy equations of the vapor film into account, a transient single particle model under FCI conditions has been established. The numerical simulations on a high-temperature particle moving in coolant have been performed using Gear algorithm. Adaptive dynamic boundary method is adopted during simulating to matching the dynamic boundary that is caused by vapor film changing. Based on the method presented above, the transient process of high-temperature particles moving in coolant can be simulated. The experimental results prove the validity of the HPMC model. (authors)

  15. Manufacturing plant performance evaluation by discrete event simulation

    International Nuclear Information System (INIS)

    Rosli Darmawan; Mohd Rasid Osman; Rosnah Mohd Yusuff; Napsiah Ismail; Zulkiflie Leman

    2002-01-01

    A case study was conducted to evaluate the performance of a manufacturing plant using discrete event simulation technique. The study was carried out on animal feed production plant. Sterifeed plant at Malaysian Institute for Nuclear Technology Research (MINT), Selangor, Malaysia. The plant was modelled base on the actual manufacturing activities recorded by the operators. The simulation was carried out using a discrete event simulation software. The model was validated by comparing the simulation results with the actual operational data of the plant. The simulation results show some weaknesses with the current plant design and proposals were made to improve the plant performance. (Author)

  16. SLC positron source: Simulation and performance

    International Nuclear Information System (INIS)

    Pitthan, R.; Braun, H.; Clendenin, J.E.; Ecklund, S.D.; Helm, R.H.; Kulikov, A.V.; Odian, A.C.; Pei, G.X.; Ross, M.C.; Woodley, M.D.

    1991-06-01

    Performance of the source was found to be in good general agreement with computer simulations with S-band acceleration, and where not, the simulations lead to identification of problems, in particular the underestimated impact of linac misalignments due to the 1989 Loma Prieta Earthquake. 13 refs., 7 figs

  17. Simulating Performance Risk for Lighting Retrofit Decisions

    Directory of Open Access Journals (Sweden)

    Jia Hu

    2015-05-01

    Full Text Available In building retrofit projects, dynamic simulations are performed to simulate building performance. Uncertainty may negatively affect model calibration and predicted lighting energy savings, which increases the chance of default on performance-based contracts. Therefore, the aim of this paper is to develop a simulation-based method that can analyze lighting performance risk in lighting retrofit decisions. The model uses a surrogate model, which is constructed by adaptively selecting sample points and generating approximation surfaces with fast computing time. The surrogate model is a replacement of the computation intensive process. A statistical method is developed to generate extreme weather profile based on the 20-year historical weather data. A stochastic occupancy model was created using actual occupancy data to generate realistic occupancy patterns. Energy usage of lighting, and heating, ventilation, and air conditioning (HVAC is simulated using EnergyPlus. The method can evaluate the influence of different risk factors (e.g., variation of luminaire input wattage, varying weather conditions on lighting and HVAC energy consumption and lighting electricity demand. Probability distributions are generated to quantify the risk values. A case study was conducted to demonstrate and validate the methods. The surrogate model is a good solution for quantifying the risk factors and probability distribution of the building performance.

  18. Hand ultrasound: a high-fidelity simulation of lung sliding.

    Science.gov (United States)

    Shokoohi, Hamid; Boniface, Keith

    2012-09-01

    Simulation training has been effectively used to integrate didactic knowledge and technical skills in emergency and critical care medicine. In this article, we introduce a novel model of simulating lung ultrasound and the features of lung sliding and pneumothorax by performing a hand ultrasound. The simulation model involves scanning the palmar aspect of the hand to create normal lung sliding in varying modes of scanning and to mimic ultrasound features of pneumothorax, including "stratosphere/barcode sign" and "lung point." The simple, reproducible, and readily available simulation model we describe demonstrates a high-fidelity simulation surrogate that can be used to rapidly illustrate the signs of normal and abnormal lung sliding at the bedside. © 2012 by the Society for Academic Emergency Medicine.

  19. An exploration of the relationship between knowledge and performance-related variables in high-fidelity simulation: designing instruction that promotes expertise in practice.

    Science.gov (United States)

    Hauber, Roxanne P; Cormier, Eileen; Whyte, James

    2010-01-01

    Increasingly, high-fidelity patient simulation (HFPS) is becoming essential to nursing education. Much remains unknown about how classroom learning is connected to student decision-making in simulation scenarios and the degree to which transference takes place between the classroom setting and actual practice. The present study was part of a larger pilot study aimed at determining the relationship between nursing students' clinical ability to prioritize their actions and the associated cognitions and physiologic outcomes of care using HFPS. In an effort to better explain the knowledge base being used by nursing students in HFPS, the investigators explored the relationship between common measures of knowledge and performance-related variables. Findings are discussed within the context of the expert performance approach and concepts from cognitive psychology, such as cognitive architecture, cognitive load, memory, and transference.

  20. Aircraft Performance for Open Air Traffic Simulations

    NARCIS (Netherlands)

    Metz, I.C.; Hoekstra, J.M.; Ellerbroek, J.; Kugler, D.

    2016-01-01

    The BlueSky Open Air Tra_c Simulator developed by the Control & Simulation section of TU Delft aims at supporting research for analysing Air Tra_c Management concepts by providing an open source simulation platform. The goal of this study was to complement BlueSky with aircraft performance

  1. An empirical investigation of operator performance in cognitively demanding simulated emergencies

    International Nuclear Information System (INIS)

    Roth, E.M.; Mumaw, R.J.; Lewis, P.M.

    1994-07-01

    This report documents the results of an empirical study of nuclear power plant operator performance in cognitively demanding simulated emergencies. During emergencies operators follow highly prescriptive written procedures. The objectives of the study were to understand and document what role higher-level cognitive activities such as diagnosis, or more generally 'situation assessment', play in guiding operator performance, given that operators utilize procedures in responding to the events. The study examined crew performance in variants of two emergencies: (1) an Interfacing System Loss of Coolant Accident and (2) a Loss of Heat Sink scenario. Data on operator performance were collected using training simulators at two plant sites. Up to 11 crews from each plant participated in each of two simulated emergencies for a total of 38 cases. Crew performance was videotaped and partial transcripts were produced and analyzed. The results revealed a number of instances where higher-level cognitive activities such as situation assessment and response planning enabled crews to handle aspects of the situation that were not fully addressed by the procedures. This report documents these cases and discusses their implications for the development and evaluation of training and control room aids, as well as for human reliability analyses

  2. Simulation of plasma loading of high-pressure RF cavities

    Energy Technology Data Exchange (ETDEWEB)

    Yu, K. [Brookhaven National Lab. (BNL), Upton, NY (United States). Computational Science Initiative; Samulyak, R. [Brookhaven National Lab. (BNL), Upton, NY (United States). Computational Science Initiative; Stony Brook Univ., NY (United States). Dept. of Applied Mathematics and Statistics; Yonehara, K. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Freemire, B. [Northern Illinois Univ., DeKalb, IL (United States)

    2018-01-11

    Muon beam-induced plasma loading of radio-frequency (RF) cavities filled with high pressure hydrogen gas with 1% dry air dopant has been studied via numerical simulations. The electromagnetic code SPACE, that resolves relevant atomic physics processes, including ionization by the muon beam, electron attachment to dopant molecules, and electron-ion and ion-ion recombination, has been used. Simulations studies have also been performed in the range of parameters typical for practical muon cooling channels.

  3. Simulation of plasma loading of high-pressure RF cavities

    Science.gov (United States)

    Yu, K.; Samulyak, R.; Yonehara, K.; Freemire, B.

    2018-01-01

    Muon beam-induced plasma loading of radio-frequency (RF) cavities filled with high pressure hydrogen gas with 1% dry air dopant has been studied via numerical simulations. The electromagnetic code SPACE, that resolves relevant atomic physics processes, including ionization by the muon beam, electron attachment to dopant molecules, and electron-ion and ion-ion recombination, has been used. Simulations studies have been performed in the range of parameters typical for practical muon cooling channels.

  4. MDT Performance in a High Rate Background Environment

    CERN Document Server

    Aleksa, Martin; Hessey, N P; Riegler, W

    1998-01-01

    A Cs137 gamma source with different lead filters in the SPS beam-line X5 has been used to simulate the ATLAS background radiation. This note shows the impact of high background rates on the MDT efficiency and resolution for three kinds of pulse shaping and compares the results with GARFIELD simulations. Furthermore it explains how the performance can be improved by time slewing corrections and double track separation.

  5. A Grid-Based Cyber Infrastructure for High Performance Chemical Dynamics Simulations

    Directory of Open Access Journals (Sweden)

    Khadka Prashant

    2008-10-01

    Full Text Available Chemical dynamics simulation is an effective means to study atomic level motions of molecules, collections of molecules, liquids, surfaces, interfaces of materials, and chemical reactions. To make chemical dynamics simulations globally accessible to a broad range of users, recently a cyber infrastructure was developed that provides an online portal to VENUS, a popular chemical dynamics simulation program package, to allow people to submit simulation jobs that will be executed on the web server machine. In this paper, we report new developments of the cyber infrastructure for the improvement of its quality of service by dispatching the submitted simulations jobs from the web server machine onto a cluster of workstations for execution, and by adding an animation tool, which is optimized for animating the simulation results. The separation of the server machine from the simulation-running machine improves the service quality by increasing the capacity to serve more requests simultaneously with even reduced web response time, and allows the execution of large scale, time-consuming simulation jobs on the powerful workstation cluster. With the addition of an animation tool, the cyber infrastructure automatically converts, upon the selection of the user, some simulation results into an animation file that can be viewed on usual web browsers without requiring installation of any special software on the user computer. Since animation is essential for understanding the results of chemical dynamics simulations, this animation capacity provides a better way for understanding simulation details of the chemical dynamics. By combining computing resources at locations under different administrative controls, this cyber infrastructure constitutes a grid environment providing physically and administratively distributed functionalities through a single easy-to-use online portal

  6. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in modern High Energy Physics. To perform precision measurements of the Higgs boson properties, the use of fast and efficient instruments of Monte Carlo event simulation is required. Due to the increasing amount of data and to the growing complexity of the simulation software tools, the computing resources currently available for Monte Carlo simulation on the LHC GRID are not sufficient. One of the possibilities to address this shortfall of computing resources is the usage of institutes computer clusters, commercial computing resources and supercomputers. In this paper, a brief description of the Higgs boson physics, the Monte-Carlo generation and event simulation techniques are presented. A description of modern high performance computing systems and tests of their performance are also discussed. These studies have been performed on the Worldwide LHC Computing Grid and Kurchatov Institute Data Processing Center, including Tier...

  7. Performance simulation of a MRPC-based PET imaging system

    Science.gov (United States)

    Roy, A.; Banerjee, A.; Biswas, S.; Chattopadhyay, S.; Das, G.; Saha, S.

    2014-10-01

    The less expensive and high resolution Multi-gap Resistive Plate Chamber (MRPC) opens up a new possibility to find an efficient alternative detector for the Time of Flight (TOF) based Positron Emission Tomography, where the sensitivity of the system depends largely on the time resolution of the detector. In a layered structure, suitable converters can be used to increase the photon detection efficiency. In this work, we perform a detailed GEANT4 simulation to optimize the converter thickness towards improving the efficiency of photon conversion. A Monte Carlo based procedure has been developed to simulate the time resolution of the MRPC-based system, making it possible to simulate its response for PET imaging application. The results of the test of a six-gap MRPC, operating in avalanche mode, with 22Na source have been discussed.

  8. Simulations of depleted CMOS sensors for high-radiation environments

    CERN Document Server

    Liu, J.; Bhat, S.; Breugnon, P.; Caicedo, I.; Chen, Z.; Degerli, Y.; Godiot-Basolo, S.; Guilloux, F.; Hemperek, T.; Hirono, T.; Hügging, F.; Krüger, H.; Moustakas, K.; Pangaud, P.; Rozanov, A.; Rymaszewski, P.; Schwemling, P.; Wang, M.; Wang, T.; Wermes, N.; Zhang, L.

    2017-01-01

    After the Phase II upgrade for the Large Hadron Collider (LHC), the increased luminosity requests a new upgraded Inner Tracker (ITk) for the ATLAS experiment. As a possible option for the ATLAS ITk, a new pixel detector based on High Voltage/High Resistivity CMOS (HV/HR CMOS) technology is under study. Meanwhile, a new CMOS pixel sensor is also under development for the tracker of Circular Electron Position Collider (CEPC). In order to explore the sensor electric properties, such as the breakdown voltage and charge collection efficiency, 2D/3D Technology Computer Aided Design (TCAD) simulations have been performed carefully for the above mentioned both of prototypes. In this paper, the guard-ring simulation for a HV/HR CMOS sensor developed for the ATLAS ITk and the charge collection efficiency simulation for a CMOS sensor explored for the CEPC tracker will be discussed in details. Some comparisons between the simulations and the latest measurements will also be addressed.

  9. De Novo Ultrascale Atomistic Simulations On High-End Parallel Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, A; Kalia, R K; Nomura, K; Sharma, A; Vashishta, P; Shimojo, F; van Duin, A; Goddard, III, W A; Biswas, R; Srivastava, D; Yang, L H

    2006-09-04

    We present a de novo hierarchical simulation framework for first-principles based predictive simulations of materials and their validation on high-end parallel supercomputers and geographically distributed clusters. In this framework, high-end chemically reactive and non-reactive molecular dynamics (MD) simulations explore a wide solution space to discover microscopic mechanisms that govern macroscopic material properties, into which highly accurate quantum mechanical (QM) simulations are embedded to validate the discovered mechanisms and quantify the uncertainty of the solution. The framework includes an embedded divide-and-conquer (EDC) algorithmic framework for the design of linear-scaling simulation algorithms with minimal bandwidth complexity and tight error control. The EDC framework also enables adaptive hierarchical simulation with automated model transitioning assisted by graph-based event tracking. A tunable hierarchical cellular decomposition parallelization framework then maps the O(N) EDC algorithms onto Petaflops computers, while achieving performance tunability through a hierarchy of parameterized cell data/computation structures, as well as its implementation using hybrid Grid remote procedure call + message passing + threads programming. High-end computing platforms such as IBM BlueGene/L, SGI Altix 3000 and the NSF TeraGrid provide an excellent test grounds for the framework. On these platforms, we have achieved unprecedented scales of quantum-mechanically accurate and well validated, chemically reactive atomistic simulations--1.06 billion-atom fast reactive force-field MD and 11.8 million-atom (1.04 trillion grid points) quantum-mechanical MD in the framework of the EDC density functional theory on adaptive multigrids--in addition to 134 billion-atom non-reactive space-time multiresolution MD, with the parallel efficiency as high as 0.998 on 65,536 dual-processor BlueGene/L nodes. We have also achieved an automated execution of hierarchical QM

  10. Optical Characterization and Energy Simulation of Glazing for High-Performance Windows

    International Nuclear Information System (INIS)

    Jonsson, Andreas

    2010-01-01

    This thesis focuses on one important component of the energy system - the window. Windows are installed in buildings mainly to create visual contact with the surroundings and to let in daylight, and should also be heat and sound insulating. This thesis covers four important aspects of windows: antireflection and switchable coatings, energy simulations and optical measurements. Energy simulations have been used to compare different windows and also to estimate the performance of smart or switchable windows, whose transmittance can be regulated. The results from this thesis show the potential of the emerging technology of smart windows, not only from a daylight and an energy perspective, but also for comfort and well-being. The importance of a well functioning control system for such windows, is pointed out. To fulfill all requirements of modern windows, they often have two or more panes. Each glass surface leads to reflection of light and therefore less daylight is transmitted. It is therefore of interest to find ways to increase the transmittance. In this thesis antireflection coatings, similar to those found on eye-glasses and LCD screens, have been investigated. For large area applications such as windows, it is necessary to use techniques which can easily be adapted to large scale manufacturing at low cost. Such a technique is dip-coating in a sol-gel of porous silica. Antireflection coatings have been deposited on glass and plastic materials to study both visual and energy performance and it has been shown that antireflection coatings increase the transmittance of windows without negatively affecting the thermal insulation and the energy efficiency. Optical measurements are important for quantifying product properties for comparisons and evaluations. It is important that new measurement routines are simple and applicable to standard commercial instruments. Different systematic error sources for optical measurements of patterned light diffusing samples using

  11. Cavitation performance improvement of high specific speed mixed-flow pump

    International Nuclear Information System (INIS)

    Chen, T; Sun, Y B; Wu, D Z; Wang, L Q

    2012-01-01

    Cavitation performance improvement of large hydraulic machinery such as pump and turbine has been a hot topic for decades. During the design process of the pumps, in order to minimize size, weight and cost centrifugal and mixed-flow pump impellers are required to operate at the highest possible rotational speed. The rotational speed is limited by the phenomenon of cavitation. The hydraulic model of high-speed mixed-flow pump with large flow rate and high pumping head, which was designed based on the traditional method, always involves poor cavitation performance. In this paper, on the basis of the same hydraulic design parameters, two hydraulic models of high-speed mixed-flow pump were designed by using different methods, in order to investigate the cavitation and hydraulic performance of the two models, the method of computational fluid dynamics (CFD) was adopted for internal flow simulation of the high specific speed mixed-flow pump. Based on the results of numerical simulation, the influences of impeller parameters and three-dimensional configuration on pressure distribution of the blades' suction surfaces were analyzed. The numerical simulation results shows a better pressure distribution and lower pressure drop around the leading edge of the improved model. The research results could provide references to the design and optimization of the anti-cavitation blade.

  12. Simulations of High Speed Fragment Trajectories

    Science.gov (United States)

    Yeh, Peter; Attaway, Stephen; Arunajatesan, Srinivasan; Fisher, Travis

    2017-11-01

    Flying shrapnel from an explosion are capable of traveling at supersonic speeds and distances much farther than expected due to aerodynamic interactions. Predicting the trajectories and stable tumbling modes of arbitrary shaped fragments is a fundamental problem applicable to range safety calculations, damage assessment, and military technology. Traditional approaches rely on characterizing fragment flight using a single drag coefficient, which may be inaccurate for fragments with large aspect ratios. In our work we develop a procedure to simulate trajectories of arbitrary shaped fragments with higher fidelity using high performance computing. We employ a two-step approach in which the force and moment coefficients are first computed as a function of orientation using compressible computational fluid dynamics. The force and moment data are then input into a six-degree-of-freedom rigid body dynamics solver to integrate trajectories in time. Results of these high fidelity simulations allow us to further understand the flight dynamics and tumbling modes of a single fragment. Furthermore, we use these results to determine the validity and uncertainty of inexpensive methods such as the single drag coefficient model.

  13. Microsurgical Performance After Sleep Interruption: A NeuroTouch Simulator Study.

    Science.gov (United States)

    Micko, Alexander; Knopp, Karoline; Knosp, Engelbert; Wolfsberger, Stefan

    2017-10-01

    In times of the ubiquitous debate about doctors' working hour restrictions, it is still questionable if the physician's performance is impaired by high work load and long shifts. In this study, we evaluated the impact of sleep interruption on neurosurgical performance. Ten medical students and 10 neurosurgical residents were tested on the virtual-reality simulator NeuroTouch by performing an identical microsurgical task, well rested (baseline test), and after sleep interruption at night (stress test). Deviation of total score, timing, and excessive force on tissue were evaluated. In addition, vital parameters and self-assessment were analyzed. After sleep interruption, total performance score increased significantly (45.1 vs. 48.7, baseline vs. stress test, P = 0.048) while timing remained stable (10.1 vs. 10.4 minutes for baseline vs. stress test, P > 0.05) for both students and residents. Excessive force decreased in both groups during the stress test for the nondominant hand (P = 0.05). For the dominant hand, an increase of excessive force was encountered in the group of residents (P = 0.05). In contrast to their results, participants of both groups assessed their performance worse during the stress test. In our study, we found an increase of neurosurgical simulator performance in neurosurgical residents and medical students under simulated night shift conditions. Further, microsurgical dexterity remained unchanged. Based on our results and the data in the available literature, we cannot confirm that working hour restrictions will have a positive effect on neurosurgical performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Impact of Loss Synchronization on Reliable High Speed Networks: A Model Based Simulation

    Directory of Open Access Journals (Sweden)

    Suman Kumar

    2014-01-01

    Full Text Available Contemporary nature of network evolution demands for simulation models which are flexible, scalable, and easily implementable. In this paper, we propose a fluid based model for performance analysis of reliable high speed networks. In particular, this paper aims to study the dynamic relationship between congestion control algorithms and queue management schemes, in order to develop a better understanding of the causal linkages between the two. We propose a loss synchronization module which is user configurable. We validate our model through simulations under controlled settings. Also, we present a performance analysis to provide insights into two important issues concerning 10 Gbps high speed networks: (i impact of bottleneck buffer size on the performance of 10 Gbps high speed network and (ii impact of level of loss synchronization on link utilization-fairness tradeoffs. The practical impact of the proposed work is to provide design guidelines along with a powerful simulation tool to protocol designers and network developers.

  15. Kinetic Energy from Supernova Feedback in High-resolution Galaxy Simulations

    Science.gov (United States)

    Simpson, Christine M.; Bryan, Greg L.; Hummels, Cameron; Ostriker, Jeremiah P.

    2015-08-01

    We describe a new method for adding a prescribed amount of kinetic energy to simulated gas modeled on a cartesian grid by directly altering grid cells’ mass and velocity in a distributed fashion. The method is explored in the context of supernova (SN) feedback in high-resolution (˜10 pc) hydrodynamic simulations of galaxy formation. Resolution dependence is a primary consideration in our application of the method, and simulations of isolated explosions (performed at different resolutions) motivate a resolution-dependent scaling for the injected fraction of kinetic energy that we apply in cosmological simulations of a 109 M⊙ dwarf halo. We find that in high-density media (≳50 cm-3) with coarse resolution (≳4 pc per cell), results are sensitive to the initial kinetic energy fraction due to early and rapid cooling. In our galaxy simulations, the deposition of small amounts of SN energy in kinetic form (as little as 1%) has a dramatic impact on the evolution of the system, resulting in an order-of-magnitude suppression of stellar mass. The overall behavior of the galaxy in the two highest resolution simulations we perform appears to converge. We discuss the resulting distribution of stellar metallicities, an observable sensitive to galactic wind properties, and find that while the new method demonstrates increased agreement with observed systems, significant discrepancies remain, likely due to simplistic assumptions that neglect contributions from SNe Ia and stellar winds.

  16. GROMACS 4.5: A high-throughput and highly parallel open source molecular simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pronk, Sander [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Pall, Szilard [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Schulz, Roland [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Larsson, Per [Univ. of Virginia, Charlottesville, VA (United States); Bjelkmar, Par [Science for Life Lab., Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden); Apostolov, Rossen [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Shirts, Michael R. [Univ. of Virginia, Charlottesville, VA (United States); Smith, Jeremy C. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kasson, Peter M. [Univ. of Virginia, Charlottesville, VA (United States); van der Spoel, David [Science for Life Lab., Stockholm (Sweden); Uppsala Univ., Uppsala (Sweden); Hess, Berk [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Lindahl, Erik [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden)

    2013-02-13

    In this study, molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. As a result, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations.

  17. High Performance Computing in Science and Engineering '14

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2015-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS). The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and   engineers. The book comes with a wealth of color illustrations and tables of results.  

  18. The effects of fatigue on performance in simulated nursing work.

    Science.gov (United States)

    Barker, Linsey M; Nussbaum, Maury A

    2011-09-01

    Fatigue is associated with increased rates of medical errors and healthcare worker injuries, yet existing research in this sector has not considered multiple dimensions of fatigue simultaneously. This study evaluated hypothesised causal relationships between mental and physical fatigue and performance. High and low levels of mental and physical fatigue were induced in 16 participants during simulated nursing work tasks in a laboratory setting. Task-induced changes in fatigue dimensions were quantified using both subjective and objective measures, as were changes in performance on physical and mental tasks. Completing the simulated work tasks increased total fatigue, mental fatigue and physical fatigue in all experimental conditions. Higher physical fatigue adversely affected measures of physical and mental performance, whereas higher mental fatigue had a positive effect on one measure of mental performance. Overall, these results suggest causal effects between manipulated levels of mental and physical fatigue and task-induced changes in mental and physical performance. STATEMENT OF RELEVANCE: Nurse fatigue and performance has implications for patient and provider safety. Results from this study demonstrate the importance of a multidimensional view of fatigue in understanding the causal relationships between fatigue and performance. The findings can guide future work aimed at predicting fatigue-related performance decrements and designing interventions.

  19. Design of DSP-based high-power digital solar array simulator

    Science.gov (United States)

    Zhang, Yang; Liu, Zhilong; Tong, Weichao; Feng, Jian; Ji, Yibo

    2013-12-01

    To satisfy rigid performance specifications, a feedback control was presented for zoom optical lens plants. With the increasing of global energy consumption, research of the photovoltaic(PV) systems get more and more attention. Research of the digital high-power solar array simulator provides technical support for high-power grid-connected PV systems research.This paper introduces a design scheme of the high-power digital solar array simulator based on TMS320F28335. A DC-DC full-bridge topology was used in the system's main circuit. The switching frequency of IGBT is 25kHz.Maximum output voltage is 900V. Maximum output current is 20A. Simulator can be pre-stored solar panel IV curves.The curve is composed of 128 discrete points .When the system was running, the main circuit voltage and current values was feedback to the DSP by the voltage and current sensors in real-time. Through incremental PI,DSP control the simulator in the closed-loop control system. Experimental data show that Simulator output voltage and current follow a preset solar panels IV curve. In connection with the formation of high-power inverter, the system becomes gridconnected PV system. The inverter can find the simulator's maximum power point and the output power can be stabilized at the maximum power point (MPP).

  20. Simulated astigmatism impairs academic-related performance in children.

    Science.gov (United States)

    Narayanasamy, Sumithira; Vincent, Stephen J; Sampson, Geoff P; Wood, Joanne M

    2015-01-01

    Astigmatism is an important refractive condition in children. However, the functional impact of uncorrected astigmatism in this population is not well established, particularly with regard to academic performance. This study investigated the impact of simulated bilateral astigmatism on academic-related tasks before and after sustained near work in children. Twenty visually normal children (mean age: 10.8 ± 0.7 years; six males and 14 females) completed a range of standardised academic-related tests with and without 1.50 D of simulated bilateral astigmatism (with both academic-related tests and the visual condition administered in a randomised order). The simulated astigmatism was induced using a positive cylindrical lens while maintaining a plano spherical equivalent. Performance was assessed before and after 20 min of sustained near work, during two separate testing sessions. Academic-related measures included a standardised reading test (the Neale Analysis of Reading Ability), visual information processing tests (Coding and Symbol Search subtests from the Wechsler Intelligence Scale for Children) and a reading-related eye movement test (the Developmental Eye Movement test). Each participant was systematically assigned either with-the-rule (WTR, axis 180°) or against-the-rule (ATR, axis 90°) simulated astigmatism to evaluate the influence of axis orientation on any decrements in performance. Reading, visual information processing and reading-related eye movement performance were all significantly impaired by both simulated bilateral astigmatism (p  0.05). Simulated astigmatism led to a reduction of between 5% and 12% in performance across the academic-related outcome measures, but there was no significant effect of the axis (WTR or ATR) of astigmatism (p > 0.05). Simulated bilateral astigmatism impaired children's performance on a range of academic-related outcome measures irrespective of the orientation of the astigmatism. These findings have

  1. Proving test on the performance of a Multiple-Excitation Simulator

    International Nuclear Information System (INIS)

    Fujita, Katsuhisa; Ito, Tomohiro; Kojima, Nobuyuki; Sasaki, Yoichi; Abe, Hiroshi; Kuroda, Katsuhiko

    1995-01-01

    Seismic excitation test on large scale piping systems is scheduled to be carried out by the Nuclear power Engineering Corporation (NUPEC) using the large-scale, high-performance vibration table at the Tadotsu Engineering Laboratory, under the sponsorship of the Ministry of International Trade and Industry (MITI). In the test, the piping systems simulate the main steam piping system and the main feed water piping system in the nuclear power plants. In this study, a fundamental test was carried out to prove the performance of the Multiple Excitation Simulator which consists of the hydraulic actuator and the control system. An L-shaped piping system and a hydraulic actuator were installed on the shaking table. Acceleration and displacement generated by the actuator were measured. The performance of the actuator and the control system was discussed comparing the measured values and the target values on the time histories and the response spectrum of the acceleration. As a result, it was proved that the actuator and the control system have good performance and will be applicable to the verification test

  2. MAPPS (Maintenance Personnel Performance Simulation): a computer simulation model for human reliability analysis

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.

    1985-01-01

    A computer model has been developed, sensitivity tested, and evaluated capable of generating reliable estimates of human performance measures in the nuclear power plant (NPP) maintenance context. The model, entitled MAPPS (Maintenance Personnel Performance Simulation), is of the simulation type and is task-oriented. It addresses a number of person-machine, person-environment, and person-person variables and is capable of providing the user with a rich spectrum of important performance measures including mean time for successful task performance by a maintenance team and maintenance team probability of task success. These two measures are particularly important for input to probabilistic risk assessment (PRA) studies which were the primary impetus for the development of MAPPS. The simulation nature of the model along with its generous input parameters and output variables allows its usefulness to extend beyond its input to PRA

  3. Alcohol consumption for simulated driving performance: A systematic review.

    Science.gov (United States)

    Rezaee-Zavareh, Mohammad Saeid; Salamati, Payman; Ramezani-Binabaj, Mahdi; Saeidnejad, Mina; Rousta, Mansoureh; Shokraneh, Farhad; Rahimi-Movaghar, Vafa

    2017-06-01

    Alcohol consumption can lead to risky driving and increase the frequency of traffic accidents, injuries and mortalities. The main purpose of our study was to compare simulated driving performance between two groups of drivers, one consumed alcohol and the other not consumed, using a systematic review. In this systematic review, electronic resources and databases including Medline via Ovid SP, EMBASE via Ovid SP, PsycINFO via Ovid SP, PubMed, Scopus, Cumulative Index to Nursing and Allied Health Literature (CINHAL) via EBSCOhost were comprehensively and systematically searched. The randomized controlled clinical trials that compared simulated driving performance between two groups of drivers, one consumed alcohol and the other not consumed, were included. Lane position standard deviation (LPSD), mean of lane position deviation (MLPD), speed, mean of speed deviation (MSD), standard deviation of speed deviation (SDSD), number of accidents (NA) and line crossing (LC) were considered as the main parameters evaluating outcomes. After title and abstract screening, the articles were enrolled for data extraction and they were evaluated for risk of biases. Thirteen papers were included in our qualitative synthesis. All included papers were classified as high risk of biases. Alcohol consumption mostly deteriorated the following performance outcomes in descending order: SDSD, LPSD, speed, MLPD, LC and NA. Our systematic review had troublesome heterogeneity. Alcohol consumption may decrease simulated driving performance in alcohol consumed people compared with non-alcohol consumed people via changes in SDSD, LPSD, speed, MLPD, LC and NA. More well-designed randomized controlled clinical trials are recommended. Copyright © 2017. Production and hosting by Elsevier B.V.

  4. Alcohol consumption for simulated driving performance: A systematic review

    Institute of Scientific and Technical Information of China (English)

    Mohammad Saeid Rezaee-Zavareh; Payman Salamati; Mahdi Ramezani-Binabaj; Mina Saeidnejad; Mansoureh Rousta; Farhad Shokraneh; Vafa Rahimi-Movaghar

    2017-01-01

    Purpose:Alcohol consumption can lead to risky driving and increase the frequency of traffic accidents,injuries and mortalities.The main purpose of our study was to compare simulated driving performance between two groups of drivers,one consumed alcohol and the other not consumed,using a systematic review.Methods:In this systematic review,electronic resources and databases including Medline via Ovid SP,EMBASE via Ovid SP,PsycINFO via Ovid SP,PubMed,Scopus,Cumulative Index to Nursing and Allied Health Literature (CINHAL) via EBSCOhost were comprehensively and systematically searched.The randomized controlled clinical trials that compared simulated driving performance between two groups of drivers,one consumed alcohol and the other not consumed,were included.Lane position standard deviation (LPSD),mean of lane position deviation (MLPD),speed,mean of speed deviation (MSD),standard deviation of speed deviation (SDSD),number of accidents (NA) and line crossing (LC) were considered as the main parameters evaluating outcomes.After title and abstract screening,the articles were enrolled for data extraction and they were evaluated for risk of biases.Results:Thirteen papers were included in our qualitative synthesis.All included papers were classified as high risk of biases.Alcohol consumption mostly deteriorated the following performance outcomes in descending order:SDSD,LPSD,speed,MLPD,LC and NA.Our systematic review had troublesome heterogeneity.Conclusion:Alcohol consumption may decrease simulated driving performance in alcohol consumed people compared with non-alcohol consumed people via changes in SDSD,LPSD,speed,MLPD,LC and NA.More well-designed randomized controlled clinical trials are recommended.

  5. Simulation of a high efficiency multi-bed adsorption heat pump

    International Nuclear Information System (INIS)

    TeGrotenhuis, W.E.; Humble, P.H.; Sweeney, J.B.

    2012-01-01

    Attaining high energy efficiency with adsorption heat pumps is challenging due to thermodynamic losses that occur when the sorbent beds are thermally cycled without effective heat recuperation. The multi-bed concept described here enables high efficiency by effectively transferring heat from beds being cooled to beds being heated. A simplified lumped-parameter model and detailed finite element analysis are used to simulate a sorption compressor, which is used to project the overall heat pump coefficient of performance. Results are presented for ammonia refrigerant and a nano-structured monolithic carbon sorbent specifically modified for the application. The effects of bed geometry and number of beds on system performance are explored, and the majority of the performance benefit is obtained with four beds. Results indicate that a COP of 1.24 based on heat input is feasible at AHRI standard test conditions for residential HVAC equipment. When compared on a basis of primary energy input, performance equivalent to SEER 13 or 14 are theoretically attainable with this system. - Highlights: ► A multi-bed concept for adsorption heat pumps is capable of high efficiency. ► Modeling is used to simulate sorption compressor and overall heat pump performance. ► Results are presented for ammonia refrigerant and a nano-structured monolithic carbon sorbent. ► The majority of the efficiency benefit is obtained with four beds. ► Predicted COP as high as 1.24 for cooling is comparable to SEER 13 or 14 for electric heat pumps.

  6. Comparative Performance of Four Single Extreme Outlier Discordancy Tests from Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    Surendra P. Verma

    2014-01-01

    Full Text Available Using highly precise and accurate Monte Carlo simulations of 20,000,000 replications and 102 independent simulation experiments with extremely low simulation errors and total uncertainties, we evaluated the performance of four single outlier discordancy tests (Grubbs test N2, Dixon test N8, skewness test N14, and kurtosis test N15 for normal samples of sizes 5 to 20. Statistical contaminations of a single observation resulting from parameters called δ from ±0.1 up to ±20 for modeling the slippage of central tendency or ε from ±1.1 up to ±200 for slippage of dispersion, as well as no contamination (δ=0 and ε=±1, were simulated. Because of the use of precise and accurate random and normally distributed simulated data, very large replications, and a large number of independent experiments, this paper presents a novel approach for precise and accurate estimations of power functions of four popular discordancy tests and, therefore, should not be considered as a simple simulation exercise unrelated to probability and statistics. From both criteria of the Power of Test proposed by Hayes and Kinsella and the Test Performance Criterion of Barnett and Lewis, Dixon test N8 performs less well than the other three tests. The overall performance of these four tests could be summarized as N2≅N15>N14>N8.

  7. The Effect of Natural or Simulated Altitude Training on High-Intensity Intermittent Running Performance in Team-Sport Athletes: A Meta-Analysis.

    Science.gov (United States)

    Hamlin, Michael J; Lizamore, Catherine A; Hopkins, Will G

    2018-02-01

    While adaptation to hypoxia at natural or simulated altitude has long been used with endurance athletes, it has only recently gained popularity for team-sport athletes. To analyse the effect of hypoxic interventions on high-intensity intermittent running performance in team-sport athletes. A systematic literature search of five journal databases was performed. Percent change in performance (distance covered) in the Yo-Yo intermittent recovery test (level 1 and level 2 were used without differentiation) in hypoxic (natural or simulated altitude) and control (sea level or normoxic placebo) groups was meta-analyzed with a mixed model. The modifying effects of study characteristics (type and dose of hypoxic exposure, training duration, post-altitude duration) were estimated with fixed effects, random effects allowed for repeated measurement within studies and residual real differences between studies, and the standard-error weighting factors were derived or imputed via standard deviations of change scores. Effects and their uncertainty were assessed with magnitude-based inference, with a smallest important improvement of 4% estimated via between-athlete standard deviations of performance at baseline. Ten studies qualified for inclusion, but two were excluded owing to small sample size and risk of publication bias. Hypoxic interventions occurred over a period of 7-28 days, and the range of total hypoxic exposure (in effective altitude-hours) was 4.5-33 km h in the intermittent-hypoxia studies and 180-710 km h in the live-high studies. There were 11 control and 15 experimental study-estimates in the final meta-analysis. Training effects were moderate and very likely beneficial in the control groups at 1 week (20 ± 14%, percent estimate, ± 90% confidence limits) and 4-week post-intervention (25 ± 23%). The intermittent and live-high hypoxic groups experienced additional likely beneficial gains at 1 week (13 ± 16%; 13 ± 15%) and 4-week post

  8. Teaching childbirth with high-fidelity simulation. Is it better observing the scenario during the briefing session?

    Science.gov (United States)

    Cuerva, Marcos J; Piñel, Carlos S; Martin, Lourdes; Espinosa, Jose A; Corral, Octavio J; Mendoza, Nicolás

    2018-02-12

    The design of optimal courses for obstetric undergraduate teaching is a relevant question. This study evaluates two different designs of simulator-based learning activity on childbirth with regard to respect to the patient, obstetric manoeuvres, interpretation of cardiotocography tracings (CTG) and infection prevention. This randomised experimental study which differs in the content of their briefing sessions consisted of two groups of undergraduate students, who performed two simulator-based learning activities on childbirth. The first briefing session included the observations of a properly performed scenario according to Spanish clinical practice guidelines on care in normal childbirth by the teachers whereas the second group did not include the observations of a properly performed scenario, and the students observed it only after the simulation process. The group that observed a properly performed scenario after the simulation obtained worse grades during the simulation, but better grades during the debriefing and evaluation. Simulator use in childbirth may be more fruitful when the medical students observe correct performance at the completion of the scenario compared to that at the start of the scenario. Impact statement What is already known on this subject? There is a scarcity of literature about the design of optimal high-fidelity simulation training in childbirth. It is known that preparing simulator-based learning activities is a complex process. Simulator-based learning includes the following steps: briefing, simulation, debriefing and evaluation. The most important part of high-fidelity simulations is the debriefing. A good briefing and simulation are of high relevance in order to have a fruitful debriefing session. What do the results of this study add? Our study describes a full simulator-based learning activity on childbirth that can be reproduced in similar facilities. The findings of this study add that high-fidelity simulation training in

  9. A high performance scientific cloud computing environment for materials simulations

    Science.gov (United States)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  10. Collective efficacy in a high-fidelity simulation of an airline operations center

    Science.gov (United States)

    Jinkerson, Shanna

    This study investigated the relationships between collective efficacy, teamwork, and team performance. Participants were placed into teams, where they worked together in a high-fidelity simulation of an airline operations center. Each individual was assigned a different role to represent different jobs within an airline (Flight Operations Coordinator, Crew Scheduling, Maintenance, Weather, Flight Scheduling, or Flight Planning.) Participants completed a total of three simulations with an After Action Review between each. Within this setting, both team performance and teamwork behaviors were shown to be positively related to expectations for subsequent performance (collective efficacy). Additionally, teamwork and collective efficacy were not shown to be concomitantly related to subsequent team performance. A chi-square test was used to evaluate existence of performance spirals, and they were not supported. The results of this study were likely impacted by lack of power, as well as a lack of consistency across the three simulations.

  11. Enabling high performance computational science through combinatorial algorithms

    International Nuclear Information System (INIS)

    Boman, Erik G; Bozdag, Doruk; Catalyurek, Umit V; Devine, Karen D; Gebremedhin, Assefaw H; Hovland, Paul D; Pothen, Alex; Strout, Michelle Mills

    2007-01-01

    The Combinatorial Scientific Computing and Petascale Simulations (CSCAPES) Institute is developing algorithms and software for combinatorial problems that play an enabling role in scientific and engineering computations. Discrete algorithms will be increasingly critical for achieving high performance for irregular problems on petascale architectures. This paper describes recent contributions by researchers at the CSCAPES Institute in the areas of load balancing, parallel graph coloring, performance improvement, and parallel automatic differentiation

  12. Enabling high performance computational science through combinatorial algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Erik G [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Bozdag, Doruk [Biomedical Informatics, and Electrical and Computer Engineering, Ohio State University (United States); Catalyurek, Umit V [Biomedical Informatics, and Electrical and Computer Engineering, Ohio State University (United States); Devine, Karen D [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Gebremedhin, Assefaw H [Computer Science and Center for Computational Science, Old Dominion University (United States); Hovland, Paul D [Mathematics and Computer Science Division, Argonne National Laboratory (United States); Pothen, Alex [Computer Science and Center for Computational Science, Old Dominion University (United States); Strout, Michelle Mills [Computer Science, Colorado State University (United States)

    2007-07-15

    The Combinatorial Scientific Computing and Petascale Simulations (CSCAPES) Institute is developing algorithms and software for combinatorial problems that play an enabling role in scientific and engineering computations. Discrete algorithms will be increasingly critical for achieving high performance for irregular problems on petascale architectures. This paper describes recent contributions by researchers at the CSCAPES Institute in the areas of load balancing, parallel graph coloring, performance improvement, and parallel automatic differentiation.

  13. Thermomechanical simulations and experimental validation for high speed incremental forming

    Science.gov (United States)

    Ambrogio, Giuseppina; Gagliardi, Francesco; Filice, Luigino; Romero, Natalia

    2016-10-01

    Incremental sheet forming (ISF) consists in deforming only a small region of the workspace through a punch driven by a NC machine. The drawback of this process is its slowness. In this study, a high speed variant has been investigated from both numerical and experimental points of view. The aim has been the design of a FEM model able to perform the material behavior during the high speed process by defining a thermomechanical model. An experimental campaign has been performed by a CNC lathe with high speed to test process feasibility. The first results have shown how the material presents the same performance than in conventional speed ISF and, in some cases, better material behavior due to the temperature increment. An accurate numerical simulation has been performed to investigate the material behavior during the high speed process confirming substantially experimental evidence.

  14. Team Culture and Business Strategy Simulation Performance

    Science.gov (United States)

    Ritchie, William J.; Fornaciari, Charles J.; Drew, Stephen A. W.; Marlin, Dan

    2013-01-01

    Many capstone strategic management courses use computer-based simulations as core pedagogical tools. Simulations are touted as assisting students in developing much-valued skills in strategy formation, implementation, and team management in the pursuit of superior strategic performance. However, despite their rich nature, little is known regarding…

  15. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit.

    Science.gov (United States)

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R; Smith, Jeremy C; Kasson, Peter M; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-04-01

    Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. GROMACS is an open source and free software available from http://www.gromacs.org. Supplementary data are available at Bioinformatics online.

  16. Performance evaluation of sea surface simulation methods for target detection

    Science.gov (United States)

    Xia, Renjie; Wu, Xin; Yang, Chen; Han, Yiping; Zhang, Jianqi

    2017-11-01

    With the fast development of sea surface target detection by optoelectronic sensors, machine learning has been adopted to improve the detection performance. Many features can be learned from training images by machines automatically. However, field images of sea surface target are not sufficient as training data. 3D scene simulation is a promising method to address this problem. For ocean scene simulation, sea surface height field generation is the key point to achieve high fidelity. In this paper, two spectra-based height field generation methods are evaluated. Comparison between the linear superposition and linear filter method is made quantitatively with a statistical model. 3D ocean scene simulating results show the different features between the methods, which can give reference for synthesizing sea surface target images with different ocean conditions.

  17. Evaluating TCMS Train-to-Ground communication performances based on the LTE technology and discreet event simulations

    DEFF Research Database (Denmark)

    Bouaziz, Maha; Yan, Ying; Kassab, Mohamed

    2018-01-01

    is shared between the train and different passengers. The simulation is based on the discrete-events network simulator Riverbed Modeler. Next, second step focusses on a co-simulation testbed, to evaluate performances with real traffic based on Hardware-In-The-Loop and OpenAirInterface modules. Preliminary...... (Long Term Evolution) network as an alternative communication technology, instead of GSM-R (Global System for Mobile communications-Railway) because of some capacity and capability limits. First step, a pure simulation is used to evaluate the network load for a high-speed scenario, when the LTE network...... simulation and co-simulation results show that LTE provides good performance for the TCMS traffic exchange in terms of packet delay and data integrity...

  18. A high performance scientific cloud computing environment for materials simulations

    OpenAIRE

    Jorissen, Kevin; Vila, Fernando D.; Rehr, John J.

    2011-01-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including...

  19. Development and verification of a high performance multi-group SP3 transport capability in the ARTEMIS core simulator

    International Nuclear Information System (INIS)

    Van Geemert, Rene

    2008-01-01

    For satisfaction of future global customer needs, dedicated efforts are being coordinated internationally and pursued continuously at AREVA NP. The currently ongoing CONVERGENCE project is committed to the development of the ARCADIA R next generation core simulation software package. ARCADIA R will be put to global use by all AREVA NP business regions, for the entire spectrum of core design processes, licensing computations and safety studies. As part of the currently ongoing trend towards more sophisticated neutronics methodologies, an SP 3 nodal transport concept has been developed for ARTEMIS which is the steady-state and transient core simulation part of ARCADIA R . For enabling a high computational performance, the SP N calculations are accelerated by applying multi-level coarse mesh re-balancing. In the current implementation, SP 3 is about 1.4 times as expensive computationally as SP 1 (diffusion). The developed SP 3 solution concept is foreseen as the future computational workhorse for many-group 3D pin-by-pin full core computations by ARCADIA R . With the entire numerical workload being highly parallelizable through domain decomposition techniques, associated CPU-time requirements that adhere to the efficiency needs in the nuclear industry can be expected to become feasible in the near future. The accuracy enhancement obtainable by using SP 3 instead of SP 1 has been verified by a detailed comparison of ARTEMIS 16-group pin-by-pin SP N results with KAERI's DeCart reference results for the 2D pin-by-pin Purdue UO 2 /MOX benchmark. This article presents the accuracy enhancement verification and quantifies the achieved ARTEMIS-SP 3 computational performance for a number of 2D and 3D multi-group and multi-box (up to pin-by-pin) core computations. (authors)

  20. Turbocharged molecular discovery of OLED emitters: from high-throughput quantum simulation to highly efficient TADF devices

    Science.gov (United States)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Ha, Dong-Gwang; Einzinger, Markus; Wu, Tony; Baldo, Marc A.; Aspuru-Guzik, Alán.

    2016-09-01

    Discovering new OLED emitters requires many experiments to synthesize candidates and test performance in devices. Large scale computer simulation can greatly speed this search process but the problem remains challenging enough that brute force application of massive computing power is not enough to successfully identify novel structures. We report a successful High Throughput Virtual Screening study that leveraged a range of methods to optimize the search process. The generation of candidate structures was constrained to contain combinatorial explosion. Simulations were tuned to the specific problem and calibrated with experimental results. Experimentalists and theorists actively collaborated such that experimental feedback was regularly utilized to update and shape the computational search. Supervised machine learning methods prioritized candidate structures prior to quantum chemistry simulation to prevent wasting compute on likely poor performers. With this combination of techniques, each multiplying the strength of the search, this effort managed to navigate an area of molecular space and identify hundreds of promising OLED candidate structures. An experimentally validated selection of this set shows emitters with external quantum efficiencies as high as 22%.

  1. High performance computing applied to simulation of the flow in pipes; Computacao de alto desempenho aplicada a simulacao de escoamento em dutos

    Energy Technology Data Exchange (ETDEWEB)

    Cozin, Cristiane; Lueders, Ricardo; Morales, Rigoberto E.M. [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil). Dept. de Engenharia Mecanica

    2008-07-01

    In recent years, computer cluster has emerged as a real alternative to solution of problems which require high performance computing. Consequently, the development of new applications has been driven. Among them, flow simulation represents a real computational burden specially for large systems. This work presents a study of using parallel computing for numerical fluid flow simulation in pipelines. A mathematical flow model is numerically solved. In general, this procedure leads to a tridiagonal system of equations suitable to be solved by a parallel algorithm. In this work, this is accomplished by a parallel odd-oven reduction method found in the literature which is implemented on Fortran programming language. A computational platform composed by twelve processors was used. Many measures of CPU times for different tridiagonal system sizes and number of processors were obtained, highlighting the communication time between processors as an important issue to be considered when evaluating the performance of parallel applications. (author)

  2. Highly immersive virtual reality laparoscopy simulation: development and future aspects.

    Science.gov (United States)

    Huber, Tobias; Wunderling, Tom; Paschold, Markus; Lang, Hauke; Kneist, Werner; Hansen, Christian

    2018-02-01

    Virtual reality (VR) applications with head-mounted displays (HMDs) have had an impact on information and multimedia technologies. The current work aimed to describe the process of developing a highly immersive VR simulation for laparoscopic surgery. We combined a VR laparoscopy simulator (LapSim) and a VR-HMD to create a user-friendly VR simulation scenario. Continuous clinical feedback was an essential aspect of the development process. We created an artificial VR (AVR) scenario by integrating the simulator video output with VR game components of figures and equipment in an operating room. We also created a highly immersive VR surrounding (IVR) by integrating the simulator video output with a [Formula: see text] video of a standard laparoscopy scenario in the department's operating room. Clinical feedback led to optimization of the visualization, synchronization, and resolution of the virtual operating rooms (in both the IVR and the AVR). Preliminary testing results revealed that individuals experienced a high degree of exhilaration and presence, with rare events of motion sickness. The technical performance showed no significant difference compared to that achieved with the standard LapSim. Our results provided a proof of concept for the technical feasibility of an custom highly immersive VR-HMD setup. Future technical research is needed to improve the visualization, immersion, and capability of interacting within the virtual scenario.

  3. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  4. Maintenance Personnel Performance Simulation (MAPPS) model

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.; Haas, P.M.

    1984-01-01

    A stochastic computer model for simulating the actions and behavior of nuclear power plant maintenance personnel is described. The model considers personnel, environmental, and motivational variables to yield predictions of maintenance performance quality and time to perform. The mode has been fully developed and sensitivity tested. Additional evaluation of the model is now taking place

  5. LIAR: A COMPUTER PROGRAM FOR THE SIMULATION AND MODELING OF HIGH PERFORMANCE LINACS

    International Nuclear Information System (INIS)

    Adolphsen, Chris

    2003-01-01

    The computer program LIAR (''LInear Accelerator Research code'') is a numerical simulation and tracking program for linear colliders. The LIAR project was started at SLAC in August 1995 in order to provide a computing and simulation tool that specifically addresses the needs of high energy linear colliders. LIAR is designed to be used for a variety of different linear accelerators. It has been applied for and checked against the existing Stanford Linear Collider (SLC) as well as the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS). The program includes wakefield effects, a 4D coupled beam description, specific optimization algorithms and other advanced features. We describe the most important concepts and highlights of the program. After having presented the LIAR program at the LINAC96 and the PAC97 conferences, we do now introduce it to the European particle accelerator community

  6. Development of the McGill simulator for endoscopic sinus surgery: a new high-fidelity virtual reality simulator for endoscopic sinus surgery.

    Science.gov (United States)

    Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Tewfik, Marc A

    2014-01-01

    The technical challenges of endoscopic sinus surgery (ESS) and the high risk of complications support the development of alternative modalities to train residents in these procedures. Virtual reality simulation is becoming a useful tool for training the skills necessary for minimally invasive surgery; however, there are currently no ESS virtual reality simulators available with valid evidence supporting their use in resident education. Our aim was to develop a new rhinology simulator, as well as to define potential performance metrics for trainee assessment. The McGill simulator for endoscopic sinus surgery (MSESS), a new sinus surgery virtual reality simulator with haptic feedback, was developed (a collaboration between the McGill University Department of Otolaryngology-Head and Neck Surgery, the Montreal Neurologic Institute Simulation Lab, and the National Research Council of Canada). A panel of experts in education, performance assessment, rhinology, and skull base surgery convened to identify core technical abilities that would need to be taught by the simulator, as well as performance metrics to be developed and captured. The MSESS allows the user to perform basic sinus surgery skills, such as an ethmoidectomy and sphenoidotomy, through the use of endoscopic tools in a virtual nasal model. The performance metrics were developed by an expert panel and include measurements of safety, quality, and efficiency of the procedure. The MSESS incorporates novel technological advancements to create a realistic platform for trainees. To our knowledge, this is the first simulator to combine novel tools such as the endonasal wash and elaborate anatomic deformity with advanced performance metrics for ESS.

  7. Highly automated driving, secondary task performance, and driver state.

    Science.gov (United States)

    Merat, Natasha; Jamson, A Hamish; Lai, Frank C H; Carsten, Oliver

    2012-10-01

    A driving simulator study compared the effect of changes in workload on performance in manual and highly automated driving. Changes in driver state were also observed by examining variations in blink patterns. With the addition of a greater number of advanced driver assistance systems in vehicles, the driver's role is likely to alter in the future from an operator in manual driving to a supervisor of highly automated cars. Understanding the implications of such advancements on drivers and road safety is important. A total of 50 participants were recruited for this study and drove the simulator in both manual and highly automated mode. As well as comparing the effect of adjustments in driving-related workload on performance, the effect of a secondary Twenty Questions Task was also investigated. In the absence of the secondary task, drivers' response to critical incidents was similar in manual and highly automated driving conditions. The worst performance was observed when drivers were required to regain control of driving in the automated mode while distracted by the secondary task. Blink frequency patterns were more consistent for manual than automated driving but were generally suppressed during conditions of high workload. Highly automated driving did not have a deleterious effect on driver performance, when attention was not diverted to the distracting secondary task. As the number of systems implemented in cars increases, an understanding of the implications of such automation on drivers' situation awareness, workload, and ability to remain engaged with the driving task is important.

  8. Key performance indicators for successful simulation projects

    OpenAIRE

    Jahangirian, M; Taylor, SJE; Young, T; Robinson, S

    2016-01-01

    There are many factors that may contribute to the successful delivery of a simulation project. To provide a structured approach to assessing the impact various factors have on project success, we propose a top-down framework whereby 15 Key Performance Indicators (KPI) are developed that represent the level of successfulness of simulation projects from various perspectives. They are linked to a set of Critical Success Factors (CSF) as reported in the simulation literature. A single measure cal...

  9. 2D simulation and performance evaluation of bifacial rear local contact c-Si solar cells under variable illumination conditions

    KAUST Repository

    Katsaounis, Theodoros; Kotsovos, Konstantinos; Gereige, Issam; Al-Saggaf, Ahmed; Tzavaras, Athanasios

    2017-01-01

    A customized 2D computational tool has been developed to simulate bifacial rear local contact PERC type PV structures based on the numerical solution of the transport equations through the finite element method. Simulations were performed under various device material parameters and back contact geometry configurations in order to optimize bifacial solar cell performance under different simulated illumination conditions. Bifacial device maximum power output was also compared with the monofacial equivalent one and the industrial standard Al-BSF structure. The performance of the bifacial structure during highly diffused irradiance conditions commonly observed in the Middle East region due to high concentrations of airborne dust particles was also investigated. Simulation results demonstrated that such conditions are highly favorable for the bifacial device because of the significantly increased diffuse component of the solar radiation which enters the back cell surface.

  10. 2D simulation and performance evaluation of bifacial rear local contact c-Si solar cells under variable illumination conditions

    KAUST Repository

    Katsaounis, Theodoros

    2017-09-18

    A customized 2D computational tool has been developed to simulate bifacial rear local contact PERC type PV structures based on the numerical solution of the transport equations through the finite element method. Simulations were performed under various device material parameters and back contact geometry configurations in order to optimize bifacial solar cell performance under different simulated illumination conditions. Bifacial device maximum power output was also compared with the monofacial equivalent one and the industrial standard Al-BSF structure. The performance of the bifacial structure during highly diffused irradiance conditions commonly observed in the Middle East region due to high concentrations of airborne dust particles was also investigated. Simulation results demonstrated that such conditions are highly favorable for the bifacial device because of the significantly increased diffuse component of the solar radiation which enters the back cell surface.

  11. High performance APCS conceptual design and evaluation scoping study

    International Nuclear Information System (INIS)

    Soelberg, N.; Liekhus, K.; Chambers, A.; Anderson, G.

    1998-02-01

    This Air Pollution Control System (APCS) Conceptual Design and Evaluation study was conducted to evaluate a high-performance (APC) system for minimizing air emissions from mixed waste thermal treatment systems. Seven variations of high-performance APCS designs were conceptualized using several design objectives. One of the system designs was selected for detailed process simulation using ASPEN PLUS to determine material and energy balances and evaluate performance. Installed system capital costs were also estimated. Sensitivity studies were conducted to evaluate the incremental cost and benefit of added carbon adsorber beds for mercury control, specific catalytic reduction for NO x control, and offgas retention tanks for holding the offgas until sample analysis is conducted to verify that the offgas meets emission limits. Results show that the high-performance dry-wet APCS can easily meet all expected emission limits except for possibly mercury. The capability to achieve high levels of mercury control (potentially necessary for thermally treating some DOE mixed streams) could not be validated using current performance data for mercury control technologies. The engineering approach and ASPEN PLUS modeling tool developed and used in this study identified APC equipment and system performance, size, cost, and other issues that are not yet resolved. These issues need to be addressed in feasibility studies and conceptual designs for new facilities or for determining how to modify existing facilities to meet expected emission limits. The ASPEN PLUS process simulation with current and refined input assumptions and calculations can be used to provide system performance information for decision-making, identifying best options, estimating costs, reducing the potential for emission violations, providing information needed for waste flow analysis, incorporating new APCS technologies in existing designs, or performing facility design and permitting activities

  12. Simulation and performance of brushless DC motor actuators

    OpenAIRE

    Gerba, Alex

    1985-01-01

    The simulation model for a Brushless D.C. Motor and the associated commutation power conditioner transistor model are presented. The necessary conditions for maximum power output while operating at steady-state speed and sinusoidally distributed air-gap flux are developed. Comparisons of simulated model with the measured performance of a typical motor are done both on time response waveforms and on average performance characteristics. These preliminary results indicate good ...

  13. SLC injector simulation and tuning for high charge transport

    International Nuclear Information System (INIS)

    Yeremian, A.D.; Miller, R.H.; Clendenin, J.E.; Early, R.A.; Ross, M.C.; Turner, J.L.; Wang, J.W.

    1992-01-01

    We have simulated the SLC injector from the thermionic gun through the first accelerating section and used the resulting parameters to tune the injector for optimum performance and high charge transport. Simulations are conducted using PARMELA, a three-dimensional space-charge model. The magnetic field profile due to the existing magnetic optics is calculated using POISSON, while SUPERFISH is used to calculate the space harmonics of the various bunchers and the accelerator cavities. The initial beam conditions in the PARMELA code are derived from the EGUN model of the gun. The resulting injector parameters from the PARMELA simulation are used to prescribe experimental settings of the injector components. The experimental results are in agreement with the results of the integrated injector model. (Author) 5 figs., 7 refs

  14. The effects of bedrest on crew performance during simulated shuttle reentry. Volume 2: Control task performance

    Science.gov (United States)

    Jex, H. R.; Peters, R. A.; Dimarco, R. J.; Allen, R. W.

    1974-01-01

    A simplified space shuttle reentry simulation performed on the NASA Ames Research Center Centrifuge is described. Anticipating potentially deleterious effects of physiological deconditioning from orbital living (simulated here by 10 days of enforced bedrest) upon a shuttle pilot's ability to manually control his aircraft (should that be necessary in an emergency) a comprehensive battery of measurements was made roughly every 1/2 minute on eight military pilot subjects, over two 20-minute reentry Gz vs. time profiles, one peaking at 2 Gz and the other at 3 Gz. Alternate runs were made without and with g-suits to test the help or interference offered by such protective devices to manual control performance. A very demanding two-axis control task was employed, with a subcritical instability in the pitch axis to force a high attentional demand and a severe loss-of-control penalty. The results show that pilots experienced in high Gz flying can easily handle the shuttle manual control task during 2 Gz or 3 Gz reentry profiles, provided the degree of physiological deconditioning is no more than induced by these 10 days of enforced bedrest.

  15. Predictors of laparoscopic simulation performance among practicing obstetrician gynecologists.

    Science.gov (United States)

    Mathews, Shyama; Brodman, Michael; D'Angelo, Debra; Chudnoff, Scott; McGovern, Peter; Kolev, Tamara; Bensinger, Giti; Mudiraj, Santosh; Nemes, Andreea; Feldman, David; Kischak, Patricia; Ascher-Walsh, Charles

    2017-11-01

    While simulation training has been established as an effective method for improving laparoscopic surgical performance in surgical residents, few studies have focused on its use for attending surgeons, particularly in obstetrics and gynecology. Surgical simulation may have a role in improving and maintaining proficiency in the operating room for practicing obstetrician gynecologists. We sought to determine if parameters of performance for validated laparoscopic virtual simulation tasks correlate with surgical volume and characteristics of practicing obstetricians and gynecologists. All gynecologists with laparoscopic privileges (n = 347) from 5 academic medical centers in New York City were required to complete a laparoscopic surgery simulation assessment. The physicians took a presimulation survey gathering physician self-reported characteristics and then performed 3 basic skills tasks (enforced peg transfer, lifting/grasping, and cutting) on the LapSim virtual reality laparoscopic simulator (Surgical Science Ltd, Gothenburg, Sweden). The association between simulation outcome scores (time, efficiency, and errors) and self-rated clinical skills measures (self-rated laparoscopic skill score or surgical volume category) were examined with regression models. The average number of laparoscopic procedures per month was a significant predictor of total time on all 3 tasks (P = .001 for peg transfer; P = .041 for lifting and grasping; P simulation performance as it correlates to active physician practice, further studies may help assess skill and individualize training to maintain skill levels as case volumes fluctuate. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. High-performance computing on GPUs for resistivity logging of oil and gas wells

    Science.gov (United States)

    Glinskikh, V.; Dudaev, A.; Nechaev, O.; Surodina, I.

    2017-10-01

    We developed and implemented into software an algorithm for high-performance simulation of electrical logs from oil and gas wells using high-performance heterogeneous computing. The numerical solution of the 2D forward problem is based on the finite-element method and the Cholesky decomposition for solving a system of linear algebraic equations (SLAE). Software implementations of the algorithm used the NVIDIA CUDA technology and computing libraries are made, allowing us to perform decomposition of SLAE and find its solution on central processor unit (CPU) and graphics processor unit (GPU). The calculation time is analyzed depending on the matrix size and number of its non-zero elements. We estimated the computing speed on CPU and GPU, including high-performance heterogeneous CPU-GPU computing. Using the developed algorithm, we simulated resistivity data in realistic models.

  17. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  18. Proficiency performance benchmarks for removal of simulated brain tumors using a virtual reality simulator NeuroTouch.

    Science.gov (United States)

    AlZhrani, Gmaan; Alotaibi, Fahad; Azarnoush, Hamed; Winkler-Schwartz, Alexander; Sabbagh, Abdulrahman; Bajunaid, Khalid; Lajoie, Susanne P; Del Maestro, Rolando F

    2015-01-01

    Assessment of neurosurgical technical skills involved in the resection of cerebral tumors in operative environments is complex. Educators emphasize the need to develop and use objective and meaningful assessment tools that are reliable and valid for assessing trainees' progress in acquiring surgical skills. The purpose of this study was to develop proficiency performance benchmarks for a newly proposed set of objective measures (metrics) of neurosurgical technical skills performance during simulated brain tumor resection using a new virtual reality simulator (NeuroTouch). Each participant performed the resection of 18 simulated brain tumors of different complexity using the NeuroTouch platform. Surgical performance was computed using Tier 1 and Tier 2 metrics derived from NeuroTouch simulator data consisting of (1) safety metrics, including (a) volume of surrounding simulated normal brain tissue removed, (b) sum of forces utilized, and (c) maximum force applied during tumor resection; (2) quality of operation metric, which involved the percentage of tumor removed; and (3) efficiency metrics, including (a) instrument total tip path lengths and (b) frequency of pedal activation. All studies were conducted in the Neurosurgical Simulation Research Centre, Montreal Neurological Institute and Hospital, McGill University, Montreal, Canada. A total of 33 participants were recruited, including 17 experts (board-certified neurosurgeons) and 16 novices (7 senior and 9 junior neurosurgery residents). The results demonstrated that "expert" neurosurgeons resected less surrounding simulated normal brain tissue and less tumor tissue than residents. These data are consistent with the concept that "experts" focused more on safety of the surgical procedure compared with novices. By analyzing experts' neurosurgical technical skills performance on these different metrics, we were able to establish benchmarks for goal proficiency performance training of neurosurgery residents. This

  19. Measurements and simulation-based optimization of TIGRESS HPGe detector array performance

    International Nuclear Information System (INIS)

    Schumaker, M.A.

    2005-01-01

    TIGRESS is a new γ-ray detector array being developed for installation at the new ISAC-II facility at TRIUMF in Vancouver. When complete, it will consist of twelve large-volume segmented HPGe clover detectors, fitted with segmented Compton suppression shields. The combined operation of prototypes of both a TIGRESS detector and a suppression shield has been tested. Peak-to-total ratios, relative photopeak efficiencies, and energy resolution functions have been determined in order to characterize the performance of TIGRESS. This information was then used to refine a GEANT4 simulation of the full detector array. Using this simulation, methods to overcome the degradation of the photopeak efficiency and peak-to-total response that occurs with high γ-ray multiplicity events were explored. These methods take advantage of the high segmentation of both the HPGe clovers and the suppression shields to suppress or sum detector interactions selectively. For a range of γ-ray energies and multiplicities, optimal analysis methods have been determined, which has resulted in significant gains in the expected performance of TIGRESS. (author)

  20. Performance of technology-driven simulators for medical students--a systematic review.

    Science.gov (United States)

    Michael, Michael; Abboudi, Hamid; Ker, Jean; Shamim Khan, Mohammed; Dasgupta, Prokar; Ahmed, Kamran

    2014-12-01

    Simulation-based education has evolved as a key training tool in high-risk industries such as aviation and the military. In parallel with these industries, the benefits of incorporating specialty-oriented simulation training within medical schools are vast. Adoption of simulators into medical school education programs has shown great promise and has the potential to revolutionize modern undergraduate education. An English literature search was carried out using MEDLINE, EMBASE, and psychINFO databases to identify all randomized controlled studies pertaining to "technology-driven" simulators used in undergraduate medical education. A validity framework incorporating the "framework for technology enhanced learning" report by the Department of Health, United Kingdom, was used to evaluate the capabilities of each technology-driven simulator. Information was collected regarding the simulator type, characteristics, and brand name. Where possible, we extracted information from the studies on the simulators' performance with respect to validity status, reliability, feasibility, education impact, acceptability, and cost effectiveness. We identified 19 studies, analyzing simulators for medical students across a variety of procedure-based specialities including; cardiovascular (n = 2), endoscopy (n = 3), laparoscopic surgery (n = 8), vascular access (n = 2), ophthalmology (n = 1), obstetrics and gynecology (n = 1), anesthesia (n = 1), and pediatrics (n = 1). Incorporation of simulators has so far been on an institutional level; no national or international trends have yet emerged. Simulators are capable of providing a highly educational and realistic experience for the medical students within a variety of speciality-oriented teaching sessions. Further research is needed to establish how best to incorporate simulators into a more primary stage of medical education; preclinical and clinical undergraduate medicine. Copyright © 2014 Elsevier Inc. All rights

  1. Milking performance evaluation and factors affecting milking claw vacuum levels with flow simulator.

    Science.gov (United States)

    Enokidani, Masafumi; Kawai, Kazuhiro; Shinozuka, Yasunori; Watanabe, Aiko

    2017-08-01

    Milking performance of milking machines that matches the production capability of dairy cows is important in reducing the risk of mastitis, particularly in high-producing cows. This study used a simulated milking device to examine the milking performance of the milking system of 73 dairy farms and to analyze the factors affecting claw vacuum. Mean claw vacuum and range of fluctuation of claw vacuum (claw vacuum range) were measured at three different flow rates: 5.7, 7.6 and 8.7 kg/min. At the highest flow rate, only 16 farms (21.9%) met both standards of mean claw vacuum ≥35 kPa and claw vacuum range ≤ 7 kPa, showing that milking systems currently have poor milking performance. The factors affecting mean claw vacuum were claw type, milk-meter and vacuum shut-off device; the factor affecting claw vacuum range was claw type. Examination of the milking performance of the milking system using a simulated milking device allows an examination of the performance that can cope with high producing cows, indicating the possibility of reducing the risk of mastitis caused by inappropriate claw vacuum. © 2016 Japanese Society of Animal Science.

  2. Simulation of press-forming for automobile part using ultra high tension steel

    Directory of Open Access Journals (Sweden)

    Tanabe I.

    2012-08-01

    Full Text Available In recent years, ultra high tension steel has gradually been used in the automobile industry. The development of press-forming technology is now essential by reason of its high productivity and high product quality. In this study, tensile tests were performed with a view to understanding the material properties. Press-forming tests were then carried out with regard to the behaviors of spring back and deep-drawability, and manufacturing a real product. The ultra high tension steel used in the experiments had a thickness of 1 mm and a tensile strength of 1000 MPa. Finally, simulations of spring back, deep-drawability and manufacturing a real product in ultra high tension steel were conducted and evaluated in order to calculate the optimum-press-forming conditions and the optimum shape of the die. FEM with non-linear and dynamic analysis using Euler-Lagrange’s element was used for the simulations. It is concluded from the results that (1 the simulations conformed to the results of the experiments (2 the simulations proved very effective for calculating the optimum press conditions and die shape.

  3. Cognitive load predicts point-of-care ultrasound simulator performance.

    Science.gov (United States)

    Aldekhyl, Sara; Cavalcanti, Rodrigo B; Naismith, Laura M

    2018-02-01

    The ability to maintain good performance with low cognitive load is an important marker of expertise. Incorporating cognitive load measurements in the context of simulation training may help to inform judgements of competence. This exploratory study investigated relationships between demographic markers of expertise, cognitive load measures, and simulator performance in the context of point-of-care ultrasonography. Twenty-nine medical trainees and clinicians at the University of Toronto with a range of clinical ultrasound experience were recruited. Participants answered a demographic questionnaire then used an ultrasound simulator to perform targeted scanning tasks based on clinical vignettes. Participants were scored on their ability to both acquire and interpret ultrasound images. Cognitive load measures included participant self-report, eye-based physiological indices, and behavioural measures. Data were analyzed using a multilevel linear modelling approach, wherein observations were clustered by participants. Experienced participants outperformed novice participants on ultrasound image acquisition. Ultrasound image interpretation was comparable between the two groups. Ultrasound image acquisition performance was predicted by level of training, prior ultrasound training, and cognitive load. There was significant convergence between cognitive load measurement techniques. A marginal model of ultrasound image acquisition performance including prior ultrasound training and cognitive load as fixed effects provided the best overall fit for the observed data. In this proof-of-principle study, the combination of demographic and cognitive load measures provided more sensitive metrics to predict ultrasound simulator performance. Performance assessments which include cognitive load can help differentiate between levels of expertise in simulation environments, and may serve as better predictors of skill transfer to clinical practice.

  4. GROMACS: High performance molecular simulations through multi-level parallelism from laptops to supercomputers

    Directory of Open Access Journals (Sweden)

    Mark James Abraham

    2015-09-01

    Full Text Available GROMACS is one of the most widely used open-source and free software codes in chemistry, used primarily for dynamical simulations of biomolecules. It provides a rich set of calculation types, preparation and analysis tools. Several advanced techniques for free-energy calculations are supported. In version 5, it reaches new performance heights, through several new and enhanced parallelization algorithms. These work on every level; SIMD registers inside cores, multithreading, heterogeneous CPU–GPU acceleration, state-of-the-art 3D domain decomposition, and ensemble-level parallelization through built-in replica exchange and the separate Copernicus framework. The latest best-in-class compressed trajectory storage format is supported.

  5. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    Science.gov (United States)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  6. Status report on high fidelity reactor simulation

    International Nuclear Information System (INIS)

    Palmiotti, G.; Smith, M.; Rabiti, C.; Lewis, E.; Yang, W.; Leclere, M.; Siegel, A.; Fischer, P.; Kaushik, D.; Ragusa, J.; Lottes, J.; Smith, B.

    2006-01-01

    This report presents the effort under way at Argonne National Laboratory toward a comprehensive, integrated computational tool intended mainly for the high-fidelity simulation of sodium-cooled fast reactors. The main activities carried out involved neutronics, thermal hydraulics, coupling strategies, software architecture, and high-performance computing. A new neutronics code, UNIC, is being developed. The first phase involves the application of a spherical harmonics method to a general, unstructured three-dimensional mesh. The method also has been interfaced with a method of characteristics. The spherical harmonics equations were implemented in a stand-alone code that was then used to solve several benchmark problems. For thermal hydraulics, a computational fluid dynamics code called Nek5000, developed in the Mathematics and Computer Science Division for coupled hydrodynamics and heat transfer, has been applied to a single-pin, periodic cell in the wire-wrap geometry typical of advanced burner reactors. Numerical strategies for multiphysics coupling have been considered and higher-accuracy efficient methods proposed to finely simulate coupled neutronic/thermal-hydraulic reactor transients. Initial steps have been taken in order to couple UNIC and Nek5000, and simplified problems have been defined and solved for testing. Furthermore, we have begun developing a lightweight computational framework, based in part on carefully selected open source tools, to nonobtrusively and efficiently integrate the individual physics modules into a unified simulation tool

  7. Blaze-DEMGPU: Modular high performance DEM framework for the GPU architecture

    Directory of Open Access Journals (Sweden)

    Nicolin Govender

    2016-01-01

    Full Text Available Blaze-DEMGPU is a modular GPU based discrete element method (DEM framework that supports polyhedral shaped particles. The high level performance is attributed to the light weight and Single Instruction Multiple Data (SIMD that the GPU architecture offers. Blaze-DEMGPU offers suitable algorithms to conduct DEM simulations on the GPU and these algorithms can be extended and modified. Since a large number of scientific simulations are particle based, many of the algorithms and strategies for GPU implementation present in Blaze-DEMGPU can be applied to other fields. Blaze-DEMGPU will make it easier for new researchers to use high performance GPU computing as well as stimulate wider GPU research efforts by the DEM community.

  8. Integrated heat transport simulation of high ion temperature plasma of LHD

    International Nuclear Information System (INIS)

    Murakami, S.; Yamaguchi, H.; Sakai, A.

    2014-10-01

    A first dynamical simulation of high ion temperature plasma with carbon pellet injection of LHD is performed by the integrated simulation GNET-TD + TASK3D. NBI heating deposition of time evolving plasma is evaluated by the 5D drift kinetic equation solver, GNET-TD and the heat transport of multi-ion species plasma (e, H, He, C) is studied by the integrated transport simulation code, TASK3D. Achievement of high ion temperature plasma is attributed to the 1) increase of heating power per ion due to the temporal increase of effective charge, 2) reduction of effective neoclassical transport with impurities, 3) reduction of turbulence transport. The reduction of turbulence transport is most significant contribution to achieve the high ion temperature and the reduction of the turbulent transport from the L-mode plasma (normal hydrogen plasma) is evaluated to be a factor about five by using integrated heat transport simulation code. Applying the Z effective dependent turbulent reduction model we obtain a similar time behavior of ion temperature after the C pellet injection with the experimental results. (author)

  9. High-performance scientific computing in the cloud

    Science.gov (United States)

    Jorissen, Kevin; Vila, Fernando; Rehr, John

    2011-03-01

    Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.

  10. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    Science.gov (United States)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  11. High-performance computational fluid dynamics: a custom-code approach

    International Nuclear Information System (INIS)

    Fannon, James; Náraigh, Lennon Ó; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain

    2016-01-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier–Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing. (paper)

  12. High-performance computational fluid dynamics: a custom-code approach

    Science.gov (United States)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  13. COMPUTERS: Teraflops for Europe; EEC Working Group on High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1991-03-15

    In little more than a decade, simulation on high performance computers has become an essential tool for theoretical physics, capable of solving a vast range of crucial problems inaccessible to conventional analytic mathematics. In many ways, computer simulation has become the calculus for interacting many-body systems, a key to the study of transitions from isolated to collective behaviour.

  14. COMPUTERS: Teraflops for Europe; EEC Working Group on High Performance Computing

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    In little more than a decade, simulation on high performance computers has become an essential tool for theoretical physics, capable of solving a vast range of crucial problems inaccessible to conventional analytic mathematics. In many ways, computer simulation has become the calculus for interacting many-body systems, a key to the study of transitions from isolated to collective behaviour

  15. Wall modeling for the simulation of highly non-isothermal unsteady flows

    International Nuclear Information System (INIS)

    Devesa, A.

    2006-12-01

    Nuclear industry flows are most of the time characterized by their high Reynolds number, density variations (at low Mach numbers) and a highly unsteady behaviour (low to moderate frequencies). High Reynolds numbers are un-affordable by direct simulation (DNS), and simulations must either be performed by solving averaged equations (RANS), or by solving only the large eddies (LES), both using a wall model. A first investigation of this thesis dealt with the derivation and test of two variable density wall models: an algebraic law (CWM) and a zonal approach dedicated to LES (TBLE-ρ). These models were validated in quasi-isothermal cases, before being used in academic and industrial non-isothermal flows with satisfactory results. Then, a numerical experiment of pulsed passive scalars was performed by DNS, were two forcing conditions were considered: oscillations are imposed in the outer flow; oscillations come from the wall. Several frequencies and amplitudes of oscillations were taken into account in order to gain insights in unsteady effects in the boundary layer, and to create a database for validating wall models in such context. The temporal behaviour of two wall models (algebraic and zonal wall models) were studied and showed that a zonal model produced better results when used in the simulation of unsteady flows. (author)

  16. High-Speed, High-Performance DQPSK Optical Links with Reduced Complexity VDFE Equalizers

    Directory of Open Access Journals (Sweden)

    Maki Nanou

    2017-02-01

    Full Text Available Optical transmission technologies optimized for optical network segments sensitive to power consumption and cost, comprise modulation formats with direct detection technologies. Specifically, non-return to zero differential quaternary phase shift keying (NRZ-DQPSK in deployed fiber plants, combined with high-performance, low-complexity electronic equalizers to compensate residual impairments at the receiver end, can be proved as a viable solution for high-performance, high-capacity optical links. Joint processing of the constructive and the destructive signals at the single-ended DQPSK receiver provides improved performance compared to the balanced configuration, however, at the expense of higher hardware requirements, a fact that may not be neglected especially in the case of high-speed optical links. To overcome this bottleneck, the use of partially joint constructive/destructive DQPSK equalization is investigated in this paper. Symbol-by-symbol equalization is performed by means of Volterra decision feedback-type equalizers, driven by a reduced subset of signals selected from the constructive and the destructive ports of the optical detectors. The proposed approach offers a low-complexity alternative for electronic equalization, without sacrificing much of the performance compared to the fully-deployed counterpart. The efficiency of the proposed equalizers is demonstrated by means of computer simulation in a typical optical transmission scenario.

  17. Measured and simulated performance of Compton-suppressed TIGRESS HPGe clover detectors

    Science.gov (United States)

    Schumaker, M. A.; Hackman, G.; Pearson, C. J.; Svensson, C. E.; Andreoiu, C.; Andreyev, A.; Austin, R. A. E.; Ball, G. C.; Bandyopadhyay, D.; Boston, A. J.; Chakrawarthy, R. S.; Churchman, R.; Drake, T. E.; Finlay, P.; Garrett, P. E.; Grinyer, G. F.; Hyland, B.; Jones, B.; Maharaj, R.; Morton, A. C.; Phillips, A. A.; Sarazin, F.; Scraggs, H. C.; Smith, M. B.; Valiente-Dobón, J. J.; Waddington, J. C.; Watters, L. M.

    2007-01-01

    Tests of the performance of a 32-fold segmented HPGe clover detector coupled to a 20-fold segmented Compton-suppression shield, which form a prototype element of the TRIUMF-ISAC Gamma-Ray Escape-Suppressed Spectrometer (TIGRESS), have been made. Peak-to-total ratios and relative efficiencies have been measured for a variety of γ-ray energies. These measurements were used to validate a GEANT4 simulation of the TIGRESS detectors, which was then used to create a simulation of the full 12-detector array. Predictions of the expected performance of TIGRESS are presented. These predictions indicate that TIGRESS will be capable, for single 1 MeV γ rays, of absolute detection efficiencies of 17% and 9.4%, and peak-to-total ratios of 54% and 61% for the "high-efficiency" and "optimized peak-to-total" configurations of the array, respectively.

  18. Numerical Simulation of Thermal Performance of Glass-Fibre-Reinforced Polymer

    Science.gov (United States)

    Zhao, Yuchao; Jiang, Xu; Zhang, Qilin; Wang, Qi

    2017-10-01

    Glass-Fibre-Reinforced Polymer (GFRP), as a developing construction material, has a rapidly increasing application in civil engineering especially bridge engineering area these years, mainly used as decorating materials and reinforcing bars for now. Compared with traditional construction material, these kinds of composite material have obvious advantages such as high strength, low density, resistance to corrosion and ease of processing. There are different processing methods to form members, such as pultrusion and resin transfer moulding (RTM) methods, which process into desired shape directly through raw material; meanwhile, GFRP, as a polymer composite, possesses several particular physical and mechanical properties, and the thermal property is one of them. The matrix material, polymer, performs special after heated and endue these composite material a potential hot processing property, but also a poor fire resistance. This paper focuses on thermal performance of GFRP as panels and corresponding researches are conducted. First, dynamic thermomechanical analysis (DMA) experiment is conducted to obtain the glass transition temperature (Tg) of the object GFRP, and the curve of bending elastic modulus with temperature is calculated according to the experimental data. Then compute and estimate the values of other various thermal parameters through DMA experiment and other literatures, and conduct numerical simulation under two condition respectively: (1) the heat transfer process of GFRP panel in which the panel would be heated directly on the surface above Tg, and the hot processing under this temperature field; (2) physical and mechanical performance of GFRP panel under fire condition. Condition (1) is mainly used to guide the development of high temperature processing equipment, and condition (2) indicates that GFRP’s performance under fire is unsatisfactory, measures must be taken when being adopted. Since composite materials’ properties differ from each other

  19. SLC injector simulation and tuning for high charge transport

    International Nuclear Information System (INIS)

    Yeremian, A.D.; Miller, R.H.; Clendenin, J.E.; Early, R.A.; Ross, M.C.; Turner, J.L.; Wang, J.W.

    1992-08-01

    We have simulated the SLC injector from the thermionic gun through the first accelerating section and used the resulting parameters to tune the injector for optimum performance and high charge transport. Simulations are conducted using PARMELA, a three-dimensional ray-trace code with a two-dimensional space-charge model. The magnetic field profile due to the existing magnetic optics is calculated using POISSON, while SUPERFISH is used to calculate the space harmonics of the various bunchers and the accelerator cavities. The initial beam conditions in the PARMELA code are derived from the EGUN model of the gun. The resulting injector parameters from the PARMELA simulation are used to prescribe experimental settings of the injector components. The experimental results are in agreement with the results of the integrated injector model

  20. Simulation studies for a high resolution time projection chamber at the international linear collider

    Energy Technology Data Exchange (ETDEWEB)

    Muennich, A.

    2007-03-26

    The International Linear Collider (ILC) is planned to be the next large accelerator. The ILC will be able to perform high precision measurements only possible at the clean environment of electron positron collisions. In order to reach this high accuracy, the requirements for the detector performance are challenging. Several detector concepts are currently under study. The understanding of the detector and its performance will be crucial to extract the desired physics results from the data. To optimise the detector design, simulation studies are needed. Simulation packages like GEANT4 allow to model the detector geometry and simulate the energy deposit in the different materials. However, the detector response taking into account the transportation of the produced charge to the readout devices and the effects ofthe readout electronics cannot be described in detail. These processes in the detector will change the measured position of the energy deposit relative to the point of origin. The determination of this detector response is the task of detailed simulation studies, which have to be carried out for each subdetector. A high resolution Time Projection Chamber (TPC) with gas amplification based on micro pattern gas detectors, is one of the options for the main tracking system at the ILC. In the present thesis a detailed simulation tool to study the performance of a TPC was developed. Its goal is to find the optimal settings to reach an excellent momentum and spatial resolution. After an introduction to the present status of particle physics and the ILC project with special focus on the TPC as central tracker, the simulation framework is presented. The basic simulation methods and implemented processes are introduced. Within this stand-alone simulation framework each electron produced by primary ionisation is transferred through the gas volume and amplified using Gas Electron Multipliers (GEMs). The output format of the simulation is identical to the raw data from a

  1. Living high-training low: effect on erythropoiesis and aerobic performance in highly-trained swimmers

    DEFF Research Database (Denmark)

    Robach, P.; Schmitt, L.; Brugniaux, J.V.

    2006-01-01

    LHTL enhances aerobic performance in athletes, and if any positive effect may last for up to 2 weeks after LHTL intervention. Eighteen swimmers trained for 13 days at 1,200 m while sleeping/living at 1,200 m in ambient air (control, n=9) or in hypoxic rooms (LHTL, n=9, 5 days at simulated altitude of 2......The "living high-training low" model (LHTL), i.e., training in normoxia but sleeping/living in hypoxia, is designed to improve the athletes performance. However, LHTL efficacy still remains controversial and also little is known about the duration of its potential benefit. This study tested whether......,500 m followed by 8 days at simulated altitude of 3,000 m, 16 h day(-1)). Measures were done before 1-2 days (POST-1) and 2 weeks after intervention (POST-15). Aerobic performance was assessed from two swimming trials, exploring .VO(2max) and endurance performance (2,000-m time trial), respectively...

  2. Dose rate laser simulation tests adequacy: Shadowing and high intensity effects analysis

    International Nuclear Information System (INIS)

    Nikiforov, A.Y.; Skorobogatov, P.K.

    1996-01-01

    The adequacy of laser based simulation of the flash X-ray effects in microcircuits may be corrupted mainly due to laser radiation shadowing by the metallization and the non-linear absorption in a high intensity range. The numerical joint solution of the optical equations and the fundamental system of equations in a two-dimensional approximation were performed to adjust the application range of laser simulation. As a result the equivalent dose rate to laser intensity correspondence was established taking into account the shadowing as well as the high intensity effects. The simulation adequacy was verified in the range up to 4·10 11 rad(Si)/s with the comparative laser test of a specially designed test structure

  3. A high-performance channel engineered charge-plasma-based MOSFET with high-κ spacer

    Science.gov (United States)

    Shan, Chan; Wang, Ying; Luo, Xin; Bao, Meng-tian; Yu, Cheng-hao; Cao, Fei

    2017-12-01

    In this paper, the performance of graded channel double-gate MOSFET (GC-DGFET) that utilizes the charge-plasma concept and a high-κ spacer is investigated through 2-D device simulations. The results demonstrate that GC-DGFET with high-κ spacer can effectively improve the ON-state driving current (ION) and reduce the OFF-leakage current (IOFF). We find that reduction of the initial energy barrier between the source and channel is the origin of this ION enhancement. The reason for the IOFF reduction is identified to be the extension of the effective channel length owing to the fringing field via high-κ spacers. Consequently, these devices offer enhanced performance by reducing the total gate-to-gate capacitance (Cgg) and decreasing the intrinsic delay (τ).

  4. First experiences of high-fidelity simulation training in junior nursing students in Korea.

    Science.gov (United States)

    Lee, Suk Jeong; Kim, Sang Suk; Park, Young-Mi

    2015-07-01

    This study was conducted to explore first experiences of high-fidelity simulation training in Korean nursing students, in order to develop and establish more effective guidelines for future simulation training in Korea. Thirty-three junior nursing students participated in high-fidelity simulation training for the first time. Using both qualitative and quantitative methods, data were collected from reflective journals and questionnaires of simulation effectiveness after simulation training. Descriptive statistics were used to analyze simulation effectiveness and content analysis was performed with the reflective journal data. Five dimensions and 31 domains, both positive and negative experiences, emerged from qualitative analysis: (i) machine-human interaction in a safe environment; (ii) perceived learning capability; (iii) observational learning; (iv) reconciling practice with theory; and (v) follow-up debriefing effect. More than 70% of students scored high on increased ability to identify changes in the patient's condition, critical thinking, decision-making, effectiveness of peer observation, and debriefing in effectiveness of simulation. This study reported both positive and negative experiences of simulation. The results of this study could be used to set the level of task difficulty in simulation. Future simulation programs can be designed by reinforcing the positive experiences and modifying the negative results. © 2014 The Authors. Japan Journal of Nursing Science © 2014 Japan Academy of Nursing Science.

  5. A task-based parallelism and vectorized approach to 3D Method of Characteristics (MOC) reactor simulation for high performance computing architectures

    Science.gov (United States)

    Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.

    2016-05-01

    In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.

  6. High-performance computing in accelerating structure design and analysis

    International Nuclear Information System (INIS)

    Li Zenghai; Folwell, Nathan; Ge Lixin; Guetz, Adam; Ivanov, Valentin; Kowalski, Marc; Lee, Lie-Quan; Ng, Cho-Kuen; Schussman, Greg; Stingelin, Lukas; Uplenchwar, Ravindra; Wolf, Michael; Xiao, Liling; Ko, Kwok

    2006-01-01

    Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R and D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high-performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high-performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long-range wakefields)

  7. Comparison of driving simulator performance and neuropsychological testing in narcolepsy.

    Science.gov (United States)

    Kotterba, Sylvia; Mueller, Nicole; Leidag, Markus; Widdig, Walter; Rasche, Kurt; Malin, Jean-Pierre; Schultze-Werninghaus, Gerhard; Orth, Maritta

    2004-09-01

    Daytime sleepiness and cataplexy can increase automobile accident rates in narcolepsy. Several countries have produced guidelines for issuing a driving license. The aim of the study was to compare driving simulator performance and neuropsychological test results in narcolepsy in order to evaluate their predictive value regarding driving ability. Thirteen patients with narcolepsy (age: 41.5+/-12.9 years) and 10 healthy control patients (age: 55.1+/-7.8 years) were investigated. By computer-assisted neuropsychological testing, vigilance, alertness and divided attention were assessed. In a driving simulator patients and controls had to drive on a highway for 60 min (mean speed of 100 km/h). Different weather and daytime conditions and obstacles were presented. Epworth Sleepiness Scale-Scores were significantly raised (narcolepsy patients: 16.7+/-5.1, controls: 6.6+/-3.6, P divided attention (56.9+/-25.4) and vigilance (58.7+/-26.8) were in a normal range. There was, however, a high inter-individual difference. There was no correlation between driving performance and neuropsychological test results or ESS Score. Neuropsychological test results did not significantly change in the follow-up. The difficulties encountered by the narcolepsy patient in remaining alert may account for sleep-related motor vehicle accidents. Driving simulator investigations are closely related to real traffic situations than isolated neuropsychological tests. At the present time the driving simulator seems to be a useful instrument judging driving ability especially in cases with ambiguous neuropsychological results.

  8. Design and Simulation of a High Performance Emergency Data Delivery Protocol

    DEFF Research Database (Denmark)

    Swartz, Kevin; Wang, Di

    2007-01-01

    The purpose of this project was to design a high performance data delivery protocol, capable of delivering data as quickly as possible to a base station or target node. This protocol was designed particularly for wireless network topologies, but could also be applied towards a wired system....... An emergency is defined as any event with high priority that needs to be handled immediately. It is assumed that this emergency event is important enough that energy efficiency is not a factor in our protocol. The desired effect is for fast as possible delivery to the base station for rapid event handling....

  9. DNS/LES Simulations of Separated Flows at High Reynolds Numbers

    Science.gov (United States)

    Balakumar, P.

    2015-01-01

    Direct numerical simulations (DNS) and large-eddy simulations (LES) simulations of flow through a periodic channel with a constriction are performed using the dynamic Smagorinsky model at two Reynolds numbers of 2800 and 10595. The LES equations are solved using higher order compact schemes. DNS are performed for the lower Reynolds number case using a fine grid and the data are used to validate the LES results obtained with a coarse and a medium size grid. LES simulations are also performed for the higher Reynolds number case using a coarse and a medium size grid. The results are compared with an existing reference data set. The DNS and LES results agreed well with the reference data. Reynolds stresses, sub-grid eddy viscosity, and the budgets for the turbulent kinetic energy are also presented. It is found that the turbulent fluctuations in the normal and spanwise directions have the same magnitude. The turbulent kinetic energy budget shows that the production peaks near the separation point region and the production to dissipation ratio is very high on the order of five in this region. It is also observed that the production is balanced by the advection, diffusion, and dissipation in the shear layer region. The dominant term is the turbulent diffusion that is about two times the molecular dissipation.

  10. Numerical simulation of realistic high-temperature superconductors

    International Nuclear Information System (INIS)

    1997-01-01

    One of the main obstacles in the development of practical high-temperature superconducting (HTS) materials is dissipation, caused by the motion of magnetic flux quanta called vortices. Numerical simulations provide a promising new approach for studying these vortices. By exploiting the extraordinary memory and speed of massively parallel computers, researchers can obtain the extremely fine temporal and spatial resolution needed to model complex vortex behavior. The results may help identify new mechanisms to increase the current-capability capabilities and to predict the performance characteristics of HTS materials intended for industrial applications

  11. Toward high-efficiency and detailed Monte Carlo simulation study of the granular flow spallation target

    Science.gov (United States)

    Cai, Han-Jie; Zhang, Zhi-Lei; Fu, Fen; Li, Jian-Yang; Zhang, Xun-Chao; Zhang, Ya-Ling; Yan, Xue-Song; Lin, Ping; Xv, Jian-Ya; Yang, Lei

    2018-02-01

    The dense granular flow spallation target is a new target concept chosen for the Accelerator-Driven Subcritical (ADS) project in China. For the R&D of this kind of target concept, a dedicated Monte Carlo (MC) program named GMT was developed to perform the simulation study of the beam-target interaction. Owing to the complexities of the target geometry, the computational cost of the MC simulation of particle tracks is highly expensive. Thus, improvement of computational efficiency will be essential for the detailed MC simulation studies of the dense granular target. Here we present the special design of the GMT program and its high efficiency performance. In addition, the speedup potential of the GPU-accelerated spallation models is discussed.

  12. Performance Modeling and Optimization of a High Energy CollidingBeam Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-06-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms.

  13. Performance Modeling and Optimization of a High Energy Colliding Beam Simulation Code

    International Nuclear Information System (INIS)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-01-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms

  14. High performance parallel computing of flows in complex geometries: II. Applications

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Staffelbach, G; Vermorel, O; Duchaine, F; Boussuge, J-F; Poinsot, T

    2009-01-01

    Present regulations in terms of pollutant emissions, noise and economical constraints, require new approaches and designs in the fields of energy supply and transportation. It is now well established that the next breakthrough will come from a better understanding of unsteady flow effects and by considering the entire system and not only isolated components. However, these aspects are still not well taken into account by the numerical approaches or understood whatever the design stage considered. The main challenge is essentially due to the computational requirements inferred by such complex systems if it is to be simulated by use of supercomputers. This paper shows how new challenges can be addressed by using parallel computing platforms for distinct elements of a more complex systems as encountered in aeronautical applications. Based on numerical simulations performed with modern aerodynamic and reactive flow solvers, this work underlines the interest of high-performance computing for solving flow in complex industrial configurations such as aircrafts, combustion chambers and turbomachines. Performance indicators related to parallel computing efficiency are presented, showing that establishing fair criterions is a difficult task for complex industrial applications. Examples of numerical simulations performed in industrial systems are also described with a particular interest for the computational time and the potential design improvements obtained with high-fidelity and multi-physics computing methods. These simulations use either unsteady Reynolds-averaged Navier-Stokes methods or large eddy simulation and deal with turbulent unsteady flows, such as coupled flow phenomena (thermo-acoustic instabilities, buffet, etc). Some examples of the difficulties with grid generation and data analysis are also presented when dealing with these complex industrial applications.

  15. Application of High Performance Computing for Simulations of N-Dodecane Jet Spray with Evaporation

    Science.gov (United States)

    2016-11-01

    is unlimited. 10 6. References 1. Malbec L-M, Egúsquiza J, Bruneaux G, Meijer M. Characterization of a set of ECN spray A injectors : nozzle to...sprays and develop a predictive theory for comparison to measurements in the laboratory of turbulent diesel sprays. 15. SUBJECT TERMS high...models into future simulations of turbulent jet sprays and develop a predictive theory for comparison to measurements in the lab of turbulent diesel

  16. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  17. Prospective randomized comparison of standard didactic lecture versus high-fidelity simulation for radiology resident contrast reaction management training.

    Science.gov (United States)

    Wang, Carolyn L; Schopp, Jennifer G; Petscavage, Jonelle M; Paladin, Angelisa M; Richardson, Michael L; Bush, William H

    2011-06-01

    The objective of our study was to assess whether high-fidelity simulation-based training is more effective than traditional didactic lecture to train radiology residents in the management of contrast reactions. This was a prospective study of 44 radiology residents randomized into a simulation group versus a lecture group. All residents attended a contrast reaction didactic lecture. Four months later, baseline knowledge was assessed with a written test, which we refer to as the "pretest." After the pretest, the 21 residents in the lecture group attended a repeat didactic lecture and the 23 residents in the simulation group underwent high-fidelity simulation-based training with five contrast reaction scenarios. Next, all residents took a second written test, which we refer to as the "posttest." Two months after the posttest, both groups took a third written test, which we refer to as the "delayed posttest," and underwent performance testing with a high-fidelity severe contrast reaction scenario graded on predefined critical actions. There was no statistically significant difference between the simulation and lecture group pretest, immediate posttest, or delayed posttest scores. The simulation group performed better than the lecture group on the severe contrast reaction simulation scenario (p = 0.001). The simulation group reported improved comfort in identifying and managing contrast reactions and administering medications after the simulation training (p ≤ 0.04) and was more comfortable than the control group (p = 0.03), which reported no change in comfort level after the repeat didactic lecture. When compared with didactic lecture, high-fidelity simulation-based training of contrast reaction management shows equal results on written test scores but improved performance during a high-fidelity severe contrast reaction simulation scenario.

  18. A quasi-3-dimensional simulation method for a high-voltage level-shifting circuit structure

    International Nuclear Information System (INIS)

    Liu Jizhi; Chen Xingbi

    2009-01-01

    A new quasi-three-dimensional (quasi-3D) numeric simulation method for a high-voltage level-shifting circuit structure is proposed. The performances of the 3D structure are analyzed by combining some 2D device structures; the 2D devices are in two planes perpendicular to each other and to the surface of the semiconductor. In comparison with Davinci, the full 3D device simulation tool, the quasi-3D simulation method can give results for the potential and current distribution of the 3D high-voltage level-shifting circuit structure with appropriate accuracy and the total CPU time for simulation is significantly reduced. The quasi-3D simulation technique can be used in many cases with advantages such as saving computing time, making no demands on the high-end computer terminals, and being easy to operate. (semiconductor integrated circuits)

  19. A quasi-3-dimensional simulation method for a high-voltage level-shifting circuit structure

    Energy Technology Data Exchange (ETDEWEB)

    Liu Jizhi; Chen Xingbi, E-mail: jzhliu@uestc.edu.c [State Key Laboratory of Electronic Thin Films and Integrated Devices, University of Electronic Science and Technology of China, Chengdu 610054 (China)

    2009-12-15

    A new quasi-three-dimensional (quasi-3D) numeric simulation method for a high-voltage level-shifting circuit structure is proposed. The performances of the 3D structure are analyzed by combining some 2D device structures; the 2D devices are in two planes perpendicular to each other and to the surface of the semiconductor. In comparison with Davinci, the full 3D device simulation tool, the quasi-3D simulation method can give results for the potential and current distribution of the 3D high-voltage level-shifting circuit structure with appropriate accuracy and the total CPU time for simulation is significantly reduced. The quasi-3D simulation technique can be used in many cases with advantages such as saving computing time, making no demands on the high-end computer terminals, and being easy to operate. (semiconductor integrated circuits)

  20. ATES/heat pump simulations performed with ATESSS code

    Science.gov (United States)

    Vail, L. W.

    1989-01-01

    Modifications to the Aquifer Thermal Energy Storage System Simulator (ATESSS) allow simulation of aquifer thermal energy storage (ATES)/heat pump systems. The heat pump algorithm requires a coefficient of performance (COP) relationship of the form: COP = COP sub base + alpha (T sub ref minus T sub base). Initial applications of the modified ATES code to synthetic building load data for two sizes of buildings in two U.S. cities showed insignificant performance advantage of a series ATES heat pump system over a conventional groundwater heat pump system. The addition of algorithms for a cooling tower and solar array improved performance slightly. Small values of alpha in the COP relationship are the principal reason for the limited improvement in system performance. Future studies at Pacific Northwest Laboratory (PNL) are planned to investigate methods to increase system performance using alternative system configurations and operations scenarios.

  1. High-Performance Tiled WMS and KML Web Server

    Science.gov (United States)

    Plesea, Lucian

    2007-01-01

    This software is an Apache 2.0 module implementing a high-performance map server to support interactive map viewers and virtual planet client software. It can be used in applications that require access to very-high-resolution geolocated images, such as GIS, virtual planet applications, and flight simulators. It serves Web Map Service (WMS) requests that comply with a given request grid from an existing tile dataset. It also generates the KML super-overlay configuration files required to access the WMS image tiles.

  2. Simulation and performance of brushless dc motor actuators

    Science.gov (United States)

    Gerba, A., Jr.

    1985-12-01

    The simulation model for a Brushless D.C. Motor and the associated commutation power conditioner transistor model are presented. The necessary conditions for maximum power output while operating at steady-state speed and sinusoidally distributed air-gap flux are developed. Comparison of simulated model with the measured performance of a typical motor are done both on time response waveforms and on average performance characteristics. These preliminary results indicate good agreement. Plans for model improvement and testing of a motor-driven positioning device for model evaluation are outlined.

  3. A High Performance Chemical Simulation Preprocessor and Source Code Generator, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Numerical simulations of chemical kinetics are a critical component of aerospace research, Earth systems research, and energy research. These simulations enable a...

  4. Simulated Performances of a Very High Energy Tomograph for Non-Destructive Characterization of large objects

    Directory of Open Access Journals (Sweden)

    Kistler Marc

    2018-01-01

    The upgrade of the detection part needs a thorough study of the performance of two detectors: a series of CdTe semiconductor sensors and two arrays of segmented CdWO4 scintillators with different pixel sizes. This study consists in a Quantum Accounting Diagram (QAD analysis coupled with Monte-Carlo simulations. The scintillator arrays are able to detect millimeter details through 140 cm of concrete, but are limited to 120 cm for smaller ones. CdTe sensors have lower but more stable performance, with a 0.5 mm resolution for 90 cm of concrete. The choice of the detector then depends on the preferred characteristic: the spatial resolution or the use on large volumes. The combination of the features of the source and the studies on the detectors gives the expected performance of the whole equipment, in terms of signal-over-noise ratio (SNR, spatial resolution and acquisition time.

  5. Methodology for the preliminary design of high performance schools in hot and humid climates

    Science.gov (United States)

    Im, Piljae

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the accelerated dissemination of energy efficient design. For the development of the toolkit, first, a survey was performed to identify high performance measures available today being implemented in new K-5 school buildings. Then an existing case-study school building in a hot and humid climate was selected and analyzed to understand the energy use pattern in a school building and to be used in developing a calibrated simulation. Based on the information from the previous step, an as-built and calibrated simulation was then developed. To accomplish this, five calibration steps were performed to match the simulation results with the measured energy use. The five steps include: (1) Using an actual 2006 weather file with measured solar radiation, (2) Modifying lighting & equipment schedule using ASHRAE's RP-1093 methods, (3) Using actual equipment performance curves (i.e., scroll chiller), (4) Using the Winkelmann's method for the underground floor heat transfer, and (5) Modifying the HVAC and room setpoint temperature based on the measured field data. Next, the calibrated simulation of the case-study K-5 school was compared to an ASHRAE Standard 90.1-1999 code-compliant school. In the next step, the energy savings potentials from the application of several high performance measures to an equivalent ASHRAE Standard 90.1-1999 code-compliant school. The high performance measures applied included the recommendations from the ASHRAE Advanced Energy Design Guides (AEDG) for K-12 and other high performance measures from the literature review as well as a daylighting strategy and solar PV and thermal systems. The results show that the net

  6. A Comparison of Robotic Simulation Performance on Basic Virtual Reality Skills: Simulator Subjective Versus Objective Assessment Tools.

    Science.gov (United States)

    Dubin, Ariel K; Smith, Roger; Julian, Danielle; Tanaka, Alyssa; Mattingly, Patricia

    To answer the question of whether there is a difference between robotic virtual reality simulator performance assessment and validated human reviewers. Current surgical education relies heavily on simulation. Several assessment tools are available to the trainee, including the actual robotic simulator assessment metrics and the Global Evaluative Assessment of Robotic Skills (GEARS) metrics, both of which have been independently validated. GEARS is a rating scale through which human evaluators can score trainees' performances on 6 domains: depth perception, bimanual dexterity, efficiency, force sensitivity, autonomy, and robotic control. Each domain is scored on a 5-point Likert scale with anchors. We used 2 common robotic simulators, the dV-Trainer (dVT; Mimic Technologies Inc., Seattle, WA) and the da Vinci Skills Simulator (dVSS; Intuitive Surgical, Sunnyvale, CA), to compare the performance metrics of robotic surgical simulators with the GEARS for a basic robotic task on each simulator. A prospective single-blinded randomized study. A surgical education and training center. Surgeons and surgeons in training. Demographic information was collected including sex, age, level of training, specialty, and previous surgical and simulator experience. Subjects performed 2 trials of ring and rail 1 (RR1) on each of the 2 simulators (dVSS and dVT) after undergoing randomization and warm-up exercises. The second RR1 trial simulator performance was recorded, and the deidentified videos were sent to human reviewers using GEARS. Eight different simulator assessment metrics were identified and paired with a similar performance metric in the GEARS tool. The GEARS evaluation scores and simulator assessment scores were paired and a Spearman rho calculated for their level of correlation. Seventy-four subjects were enrolled in this randomized study with 9 subjects excluded for missing or incomplete data. There was a strong correlation between the GEARS score and the simulator metric

  7. Simulated LOCA Test and Characterization Study Related to High Burn-Up Issue

    International Nuclear Information System (INIS)

    Park, D. J.; Jung, Y. I.; Choi, B. K.; Park, S. Y.; Kim, H. G.; Park, J. Y.

    2012-01-01

    For the safety evaluation of fuel cladding during the injection of emergency core coolant, simulated Loss-of-coolant accident (LOCA) test was performed by using Zircaloy-4 fuel cladding samples. Zircaloy-4 tube samples with and without prehydring were oxidized in a steam environment with the test temperature of 1200 .deg. C. Prehydrided cladding was prepared from as-fabricated Zircaloy-4 to study the effects of hydrogen on mechanical properties of cladding during high temperature oxidation and quench conditions. In order to measure the ductility of the tube samples embrittled by quenching water, ring compression test was carried out by using 8 mm ring sample sectioned from oxidized tube sample and microstructural analysis was also performed after simulated LOCA test. The results showed that hydrogen increases oxygen solubility and pickup rate in the beta layer. This reduces ductility of prehydrided fuel cladding compared with as-fabricated cladding. Trend in ductility decrease for prehydrided sample under simulated LOCA condition was very similar with data obtained from tests conducted using irradiated high burn-up fuel claddings

  8. Multi-scale high-performance fluid flow: Simulations through porous media

    KAUST Repository

    Perović, Nevena

    2016-08-03

    Computational fluid dynamic (CFD) calculations on geometrically complex domains such as porous media require high geometric discretisation for accurately capturing the tested physical phenomena. Moreover, when considering a large area and analysing local effects, it is necessary to deploy a multi-scale approach that is both memory-intensive and time-consuming. Hence, this type of analysis must be conducted on a high-performance parallel computing infrastructure. In this paper, the coupling of two different scales based on the Navier–Stokes equations and Darcy\\'s law is described followed by the generation of complex geometries, and their discretisation and numerical treatment. Subsequently, the necessary parallelisation techniques and a rather specific tool, which is capable of retrieving data from the supercomputing servers and visualising them during the computation runtime (i.e. in situ) are described. All advantages and possible drawbacks of this approach, together with the preliminary results and sensitivity analyses are discussed in detail.

  9. Multi-scale high-performance fluid flow: Simulations through porous media

    KAUST Repository

    Perović, Nevena; Frisch, Jé rô me; Salama, Amgad; Sun, Shuyu; Rank, Ernst; Mundani, Ralf Peter

    2016-01-01

    Computational fluid dynamic (CFD) calculations on geometrically complex domains such as porous media require high geometric discretisation for accurately capturing the tested physical phenomena. Moreover, when considering a large area and analysing local effects, it is necessary to deploy a multi-scale approach that is both memory-intensive and time-consuming. Hence, this type of analysis must be conducted on a high-performance parallel computing infrastructure. In this paper, the coupling of two different scales based on the Navier–Stokes equations and Darcy's law is described followed by the generation of complex geometries, and their discretisation and numerical treatment. Subsequently, the necessary parallelisation techniques and a rather specific tool, which is capable of retrieving data from the supercomputing servers and visualising them during the computation runtime (i.e. in situ) are described. All advantages and possible drawbacks of this approach, together with the preliminary results and sensitivity analyses are discussed in detail.

  10. Conceptual design and performance simulations of super-compact electromagnetic calorimeter

    Directory of Open Access Journals (Sweden)

    Skoda Libor

    2013-11-01

    Full Text Available Measurements of particle production at forward rapidities in high energy p-p, p-A and A-A collisions provide access to physics processes at very low Bjorken x. These measurements will allow to study the gluon saturation scale and improve our knowledge of parton distribution in nuclei. Specific requirements must be fulfilled for a calorimeter to successfully operate in high-multiplicity forward region within often stringent space limits. Here we present a study of a conceptual design of super-compact electromagnetic calorimeter being developed at Czech Technical University in Prague. The design of the sampling calorimeter is based on a sandwich structure of thin tungsten and scintillator layers oriented in parallel to the beam. Used optical readout of individual scintillator pads guaranties the required high radiation hardness of the detector. We present simulation of the expected performance of the optical pad readout together with overall detector performance. It is aimed for the detector to allow measuring of high energy photons (1

  11. Shock Mechanism Analysis and Simulation of High-Power Hydraulic Shock Wave Simulator

    Directory of Open Access Journals (Sweden)

    Xiaoqiu Xu

    2017-01-01

    Full Text Available The simulation of regular shock wave (e.g., half-sine can be achieved by the traditional rubber shock simulator, but the practical high-power shock wave characterized by steep prepeak and gentle postpeak is hard to be realized by the same. To tackle this disadvantage, a novel high-power hydraulic shock wave simulator based on the live firing muzzle shock principle was proposed in the current work. The influence of the typical shock characteristic parameters on the shock force wave was investigated via both theoretical deduction and software simulation. According to the obtained data compared with the results, in fact, it can be concluded that the developed hydraulic shock wave simulator can be applied to simulate the real condition of the shocking system. Further, the similarity evaluation of shock wave simulation was achieved based on the curvature distance, and the results stated that the simulation method was reasonable and the structural optimization based on software simulation is also beneficial to the increase of efficiency. Finally, the combination of theoretical analysis and simulation for the development of artillery recoil tester is a comprehensive approach in the design and structure optimization of the recoil system.

  12. Simulation of performance of centrifugal circulators with vaneless diffuser for GCR applications

    International Nuclear Information System (INIS)

    Tauveron, N.; Dor, I.

    2010-01-01

    In the frame of the international forum GenIV, CEA has selected various innovative concepts of gas-cooled nuclear reactor. Thermal hydraulic performances are a key issue for the design. For transient conditions and decay heat removal situations, the thermal hydraulic performance must remain as high as possible. In this context, all the transient situations, the incidental and accidental scenarii must be evaluated by a validated system code able to correctly describe, in particular, the thermal hydraulics of the whole plant. As concepts use a helium compressor to maintain the flow in the core, a special emphasis must be laid on compressor modelling. Centrifugal circulators with a vaneless diffuser have significant properties in term of simplicity, cost, ability to operate over a wide range of conditions. The objective of this paper is to present a dedicated description of centrifugal compressor, based on a one-dimensional approach. This type of model requires various correlations as input data. The present contribution consists in establishing and validating the numerical simulations (including different sets of correlations) by comparison with representative experimental data. The results obtained show a qualitatively correct behaviour of the model compared to open literature cases of the gas turbine aircraft community and helium circulators of high temperature gas reactors. The model is finally used in a depressurised transient simulation of a small power gas fast reactor (ALLEGRO concept). Advantages of this model versus first preliminary simulations are shown. Further work on modelling and validation are nevertheless needed to have a better confidence in the simulation predictions.

  13. Simulation of performance of centrifugal circulators with vaneless diffuser for GCR applications

    Energy Technology Data Exchange (ETDEWEB)

    Tauveron, N., E-mail: nicolas.tauveron@cea.f [CEA, DEN, DER/SSTH, 17 rue des Martyrs, F-38054 Grenoble (France); Dor, I., E-mail: isabelle.dor@cea.f [CEA, DEN, DER/SSTH, 17 rue des Martyrs, F-38054 Grenoble (France)

    2010-10-15

    In the frame of the international forum GenIV, CEA has selected various innovative concepts of gas-cooled nuclear reactor. Thermal hydraulic performances are a key issue for the design. For transient conditions and decay heat removal situations, the thermal hydraulic performance must remain as high as possible. In this context, all the transient situations, the incidental and accidental scenarii must be evaluated by a validated system code able to correctly describe, in particular, the thermal hydraulics of the whole plant. As concepts use a helium compressor to maintain the flow in the core, a special emphasis must be laid on compressor modelling. Centrifugal circulators with a vaneless diffuser have significant properties in term of simplicity, cost, ability to operate over a wide range of conditions. The objective of this paper is to present a dedicated description of centrifugal compressor, based on a one-dimensional approach. This type of model requires various correlations as input data. The present contribution consists in establishing and validating the numerical simulations (including different sets of correlations) by comparison with representative experimental data. The results obtained show a qualitatively correct behaviour of the model compared to open literature cases of the gas turbine aircraft community and helium circulators of high temperature gas reactors. The model is finally used in a depressurised transient simulation of a small power gas fast reactor (ALLEGRO concept). Advantages of this model versus first preliminary simulations are shown. Further work on modelling and validation are nevertheless needed to have a better confidence in the simulation predictions.

  14. Integrating Soft Set Theory and Fuzzy Linguistic Model to Evaluate the Performance of Training Simulation Systems.

    Science.gov (United States)

    Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu

    2016-01-01

    The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance-performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system.

  15. High performance computing in science and engineering Garching/Munich 2016

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, Siegfried; Bode, Arndt; Bruechle, Helmut; Brehm, Matthias (eds.)

    2016-11-01

    Computer simulations are the well-established third pillar of natural sciences along with theory and experimentation. Particularly high performance computing is growing fast and constantly demands more and more powerful machines. To keep pace with this development, in spring 2015, the Leibniz Supercomputing Centre installed the high performance computing system SuperMUC Phase 2, only three years after the inauguration of its sibling SuperMUC Phase 1. Thereby, the compute capabilities were more than doubled. This book covers the time-frame June 2014 until June 2016. Readers will find many examples of outstanding research in the more than 130 projects that are covered in this book, with each one of these projects using at least 4 million core-hours on SuperMUC. The largest scientific communities using SuperMUC in the last two years were computational fluid dynamics simulations, chemistry and material sciences, astrophysics, and life sciences.

  16. Fracture Simulation of Highly Crosslinked Polymer Networks: Triglyceride-Based Adhesives

    Science.gov (United States)

    Lorenz, Christian; Stevens, Mark; Wool, Richard

    2003-03-01

    The ACRES program at the U. of Delaware has shown that triglyceride oils derived from plants are a favorable alternative to the traditional adhesives. The triglyceride networks are formed from an initial mixture of styrene monomers, free-radical initiators and triglycerides. We have performed simulations to study the effect of physical composition and physical characteristics of the triglyceride network on the strength of triglyceride network. A coarse-grained, bead-spring model of the triglyceride system is used. The average triglyceride consists of 6 beads per chain, the styrenes are represented as a single bead and the initiators are two bead chains. The polymer network is formed using an off-lattice 3D Monte Carlo simulation, in which the initiators activate the styrene and triglyceride reactive sites and then bonds are randomly formed between the styrene and active triglyceride monomers producing a highly crosslinked polymer network. Molecular dynamics simulations of the network under tensile and shear strains were performed to determine the strength as a function of the network composition. The relationship between the network structure and its strength will also be discussed.

  17. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ames, David E.,

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  18. Virtual Design Studio (VDS) - Development of an Integrated Computer Simulation Environment for Performance Based Design of Very-Low Energy and High IEQ Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yixing [Building Energy and Environmental Systems Lab. (BEESL), Syracuse, NY (United States); Zhang, Jianshun [Syracuse Univ., NY (United States); Pelken, Michael [Syracuse Univ., NY (United States); Gu, Lixing [Univ. of Central Florida, Orlando, FL (United States); Rice, Danial [Building Energy and Environmental Systems Lab. (BEESL), Syracuse, NY (United States); Meng, Zhaozhou [Building Energy and Environmental Systems Lab. (BEESL), Syracuse, NY (United States); Semahegn, Shewangizaw [Building Energy and Environmental Systems Lab. (BEESL), Syracuse, NY (United States); Feng, Wei [Building Energy and Environmental Systems Lab. (BEESL), Syracuse, NY (United States); Ling, Francesca [Syracuse Univ., NY (United States); Shi, Jun [Building Energy and Environmental Systems Lab. (BEESL), Syracuse, NY (United States); Henderson, Hugh [CDH Energy, Cazenovia, NY (United States)

    2013-09-01

    Executive Summary The objective of this study was to develop a “Virtual Design Studio (VDS)”: a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. This VDS is intended to assist collaborating architects, engineers and project management team members throughout from the early phases to the detailed building design stages. It can be used to plan design tasks and workflow, and evaluate the potential impacts of various green building strategies on the building performance by using the state of the art simulation tools as well as industrial/professional standards and guidelines for green building system design. Engaged in the development of VDS was a multi-disciplinary research team that included architects, engineers, and software developers. Based on the review and analysis of how existing professional practices in building systems design operate, particularly those used in the U.S., Germany and UK, a generic process for performance-based building design, construction and operation was proposed. It distinguishes the whole process into five distinct stages: Assess, Define, Design, Apply, and Monitoring (ADDAM). The current VDS is focused on the first three stages. The VDS considers building design as a multi-dimensional process, involving multiple design teams, design factors, and design stages. The intersection among these three dimensions defines a specific design task in terms of “who”, “what” and “when”. It also considers building design as a multi-objective process that aims to enhance the five aspects of performance for green building systems: site sustainability, materials and resource efficiency, water utilization efficiency, energy efficiency and impacts to the atmospheric environment, and IEQ. The current VDS development has been limited to energy efficiency and IEQ performance, with particular focus

  19. Mental abilities and performance efficacy under a simulated 480 meters helium-oxygen saturation diving

    Directory of Open Access Journals (Sweden)

    gonglin ehou

    2015-07-01

    Full Text Available Stress in extreme environment severely disrupts human physiology and mental abilities. The present study investigated the cognition and performance efficacy of four divers during a simulated 480 meters helium-oxygen saturation diving. We analyzed the spatial memory, 2D/3D mental rotation functioning, grip strength, and hand-eye coordination ability in four divers during the 0 – 480 meters compression and decompression processes of the simulated diving. The results showed that except for its mild decrease on grip strength, the high atmosphere pressure condition significantly impaired the hand-eye coordination (especially at 300 meters, the reaction time and correct rate of mental rotation, as well as the spatial memory (especially as 410 meters, showing high individual variability. We conclude that the human cognition and performance efficacy are significantly affected during deep water saturation diving.

  20. Sodium bicarbonate supplementation prevents skilled tennis performance decline after a simulated match

    Directory of Open Access Journals (Sweden)

    Huang Ming-Hsiang

    2010-10-01

    Full Text Available Abstract The supplementation of sodium bicarbonate (NaHCO3 could increase performance or delay fatigue in intermittent high-intensity exercise. Prolonged tennis matches result in fatigue, which impairs skilled performance. The aim of this study was to investigate the effect of NaHCO3 supplementation on skilled tennis performance after a simulated match. Nine male college tennis players were recruited for this randomized cross-over, placebo-controlled, double-blind study. The participants consumed NaHCO3 (0.3 g. kg-1 or NaCl (0.209 g. kg-1 before the trial. An additional supplementation of 0.1 g. kg-1 NaHCO3 or 0.07 g. kg-1 NaCl was ingested after the third game in the simulated match. The Loughborough Tennis Skill Test was performed before and after the simulated match. Post-match [HCO3-] and base excess were significantly higher in the bicarbonate trial than those in the placebo trial. Blood [lactate] was significantly increased in the placebo (pre: 1.22 ± 0.54; post: 2.17 ± 1.46 mM and bicarbonate (pre: 1.23 ± 0.41; post: 3.21 ± 1.89 mM trials. The match-induced change in blood [lactate] was significantly higher in the bicarbonate trial. Blood pH remained unchanged in the placebo trial (pre: 7.37 ± 0.32; post: 7.37 ± 0.14 but was significantly increased in the bicarbonate trial (pre: 7.37 ± 0.26; post: 7.45 ± 0.63, indicating a more alkaline environment. The service and forehand ground stroke consistency scores were declined significantly after the simulated match in the placebo trial, while they were maintained in the bicarbonate trial. The match-induced declines in the consistency scores were significantly larger in the placebo trial than those in the bicarbonate trial. This study suggested that NaHCO3 supplementation could prevent the decline in skilled tennis performance after a simulated match.

  1. High-performance simulation-based algorithms for an alpine ski racer’s trajectory optimization in heterogeneous computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2014-09-01

    Full Text Available Effective, simulation-based trajectory optimization algorithms adapted to heterogeneous computers are studied with reference to the problem taken from alpine ski racing (the presented solution is probably the most general one published so far. The key idea behind these algorithms is to use a grid-based discretization scheme to transform the continuous optimization problem into a search problem over a specially constructed finite graph, and then to apply dynamic programming to find an approximation of the global solution. In the analyzed example it is the minimum-time ski line, represented as a piecewise-linear function (a method of elimination of unfeasible solutions is proposed. Serial and parallel versions of the basic optimization algorithm are presented in detail (pseudo-code, time and memory complexity. Possible extensions of the basic algorithm are also described. The implementation of these algorithms is based on OpenCL. The included experimental results show that contemporary heterogeneous computers can be treated as μ-HPC platforms-they offer high performance (the best speedup was equal to 128 while remaining energy and cost efficient (which is crucial in embedded systems, e.g., trajectory planners of autonomous robots. The presented algorithms can be applied to many trajectory optimization problems, including those having a black-box represented performance measure

  2. Utilization of Short-Simulations for Tuning High-Resolution Climate Model

    Science.gov (United States)

    Lin, W.; Xie, S.; Ma, P. L.; Rasch, P. J.; Qian, Y.; Wan, H.; Ma, H. Y.; Klein, S. A.

    2016-12-01

    Many physical parameterizations in atmospheric models are sensitive to resolution. Tuning the models that involve a multitude of parameters at high resolution is computationally expensive, particularly when relying primarily on multi-year simulations. This work describes a complementary set of strategies for tuning high-resolution atmospheric models, using ensembles of short simulations to reduce the computational cost and elapsed time. Specifically, we utilize the hindcast approach developed through the DOE Cloud Associated Parameterization Testbed (CAPT) project for high-resolution model tuning, which is guided by a combination of short (tests have been found to be effective in numerous previous studies in identifying model biases due to parameterized fast physics, and we demonstrate that it is also useful for tuning. After the most egregious errors are addressed through an initial "rough" tuning phase, longer simulations are performed to "hone in" on model features that evolve over longer timescales. We explore these strategies to tune the DOE ACME (Accelerated Climate Modeling for Energy) model. For the ACME model at 0.25° resolution, it is confirmed that, given the same parameters, major biases in global mean statistics and many spatial features are consistent between Atmospheric Model Intercomparison Project (AMIP)-type simulations and CAPT-type hindcasts, with just a small number of short-term simulations for the latter over the corresponding season. The use of CAPT hindcasts to find parameter choice for the reduction of large model biases dramatically improves the turnaround time for the tuning at high resolution. Improvement seen in CAPT hindcasts generally translates to improved AMIP-type simulations. An iterative CAPT-AMIP tuning approach is therefore adopted during each major tuning cycle, with the former to survey the likely responses and narrow the parameter space, and the latter to verify the results in climate context along with assessment in

  3. Two-step simulation of velocity and passive scalar mixing at high Schmidt number in turbulent jets

    Science.gov (United States)

    Rah, K. Jeff; Blanquart, Guillaume

    2016-11-01

    Simulation of passive scalar in the high Schmidt number turbulent mixing process requires higher computational cost than that of velocity fields, because the scalar is associated with smaller length scales than velocity. Thus, full simulation of both velocity and passive scalar with high Sc for a practical configuration is difficult to perform. In this work, a new approach to simulate velocity and passive scalar mixing at high Sc is suggested to reduce the computational cost. First, the velocity fields are resolved by Large Eddy Simulation (LES). Then, by extracting the velocity information from LES, the scalar inside a moving fluid blob is simulated by Direct Numerical Simulation (DNS). This two-step simulation method is applied to a turbulent jet and provides a new way to examine a scalar mixing process in a practical application with smaller computational cost. NSF, Samsung Scholarship.

  4. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm; Tryggvason, T.

    1998-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...... simulation program requires a detailed description of the energy flow in the air movement which can be obtained by a CFD program. The paper describes an energy consumption calculation in a large building, where the building energy simulation program is modified by CFD predictions of the flow between three...... zones connected by open areas with pressure and buoyancy driven air flow. The two programs are interconnected in an iterative procedure. The paper shows also an evaluation of the air quality in the main area of the buildings based on CFD predictions. It is shown that an interconnection between a CFD...

  5. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter V.; Tryggvason, Tryggvi

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...

  6. Pharmacy Students' Learning and Satisfaction With High-Fidelity Simulation to Teach Drug-Induced Dyspepsia

    Science.gov (United States)

    2013-01-01

    Objective. To assess second-year pharmacy students’ acquisition of pharmacotherapy knowledge and clinical competence from participation in a high-fidelity simulation, and to determine the impact on the simulation experience of implementing feedback from previous students. Design. A high-fidelity simulation was used to present a patient case scenario of drug-induced dyspepsia with gastrointestinal bleeding. The simulation was revised based on feedback from a previous class of students to include a smaller group size, provision of session material to students in advance, and an improved learning environment. Assessment. Student performance on pre- and post-simulation knowledge and clinical competence tests documented significant improvements in students' knowledge of dyspepsia and associated symptoms, with the greatest improvement on questions relating to the hemodynamic effects of gastrointestinal bleeding. Students were more satisfied with the simulation experience compared to students in the earlier study. Conclusion. Participation in a high-fidelity simulation allowed pharmacy students to apply knowledge and skills learned in the classroom. Improved student satisfaction with the simulation suggests that implementing feedback obtained through student course evaluations can be an effective means of improving the curriculum. PMID:23519773

  7. Simulation of high SNR photodetector with L-C coupling and transimpedance amplifier circuit and its verification

    Science.gov (United States)

    Wang, Shaofeng; Xiang, Xiao; Zhou, Conghua; Zhai, Yiwei; Quan, Runai; Wang, Mengmeng; Hou, Feiyan; Zhang, Shougang; Dong, Ruifang; Liu, Tao

    2017-01-01

    In this paper, a model for simulating the optical response and noise performances of photodetectors with L-C coupling and transimpedance amplification circuit is presented. To verify the simulation, two kinds of photodetectors, which are based on the same printed-circuit-board (PCB) designing and PIN photodiode but different operational amplifiers, are developed and experimentally investigated. Through the comparisons between the numerical simulation results and the experimentally obtained data, excellent agreements are achieved, which show that the model provides a highly efficient guide for the development of a high signal to noise ratio photodetector. Furthermore, the parasite capacitances on the developed PCB, which are always hardly measured but play a non-negligible influence on the photodetectors' performances, are estimated.

  8. Investigation of the impact of high liquid viscosity on jet atomization in crossflow via high-fidelity simulations

    Science.gov (United States)

    Li, Xiaoyi; Gao, Hui; Soteriou, Marios C.

    2017-08-01

    Atomization of extremely high viscosity liquid can be of interest for many applications in aerospace, automotive, pharmaceutical, and food industries. While detailed atomization measurements usually face grand challenges, high-fidelity numerical simulations offer the advantage to comprehensively explore the atomization details. In this work, a previously validated high-fidelity first-principle simulation code HiMIST is utilized to simulate high-viscosity liquid jet atomization in crossflow. The code is used to perform a parametric study of the atomization process in a wide range of Ohnesorge numbers (Oh = 0.004-2) and Weber numbers (We = 10-160). Direct comparisons between the present study and previously published low-viscosity jet in crossflow results are performed. The effects of viscous damping and slowing on jet penetration, liquid surface instabilities, ligament formation/breakup, and subsequent droplet formation are investigated. Complex variations in near-field and far-field jet penetrations with increasing Oh at different We are observed and linked with the underlying jet deformation and breakup physics. Transition in breakup regimes and increase in droplet size with increasing Oh are observed, mostly consistent with the literature reports. The detailed simulations elucidate a distinctive edge-ligament-breakup dominated process with long surviving ligaments for the higher Oh cases, as opposed to a two-stage edge-stripping/column-fracture process for the lower Oh counterparts. The trend of decreasing column deflection with increasing We is reversed as Oh increases. A predominantly unimodal droplet size distribution is predicted at higher Oh, in contrast to the bimodal distribution at lower Oh. It has been found that both Rayleigh-Taylor and Kelvin-Helmholtz linear stability theories cannot be easily applied to interpret the distinct edge breakup process and further study of the underlying physics is needed.

  9. Mastoidectomy performance assessment of virtual simulation training using final-product analysis

    DEFF Research Database (Denmark)

    Andersen, Steven A W; Cayé-Thomasen, Per; Sørensen, Mads S

    2015-01-01

    a modified Welling scale. The simulator gathered basic metrics on time, steps, and volumes in relation to the on-screen tutorial and collisions with vital structures. RESULTS: Substantial inter-rater reliability (kappa = 0.77) for virtual simulation and moderate inter-rater reliability (kappa = 0.......59) for dissection final-product assessment was found. The simulation and dissection performance scores had significant correlation (P = .014). None of the basic simulator metrics correlated significantly with the final-product score except for number of steps completed in the simulator. CONCLUSIONS: A modified...... version of a validated final-product performance assessment tool can be used to assess mastoidectomy on virtual temporal bones. Performance assessment of virtual mastoidectomy could potentially save the use of cadaveric temporal bones for more advanced training when a basic level of competency...

  10. [Lack of correlation between performances in a simulator and in reality].

    Science.gov (United States)

    Konge, Lars; Bitsch, Mikael

    2010-12-13

    Simulation-based training provides obvious benefits for patients and doctors in education. Frequently, virtual reality simulators are expensive and evidence for their efficacy is poor, particularly as a result of studies with poor methodology and few test participants. In medical simulated training- and evaluation programmes it is always a question of transfer to the real clinical world. To illustrate this problem a study comparing the test performance of persons on a bowling simulator with their performance in a real bowling alley was conducted. Twenty-five test subjects played two rounds of bowling on a Nintendo Wii and 25 days later on a real bowling alley. Correlations of the scores in the first and second round (test-retest-reliability) and of the scores on the simulator and in reality (criterion validation) were studied and there was tested for any difference between female and male performance. The intraclass correlation coefficient equalled 0.76, i.e. the simulator fairly accurately measured participant performance. In contrast to this there was absolutely no correlation between participants' real bowling abilities and their scores on the simulator (Pearson's r = 0.06). There was no significant difference between female and male abilities. Simulation-based testing and training must be based on evidence. More studies are needed to include an adequate number of subjects. Bowling competence should not be based on Nintendo Wii measurements. Simulated training- and evaluation programmes should be validated before introduction, to ensure consistency with the real world.

  11. Comparison of electron cloud simulation and experiments in the high-current experiment

    International Nuclear Information System (INIS)

    Cohen, R.H.; Friedman, A.; Covo, M. Kireeff; Lund, S.M.; Molvik, A.W.; Bieniosek, F.M.; Seidl, P.A.; Vay, J.-L.; Verboncoeur, J.; Stoltz, P.; Veitzer, S.

    2004-01-01

    A set of experiments has been performed on the High-Current Experiment (HCX) facility at LBNL, in which the ion beam is allowed to collide with an end plate and thereby induce a copious supply of desorbed electrons. Through the use of combinations of biased and grounded electrodes positioned in between and downstream of the quadrupole magnets, the flow of electrons upstream into the magnets can be turned on or off. Properties of the resultant ion beam are measured under each condition. The experiment is modeled via a full three-dimensional, two species (electron and ion) particle simulation, as well as via reduced simulations (ions with appropriately chosen model electron cloud distributions, and a high-resolution simulation of the region adjacent to the end plate). The three-dimensional simulations are the first of their kind and the first to make use of a timestep-acceleration scheme that allows the electrons to be advanced with a timestep that is not small compared to the highest electron cyclotron period. The simulations reproduce qualitative aspects of the experiments, illustrate some unanticipated physical effects, and serve as an important demonstration of a developing simulation capability

  12. GNES-R: Global nuclear energy simulator for reactors task 1: High-fidelity neutron transport

    International Nuclear Information System (INIS)

    Clarno, K.; De Almeida, V.; D'Azevedo, E.; De Oliveira, C.; Hamilton, S.

    2006-01-01

    A multi-laboratory, multi-university collaboration has formed to advance the state-of-the-art in high-fidelity, coupled-physics simulation of nuclear energy systems. We are embarking on the first-phase in the development of a new suite of simulation tools dedicated to the advancement of nuclear science and engineering technologies. We seek to develop and demonstrate a new generation of multi-physics simulation tools that will explore the scientific phenomena of tightly coupled physics parameters within nuclear systems, support the design and licensing of advanced nuclear reactors, and provide benchmark quality solutions for code validation. In this paper, we have presented the general scope of the collaborative project and discuss the specific challenges of high-fidelity neutronics for nuclear reactor simulation and the inroads we have made along this path. The high-performance computing neutronics code system utilizes the latest version of SCALE to generate accurate, problem-dependent cross sections, which are used in NEWTRNX - a new 3-D, general-geometry, discrete-ordinates solver based on the Slice-Balance Approach. The Global Nuclear Energy Simulator for Reactors (GNES-R) team is embarking on a long-term simulation development project that encompasses multiple laboratories and universities for the expansion of high-fidelity coupled-physics simulation of nuclear energy systems. (authors)

  13. Design of the HELICS High-Performance Transmission-Distribution-Communication-Market Co-Simulation Framework

    Energy Technology Data Exchange (ETDEWEB)

    Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Top, Philip [Lawrence Livermore National Laboratories; Smith, Steve [Lawrence Livermore National Laboratories; Daily, Jeff [Pacific Northwest National Laboratory; Fuller, Jason [Pacific Northwest National Laboratory

    2017-10-12

    This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layered co-simulation architecture.

  14. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  15. Adaptive Performance-Constrained in Situ Visualization of Atmospheic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dorier, Matthieu; Sisneros, Roberto; Bautista Gomez, Leonard; Peterka, Tom; Orf, Leigh; Rahmani, Lokman; Antoniu, Gabriel; Bouge, Luc

    2016-09-12

    While many parallel visualization tools now provide in situ visualization capabilities, the trend has been to feed such tools with large amounts of unprocessed output data and let them render everything at the highest possible resolution. This leads to an increased run time of simulations that still have to complete within a fixed-length job allocation. In this paper, we tackle the challenge of enabling in situ visualization under performance constraints. Our approach shuffles data across processes according to its content and filters out part of it in order to feed a visualization pipeline with only a reorganized subset of the data produced by the simulation. Our framework leverages fast, generic evaluation procedures to score blocks of data, using information theory, statistics, and linear algebra. It monitors its own performance and adapts dynamically to achieve appropriate visual fidelity within predefined performance constraints. Experiments on the Blue Waters supercomputer with the CM1 simulation show that our approach enables a 5 speedup with respect to the initial visualization pipeline and is able to meet performance constraints.

  16. Design and experimentally measure a high performance metamaterial filter

    Science.gov (United States)

    Xu, Ya-wen; Xu, Jing-cheng

    2018-03-01

    Metamaterial filter is a kind of expecting optoelectronic device. In this paper, a metal/dielectric/metal (M/D/M) structure metamaterial filter is simulated and measured. Simulated results indicate that the perfect impedance matching condition between the metamaterial filter and the free space leads to the transmission band. Measured results show that the proposed metamaterial filter achieves high performance transmission on TM and TE polarization directions. Moreover, the high transmission rate is also can be obtained when the incident angle reaches to 45°. Further measured results show that the transmission band can be expanded through optimizing structural parameters. The central frequency of the transmission band is also can be adjusted through optimizing structural parameters. The physical mechanism behind the central frequency shifted is solved through establishing an equivalent resonant circuit model.

  17. Investigation the performance of 0-D and 3-d combustion simulation softwares for modelling HCCI engine with high air excess ratios

    Directory of Open Access Journals (Sweden)

    Gökhan Coşkun

    2017-10-01

    Full Text Available In this study, performance of zero and three dimensional simulations codes that used for simulate a homogenous charge compression ignition (HCCI engine fueled with Primary Reference Fuel PRF (85% iso-octane and 15% n-heptane were investigated. 0-D code, called as SRM Suite (Stochastic Reactor Model which can simulate engine combustion by using stochastic reactor model technique were used. Ansys-Fluent which can simulate computational fluid dynamics (CFD was used for 3-D engine combustion simulations. Simulations were evaluated for both commercial codes in terms of combustion, heat transfer and emissions in a HCCI engine. Chemical kinetic mechanisms which developed by Tsurushima including 33 species and 38 reactions for surrogate PRF fuel were used for combustion simulations. Analysis showed that both codes have advantages over each other.

  18. Performance Optimization of the ATLAS Detector Simulation

    CERN Document Server

    AUTHOR|(CDS)2091018

    In the thesis at hand the current performance of the ATLAS detector simulation, part of the Athena framework, is analyzed and possible optimizations are examined. For this purpose the event based sampling profiler VTune Amplifier by Intel is utilized. As the most important metric to measure improvements, the total execution time of the simulation of $t\\bar{t}$ events is also considered. All efforts are focused on structural changes, which do not influence the simulation output and can be attributed to CPU specific issues, especially front end stalls and vectorization. The most promising change is the activation of profile guided optimization for Geant4, which is a critical external dependency of the simulation. Profile guided optimization gives an average improvement of $8.9\\%$ and $10.0\\%$ for the two considered cases at the cost of one additional compilation (instrumented binaries) and execution (training to obtain profiling data) at build time.

  19. Physiological responses and performance in a simulated trampoline gymnastics competition in elite male gymnasts

    DEFF Research Database (Denmark)

    Jensen, Peter; Scott, Suzanne; Krustrup, Peter

    2013-01-01

    Abstract Physiological responses and performance were examined during and after a simulated trampoline competition (STC). Fifteen elite trampoline gymnasts participated, of which whereas eight completed two routines (EX1 and EX2) and a competition final (EX3). Trampoline-specific activities were...... gymnastic competition includes a high number of repeated explosive and energy demanding jumps, which impairs jump performance during and 24 h post-competition....

  20. High-performance zig-zag and meander inductors embedded in ferrite material

    International Nuclear Information System (INIS)

    Stojanovic, Goran; Damnjanovic, Mirjana; Desnica, Vladan; Zivanov, Ljiljana; Raghavendra, Ramesh; Bellew, Pat; Mcloughlin, Neil

    2006-01-01

    This paper describes the design, modeling, simulation and fabrication of zig-zag and meander inductors embedded in low- or high-permeability soft ferrite material. These microinductors have been developed with ceramic coprocessing technology. We compare the electrical properties of zig-zag and meander inductors structures installed as surface-mount devices. The equivalent model of the new structures is presented, suitable for design, circuit simulations and for prediction of the performance of proposed inductors. The relatively high impedance values allow these microinductors to be used in high-frequency suppressors. The components were tested in the frequency range of 1 MHz-3 GHz using an Agilent 4287A RF LCR meter. The measurements confirm the validity of the analytical model

  1. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  2. Building Performance Simulation for Sustainable Energy Use in Buildings

    NARCIS (Netherlands)

    Hensen, J.L.M.

    2010-01-01

    This paper aims to provide a general view of the background and current state of building performance simulation, which has the potential to deliver, directly or indirectly, substantial benefits to building stakeholders and to the environment. However the building simulation community faces many

  3. Building performance simulation for sustainable building design and operation

    NARCIS (Netherlands)

    Hensen, J.L.M.

    2011-01-01

    This paper aims to provide a general view of the background and current state of building performance simulation, which has the potential to deliver, directly or indirectly, substantial benefits to building stakeholders and to the environment. However the building simulation community faces many

  4. Simulating extreme environments: Ergonomic evaluation of Chinese pilot performance and heat stress tolerance.

    Science.gov (United States)

    Li, Jing; Tian, Yinsheng; Ding, Li; Zou, Huijuan; Ren, Zhaosheng; Shi, Liyong; Feathers, David; Wang, Ning

    2015-06-05

    High-temperatures in the cockpit environment can adversely influence pilot behavior and performance. To investigate the impact of high thermal environments on Chinese pilot performance in a simulated cockpit environment. Ten subjects volunteered to participate in the tests under 40°C and 45°C high-temperature simulations in an environmentally controlled chamber. Measures such as grip strength, perception, dexterity, somatic sense reaction, and analytical reasoning were taken. The results were compared to the Combined Index of Heat Stress (CIHS). CIHS exceeded the heat stress safety limit after 45 min under 40°C, grip strength decreased by 12% and somatic perception became 2.89 times larger than the initial value. In the case of 45°C, CIHS exceeded the safety limit after only 20 min, while the grip strength decreased just by 3.2% and somatic perception increased to 4.36 times larger than the initial value. Reaction and finger dexterity were not statistically different from baseline measurements, but the error rate of analytical reasoning test rose remarkably. Somatic perception was the most sensitive index to high-temperature, followed by grip strength. Results of this paper may help to improve environmental control design of new fighter cockpit and for pilot physiology and cockpit environment ergonomics research for Chinese pilots.

  5. A high precision dual feedback discrete control system designed for satellite trajectory simulator

    Science.gov (United States)

    Liu, Ximin; Liu, Liren; Sun, Jianfeng; Xu, Nan

    2005-08-01

    Cooperating with the free-space laser communication terminals, the satellite trajectory simulator is used to test the acquisition, pointing, tracking and communicating performances of the terminals. So the satellite trajectory simulator plays an important role in terminal ground test and verification. Using the double-prism, Sun etc in our group designed a satellite trajectory simulator. In this paper, a high precision dual feedback discrete control system designed for the simulator is given and a digital fabrication of the simulator is made correspondingly. In the dual feedback discrete control system, Proportional- Integral controller is used in velocity feedback loop and Proportional- Integral- Derivative controller is used in position feedback loop. In the controller design, simplex method is introduced and an improvement to the method is made. According to the transfer function of the control system in Z domain, the digital fabrication of the simulator is given when it is exposed to mechanism error and moment disturbance. Typically, when the mechanism error is 100urad, the residual standard error of pitching angle, azimuth angle, x-coordinate position and y-coordinate position are 0.49urad, 6.12urad, 4.56urad, 4.09urad respectively. When the moment disturbance is 0.1rad, the residual standard error of pitching angle, azimuth angle, x-coordinate position and y-coordinate position are 0.26urad, 0.22urad, 0.16urad, 0.15urad respectively. The digital fabrication results demonstrate that the dual feedback discrete control system designed for the simulator can achieve the anticipated high precision performance.

  6. Impact of reactive settler models on simulated WWTP performance

    DEFF Research Database (Denmark)

    Gernaey, Krist; Jeppsson, Ulf; Batstone, Damien J.

    2006-01-01

    for an ASM1 case study. Simulations with a whole plant model including the non-reactive Takacs settler model are used as a reference, and are compared to simulation results considering two reactive settler models. The first is a return sludge model block removing oxygen and a user-defined fraction of nitrate......, combined with a non-reactive Takacs settler. The second is a fully reactive ASM1 Takacs settler model. Simulations with the ASM1 reactive settler model predicted a 15.3% and 7.4% improvement of the simulated N removal performance, for constant (steady-state) and dynamic influent conditions respectively....... The oxygen/nitrate return sludge model block predicts a 10% improvement of N removal performance under dynamic conditions, and might be the better modelling option for ASM1 plants: it is computationally more efficient and it will not overrate the importance of decay processes in the settler....

  7. Polysilicon high frequency devices for large area electronics: Characterization, simulation and modeling

    Energy Technology Data Exchange (ETDEWEB)

    Botrel, J L [CEA-LETI 17, rue des Martyrs 38054 Grenoble (France); IMEP 23, rue des Martyrs 38016 Grenoble (France)], E-mail: jean-loius.botrel@cea.fr; Savry, O; Rozeau, O; Templier, F [CEA-LETI 17, rue des Martyrs 38054 Grenoble (France); Jomaah, J [IMEP 23, rue des Martyrs 38016 Grenoble (France)

    2007-07-16

    Laser Crystallised Polysilicon Thin Film Transistors have now sufficient good conduction properties to be used in high-frequency applications. In this work, we report the results for 5 {mu}m long polysilicon TFTs obtained at frequencies up to several hundred MHz for applications such as RFID tags or System-On-Panel. In order to investigate the device operation, DC and AC two-dimensional simulations of these devices in the Effective Medium framework have been performed. In the light of simulation results, the effects of carrier trapping and carrier transit on the device capacitances as a function of dimensions are analysed and compared. An equivalent small-signal circuit which accounts for the behaviour of these transistors in all regions of operation is proposed and a model for the most relevant elements of this circuit is presented. To validate our simulation results, scattering-parameters (S-parameters) measurements are performed for several structures such as multi-finger, serpentine and linear architectures and the most meaningful parameters will be given. Cut-off frequencies as high as 300 MHz and maximum oscillation frequencies of about 600 MHz have been extracted.

  8. High Fidelity Simulation of Primary Atomization in Diesel Engine Sprays

    Science.gov (United States)

    Ivey, Christopher; Bravo, Luis; Kim, Dokyun

    2014-11-01

    A high-fidelity numerical simulation of jet breakup and spray formation from a complex diesel fuel injector at ambient conditions has been performed. A full understanding of the primary atomization process in fuel injection of diesel has not been achieved for several reasons including the difficulties accessing the optically dense region. Due to the recent advances in numerical methods and computing resources, high fidelity simulations of atomizing flows are becoming available to provide new insights of the process. In the present study, an unstructured un-split Volume-of-Fluid (VoF) method coupled to a stochastic Lagrangian spray model is employed to simulate the atomization process. A common rail fuel injector is simulated by using a nozzle geometry available through the Engine Combustion Network. The working conditions correspond to a single orifice (90 μm) JP-8 fueled injector operating at an injection pressure of 90 bar, ambient condition at 29 bar, 300 K filled with 100% nitrogen with Rel = 16,071, Wel = 75,334 setting the spray in the full atomization mode. The experimental dataset from Army Research Lab is used for validation in terms of spray global parameters and local droplet distributions. The quantitative comparison will be presented and discussed. Supported by Oak Ridge Associated Universities and the Army Research Laboratory.

  9. Validation of high-resolution aerosol optical thickness simulated by a global non-hydrostatic model against remote sensing measurements

    Science.gov (United States)

    Goto, Daisuke; Sato, Yousuke; Yashiro, Hisashi; Suzuki, Kentaroh; Nakajima, Teruyuki

    2017-02-01

    A high-performance computing resource allows us to conduct numerical simulations with a horizontal grid spacing that is sufficiently high to resolve cloud systems. The cutting-edge computational capability, which was provided by the K computer at RIKEN in Japan, enabled the authors to perform long-term, global simulations of air pollutions and clouds with unprecedentedly high horizontal resolutions. In this study, a next generation model capable of simulating global air pollutions with O(10 km) grid spacing by coupling an atmospheric chemistry model to the Non-hydrostatic Icosahedral Atmospheric Model (NICAM) was performed. Using the newly developed model, month-long simulations for July were conducted with 14 km grid spacing on the K computer. Regarding the global distributions of aerosol optical thickness (AOT), it was found that the correlation coefficient (CC) between the simulation and AERONET measurements was approximately 0.7, and the normalized mean bias was -10%. The simulated AOT was also compared with satellite-retrieved values; the CC was approximately 0.6. The radiative effects due to each chemical species (dust, sea salt, organics, and sulfate) were also calculated and compared with multiple measurements. As a result, the simulated fluxes of upward shortwave radiation at the top of atmosphere and the surface compared well with the observed values, whereas those of downward shortwave radiation at the surface were underestimated, even if all aerosol components were considered. However, the aerosol radiative effects on the downward shortwave flux at the surface were found to be as high as 10 W/m2 in a global scale; thus, simulated aerosol distributions can strongly affect the simulated air temperature and dynamic circulation.

  10. The Effects of Training on Anxiety and Task Performance in Simulated Suborbital Spaceflight.

    Science.gov (United States)

    Blue, Rebecca S; Bonato, Frederick; Seaton, Kimberly; Bubka, Andrea; Vardiman, Johnené L; Mathers, Charles; Castleberry, Tarah L; Vanderploeg, James M

    2017-07-01

    In commercial spaceflight, anxiety could become mission-impacting, causing negative experiences or endangering the flight itself. We studied layperson response to four varied-length training programs (ranging from 1 h-2 d of preparation) prior to centrifuge simulation of launch and re-entry acceleration profiles expected during suborbital spaceflight. We examined subject task execution, evaluating performance in high-stress conditions. We sought to identify any trends in demographics, hemodynamics, or similar factors in subjects with the highest anxiety or poorest tolerance of the experience. Volunteers participated in one of four centrifuge training programs of varied complexity and duration, culminating in two simulated suborbital spaceflights. At most, subjects underwent seven centrifuge runs over 2 d, including two +Gz runs (peak +3.5 Gz, Run 2) and two +Gx runs (peak +6.0 Gx, Run 4) followed by three runs approximating suborbital spaceflight profiles (combined +Gx and +Gz, peak +6.0 Gx and +4.0 Gz). Two cohorts also received dedicated anxiety-mitigation training. Subjects were evaluated on their performance on various tasks, including a simulated emergency. Participating in 2-7 centrifuge exposures were 148 subjects (105 men, 43 women, age range 19-72 yr, mean 39.4 ± 13.2 yr, body mass index range 17.3-38.1, mean 25.1 ± 3.7). There were 10 subjects who withdrew or limited their G exposure; history of motion sickness was associated with opting out. Shorter length training programs were associated with elevated hemodynamic responses. Single-directional G training did not significantly improve tolerance. Training programs appear best when high fidelity and sequential exposures may improve tolerance of physical/psychological flight stressors. The studied variables did not predict anxiety-related responses to these centrifuge profiles.Blue RS, Bonato F, Seaton K, Bubka A, Vardiman JL, Mathers C, Castleberry TL, Vanderploeg JM. The effects of training on anxiety

  11. Use of high performance computing to examine the effectiveness of aquifer remediation

    International Nuclear Information System (INIS)

    Tompson, A.F.B.; Ashby, S.F.; Falgout, R.D.; Smith, S.G.; Fogwell, T.W.; Loosmore, G.A.

    1994-06-01

    Large-scale simulation of fluid flow and chemical migration is being used to study the effectiveness of pump-and-treat restoration of a contaminated, saturated aquifer. A three-element approach focusing on geostatistical representations of heterogeneous aquifers, high-performance computing strategies for simulating flow, migration, and reaction processes in large three-dimensional systems, and highly-resolved simulations of flow and chemical migration in porous formations will be discussed. Results from a preliminary application of this approach to examine pumping behavior at a real, heterogeneous field site will be presented. Future activities will emphasize parallel computations in larger, dynamic, and nonlinear (two-phase) flow problems as well as improved interpretive methods for defining detailed material property distributions

  12. Hydraulic performance numerical simulation of high specific speed mixed-flow pump based on quasi three-dimensional hydraulic design method

    International Nuclear Information System (INIS)

    Zhang, Y X; Su, M; Hou, H C; Song, P F

    2013-01-01

    This research adopts the quasi three-dimensional hydraulic design method for the impeller of high specific speed mixed-flow pump to achieve the purpose of verifying the hydraulic design method and improving hydraulic performance. Based on the two families of stream surface theory, the direct problem is completed when the meridional flow field of impeller is obtained by employing iterative calculation to settle the continuity and momentum equation of fluid. The inverse problem is completed by using the meridional flow field calculated in the direct problem. After several iterations of the direct and inverse problem, the shape of impeller and flow field information can be obtained finally when the result of iteration satisfies the convergent criteria. Subsequently the internal flow field of the designed pump are simulated by using RANS equations with RNG k-ε two-equation turbulence model. The static pressure and streamline distributions at the symmetrical cross-section, the vector velocity distribution around blades and the reflux phenomenon are analyzed. The numerical results show that the quasi three-dimensional hydraulic design method for high specific speed mixed-flow pump improves the hydraulic performance and reveal main characteristics of the internal flow of mixed-flow pump as well as provide basis for judging the rationality of the hydraulic design, improvement and optimization of hydraulic model

  13. Cooperative simulation of lithography and topography for three-dimensional high-aspect-ratio etching

    Science.gov (United States)

    Ichikawa, Takashi; Yagisawa, Takashi; Furukawa, Shinichi; Taguchi, Takafumi; Nojima, Shigeki; Murakami, Sadatoshi; Tamaoki, Naoki

    2018-06-01

    A topography simulation of high-aspect-ratio etching considering transports of ions and neutrals is performed, and the mechanism of reactive ion etching (RIE) residues in three-dimensional corner patterns is revealed. Limited ion flux and CF2 diffusion from the wide space of the corner is found to have an effect on the RIE residues. Cooperative simulation of lithography and topography is used to solve the RIE residue problem.

  14. Improving the performance of a filling line based on simulation

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  15. submitter Simulation-Based Performance Analysis of the ALICE Mass Storage System

    CERN Document Server

    Vickovic, L; Celar, S

    2016-01-01

    CERN – the European Organization for Nuclear Research today, in the era of big data, is one of the biggest data generators in the world. Especially interesting is transient data storage system in the ALICE experiment. With the goal to optimize its performance this paper discusses a dynamic, discrete event simulation model of disk based Storage Area Network (SAN) and its usage for the performance analyses. Storage system model is based on modular, bottom up approach and the differences between measured and simulated values vary between 1.5 % and 4 % depending on the simulated component. Once finished, simulation model was used for detailed performance analyses. Among other findings it showed that system performances can be seriously affected if the array stripe size is larger than the size of cache on individual disks in the array, which so far has been completely ignored in the literature.

  16. On the increase of predictive performance with high-level data fusion

    International Nuclear Information System (INIS)

    Doeswijk, T.G.; Smilde, A.K.; Hageman, J.A.; Westerhuis, J.A.; Eeuwijk, F.A. van

    2011-01-01

    The combination of the different data sources for classification purposes, also called data fusion, can be done at different levels: low-level, i.e. concatenating data matrices, medium-level, i.e. concatenating data matrices after feature selection and high-level, i.e. combining model outputs. In this paper the predictive performance of high-level data fusion is investigated. Partial least squares is used on each of the data sets and dummy variables representing the classes are used as response variables. Based on the estimated responses y-hat j for data set j and class k, a Gaussian distribution p(g k |y-hat j ) is fitted. A simulation study is performed that shows the theoretical performance of high-level data fusion for two classes and two data sets. Within group correlations of the predicted responses of the two models and differences between the predictive ability of each of the separate models and the fused models are studied. Results show that the error rate is always less than or equal to the best performing subset and can theoretically approach zero. Negative within group correlations always improve the predictive performance. However, if the data sets have a joint basis, as with metabolomics data, this is not likely to happen. For equally performing individual classifiers the best results are expected for small within group correlations. Fusion of a non-predictive classifier with a classifier that exhibits discriminative ability lead to increased predictive performance if the within group correlations are strong. An example with real life data shows the applicability of the simulation results.

  17. Scalability of DL_POLY on High Performance Computing Platform

    Directory of Open Access Journals (Sweden)

    Mabule Samuel Mabakane

    2017-12-01

    Full Text Available This paper presents a case study on the scalability of several versions of the molecular dynamics code (DL_POLY performed on South Africa‘s Centre for High Performance Computing e1350 IBM Linux cluster, Sun system and Lengau supercomputers. Within this study different problem sizes were designed and the same chosen systems were employed in order to test the performance of DL_POLY using weak and strong scalability. It was found that the speed-up results for the small systems were better than large systems on both Ethernet and Infiniband network. However, simulations of large systems in DL_POLY performed well using Infiniband network on Lengau cluster as compared to e1350 and Sun supercomputer.

  18. Fracture modelling of a high performance armour steel

    Science.gov (United States)

    Skoglund, P.; Nilsson, M.; Tjernberg, A.

    2006-08-01

    The fracture characteristics of the high performance armour steel Armox 500T is investigated. Tensile mechanical experiments using samples with different notch geometries are used to investigate the effect of multi-axial stress states on the strain to fracture. The experiments are numerically simulated and from the simulation the stress at the point of fracture initiation is determined as a function of strain and these data are then used to extract parameters for fracture models. A fracture model based on quasi-static experiments is suggested and the model is tested against independent experiments done at both static and dynamic loading. The result show that the fracture model give reasonable good agreement between simulations and experiments at both static and dynamic loading condition. This indicates that multi-axial loading is more important to the strain to fracture than the deformation rate in the investigated loading range. However on-going work will further characterise the fracture behaviour of Armox 500T.

  19. Effects of incentives on psychosocial performances in simulated space-dwelling groups

    Science.gov (United States)

    Hienz, Robert D.; Brady, Joseph V.; Hursh, Steven R.; Gasior, Eric D.; Spence, Kevin R.; Emurian, Henry H.

    Prior research with individually isolated 3-person crews in a distributed, interactive, planetary exploration simulation examined the effects of communication constraints and crew configuration changes on crew performance and psychosocial self-report measures. The present report extends these findings to a model of performance maintenance that operationalizes conditions under which disruptive affective responses by crew participants might be anticipated to emerge. Experiments evaluated the effects of changes in incentive conditions on crew performance and self-report measures in simulated space-dwelling groups. Crews participated in a simulated planetary exploration mission that required identification, collection, and analysis of geologic samples. Results showed that crew performance effectiveness was unaffected by either positive or negative incentive conditions, while self-report measures were differentially affected—negative incentive conditions produced pronounced increases in negative self-report ratings and decreases in positive self-report ratings, while positive incentive conditions produced increased positive self-report ratings only. Thus, incentive conditions associated with simulated spaceflight missions can significantly affect psychosocial adaptation without compromising task performance effectiveness in trained and experienced crews.

  20. Relating Standardized Visual Perception Measures to Simulator Visual System Performance

    Science.gov (United States)

    Kaiser, Mary K.; Sweet, Barbara T.

    2013-01-01

    Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).

  1. Numerical simulation investigation on centrifugal compressor performance of turbocharger

    International Nuclear Information System (INIS)

    Li, Jie; Yin, Yuting; Li, Shuqi; Zhang, Jizhong

    2013-01-01

    In this paper, the mathematical model of the flow filed in centrifugal compressor of turbocharger was studied. Based on the theory of computational fluid dynamics (CFD), performance curves and parameter distributions of the compressor were obtained from the 3-D numerical simulation by using CFX. Meanwhile, the influences of grid number and distribution on compressor performance were investigated, and numerical calculation method was analyzed and validated, through combining with test data. The results obtained show the increase of the grid number has little influence on compressor performance while the grid number of single-passage is above 300,000. The results also show that the numerical calculation mass flow rate of compressor choke situation has a good consistent with test results, and the maximum difference of the diffuser exit pressure between simulation and experiment decrease to 3.5% with the assumption of 6 kPa additional total pressure loss at compressor inlet. The numerical simulation method in this paper can be used to predict compressor performance, and the difference of total pressure ratio between calculation and test is less than 7%, and the total-to-total efficiency also have a good consistent with test.

  2. Numerical simulation investigation on centrifugal compressor performance of turbocharger

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jie [China Iron and Steel Research Institute Group, Beijing (China); Yin, Yuting [China North Engine Research Institute, Datong (China); Li, Shuqi; Zhang, Jizhong [Science and Technology Diesel Engine Turbocharging Laboratory, Datong (China)

    2013-06-15

    In this paper, the mathematical model of the flow filed in centrifugal compressor of turbocharger was studied. Based on the theory of computational fluid dynamics (CFD), performance curves and parameter distributions of the compressor were obtained from the 3-D numerical simulation by using CFX. Meanwhile, the influences of grid number and distribution on compressor performance were investigated, and numerical calculation method was analyzed and validated, through combining with test data. The results obtained show the increase of the grid number has little influence on compressor performance while the grid number of single-passage is above 300,000. The results also show that the numerical calculation mass flow rate of compressor choke situation has a good consistent with test results, and the maximum difference of the diffuser exit pressure between simulation and experiment decrease to 3.5% with the assumption of 6 kPa additional total pressure loss at compressor inlet. The numerical simulation method in this paper can be used to predict compressor performance, and the difference of total pressure ratio between calculation and test is less than 7%, and the total-to-total efficiency also have a good consistent with test.

  3. Simulation training improves medical students' learning experiences when performing real vaginal deliveries.

    Science.gov (United States)

    Dayal, Ashlesha K; Fisher, Nelli; Magrane, Diane; Goffman, Dena; Bernstein, Peter S; Katz, Nadine T

    2009-01-01

    To determine the relationship between simulation training for vaginal delivery maneuvers and subsequent participation in live deliveries during the clinical rotation and to assess medical students' performance and confidence in vaginal delivery maneuvers with and without simulation training. Medical students were randomized to receive or not to receive simulation training for vaginal delivery maneuvers on a mannequin simulator at the start of a 6-week clerkship. Both groups received traditional didactic and clinical teaching. One researcher, blinded to randomization, scored student competence of delivery maneuvers and overall delivery performance on simulator. Delivery performance was scored (1-5, with 5 being the highest) at weeks 1 and 5 of the clerkship. Students were surveyed to assess self-confidence in the ability to perform delivery maneuvers at weeks 1 and 5, and participation in live deliveries was evaluated using student obstetric patient logs. Thirty-three students were randomized, 18 to simulation training [simulation group (SIM)] and 15 to no simulation training [control group (CON)]. Clerkship logs demonstrated that SIM students participated in more deliveries than CON students (9.8 +/- 3.7 versus 6.2 +/- 2.8, P < 0.005). SIM reported increased confidence in ability to perform a vaginal delivery, when compared with CON at the end of the clerkship (3.81 +/- 0.83 versus 3.00 +/- 1.0, respectively, P < 0.05). The overall delivery performance score was significantly higher in SIM, when compared with CON at week 1 (3.94 +/- 0.94 versus 2.07 +/- 1.22, respectively, P < 0.001) and week 5 (4.88 +/- 0.33 versus 4.31 +/- 0.63, P < 0.001) in the simulated environment. Students who receive simulation training participate more actively in the clinical environment during the course of the clerkship. Student simulation training is beneficial to learn obstetric skills in a minimal risk environment, demonstrate competency with maneuvers, and translate this competence

  4. A Mesoscopic Simulation for the Early-Age Shrinkage Cracking Process of High Performance Concrete in Bridge Engineering

    Directory of Open Access Journals (Sweden)

    Guodong Li

    2017-01-01

    Full Text Available On a mesoscopic level, high performance concrete (HPC was assumed to be a heterogeneous composite material consisting of aggregates, mortar, and pores. The concrete mesoscopic structure model had been established based on CT image reconstruction. By combining this model with continuum mechanics, damage mechanics, and fracture mechanics, a relatively complete system for concrete mesoscopic mechanics analysis was established to simulate the process of early-age shrinkage cracking in HPC. This process was based on the dispersion crack model. The results indicated that the interface between the aggregate and mortar was the crack point caused by shrinkage cracks in HPC. The locations of early-age shrinkage cracks in HPC were associated with the spacing and the size of the aggregate particle. However, the shrinkage deformation size of the mortar was related to the scope of concrete cracking and was independent of the crack position. Whereas lower water to cement ratios can improve the early strength of concrete, this ratio cannot control early-age shrinkage cracks in HPC.

  5. SSC High Energy Booster resonance corrector and dynamic tune scanning simulation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, P.; Machida, S.

    1993-05-01

    A resonance correction system for the High Energy Booster (HEB) of the Superconducting Super Collider (SSCL) was investigated by means of dynamic multiparticle tracking. In the simulation the operating tune is scanned as a function of time so that the bunch goes through a resonance. The performance of the half integer and third integer resonance correction system is demonstrated.

  6. Dual Arm Work Package performance estimates and telerobot task network simulation

    International Nuclear Information System (INIS)

    Draper, J.V.

    1997-01-01

    This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy's Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collected to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations

  7. High Sodium Simulant Testing To Support SB8 Sludge Preparation

    International Nuclear Information System (INIS)

    Newell, J. D.

    2012-01-01

    Scoping studies were completed for high sodium simulant SRAT/SME cycles to determine any impact to CPC processing. Two SRAT/SME cycles were performed with simulant having sodium supernate concentration of 1.9M at 130% and 100% of the Koopman Minimum Acid requirement. Both of these failed to meet DWPF processing objectives related to nitrite destruction and hydrogen generation. Another set of SRAT/SME cycles were performed with simulant having a sodium supernate concentration of 1.6M at 130%, 125%, 110%, and 100% of the Koopman Minimum Acid requirement. Only the run at 110% met DWPF processing objectives. Neither simulant had a stoichiometric factor window of 30% between nitrite destruction and excessive hydrogen generation. Based on the 2M-110 results it was anticipated that the 2.5M stoichiometric window for processing would likely be smaller than from 110-130%, since it appeared that it would be necessary to increase the KMA factor by at least 10% above the minimum calculated requirement to achieve nitrite destruction due to the high oxalate content. The 2.5M-130 run exceeded the DWPF hydrogen limits in both the SRAT and SME cycle. Therefore, testing of this wash endpoint was halted. This wash endpoint with this minimum acid requirement and mercury-noble metal concentration profile appears to be something DWPF should not process due to an overly narrow window of stoichiometry. The 2M case was potentially processable in DWPF, but modifications would likely be needed in DWPF such as occasionally accepting SRAT batches with undestroyed nitrite for further acid addition and reprocessing, running near the bottom of the as yet ill-defined window of allowable stoichiometric factors, potentially extending the SRAT cycle to burn off unreacted formic acid before transferring to the SME cycle, and eliminating formic acid additions in the frit slurry

  8. The effects of laboratory inquire-based experiments and computer simulations on high school students‘ performance and cognitive load in physics teaching

    Directory of Open Access Journals (Sweden)

    Radulović Branka

    2016-01-01

    Full Text Available The main goal of this study was to examine the extent to which different teaching instructions focused on the application of laboratory inquire-based experiments (LIBEs and interactive computer based simulations (ICBSs improved understanding of physical contents in high school students, compared to traditional teaching approach. Additionally, the study examined how the applied instructions influenced students’ assessment of invested cognitive load. A convenience sample of this research included 187 high school students. A multiple-choice test of knowledge was used as a measuring instrument for the students’ performance. Each task in the test was followed by the five-point Likert-type scale for the evaluation of invested cognitive load. In addition to descriptive statistics, determination of significant differences in performance and cognitive load as well as the calculation of instructional efficiency of applied instructional design, computed one-factor analysis of variance and Tukey’s post-hoc test. The findings indicate that teaching instructions based on the use of LIBEs and ICBSs equally contribute to an increase in students’ performance and the reduction of cognitive load unlike traditional teaching of Physics. The results obtained by the students from the LIBEs and ICBSs groups for calculated instructional efficiency suggest that the applied teaching strategies represent effective teaching instructions. [Projekat Ministarstva nauke Republike Srbije, br. 179010: The Quality of Education System in Serbia from European Perspective

  9. Exploring the use of high-fidelity simulation training to enhance clinical skills.

    Science.gov (United States)

    Ann Kirkham, Lucy

    2018-02-07

    The use of interprofessional simulation training to enhance nursing students' performance of technical and non-technical clinical skills is becoming increasingly common. Simulation training can involve the use of role play, virtual reality or patient simulator manikins to replicate clinical scenarios and assess the nursing student's ability to, for example, undertake clinical observations or work as part of a team. Simulation training enables nursing students to practise clinical skills in a safe environment. Effective simulation training requires extensive preparation, and debriefing is necessary following a simulated training session to review any positive or negative aspects of the learning experience. This article discusses a high-fidelity simulated training session that was used to assess a group of third-year nursing students and foundation level 1 medical students. This involved the use of a patient simulator manikin in a scenario that required the collaborative management of a deteriorating patient. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  10. High performance gamma measurements of equipment retrieved from Hanford high-level nuclear waste tanks

    International Nuclear Information System (INIS)

    Troyer, G.L.

    1997-01-01

    The cleanup of high level defense nuclear waste at the Hanford site presents several progressive challenges. Among these is the removal and disposal of various components from buried active waste tanks to allow new equipment insertion or hazards mitigation. A unique automated retrieval system at the tank provides for retrieval, high pressure washing, inventory measurement, and containment for disposal. Key to the inventory measurement is a three detector HPGe high performance gamma spectroscopy system capable of recovering data at up to 90% saturation (200,000 counts per second). Data recovery is based on a unique embedded electronic pulser and specialized software to report the inventory. Each of the detectors have different shielding specified through Monte Carlo simulation with the MCNP program. This shielding provides performance over a dynamic range of eight orders of magnitude. System description, calibration issues and operational experiences are discussed

  11. High performance gamma measurements of equipment retrieved from Hanford high-level nuclear waste tanks

    Energy Technology Data Exchange (ETDEWEB)

    Troyer, G.L.

    1997-03-17

    The cleanup of high level defense nuclear waste at the Hanford site presents several progressive challenges. Among these is the removal and disposal of various components from buried active waste tanks to allow new equipment insertion or hazards mitigation. A unique automated retrieval system at the tank provides for retrieval, high pressure washing, inventory measurement, and containment for disposal. Key to the inventory measurement is a three detector HPGe high performance gamma spectroscopy system capable of recovering data at up to 90% saturation (200,000 counts per second). Data recovery is based on a unique embedded electronic pulser and specialized software to report the inventory. Each of the detectors have different shielding specified through Monte Carlo simulation with the MCNP program. This shielding provides performance over a dynamic range of eight orders of magnitude. System description, calibration issues and operational experiences are discussed.

  12. A high-fidelity, six-degree-of-freedom batch simulation environment for tactical guidance research and evaluation

    Science.gov (United States)

    Goodrich, Kenneth H.

    1993-01-01

    A batch air combat simulation environment, the tactical maneuvering simulator (TMS), is presented. The TMS is a tool for developing and evaluating tactical maneuvering logics, but it can also be used to evaluate the tactical implications of perturbations to aircraft performance or supporting systems. The TMS can simulate air combat between any number of engagement participants, with practical limits imposed by computer memory and processing power. Aircraft are modeled using equations of motion, control laws, aerodynamics, and propulsive characteristics equivalent to those used in high-fidelity piloted simulations. Data bases representative of a modern high-performance aircraft with and without thrust-vectoring capability are included. To simplify the task of developing and implementing maneuvering logics in the TMS, an outer-loop control system, the tactical autopilot (TA), is implemented in the aircraft simulation model. The TA converts guidance commands by computerized maneuvering logics from desired angle of attack and wind-axis bank-angle inputs to the inner loop control augmentation system of the aircraft. The capabilities and operation of the TMS and the TA are described.

  13. The design and simulated performance of a fast Level 1 track trigger for the ATLAS High Luminosity Upgrade

    CERN Document Server

    Martensson, Mikael; The ATLAS collaboration

    2017-01-01

    The ATLAS experiment at the High Luminosity LHC will face a fivefold increase in the number of interactions per bunch crossing relative to the ongoing Run 2. This will require a proportional improvement in rejection power at the earliest levels of the detector trigger system, while preserving good signal efficiency. One critical aspect of this improvement will be the implementation of precise track reconstruction, through which sharper trigger turn-on curves can be achieved, and b-tagging and tau-tagging techniques can in principle be implemented. The challenge of such a project comes in the development of a fast, custom electronic device integrated in the hardware based first trigger level of the experiment. This article will discuss the requirements, architecture and projected performance of the system in terms of tracking, timing and physics, based on detailed simulations. Studies are carried out using data from the strip subsystem only or both strip and pixel subsystems.

  14. The design and simulated performance of a fast Level 1 track trigger for the ATLAS High Luminosity Upgrade

    CERN Document Server

    Martensson, Mikael; The ATLAS collaboration

    2017-01-01

    The ATLAS experiment at the high-luminosity LHC will face a five-fold increase in the number of interactions per collision relative to the ongoing Run 2. This will require a proportional improvement in rejection power at the earliest levels of the detector trigger system, while preserving good signal efficiency. One critical aspect of this improvement will be the implementation of precise track reconstruction, through which sharper trigger turn-on curves can be achieved, and b-tagging and tau-tagging techniques can in principle be implemented. The challenge of such a project comes in the development of a fast, custom electronic device integrated in the hardware-based first trigger level of the experiment, with repercussions propagating as far as the detector read-out philosophy. This talk will discuss the requirements, architecture and projected performance of the system in terms of tracking, timing and physics, based on detailed simulations. Studies are carried out comparing two detector geometries and using...

  15. High Performance Systolic Array Core Architecture Design for DNA Sequencer

    Directory of Open Access Journals (Sweden)

    Saiful Nurdin Dayana

    2018-01-01

    Full Text Available This paper presents a high performance systolic array (SA core architecture design for Deoxyribonucleic Acid (DNA sequencer. The core implements the affine gap penalty score Smith-Waterman (SW algorithm. This time-consuming local alignment algorithm guarantees optimal alignment between DNA sequences, but it requires quadratic computation time when performed on standard desktop computers. The use of linear SA decreases the time complexity from quadratic to linear. In addition, with the exponential growth of DNA databases, the SA architecture is used to overcome the timing issue. In this work, the SW algorithm has been captured using Verilog Hardware Description Language (HDL and simulated using Xilinx ISIM simulator. The proposed design has been implemented in Xilinx Virtex -6 Field Programmable Gate Array (FPGA and improved in the core area by 90% reduction.

  16. Fast and accurate methods for the performance testing of highly-efficient c-Si photovoltaic modules using a 10 ms single-pulse solar simulator and customized voltage profiles

    International Nuclear Information System (INIS)

    Virtuani, A; Rigamonti, G; Friesen, G; Chianese, D; Beljean, P

    2012-01-01

    Performance testing of highly efficient, highly capacitive c-Si modules with pulsed solar simulators requires particular care. These devices in fact usually require a steady-state solar simulator or pulse durations longer than 100–200 ms in order to avoid measurement artifacts. The aim of this work was to validate an alternative method for the testing of highly capacitive c-Si modules using a 10 ms single pulse solar simulator. Our approach attempts to reconstruct a quasi-steady-state I–V (current–voltage) curve of a highly capacitive device during one single 10 ms flash by applying customized voltage profiles–-in place of a conventional V ramp—to the terminals of the device under test. The most promising results were obtained by using V profiles which we name ‘dragon-back’ (DB) profiles. When compared to the reference I–V measurement (obtained by using a multi-flash approach with approximately 20 flashes), the DB V profile method provides excellent results with differences in the estimation of P max (as well as of I sc , V oc and FF) below ±0.5%. For the testing of highly capacitive devices the method is accurate, fast (two flashes—possibly one—required), cost-effective and has proven its validity with several technologies making it particularly interesting for in-line testing. (paper)

  17. Hybrid Building Performance Simulation Models for Industrial Energy Efficiency Applications

    Directory of Open Access Journals (Sweden)

    Peter Smolek

    2018-06-01

    Full Text Available In the challenge of achieving environmental sustainability, industrial production plants, as large contributors to the overall energy demand of a country, are prime candidates for applying energy efficiency measures. A modelling approach using cubes is used to decompose a production facility into manageable modules. All aspects of the facility are considered, classified into the building, energy system, production and logistics. This approach leads to specific challenges for building performance simulations since all parts of the facility are highly interconnected. To meet this challenge, models for the building, thermal zones, energy converters and energy grids are presented and the interfaces to the production and logistics equipment are illustrated. The advantages and limitations of the chosen approach are discussed. In an example implementation, the feasibility of the approach and models is shown. Different scenarios are simulated to highlight the models and the results are compared.

  18. High Performance Networks From Supercomputing to Cloud Computing

    CERN Document Server

    Abts, Dennis

    2011-01-01

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applicati

  19. Wavy channel transistor for area efficient high performance operation

    KAUST Repository

    Fahad, Hossain M.

    2013-04-05

    We report a wavy channel FinFET like transistor where the channel is wavy to increase its width without any area penalty and thereby increasing its drive current. Through simulation and experiments, we show the effectiveness of such device architecture is capable of high performance operation compared to conventional FinFETs with comparatively higher area efficiency and lower chip latency as well as lower power consumption.

  20. High-performance, scalable optical network-on-chip architectures

    Science.gov (United States)

    Tan, Xianfang

    The rapid advance of technology enables a large number of processing cores to be integrated into a single chip which is called a Chip Multiprocessor (CMP) or a Multiprocessor System-on-Chip (MPSoC) design. The on-chip interconnection network, which is the communication infrastructure for these processing cores, plays a central role in a many-core system. With the continuously increasing complexity of many-core systems, traditional metallic wired electronic networks-on-chip (NoC) became a bottleneck because of the unbearable latency in data transmission and extremely high energy consumption on chip. Optical networks-on-chip (ONoC) has been proposed as a promising alternative paradigm for electronic NoC with the benefits of optical signaling communication such as extremely high bandwidth, negligible latency, and low power consumption. This dissertation focus on the design of high-performance and scalable ONoC architectures and the contributions are highlighted as follow: 1. A micro-ring resonator (MRR)-based Generic Wavelength-routed Optical Router (GWOR) is proposed. A method for developing any sized GWOR is introduced. GWOR is a scalable non-blocking ONoC architecture with simple structure, low cost and high power efficiency compared to existing ONoC designs. 2. To expand the bandwidth and improve the fault tolerance of the GWOR, a redundant GWOR architecture is designed by cascading different type of GWORs into one network. 3. The redundant GWOR built with MRR-based comb switches is proposed. Comb switches can expand the bandwidth while keep the topology of GWOR unchanged by replacing the general MRRs with comb switches. 4. A butterfly fat tree (BFT)-based hybrid optoelectronic NoC (HONoC) architecture is developed in which GWORs are used for global communication and electronic routers are used for local communication. The proposed HONoC uses less numbers of electronic routers and links than its counterpart of electronic BFT-based NoC. It takes the advantages of

  1. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    OpenAIRE

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  2. Behavioral Simulation and Performance Evaluation of Multi-Processor Architectures

    Directory of Open Access Journals (Sweden)

    Ausif Mahmood

    1996-01-01

    Full Text Available The development of multi-processor architectures requires extensive behavioral simulations to verify the correctness of design and to evaluate its performance. A high level language can provide maximum flexibility in this respect if the constructs for handling concurrent processes and a time mapping mechanism are added. This paper describes a novel technique for emulating hardware processes involved in a parallel architecture such that an object-oriented description of the design is maintained. The communication and synchronization between hardware processes is handled by splitting the processes into their equivalent subprograms at the entry points. The proper scheduling of these subprograms is coordinated by a timing wheel which provides a time mapping mechanism. Finally, a high level language pre-processor is proposed so that the timing wheel and the process emulation details can be made transparent to the user.

  3. Power efficient and high performance VLSI architecture for AES algorithm

    Directory of Open Access Journals (Sweden)

    K. Kalaiselvi

    2015-09-01

    Full Text Available Advanced encryption standard (AES algorithm has been widely deployed in cryptographic applications. This work proposes a low power and high throughput implementation of AES algorithm using key expansion approach. We minimize the power consumption and critical path delay using the proposed high performance architecture. It supports both encryption and decryption using 256-bit keys with a throughput of 0.06 Gbps. The VHDL language is utilized for simulating the design and an FPGA chip has been used for the hardware implementations. Experimental results reveal that the proposed AES architectures offer superior performance than the existing VLSI architectures in terms of power, throughput and critical path delay.

  4. Measuring cognitive load: performance, mental effort and simulation task complexity.

    Science.gov (United States)

    Haji, Faizal A; Rojas, David; Childs, Ruth; de Ribaupierre, Sandrine; Dubrowski, Adam

    2015-08-01

    Interest in applying cognitive load theory in health care simulation is growing. This line of inquiry requires measures that are sensitive to changes in cognitive load arising from different instructional designs. Recently, mental effort ratings and secondary task performance have shown promise as measures of cognitive load in health care simulation. We investigate the sensitivity of these measures to predicted differences in intrinsic load arising from variations in task complexity and learner expertise during simulation-based surgical skills training. We randomly assigned 28 novice medical students to simulation training on a simple or complex surgical knot-tying task. Participants completed 13 practice trials, interspersed with computer-based video instruction. On trials 1, 5, 9 and 13, knot-tying performance was assessed using time and movement efficiency measures, and cognitive load was assessed using subjective rating of mental effort (SRME) and simple reaction time (SRT) on a vibrotactile stimulus-monitoring secondary task. Significant improvements in knot-tying performance (F(1.04,24.95)  = 41.1, p cognitive load (F(2.3,58.5)  = 57.7, p load among novices engaged in simulation-based learning. These measures can be used to track cognitive load during skills training. Mental effort ratings are also sensitive to small differences in intrinsic load arising from variations in the physical complexity of a simulation task. The complementary nature of these subjective and objective measures suggests their combined use is advantageous in simulation instructional design research. © 2015 John Wiley & Sons Ltd.

  5. Effect of video-game experience and position of flight stick controller on simulated-flight performance.

    Science.gov (United States)

    Cho, Bo-Keun; Aghazadeh, Fereydoun; Al-Qaisi, Saif

    2012-01-01

    The purpose of this study was to determine the effects of video-game experience and flight-stick position on flying performance. The study divided participants into 2 groups; center- and side-stick groups, which were further divided into high and low level of video-game experience subgroups. The experiment consisted of 7 sessions of simulated flying, and in the last session, the flight stick controller was switched to the other position. Flight performance was measured in terms of the deviation of heading, altitude, and airspeed from their respective requirements. Participants with high experience in video games performed significantly better (p increase (0.78 %). However, after switching from a center- to a side-stick controller, performance scores decreased (4.8%).

  6. The Relationship between Tests of Neurocognition and Performance on a Laparoscopic Simulator

    Directory of Open Access Journals (Sweden)

    Oumar Kuzbari

    2010-01-01

    Full Text Available Objective. To estimate if there is a relationship between the results of tests of neurocognition and performance on a laparoscopic surgery simulator. Methods and Materials. Twenty participants with no prior laparoscopic experience had baseline cognitive tests administered (Trail Making Test, Part A and B (TMT-A and TMT-B, Grooved Peg Board Test, Symbol Digit Modalities Test, Symbol Digit Recall Test, and Stroop Interference Test, completed a demographic questionnaire, and then performed laparoscopy using a simulator. We correlated the results of cognitive tests with laparoscopic surgical performance. Results. One cognitive test sensitive to frontal lobe function, TMT-A, significantly correlated with laparoscopic surgical performance on the simulator (correlation coefficient of 0.534 with P<.05. However, the correlation between performance and other cognitive tests (TMT-B, Grooved Peg Board Test, Symbol Digit Modalities Test, Symbol Digit Recall Test, and Stroop Interference Test was not statistically significant. Conclusion. Laparoscopic performance may be related to measures of frontal lobe function. Neurocognitive tests may predict motor skills abilities and performance on laparoscopic simulator.

  7. The Impact of Computer Simulations as Interactive Demonstration Tools on the Performance of Grade 11 Learners in Electromagnetism

    Science.gov (United States)

    Kotoka, Jonas; Kriek, Jeanne

    2014-01-01

    The impact of computer simulations on the performance of 65 grade 11 learners in electromagnetism in a South African high school in the Mpumalanga province is investigated. Learners did not use the simulations individually, but teachers used them as an interactive demonstration tool. Basic concepts in electromagnetism are difficult to understand…

  8. Modeling and Simulation of the Multi-module High Temperature Gas-cooled Reactor

    International Nuclear Information System (INIS)

    Liu Dan; Sun Jun; Sui Zhe; Xu Xiaolin; Ma Yuanle; Sun Yuliang

    2014-01-01

    The modular high temperature gas-cooled reactor (MHTGR) is characterized with the inherent safety. To enhance its economic benefit, the capital cost of MHTGR can be decreased by combining more reactor modules into one unit and realize the batch constructions in the concept of modularization. In the research and design of the multi-module reactors, one difficulty is to clarify the coupling effects of different modules in operating the reactors due to the shared feed water and main steam systems in the secondary loop. In the advantages of real-time simulation and coupling calculations of different modules and sub-systems, the operation of multi-module reactors can be studied and analyzed to understand the range and extent of the coupling effects. In the current paper; the engineering simulator for the multi-module reactors was realized and able to run in high performance computers, based on the research experience of the HTR-PM engineering simulator. The models were detailed introduced including the primary and secondary loops. The steady state of full power operation was demonstrated to show the good performance of six-module reactors. Typical dynamic processes, such as adjusting feed water flow rates and shutting down one reactor; were also tested to study the coupling effects in multi-module reactors. (author)

  9. An applied artificial intelligence approach towards assessing building performance simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Yezioro, Abraham [Faculty of Architecture and Town Planning, Technion IIT (Israel); Dong, Bing [Center for Building Performance and Diagnostics, School of Architecture, Carnegie Mellon University (United States); Leite, Fernanda [Department of Civil and Environmental Engineering, Carnegie Mellon University (United States)

    2008-07-01

    With the development of modern computer technology, a large amount of building energy simulation tools is available in the market. When choosing which simulation tool to use in a project, the user must consider the tool's accuracy and reliability, considering the building information they have at hand, which will serve as input for the tool. This paper presents an approach towards assessing building performance simulation results to actual measurements, using artificial neural networks (ANN) for predicting building energy performance. Training and testing of the ANN were carried out with energy consumption data acquired for 1 week in the case building called the Solar House. The predicted results show a good fitness with the mathematical model with a mean absolute error of 0.9%. Moreover, four building simulation tools were selected in this study in order to compare their results with the ANN predicted energy consumption: Energy{sub 1}0, Green Building Studio web tool, eQuest and EnergyPlus. The results showed that the more detailed simulation tools have the best simulation performance in terms of heating and cooling electricity consumption within 3% of mean absolute error. (author)

  10. Virtual Learning Simulations in High School

    DEFF Research Database (Denmark)

    Thisgaard, Malene Warming; Makransky, Guido

    2017-01-01

    The present study compared the value of using a virtual learning simulation compared to traditional lessons on the topic of evolution, and investigated if the virtual learning simulation could serve as a catalyst for STEM academic and career development, based on social cognitive career theory....... The investigation was conducted using a crossover repeated measures design based on a sample of 128 high school biology/biotech students. The results showed that the virtual learning simulation increased knowledge of evolution significantly, compared to the traditional lesson. No significant differences between...... the simulation and lesson were found in their ability to increase the non-cognitive measures. Both interventions increased self-efficacy significantly, and none of them had a significant effect on motivation. In addition, the results showed that the simulation increased interest in biology related tasks...

  11. Noise Simulations of the High-Lift Common Research Model

    Science.gov (United States)

    Lockard, David P.; Choudhari, Meelan M.; Vatsa, Veer N.; O'Connell, Matthew D.; Duda, Benjamin; Fares, Ehab

    2017-01-01

    The PowerFLOW(TradeMark) code has been used to perform numerical simulations of the high-lift version of the Common Research Model (HL-CRM) that will be used for experimental testing of airframe noise. Time-averaged surface pressure results from PowerFLOW(TradeMark) are found to be in reasonable agreement with those from steady-state computations using FUN3D. Surface pressure fluctuations are highest around the slat break and nacelle/pylon region, and synthetic array beamforming results also indicate that this region is the dominant noise source on the model. The gap between the slat and pylon on the HL-CRM is not realistic for modern aircraft, and most nacelles include a chine that is absent in the baseline model. To account for those effects, additional simulations were completed with a chine and with the slat extended into the pylon. The case with the chine was nearly identical to the baseline, and the slat extension resulted in higher surface pressure fluctuations but slightly reduced radiated noise. The full-span slat geometry without the nacelle/pylon was also simulated and found to be around 10 dB quieter than the baseline over almost the entire frequency range. The current simulations are still considered preliminary as changes in the radiated acoustics are still being observed with grid refinement, and additional simulations with finer grids are planned.

  12. An approach to high speed ship ride quality simulation

    Science.gov (United States)

    Malone, W. L.; Vickery, J. M.

    1975-01-01

    The high speeds attained by certain advanced surface ships result in a spectrum of motion which is higher in frequency than that of conventional ships. This fact along with the inclusion of advanced ride control features in the design of these ships resulted in an increased awareness of the need for ride criteria. Such criteria can be developed using data from actual ship operations in varied sea states or from clinical laboratory experiments. A third approach is to simulate ship conditions using measured or calculated ship motion data. Recent simulations have used data derived from a math model of Surface Effect Ship (SES) motion. The model in turn is based on equations of motion which have been refined with data from scale models and SES of up to 101 600-kg (100-ton) displacement. Employment of broad band motion emphasizes the use of the simulators as a design tool to evaluate a given ship configuration in several operational situations and also serves to provide data as to the overall effect of a given motion on crew performance and physiological status.

  13. Load management strategy for Particle-In-Cell simulations in high energy particle acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Beck, A., E-mail: beck@llr.in2p3.fr [Laboratoire Leprince-Ringuet, École polytechnique, CNRS-IN2P3, Palaiseau 91128 (France); Frederiksen, J.T. [Niels Bohr Institute, University of Copenhagen, Blegdamsvej 17, 2100 København Ø (Denmark); Dérouillat, J. [CEA, Maison de La Simulation, 91400 Saclay (France)

    2016-09-01

    In the wake of the intense effort made for the experimental CILEX project, numerical simulation campaigns have been carried out in order to finalize the design of the facility and to identify optimal laser and plasma parameters. These simulations bring, of course, important insight into the fundamental physics at play. As a by-product, they also characterize the quality of our theoretical and numerical models. In this paper, we compare the results given by different codes and point out algorithmic limitations both in terms of physical accuracy and computational performances. These limitations are illustrated in the context of electron laser wakefield acceleration (LWFA). The main limitation we identify in state-of-the-art Particle-In-Cell (PIC) codes is computational load imbalance. We propose an innovative algorithm to deal with this specific issue as well as milestones towards a modern, accurate high-performance PIC code for high energy particle acceleration.

  14. Bimanual Psychomotor Performance in Neurosurgical Resident Applicants Assessed Using NeuroTouch, a Virtual Reality Simulator.

    Science.gov (United States)

    Winkler-Schwartz, Alexander; Bajunaid, Khalid; Mullah, Muhammad A S; Marwa, Ibrahim; Alotaibi, Fahad E; Fares, Jawad; Baggiani, Marta; Azarnoush, Hamed; Zharni, Gmaan Al; Christie, Sommer; Sabbagh, Abdulrahman J; Werthner, Penny; Del Maestro, Rolando F

    Current selection methods for neurosurgical residents fail to include objective measurements of bimanual psychomotor performance. Advancements in computer-based simulation provide opportunities to assess cognitive and psychomotor skills in surgically naive populations during complex simulated neurosurgical tasks in risk-free environments. This pilot study was designed to answer 3 questions: (1) What are the differences in bimanual psychomotor performance among neurosurgical residency applicants using NeuroTouch? (2) Are there exceptionally skilled medical students in the applicant cohort? and (3) Is there an influence of previous surgical exposure on surgical performance? Participants were instructed to remove 3 simulated brain tumors with identical visual appearance, stiffness, and random bleeding points. Validated tier 1, tier 2, and advanced tier 2 metrics were used to assess bimanual psychomotor performance. Demographic data included weeks of neurosurgical elective and prior operative exposure. This pilot study was carried out at the McGill Neurosurgical Simulation Research and Training Center immediately following neurosurgical residency interviews at McGill University, Montreal, Canada. All 17 medical students interviewed were asked to participate, of which 16 agreed. Performances were clustered in definable top, middle, and bottom groups with significant differences for all metrics. Increased time spent playing music, increased applicant self-evaluated technical skills, high self-ratings of confidence, and increased skin closures statistically influenced performance on univariate analysis. A trend for both self-rated increased operating room confidence and increased weeks of neurosurgical exposure to increased blood loss was seen in multivariate analysis. Simulation technology identifies neurosurgical residency applicants with differing levels of technical ability. These results provide information for studies being developed for longitudinal studies on the

  15. Three-dimensional simulations of low foot and high foot implosion experiments on the National Ignition Facility

    International Nuclear Information System (INIS)

    Clark, D. S.; Weber, C. R.; Milovich, J. L.; Salmonson, J. D.; Kritcher, A. L.; Haan, S. W.; Hammel, B. A.; Hinkel, D. E.; Hurricane, O. A.; Jones, O. S.; Marinak, M. M.; Patel, P. K.; Robey, H. F.; Sepke, S. M.; Edwards, M. J.

    2016-01-01

    In order to achieve the several hundred Gbar stagnation pressures necessary for inertial confinement fusion ignition, implosion experiments on the National Ignition Facility (NIF) [E. I. Moses et al., Phys. Plasmas 16, 041006 (2009)] require the compression of deuterium-tritium fuel layers by a convergence ratio as high as forty. Such high convergence implosions are subject to degradation by a range of perturbations, including the growth of small-scale defects due to hydrodynamic instabilities, as well as longer scale modulations due to radiation flux asymmetries in the enclosing hohlraum. Due to the broad range of scales involved, and also the genuinely three-dimensional (3D) character of the flow, accurately modeling NIF implosions remains at the edge of current simulation capabilities. This paper describes the current state of progress of 3D capsule-only simulations of NIF implosions aimed at accurately describing the performance of specific NIF experiments. Current simulations include the effects of hohlraum radiation asymmetries, capsule surface defects, the capsule support tent and fill tube, and use a grid resolution shown to be converged in companion two-dimensional simulations. The results of detailed simulations of low foot implosions from the National Ignition Campaign are contrasted against results for more recent high foot implosions. While the simulations suggest that low foot performance was dominated by ablation front instability growth, especially the defect seeded by the capsule support tent, high foot implosions appear to be dominated by hohlraum flux asymmetries, although the support tent still plays a significant role. For both implosion types, the simulations show reasonable, though not perfect, agreement with the data and suggest that a reliable predictive capability is developing to guide future implosions toward ignition.

  16. Three-dimensional simulations of low foot and high foot implosion experiments on the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Clark, D. S.; Weber, C. R.; Milovich, J. L.; Salmonson, J. D.; Kritcher, A. L.; Haan, S. W.; Hammel, B. A.; Hinkel, D. E.; Hurricane, O. A.; Jones, O. S.; Marinak, M. M.; Patel, P. K.; Robey, H. F.; Sepke, S. M.; Edwards, M. J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94550 (United States)

    2016-05-15

    In order to achieve the several hundred Gbar stagnation pressures necessary for inertial confinement fusion ignition, implosion experiments on the National Ignition Facility (NIF) [E. I. Moses et al., Phys. Plasmas 16, 041006 (2009)] require the compression of deuterium-tritium fuel layers by a convergence ratio as high as forty. Such high convergence implosions are subject to degradation by a range of perturbations, including the growth of small-scale defects due to hydrodynamic instabilities, as well as longer scale modulations due to radiation flux asymmetries in the enclosing hohlraum. Due to the broad range of scales involved, and also the genuinely three-dimensional (3D) character of the flow, accurately modeling NIF implosions remains at the edge of current simulation capabilities. This paper describes the current state of progress of 3D capsule-only simulations of NIF implosions aimed at accurately describing the performance of specific NIF experiments. Current simulations include the effects of hohlraum radiation asymmetries, capsule surface defects, the capsule support tent and fill tube, and use a grid resolution shown to be converged in companion two-dimensional simulations. The results of detailed simulations of low foot implosions from the National Ignition Campaign are contrasted against results for more recent high foot implosions. While the simulations suggest that low foot performance was dominated by ablation front instability growth, especially the defect seeded by the capsule support tent, high foot implosions appear to be dominated by hohlraum flux asymmetries, although the support tent still plays a significant role. For both implosion types, the simulations show reasonable, though not perfect, agreement with the data and suggest that a reliable predictive capability is developing to guide future implosions toward ignition.

  17. Using simulation to evaluate the performance of resilience strategies and process failures

    Energy Technology Data Exchange (ETDEWEB)

    Levy, Scott N.; Topp, Bryan Embry; Arnold, Dorian C; Ferreira, Kurt Brian; Widener, Patrick; Hoefler, Torsten

    2014-01-01

    Fault-tolerance has been identified as a major challenge for future extreme-scale systems. Current predictions suggest that, as systems grow in size, failures will occur more frequently. Because increases in failure frequency reduce the performance and scalability of these systems, significant effort has been devoted to developing and refining resilience mechanisms to mitigate the impact of failures. However, effective evaluation of these mechanisms has been challenging. Current systems are smaller and have significantly different architectural features (e.g., interconnect, persistent storage) than we expect to see in next-generation systems. To overcome these challenges, we propose the use of simulation. Simulation has been shown to be an effective tool for investigating performance characteristics of applications on future systems. In this work, we: identify the set of system characteristics that are necessary for accurate performance prediction of resilience mechanisms for HPC systems and applications; demonstrate how these system characteristics can be incorporated into an existing large-scale simulator; and evaluate the predictive performance of our modified simulator. We also describe how we were able to optimize the simulator for large temporal and spatial scales-allowing the simulator to run 4x faster and use over 100x less memory.

  18. On the performance of a high head Francis turbine at design and off-design conditions

    International Nuclear Information System (INIS)

    Aakti, B; Amstutz, O; Casartelli, E; Romanelli, G; Mangani, L

    2015-01-01

    In the present paper, fully 360 degrees transient and steady-state simulations of a Francis turbine were performed at three operating conditions, namely at part load (PL), best efficiency point (BEP), and high load (HL), using different numerical approaches for the pressure-velocity coupling. The simulation domain includes the spiral casing with stay and guide vanes, the runner and the draft tube. The main target of the investigations is the numerical prediction of the overall performance of the high head Francis turbine model as well as local and integral quantities of the complete machine in different operating conditions. All results were compared with experimental data published by the workshop organization. All CFD simulations were performed at model scale with a new in-house, 3D, unstructured, object-oriented finite volume code within the framework of the open source OpenFOAM library. The novel fully coupled pressure-based solver is designed to solve the incompressible RANS- Equations and is capable of handling multiple references of frame (MRF). The obtained results show that the overall performance is well captured by the simulations. Regarding the local flow distributions within the inlet section of the draft-tube, the axial velocity is better estimated than the circumferential component

  19. Performance comparison of low and high temperature polymer electrolyte membrane fuel cells. Experimental examinations, modelling and numerical simulation; Leistungsvergleich von Nieder- und Hochtemperatur-Polymerelektrolytmembran-Brennstoffzellen. Experimentelle Untersuchungen, Modellierung und numerische Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Loehn, Helmut

    2010-11-03

    danger of washing out of the phosphoric acid. In an additional test row the Celtec-P-1000 HT-MEA was subjected to temperature change cycles (40 - 160 C), which lead to irreversible voltage losses. In a final test row performance tests were carried out with a HT-PEM fuel cell stack (16 cells /1 kW), developed in the fuel cell research centre of Volkswagen with a special gas diffusion electrode, which should avoid the degradation at deep temperatures. In these examinations no irreversible voltage losses could be detected, but the tests had to be aborted because of leakage problems. The by the experimental examinations gained insight of the superior operating behaviour and the further advantages of the HT-PEMFC in comparison to the LT-PEMFC were crucial for the construction of a simulation model for a single HT-PEM fuel cell in the theoretical part of this thesis, that also should be suitable as process simulation model for the computer based development of a virtual fuel cell within the interdisciplinary project ''Virtual Fuel Cell'' at the TU Darmstadt. The model is a numerical 2D ''along the channel'' - model, that was constructed with the finite element software COMSOL Multiphysics (version 3.5 a). The stationary, one phase model comprises altogether ten dependent variables in seven application modules in a highly complex, coupled non linear system of equations with 33713 degrees of freedom (1675 rectangle elements with 1768 nodes). The simulation model describes the mass transport processes and the electro-chemical reactions in a HT-PEM fuel cell with good accuracy, the model validation by comparing the model results with experimental data could be proved. So the 2D-model is basically suitable as process simulation model for the projecting of a virtual HT-PEM fuel cell. (orig.)

  20. Interprofessional education in pharmacology using high-fidelity simulation.

    Science.gov (United States)

    Meyer, Brittney A; Seefeldt, Teresa M; Ngorsuraches, Surachat; Hendrickx, Lori D; Lubeck, Paula M; Farver, Debra K; Heins, Jodi R

    2017-11-01

    This study examined the feasibility of an interprofessional high-fidelity pharmacology simulation and its impact on pharmacy and nursing students' perceptions of interprofessionalism and pharmacology knowledge. Pharmacy and nursing students participated in a pharmacology simulation using a high-fidelity patient simulator. Faculty-facilitated debriefing included discussion of the case and collaboration. To determine the impact of the activity on students' perceptions of interprofessionalism and their ability to apply pharmacology knowledge, surveys were administered to students before and after the simulation. Attitudes Toward Health Care Teams scale (ATHCT) scores improved from 4.55 to 4.72 on a scale of 1-6 (p = 0.005). Almost all (over 90%) of the students stated their pharmacology knowledge and their ability to apply that knowledge improved following the simulation. A simulation in pharmacology is feasible and favorably affected students' interprofessionalism and pharmacology knowledge perceptions. Pharmacology is a core science course required by multiple health professions in early program curricula, making it favorable for incorporation of interprofessional learning experiences. However, reports of high-fidelity interprofessional simulation in pharmacology courses are limited. This manuscript contributes to the literature in the field of interprofessional education by demonstrating that an interprofessional simulation in pharmacology is feasible and can favorably affect students' perceptions of interprofessionalism. This manuscript provides an example of a pharmacology interprofessional simulation that faculty in other programs can use to build similar educational activities. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  2. Simulation design of P–I–N-type all-perovskite solar cells with high efficiency

    International Nuclear Information System (INIS)

    Du Hui-Jing; Wang Wei-Chao; Gu Yi-Fan

    2017-01-01

    According to the good charge transporting property of perovskite, we design and simulate a p–i–n-type all-perovskite solar cell by using one-dimensional device simulator. The perovskite charge transporting layers and the perovskite absorber constitute the all-perovskite cell. By modulating the cell parameters, such as layer thickness values, doping concentrations and energy bands of n-, i-, and p-type perovskite layers, the all-perovskite solar cell obtains a high power conversion efficiency of 25.84%. The band matched cell shows appreciably improved performance with widen absorption spectrum and lowered recombination rate, so weobtain a high J sc of 32.47 mA/cm 2 . The small series resistance of the all-perovskite solar cell also benefits the high J sc . The simulation provides a novel thought of designing perovskite solar cells with simple producing process, low production cost and high efficient structure to solve the energy problem. (paper)

  3. A database for human performance under simulated emergencies of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea

    2005-01-01

    Reliable human performance is a prerequisite in securing the safety of complicated process systems such as nuclear power plants. However, the amount of available knowledge that can explain why operators deviate from an expected performance level is so small because of the infrequency of real accidents. Therefore, in this study, a database that contains a set of useful information extracted from simulated emergencies was developed in order to provide important clues for understanding the change of operators' performance under stressful conditions (i.e., real accidents). The database was developed under Microsoft Windows TM environment using Microsoft Access 97 TM and Microsoft Visual Basic 6.0 TM . In the database, operators' performance data obtained from the analysis of over 100 audio-visual records for simulated emergencies were stored using twenty kinds of distinctive data fields. A total of ten kinds of operators' performance data are available from the developed database. Although it is still difficult to predict operators' performance under stressful conditions based on the results of simulated emergencies, simulation studies remain the most feasible way to scrutinize performance. Accordingly, it is expected that the performance data of this study will provide a concrete foundation for understanding the change of operators' performance in emergency situations

  4. Geant4 simulation of a 3D high resolution gamma camera

    International Nuclear Information System (INIS)

    Akhdar, H.; Kezzar, K.; Aksouh, F.; Assemi, N.; AlGhamdi, S.; AlGarawi, M.; Gerl, J.

    2015-01-01

    The aim of this work is to develop a 3D gamma camera with high position resolution and sensitivity relying on both distance/absorption and Compton scattering techniques and without using any passive collimation. The proposed gamma camera is simulated in order to predict its performance using the full benefit of Geant4 features that allow the construction of the needed geometry of the detectors, have full control of the incident gamma particles and study the response of the detector in order to test the suggested geometries. Three different geometries are simulated and each configuration is tested with three different scintillation materials (LaBr3, LYSO and CeBr3)

  5. Performance of a 2-megawatt high voltage test load

    International Nuclear Information System (INIS)

    Horan, D.; Kustom, R.; Ferguson, M.

    1995-01-01

    A high-power, water-cooled resistive load which simulates the electrical load characteristics of a high-power klystron, capable of 2 megawatts dissipation at 95 kV DC, was built and installed at the Advanced Photon Source for use in load-testing high voltage power supplies. During this testing, the test load has logged approximately 35 hours of operation at power levels in excess of one mezawatt. Slight variations in the resistance of the load during operation indicate that leakage currents in the cooling water may be a significant factor affecting the performance of the load. Sufficient performance data have been collected to indicate that leakage current through the deionized (DI) water coolant shunts roughly 15 percent of the full-load current around the load resistor elements. The leakage current could cause deterioration of internal components of the load. The load pressure vessel was disassembled and inspected internally for any signs of significant wear and distress. Results of this inspection and possible modifications for improved performance will be discussed

  6. Interactive Data Exploration for High-Performance Fluid Flow Computations through Porous Media

    KAUST Repository

    Perovic, Nevena

    2014-09-01

    © 2014 IEEE. Huge data advent in high-performance computing (HPC) applications such as fluid flow simulations usually hinders the interactive processing and exploration of simulation results. Such an interactive data exploration not only allows scientiest to \\'play\\' with their data but also to visualise huge (distributed) data sets in both an efficient and easy way. Therefore, we propose an HPC data exploration service based on a sliding window concept, that enables researches to access remote data (available on a supercomputer or cluster) during simulation runtime without exceeding any bandwidth limitations between the HPC back-end and the user front-end.

  7. High performance homes

    DEFF Research Database (Denmark)

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    Can prefabrication contribute to the development of high performance homes? To answer this question, this chapter defines high performance in more broadly inclusive terms, acknowledging the technical, architectural, social and economic conditions under which energy consumption and production occur....... Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  8. Influence of high energy β-radiation on thermoelectric performance of filled skutterudites compounds

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jikun, E-mail: jikunchen@seas.harvard.edu [CAS Key Laboratory of Materials for Energy Conversion, Shanghai Institute of Ceramics, Chinese Academy of Sciences, Shanghai 200050 (China); Zha, Hao [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Key Laboratory of Particle & Radiation Imaging, Tsinghua University, Ministry of Education, Beijing (China); Xia, Xugui; Qiu, Pengfei; Li, Yulong [CAS Key Laboratory of Materials for Energy Conversion, Shanghai Institute of Ceramics, Chinese Academy of Sciences, Shanghai 200050 (China); Wang, Chuanjing; Han, Yunsheng [Nuctech Company Limited, Beijing (China); Shi, Xun; Chen, Lidong [CAS Key Laboratory of Materials for Energy Conversion, Shanghai Institute of Ceramics, Chinese Academy of Sciences, Shanghai 200050 (China); Jin, Qingxiu [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Key Laboratory of Particle & Radiation Imaging, Tsinghua University, Ministry of Education, Beijing (China); Chen, Huaibi, E-mail: chenhb@mail.tsinghua.edu.cn [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Key Laboratory of Particle & Radiation Imaging, Tsinghua University, Ministry of Education, Beijing (China)

    2015-08-15

    Highlights: • Impact by MeV β-rays irradiation on skutterudite TE material was investigated. • Monte-Carlo simulation is used to simulate the deposited energy irradiations. • The high deposited energy does not change the TE performance. • The light irradiation does not show a significant impact on TE materials. - Abstract: The influence of MeV β-rays irradiation on the thermoelectric performance of n-type filled skutterudite material has been investigated using an electron accelerator. Using a Monte-Carlo simulation base on Fluka code, the deposited energy in the sample material from the irradiation is estimated, which shows a large power deposited around 50 W/mm. Nevertheless, the thermoelectric performances of the filled skutterudite samples are compared before and after irradiations. It indicates that the thermoelectric material will not be easily jeopardized by ‘light’ irradiations with energy lower than MeV range.

  9. Influence of high energy β-radiation on thermoelectric performance of filled skutterudites compounds

    International Nuclear Information System (INIS)

    Chen, Jikun; Zha, Hao; Xia, Xugui; Qiu, Pengfei; Li, Yulong; Wang, Chuanjing; Han, Yunsheng; Shi, Xun; Chen, Lidong; Jin, Qingxiu; Chen, Huaibi

    2015-01-01

    Highlights: • Impact by MeV β-rays irradiation on skutterudite TE material was investigated. • Monte-Carlo simulation is used to simulate the deposited energy irradiations. • The high deposited energy does not change the TE performance. • The light irradiation does not show a significant impact on TE materials. - Abstract: The influence of MeV β-rays irradiation on the thermoelectric performance of n-type filled skutterudite material has been investigated using an electron accelerator. Using a Monte-Carlo simulation base on Fluka code, the deposited energy in the sample material from the irradiation is estimated, which shows a large power deposited around 50 W/mm. Nevertheless, the thermoelectric performances of the filled skutterudite samples are compared before and after irradiations. It indicates that the thermoelectric material will not be easily jeopardized by ‘light’ irradiations with energy lower than MeV range

  10. Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields

    Science.gov (United States)

    Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo

    The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.

  11. High-Performance Computer Modeling of the Cosmos-Iridium Collision

    Energy Technology Data Exchange (ETDEWEB)

    Olivier, S; Cook, K; Fasenfest, B; Jefferson, D; Jiang, M; Leek, J; Levatin, J; Nikolaev, S; Pertica, A; Phillion, D; Springer, K; De Vries, W

    2009-08-28

    This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellite collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  12. Effectiveness of simulation-based learning on student nurses' self-efficacy and performance while learning fundamental nursing skills.

    Science.gov (United States)

    Lin, Hsin-Hsin

    2015-01-01

    It was noted worldwide while learning fundamental skills and facing skills assessments, nursing students seemed to experience low confidence and high anxiety levels. Could simulation-based learning help to enhance students' self-efficacy and performance? Its effectiveness is mostly unidentified. This study was conducted to provide a shared experience to give nurse educators confidence and an insight into how simulation-based teaching can fit into nursing skills learning. A pilot study was completed with 50 second-year undergraduate nursing students, and the main study included 98 students where a pretest-posttest design was adopted. Data were gathered through four questionnaires and a performance assessment under scrutinized controls such as previous experiences, lecturers' teaching skills, duration of teaching, procedure of skills performance assessment and the inter-rater reliability. The results showed that simulation-based learning significantly improved students' self-efficacy regarding skills learning and the skills performance that nurse educators wish students to acquire. However, technology anxiety, examiners' critical attitudes towards students' performance and their unpredicted verbal and non-verbal expressions, have been found as possible confounding factors. The simulation-based learning proved to have a powerful positive effect on students' achievement outcomes. Nursing skills learning is one area that can benefit greatly from this kind of teaching and learning method.

  13. The development of high performance numerical simulation code for transient groundwater flow and reactive solute transport problems based on local discontinuous Galerkin method

    International Nuclear Information System (INIS)

    Suzuki, Shunichi; Motoshima, Takayuki; Naemura, Yumi; Kubo, Shin; Kanie, Shunji

    2009-01-01

    The authors develop a numerical code based on Local Discontinuous Galerkin Method for transient groundwater flow and reactive solute transport problems in order to make it possible to do three dimensional performance assessment on radioactive waste repositories at the earliest stage possible. Local discontinuous Galerkin Method is one of mixed finite element methods which are more accurate ones than standard finite element methods. In this paper, the developed numerical code is applied to several problems which are provided analytical solutions in order to examine its accuracy and flexibility. The results of the simulations show the new code gives highly accurate numeric solutions. (author)

  14. Fully Coupled Simulation of Lithium Ion Battery Cell Performance

    Energy Technology Data Exchange (ETDEWEB)

    Trembacki, Bradley L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Murthy, Jayathi Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roberts, Scott Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Lithium-ion battery particle-scale (non-porous electrode) simulations applied to resolved electrode geometries predict localized phenomena and can lead to better informed decisions on electrode design and manufacturing. This work develops and implements a fully-coupled finite volume methodology for the simulation of the electrochemical equations in a lithium-ion battery cell. The model implementation is used to investigate 3D battery electrode architectures that offer potential energy density and power density improvements over traditional layer-by-layer particle bed battery geometries. Advancement of micro-scale additive manufacturing techniques has made it possible to fabricate these 3D electrode microarchitectures. A variety of 3D battery electrode geometries are simulated and compared across various battery discharge rates and length scales in order to quantify performance trends and investigate geometrical factors that improve battery performance. The energy density and power density of the 3D battery microstructures are compared in several ways, including a uniform surface area to volume ratio comparison as well as a comparison requiring a minimum manufacturable feature size. Significant performance improvements over traditional particle bed electrode designs are observed, and electrode microarchitectures derived from minimal surfaces are shown to be superior. A reduced-order volume-averaged porous electrode theory formulation for these unique 3D batteries is also developed, allowing simulations on the full-battery scale. Electrode concentration gradients are modeled using the diffusion length method, and results for plate and cylinder electrode geometries are compared to particle-scale simulation results. Additionally, effective diffusion lengths that minimize error with respect to particle-scale results for gyroid and Schwarz P electrode microstructures are determined.

  15. Design of the HELICS High-Performance Transmission-Distribution-Communication-Market Co-Simulation Framework: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Top, Philip [Lawrence Livermore National Laboratories; Smith, Steve [Lawrence Livermore National Laboratories; Daily, Jeff [Pacific Northwest National Laboratory; Fuller, Jason [Pacific Northwest National Laboratory

    2017-09-12

    This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layered co-simulation architecture.

  16. Simulations of dimensionally reduced effective theories of high temperature QCD

    CERN Document Server

    Hietanen, Ari

    Quantum chromodynamics (QCD) is the theory describing interaction between quarks and gluons. At low temperatures, quarks are confined forming hadrons, e.g. protons and neutrons. However, at extremely high temperatures the hadrons break apart and the matter transforms into plasma of individual quarks and gluons. In this theses the quark gluon plasma (QGP) phase of QCD is studied using lattice techniques in the framework of dimensionally reduced effective theories EQCD and MQCD. Two quantities are in particular interest: the pressure (or grand potential) and the quark number susceptibility. At high temperatures the pressure admits a generalised coupling constant expansion, where some coefficients are non-perturbative. We determine the first such contribution of order g^6 by performing lattice simulations in MQCD. This requires high precision lattice calculations, which we perform with different number of colors N_c to obtain N_c-dependence on the coefficient. The quark number susceptibility is studied by perf...

  17. Design and performance simulation of a segmented-absorber based muon detection system for high energy heavy ion collision experiments

    International Nuclear Information System (INIS)

    Ahmad, S.; Bhaduri, P.P.; Jahan, H.; Senger, A.; Adak, R.; Samanta, S.; Prakash, A.; Dey, K.; Lebedev, A.; Kryshen, E.; Chattopadhyay, S.; Senger, P.; Bhattacharjee, B.; Ghosh, S.K.; Raha, S.; Irfan, M.; Ahmad, N.; Farooq, M.; Singh, B.

    2015-01-01

    A muon detection system (MUCH) based on a novel concept using a segmented and instrumented absorber has been designed for high-energy heavy-ion collision experiments. The system consists of 6 hadron absorber blocks and 6 tracking detector triplets. Behind each absorber block a detector triplet is located which measures the tracks of charged particles traversing the absorber. The performance of such a system has been simulated for the CBM experiment at FAIR (Germany) that is scheduled to start taking data in heavy ion collisions in the beam energy range of 6–45 A GeV from 2019. The muon detection system is mounted downstream to a Silicon Tracking System (STS) that is located in a large aperture dipole magnet which provides momentum information of the charged particle tracks. The reconstructed tracks from the STS are to be matched to the hits measured by the muon detector triplets behind the absorber segments. This method allows the identification of muon tracks over a broad range of momenta including tracks of soft muons which do not pass through all the absorber layers. Pairs of oppositely charged muons identified by MUCH could therefore be combined to measure the invariant masses in a wide range starting from low mass vector mesons (LMVM) up to charmonia. The properties of the absorber (material, thickness, position) and of the tracking chambers (granularity, geometry) have been varied in simulations of heavy-ion collision events generated with the UrQMD generator and propagated through the setup using the GEANT3, the particle transport code. The tracks are reconstructed by a Cellular Automaton algorithm followed by a Kalman Filter. The simulations demonstrate that low mass vector mesons and charmonia can be clearly identified in central Au+Au collisions at beam energies provided by the international Facility for Antiproton and Ion Research (FAIR)

  18. Acquiring skills in malignant hyperthermia crisis management: comparison of high-fidelity simulation versus computer-based case study

    Directory of Open Access Journals (Sweden)

    Vilma Mejía

    Full Text Available Abstract Introduction: The primary purpose of this study was to compare the effect of high fidelity simulation versus a computer-based case solving self-study, in skills acquisition about malignant hyperthermia on first year anesthesiology residents. Methods: After institutional ethical committee approval, 31 first year anesthesiology residents were enrolled in this prospective randomized single-blinded study. Participants were randomized to either a High Fidelity Simulation Scenario or a computer-based Case Study about malignant hyperthermia. After the intervention, all subjects' performance in was assessed through a high fidelity simulation scenario using a previously validated assessment rubric. Additionally, knowledge tests and a satisfaction survey were applied. Finally, a semi-structured interview was done to assess self-perception of reasoning process and decision-making. Results: 28 first year residents finished successfully the study. Resident's management skill scores were globally higher in High Fidelity Simulation versus Case Study, however they were significant in 4 of the 8 performance rubric elements: recognize signs and symptoms (p = 0.025, prioritization of initial actions of management (p = 0.003, recognize complications (p = 0.025 and communication (p = 0.025. Average scores from pre- and post-test knowledge questionnaires improved from 74% to 85% in the High Fidelity Simulation group, and decreased from 78% to 75% in the Case Study group (p = 0.032. Regarding the qualitative analysis, there was no difference in factors influencing the student's process of reasoning and decision-making with both teaching strategies. Conclusion: Simulation-based training with a malignant hyperthermia high-fidelity scenario was superior to computer-based case study, improving knowledge and skills in malignant hyperthermia crisis management, with a very good satisfaction level in anesthesia residents.

  19. Computer science of the high performance; Informatica del alto rendimiento

    Energy Technology Data Exchange (ETDEWEB)

    Moraleda, A.

    2008-07-01

    The high performance computing is taking shape as a powerful accelerator of the process of innovation, to drastically reduce the waiting times for access to the results and the findings in a growing number of processes and activities as complex and important as medicine, genetics, pharmacology, environment, natural resources management or the simulation of complex processes in a wide variety of industries. (Author)

  20. SEMICONDUCTOR INTEGRATED CIRCUITS: A quasi-3-dimensional simulation method for a high-voltage level-shifting circuit structure

    Science.gov (United States)

    Jizhi, Liu; Xingbi, Chen

    2009-12-01

    A new quasi-three-dimensional (quasi-3D) numeric simulation method for a high-voltage level-shifting circuit structure is proposed. The performances of the 3D structure are analyzed by combining some 2D device structures; the 2D devices are in two planes perpendicular to each other and to the surface of the semiconductor. In comparison with Davinci, the full 3D device simulation tool, the quasi-3D simulation method can give results for the potential and current distribution of the 3D high-voltage level-shifting circuit structure with appropriate accuracy and the total CPU time for simulation is significantly reduced. The quasi-3D simulation technique can be used in many cases with advantages such as saving computing time, making no demands on the high-end computer terminals, and being easy to operate.

  1. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    International Nuclear Information System (INIS)

    Bouchard, Kristofer E.

    2016-01-01

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  2. Performance Test of Core Protection and Monitoring Algorithm with DLL for SMART Simulator Implementation

    International Nuclear Information System (INIS)

    Koo, Bonseung; Hwang, Daehyun; Kim, Keungkoo

    2014-01-01

    A multi-purpose best-estimate simulator for SMART is being established, which is intended to be used as a tool to evaluate the impacts of design changes on the safety performance, and to improve and/or optimize the operating procedure of SMART. In keeping with these intentions, a real-time model of the digital core protection and monitoring systems was developed and the real-time performance of the models was verified for various simulation scenarios. In this paper, a performance test of the core protection and monitoring algorithm with a DLL file for the SMART simulator implementation was performed. A DLL file of the simulator application code was made and several real-time evaluation tests were conducted for the steady-state and transient conditions with simulated system variables. A performance test of the core protection and monitoring algorithms for the SMART simulator was performed. A DLL file of the simulator version code was made and several real-time evaluation tests were conducted for various scenarios with a DLL file and simulated system variables. The results of all test cases showed good agreement with the reference results and some features caused by algorithm change were properly reflected to the DLL results. Therefore, it was concluded that the SCOPS S SIM and SCOMS S SIM algorithms and calculational capabilities are appropriate for the core protection and monitoring program in the SMART simulator

  3. The performance of Dräger Oxylog ventilators at simulated altitude.

    Science.gov (United States)

    Flynn, J G; Singh, B

    2008-07-01

    Ventilated patients frequently require transport by air in a hypobaric environment. Previous studies have demonstrated significant changes in the performance of ventilators with changes in cabin pressure (altitude) but no studies have been published on the function of modem ventilators at altitude. This experiment set out to evaluate ventilatory parameters (tidal volume and respiratory rate) of three commonly used transport ventilators (the Dräger Oxylog 1000, 2000 and 3000) in a simulated hypobaric environment. Ventilators were assessed using either air-mix (60% oxygen) or 100% oxygen and tested against models simulating a normal lung, a low compliance (Acute Respiratory Distress Syndrome) lung and a high-resistance (asthma) lung. Ventilators were tested at a range of simulated altitudes between sea level and 3048 m. Over this range, tidal volume delivered by the Oxylog 1000 increased by 68% and respiratory rate decreased by 28%. Tidal volume delivered by the Oxylog 2000 ventilator increased by 29% over the same range of altitudes but there was no significant change in respiratory rate. Tidal volume and respiratory rate remained constant with the Oxylog 3000 over the same range of altitudes. Changes were consistent with each ventilator regardless of oxygen content or lung model. It is important that clinicians involved in critical care transport in a hypobaric environment are aware that individual ventilators perform differently at altitude and that they are aware of the characteristics of the particular ventilator that they are using.

  4. Design of High Performance Permanent-Magnet Synchronous Wind Generators

    Directory of Open Access Journals (Sweden)

    Chun-Yu Hsiao

    2014-11-01

    Full Text Available This paper is devoted to the analysis and design of high performance permanent-magnet synchronous wind generators (PSWGs. A systematic and sequential methodology for the design of PMSGs is proposed with a high performance wind generator as a design model. Aiming at high induced voltage, low harmonic distortion as well as high generator efficiency, optimal generator parameters such as pole-arc to pole-pitch ratio and stator-slot-shoes dimension, etc. are determined with the proposed technique using Maxwell 2-D, Matlab software and the Taguchi method. The proposed double three-phase and six-phase winding configurations, which consist of six windings in the stator, can provide evenly distributed current for versatile applications regarding the voltage and current demands for practical consideration. Specifically, windings are connected in series to increase the output voltage at low wind speed, and in parallel during high wind speed to generate electricity even when either one winding fails, thereby enhancing the reliability as well. A PMSG is designed and implemented based on the proposed method. When the simulation is performed with a 6 Ω load, the output power for the double three-phase winding and six-phase winding are correspondingly 10.64 and 11.13 kW. In addition, 24 Ω load experiments show that the efficiencies of double three-phase winding and six-phase winding are 96.56% and 98.54%, respectively, verifying the proposed high performance operation.

  5. Simulation of the hydraulic performance of highway filter drains through laboratory models and stormwater management tools.

    Science.gov (United States)

    Sañudo-Fontaneda, Luis A; Jato-Espino, Daniel; Lashford, Craig; Coupe, Stephen J

    2017-05-23

    Road drainage is one of the most relevant assets in transport infrastructure due to its inherent influence on traffic management and road safety. Highway filter drains (HFDs), also known as "French Drains", are the main drainage system currently in use in the UK, throughout 7000 km of its strategic road network. Despite being a widespread technique across the whole country, little research has been completed on their design considerations and their subsequent impact on their hydraulic performance, representing a gap in the field. Laboratory experiments have been proven to be a reliable indicator for the simulation of the hydraulic performance of stormwater best management practices (BMPs). In addition to this, stormwater management tools (SMT) have been preferentially chosen as a design tool for BMPs by practitioners from all over the world. In this context, this research aims to investigate the hydraulic performance of HFDs by comparing the results from laboratory simulation and two widely used SMT such as the US EPA's stormwater management model (SWMM) and MicroDrainage®. Statistical analyses were applied to a series of rainfall scenarios simulated, showing a high level of accuracy between the results obtained in laboratory and using SMT as indicated by the high and low values of the Nash-Sutcliffe and R 2 coefficients and root-mean-square error (RMSE) reached, which validated the usefulness of SMT to determine the hydraulic performance of HFDs.

  6. Simulator training and non-technical factors improve laparoscopic performance among OBGYN trainees.

    Science.gov (United States)

    Ahlborg, Liv; Hedman, Leif; Nisell, Henry; Felländer-Tsai, Li; Enochsson, Lars

    2013-10-01

    To investigate how simulator training and non-technical factors affect laparoscopic performance among residents in obstetrics and gynecology. In this prospective study, trainees were randomized into three groups. The first group was allocated to proficiency-based training in the LapSimGyn(®) virtual reality simulator. The second group received additional structured mentorship during subsequent laparoscopies. The third group served as control group. At baseline an operation was performed and visuospatial ability, flow and self-efficacy were assessed. All groups subsequently performed three tubal occlusions. Self-efficacy and flow were assessed before and/or after each operation. Simulator training was conducted at the Center for Advanced Medical Simulation and Training, Karolinska University Hospital. Sterilizations were performed at each trainee's home clinic. Twenty-eight trainees/residents from 21 hospitals in Sweden were included. Visuospatial ability was tested by the Mental Rotation Test-A. Flow and self-efficacy were assessed by validated scales and questionnaires. Laparoscopic performance was measured as the duration of surgery. Visuospatial ability, self-efficacy and flow were correlated to the laparoscopic performance using Spearman's correlations. Differences between groups were analyzed by the Mann-Whitney U-test. No differences across groups were detected at baseline. Self-efficacy scores before and flow scores after the third operation were significantly higher in the trained groups. Duration of surgery was significantly shorter in the trained groups. Flow and self-efficacy correlate positively with laparoscopic performance. Simulator training and non-technical factors appear to improve the laparoscopic performance among trainees/residents in obstetrics and gynecology. © 2013 Nordic Federation of Societies of Obstetrics and Gynecology.

  7. Driving simulator performance of veterans from the Iraq and Afghanistan wars.

    Science.gov (United States)

    Amick, Melissa M; Kraft, Melissa; McGlinchey, Regina

    2013-01-01

    Driving simulator performance was examined in Operation Iraqi Freedom/Operation Enduring Freedom (OIF/OEF) Veterans to objectively evaluate driving abilities among this cohort who self-report poorer driving safety postdeployment. OIF/OEF Veterans (n = 25) and age- and education-matched civilian controls (n = 25) participated in a 30 min driving simulator assessment that measured the frequency of minor, moderate, and severe driving errors. Frequency of errors in specific content domains (speed regulation, positioning, and signaling) was also calculated. All participants answered questions about number of lifetime traffic "warnings," moving violation tickets, and accidents. Veterans completed the Posttraumatic Stress Disorder (PTSD) Checklist-Military Version. On the driving simulator assessment, Veterans committed more minor, moderate, severe, and speeding errors and reported poorer lifetime driving records than the civilian control group. Exploratory analyses revealed an association between increasing errors on the driving simulator with increasing symptoms of PTSD, although statistically this correlation did not reach significance. These findings suggest that Veterans perform more poorly on an objective evaluation of driving safety and that the presence of PTSD could be associated with worse performance on this standardized driving simulator assessment.

  8. Simulation study of the high intensity S-Band photoinjector

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Xiongwei; Nakajima, Kazuhisa [High Energy Accelerator Research Organization, Tsukuba, Ibaraki (Japan)

    2001-10-01

    In this paper, we report the results of simulation study of the high intensity S-Band photoinjector. The aim of the simulation study is to transport high bunch charge with low emittance evolution. The simulation result shows that 7nC bunch with rms emittance 22.3 {pi} mm mrad can be outputted at the exit of photoinjector. (author)

  9. Simulation study of the high intensity S-Band photoinjector

    International Nuclear Information System (INIS)

    Zhu, Xiongwei; Nakajima, Kazuhisa

    2001-01-01

    In this paper, we report the results of simulation study of the high intensity S-Band photoinjector. The aim of the simulation study is to transport high bunch charge with low emittance evolution. The simulation result shows that 7nC bunch with rms emittance 22.3 π mm mrad can be outputted at the exit of photoinjector. (author)

  10. Wall modeling for the simulation of highly non-isothermal unsteady flows; Modelisation de paroi pour la simulation d'ecoulements instationnaires non-isothermes

    Energy Technology Data Exchange (ETDEWEB)

    Devesa, A

    2006-12-15

    Nuclear industry flows are most of the time characterized by their high Reynolds number, density variations (at low Mach numbers) and a highly unsteady behaviour (low to moderate frequencies). High Reynolds numbers are un-affordable by direct simulation (DNS), and simulations must either be performed by solving averaged equations (RANS), or by solving only the large eddies (LES), both using a wall model. A first investigation of this thesis dealt with the derivation and test of two variable density wall models: an algebraic law (CWM) and a zonal approach dedicated to LES (TBLE-{rho}). These models were validated in quasi-isothermal cases, before being used in academic and industrial non-isothermal flows with satisfactory results. Then, a numerical experiment of pulsed passive scalars was performed by DNS, were two forcing conditions were considered: oscillations are imposed in the outer flow; oscillations come from the wall. Several frequencies and amplitudes of oscillations were taken into account in order to gain insights in unsteady effects in the boundary layer, and to create a database for validating wall models in such context. The temporal behaviour of two wall models (algebraic and zonal wall models) were studied and showed that a zonal model produced better results when used in the simulation of unsteady flows. (author)

  11. Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics.

    Science.gov (United States)

    Garfjeld Roberts, Patrick; Guyver, Paul; Baldwin, Mathew; Akhtar, Kash; Alvand, Abtin; Price, Andrew J; Rees, Jonathan L

    2017-02-01

    To assess the construct and face validity of ArthroS, a passive haptic VR simulator. A secondary aim was to evaluate the novel performance metrics produced by this simulator. Two groups of 30 participants, each divided into novice, intermediate or expert based on arthroscopic experience, completed three separate tasks on either the knee or shoulder module of the simulator. Performance was recorded using 12 automatically generated performance metrics and video footage of the arthroscopic procedures. The videos were blindly assessed using a validated global rating scale (GRS). Participants completed a survey about the simulator's realism and training utility. This new simulator demonstrated construct validity of its tasks when evaluated against a GRS (p ≤ 0.003 in all cases). Regarding it's automatically generated performance metrics, established outputs such as time taken (p ≤ 0.001) and instrument path length (p ≤ 0.007) also demonstrated good construct validity. However, two-thirds of the proposed 'novel metrics' the simulator reports could not distinguish participants based on arthroscopic experience. Face validity assessment rated the simulator as a realistic and useful tool for trainees, but the passive haptic feedback (a key feature of this simulator) is rated as less realistic. The ArthroS simulator has good task construct validity based on established objective outputs, but some of the novel performance metrics could not distinguish between surgical experience. The passive haptic feedback of the simulator also needs improvement. If simulators could offer automated and validated performance feedback, this would facilitate improvements in the delivery of training by allowing trainees to practise and self-assess.

  12. Numerical Simulation and Performance Analysis of Twin Screw Air Compressors

    Directory of Open Access Journals (Sweden)

    W. S. Lee

    2001-01-01

    Full Text Available A theoretical model is proposed in this paper in order to study the performance of oil-less and oil-injected twin screw air compressors. Based on this model, a computer simulation program is developed and the effects of different design parameters including rotor profile, geometric clearance, oil-injected angle, oil temperature, oil flow rate, built-in volume ratio and other operation conditions on the performance of twin screw air compressors are investigated. The simulation program gives us output variables such as specific power, compression ratio, compression efficiency, volumetric efficiency, and discharge temperature. Some of the above results are then compared with experimentally measured data and good agreement is found between the simulation results and the measured data.

  13. Proceedings of eSim 2006 : IBPSA-Canada's 4. biennial building performance simulation conference

    International Nuclear Information System (INIS)

    Kesik, T.

    2006-01-01

    This conference was attended by professionals, academics and students interested in promoting the science of building performance simulation in order to optimize design, construction, operation and maintenance of new and existing buildings around the world. This biennial conference and exhibition covered all topics related to computerized simulation of a building's energy performance and energy efficiency. Computerized simulation is widely used to predict the environmental performance of buildings during all stages of a building's life cycle, from the design, commissioning, construction, occupancy and management stages. Newly developed simulation methods for optimal comfort in new and existing buildings were evaluated. The themes of the conference were: recent developments for modelling the physical processes relevant to buildings; algorithms for modelling conventional and innovative HVAC systems; methods for modelling whole-building performance; building simulation software development; the use of building simulation tools in code compliance; moving simulation into practice; validation of building simulation software; architectural design; and optimization approaches in building design. The conference also covered the modeling of energy supply systems with reference to renewable energy sources such as ground source heat pumps or hybrid systems incorporating solar energy. The conference featured 32 presentations, of which 28 have been catalogued separately for inclusion in this database. refs., tabs., figs

  14. Computer simulation of high energy displacement cascades

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1990-01-01

    A methodology developed for modeling many aspects of high energy displacement cascades with molecular level computer simulations is reviewed. The initial damage state is modeled in the binary collision approximation (using the MARLOWE computer code), and the subsequent disposition of the defects within a cascade is modeled with a Monte Carlo annealing simulation (the ALSOME code). There are few adjustable parameters, and none are set to physically unreasonable values. The basic configurations of the simulated high energy cascades in copper, i.e., the number, size and shape of damage regions, compare well with observations, as do the measured numbers of residual defects and the fractions of freely migrating defects. The success of these simulations is somewhat remarkable, given the relatively simple models of defects and their interactions that are employed. The reason for this success is that the behavior of the defects is very strongly influenced by their initial spatial distributions, which the binary collision approximation adequately models. The MARLOWE/ALSOME system, with input from molecular dynamics and experiments, provides a framework for investigating the influence of high energy cascades on microstructure evolution. (author)

  15. Physiological responses and performance in a simulated trampoline gymnastics competition in elite male gymnasts.

    Science.gov (United States)

    Jensen, Peter; Scott, Suzanne; Krustrup, Peter; Mohr, Magni

    2013-01-01

    Physiological responses and performance were examined during and after a simulated trampoline competition (STC). Fifteen elite trampoline gymnasts participated, of which eight completed two routines (EX1 and EX2) and a competition final (EX3). Trampoline-specific activities were quantified by video-analysis. Countermovement jump (CMJ) and 20 maximal trampoline jump (20-MTJ) performances were assessed. Heart rate (HR) and quadriceps muscle temperature (Tm) were recorded and venous blood was drawn. A total of 252 ± 16 jumps were performed during the STC. CMJ performance declined (P trampoline gymnastic competition includes a high number of repeated explosive and energy demanding jumps, which impairs jump performance during and 24 h post-competition.

  16. Distributed dynamic simulations of networked control and building performance applications.

    Science.gov (United States)

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  17. Influence of carbohydrate supplementation on skill performance during a soccer match simulation.

    Science.gov (United States)

    Russell, Mark; Benton, David; Kingsley, Michael

    2012-07-01

    This study investigated the influence of carbohydrate supplementation on skill performance throughout exercise that replicates soccer match-play. Experimentation was conducted in a randomised, double-blind and cross-over study design. After familiarization, 15 professional academy soccer players completed a soccer match simulation incorporating passing, dribbling and shooting on two separate occasions. Participants received a 6% carbohydrate-electrolyte solution (CHO) or electrolyte solution (PL). Precision, success rate, ball speed and an overall index (speed-precision-success; SPS) were determined for all skills. Blood samples were taken at rest, immediately before exercise, every 15 min during exercise (first half: 15, 30 and 45 min; second half: 60, 75 and 90 min), and 10 min into the half time (half-time). Carbohydrate supplementation influenced shooting (time×treatment interaction: pinteraction: pCarbohydrate supplementation attenuated decrements in shooting performance during simulated soccer match-play; however, further research is warranted to optimise carbohydrate supplementation regimes for high-intensity intermittent sports. Copyright © 2012 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  18. High fidelity simulation based team training in urology: a preliminary interdisciplinary study of technical and nontechnical skills in laparoscopic complications management.

    Science.gov (United States)

    Lee, Jason Y; Mucksavage, Phillip; Canales, Cecilia; McDougall, Elspeth M; Lin, Sharon

    2012-04-01

    Simulation based team training provides an opportunity to develop interdisciplinary communication skills and address potential medical errors in a high fidelity, low stakes environment. We evaluated the implementation of a novel simulation based team training scenario and assessed the technical and nontechnical performance of urology and anesthesiology residents. Urology residents were randomly paired with anesthesiology residents to participate in a simulation based team training scenario involving the management of 2 scripted critical events during laparoscopic radical nephrectomy, including the vasovagal response to pneumoperitoneum and renal vein injury during hilar dissection. A novel kidney surgical model and a high fidelity mannequin simulator were used for the simulation. A debriefing session followed each simulation based team training scenario. Assessments of technical and nontechnical performance were made using task specific checklists and global rating scales. A total of 16 residents participated, of whom 94% rated the simulation based team training scenario as useful for communication skill training. Also, 88% of urology residents believed that the kidney surgical model was useful for technical skill training. Urology resident training level correlated with technical performance (p=0.004) and blood loss during renal vein injury management (p=0.022) but not with nontechnical performance. Anesthesia resident training level correlated with nontechnical performance (p=0.036). Urology residents consistently rated themselves higher on nontechnical performance than did faculty (p=0.033). Anesthesia residents did not differ in the self-assessment of nontechnical performance compared to faculty assessments. Residents rated the simulation based team training scenario as useful for interdisciplinary communication skill training. Urology resident training level correlated with technical performance but not with nontechnical performance. Urology residents

  19. Design and evaluation of dynamic replication strategies for a high-performance data grid

    International Nuclear Information System (INIS)

    Ranganathan, K.; Foster, I.

    2001-01-01

    Physics experiments that generate large amounts of data need to be able to share it with researchers around the world. High performance grids facilitate the distribution of such data to geographically remote places. Dynamic replication can be used as a technique to reduce bandwidth consumption and access latency in accessing these huge amounts of data. The authors describe a simulation framework that we have developed to model a grid scenario, which enables comparative studies of alternative dynamic replication strategies. The authors present preliminary results obtained with this simulator, in which we evaluate the performance of six different replication strategies for three different kinds of access patterns. The simulation results show that the best strategy has significant savings in latency and bandwidth consumption if the access patterns contain a moderate amount of geographical locality

  20. High Performance Gigabit Ethernet Switches for DAQ Systems

    CERN Document Server

    Barczyk, Artur

    2005-01-01

    Commercially available high performance Gigabit Ethernet (GbE) switches are optimized mostly for Internet and standard LAN application traffic. DAQ systems on the other hand usually make use of very specific traffic patterns, with e.g. deterministic arrival times. Industry's accepted loss-less limit of 99.999% may be still unacceptably high for DAQ purposes, as e.g. in the case of the LHCb readout system. In addition, even switches passing this criteria under random traffic can show significantly higher loss rates if subject to our traffic pattern, mainly due to buffer memory limitations. We have evaluated the performance of several switches, ranging from "pizza-box" devices with 24 or 48 ports up to chassis based core switches in a test-bed capable to emulate realistic traffic patterns as expected in the readout system of our experiment. The results obtained in our tests have been used to refine and parametrize our packet level simulation of the complete LHCb readout network. In this paper we report on the...

  1. Hot and Hypoxic Environments Inhibit Simulated Soccer Performance and Exacerbate Performance Decrements When Combined

    Science.gov (United States)

    Aldous, Jeffrey W. F.; Chrismas, Bryna C. R.; Akubat, Ibrahim; Dascombe, Ben; Abt, Grant; Taylor, Lee

    2016-01-01

    The effects of heat and/or hypoxia have been well-documented in match-play data. However, large match-to-match variation for key physical performance measures makes environmental inferences difficult to ascertain from soccer match-play. Therefore, the present study aims to investigate the hot (HOT), hypoxic (HYP), and hot-hypoxic (HH) mediated-decrements during a non-motorized treadmill based soccer-specific simulation. Twelve male University soccer players completed three familiarization sessions and four randomized crossover experimental trials of the intermittent Soccer Performance Test (iSPT) in normoxic-temperate (CON: 18°C 50% rH), HOT (30°C; 50% rH), HYP (1000 m; 18°C 50% rH), and HH (1000 m; 30°C; 50% rH). Physical performance and its performance decrements, body temperatures (rectal, skin, and estimated muscle temperature), heart rate (HR), arterial blood oxygen saturation (SaO2), perceived exertion, thermal sensation (TS), body mass changes, blood lactate, and plasma volume were all measured. Performance decrements were similar in HOT and HYP [Total Distance (−4%), High-speed distance (~−8%), and variable run distance (~−12%) covered] and exacerbated in HH [total distance (−9%), high-speed distance (−15%), and variable run distance (−15%)] compared to CON. Peak sprint speed, was 4% greater in HOT compared with CON and HYP and 7% greater in HH. Sprint distance covered was unchanged (p > 0.05) in HOT and HYP and only decreased in HH (−8%) compared with CON. Body mass (−2%), temperatures (+2–5%), and TS (+18%) were altered in HOT. Furthermore, SaO2 (−8%) and HR (+3%) were changed in HYP. Similar changes in body mass and temperatures, HR, TS, and SaO2 were evident in HH to HOT and HYP, however, blood lactate (p physical performance during iSPT. Future interventions should address the increases in TS and body temperatures, to attenuate these decrements on soccer performance. PMID:26793122

  2. The reliability and validity of a soccer-specific nonmotorised treadmill simulation (intermittent soccer performance test).

    Science.gov (United States)

    Aldous, Jeffrey W F; Akubat, Ibrahim; Chrismas, Bryna C R; Watkins, Samuel L; Mauger, Alexis R; Midgley, Adrian W; Abt, Grant; Taylor, Lee

    2014-07-01

    This study investigated the reliability and validity of a novel nonmotorised treadmill (NMT)-based soccer simulation using a novel activity category called a "variable run" to quantify fatigue during high-speed running. Twelve male University soccer players completed 3 familiarization sessions and 1 peak speed assessment before completing the intermittent soccer performance test (iSPT) twice. The 2 iSPTs were separated by 6-10 days. The total distance, sprint distance, and high-speed running distance (HSD) were 8,968 ± 430 m, 980 ± 75 m and 2,122 ± 140 m, respectively. No significant difference (p > 0.05) was found between repeated trials of the iSPT for all physiological and performance variables. Reliability measures between iSPT1 and iSPT2 showed good agreement (coefficient of variation: 0.80). Furthermore, the variable run phase showed HSD significantly decreased (p ≤ 0.05) in the last 15 minutes (89 ± 6 m) compared with the first 15 minutes (85 ± 7 m), quantifying decrements in high-speed exercise compared with the previous literature. This study validates the iSPT as a NMT-based soccer simulation compared with the previous match-play data and is a reliable tool for assessing and monitoring physiological and performance variables in soccer players. The iSPT could be used in a number of ways including player rehabilitation, understanding the efficacy of nutritional interventions, and also the quantification of environmentally mediated decrements on soccer-specific performance.

  3. Toolbox for Urban Mobility Simulation: High Resolution Population Dynamics for Global Cities

    Science.gov (United States)

    Bhaduri, B. L.; Lu, W.; Liu, C.; Thakur, G.; Karthik, R.

    2015-12-01

    In this rapidly urbanizing world, unprecedented rate of population growth is not only mirrored by increasing demand for energy, food, water, and other natural resources, but has detrimental impacts on environmental and human security. Transportation simulations are frequently used for mobility assessment in urban planning, traffic operation, and emergency management. Previous research, involving purely analytical techniques to simulations capturing behavior, has investigated questions and scenarios regarding the relationships among energy, emissions, air quality, and transportation. Primary limitations of past attempts have been availability of input data, useful "energy and behavior focused" models, validation data, and adequate computational capability that allows adequate understanding of the interdependencies of our transportation system. With increasing availability and quality of traditional and crowdsourced data, we have utilized the OpenStreetMap roads network, and has integrated high resolution population data with traffic simulation to create a Toolbox for Urban Mobility Simulations (TUMS) at global scale. TUMS consists of three major components: data processing, traffic simulation models, and Internet-based visualizations. It integrates OpenStreetMap, LandScanTM population, and other open data (Census Transportation Planning Products, National household Travel Survey, etc.) to generate both normal traffic operation and emergency evacuation scenarios. TUMS integrates TRANSIMS and MITSIM as traffic simulation engines, which are open-source and widely-accepted for scalable traffic simulations. Consistent data and simulation platform allows quick adaption to various geographic areas that has been demonstrated for multiple cities across the world. We are combining the strengths of geospatial data sciences, high performance simulations, transportation planning, and emissions, vehicle and energy technology development to design and develop a simulation

  4. Time Step Considerations when Simulating Dynamic Behavior of High Performance Homes

    Energy Technology Data Exchange (ETDEWEB)

    Tabares-Velasco, Paulo Cesar

    2016-09-01

    Building energy simulations, especially those concerning pre-cooling strategies and cooling/heating peak demand management, require careful analysis and detailed understanding of building characteristics. Accurate modeling of the building thermal response and material properties for thermally massive walls or advanced materials like phase change materials (PCMs) are critically important.

  5. Disruption simulation experiment using high-frequency rastering electron beam as the heat source

    International Nuclear Information System (INIS)

    Yamazaki, S.; Seki, M.

    1987-01-01

    The disruption is a serious event which possibly reduces the lifetime of plasm interactive components, so the effects of the resulting high heat flux on the wall materials must be clearly identified. The authors performed disruption simulation experiments to investigate melting, evaporation, and crack initiation behaviors using an electron beam facility as the heat source. The facility was improved with a high-frequency beam rastering system which provided spatially and temporally uniform heat flux on wider test surfaces. Along with the experiments, thermal and mechanical analyses were also performed. A two-dimensional disruption thermal analysis code (DREAM) was developed for the analyses

  6. Analysis of human performance observed under simulated emergencies of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea; Kim, Jae Whan; Ha, Jae Joo

    2005-01-01

    Previous studies have continuously and commonly revealed that human performance is decisive factor affecting the safety of complicated process systems. Subsequently, extensive effort has been spent to suggest serviceable countermeasures for human performance related problems under emergencies. However, several obstacles including very limited number of available data have hindered researchers from elucidating effective ways to cope with human performance related problems. In this study, human performance data under simulated emergencies have been extracted using a full scope simulator located in the reference NPP. The main purpose of this study is to provide plant-specific and domain-specific human performance data that can be used to premeditate human performance related problems under emergencies. To accomplish this goal, over 100 records that were collected from retraining sessions for licensed MCR operators have been analyzed by the time-line and protocol analysis technique. As a result, many kinds of useful information that can play a remarkable role in scrutinizing human performance related problems have been secured. Although it is still careful to make some predictions about human performance under a real situation on the basis of that under a simulated situation. However, it is also true that the simulator is a basic tool in observing human behaviors under emergencies. Thus, it is strongly believed that human performance data obtained from this study will be a concrete foundation in scrutinizing the change of human performance under emergencies

  7. Educational program in crisis management for cardiac surgery teams including high realism simulation.

    Science.gov (United States)

    Stevens, Louis-Mathieu; Cooper, Jeffrey B; Raemer, Daniel B; Schneider, Robert C; Frankel, Allan S; Berry, William R; Agnihotri, Arvind K

    2012-07-01

    Cardiac surgery demands effective teamwork for safe, high-quality care. The objective of this pilot study was to develop a comprehensive program to sharpen performance of experienced cardiac surgical teams in acute crisis management. We developed and implemented an educational program for cardiac surgery based on high realism acute crisis simulation scenarios and interactive whole-unit workshop. The impact of these interventions was assessed with postintervention questionnaires, preintervention and 6-month postintervention surveys, and structured interviews. The realism of the acute crisis simulation scenarios gradually improved; most participants rated both the simulation and whole-unit workshop as very good or excellent. Repeat simulation training was recommended every 6 to 12 months by 82% of the participants. Participants of the interactive workshop identified 2 areas of highest priority: encouraging speaking up about critical information and interprofessional information sharing. They also stressed the importance of briefings, early communication of surgical plan, knowing members of the team, and continued simulation for practice. The pre/post survey response rates were 70% (55/79) and 66% (52/79), respectively. The concept of working as a team improved between surveys (P = .028), with a trend for improvement in gaining common understanding of the plan before a procedure (P = .075) and appropriate resolution of disagreements (P = .092). Interviewees reported that the training had a positive effect on their personal behaviors and patient care, including speaking up more readily and communicating more clearly. Comprehensive team training using simulation and a whole-unit interactive workshop can be successfully deployed for experienced cardiac surgery teams with demonstrable benefits in participant's perception of team performance. Copyright © 2012 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  8. Performance Simulations for a Spaceborne Methane Lidar Mission

    Science.gov (United States)

    Kiemle, C.; Kawa, Stephan Randolph; Quatrevalet, Mathieu; Browell, Edward V.

    2014-01-01

    Future spaceborne lidar measurements of key anthropogenic greenhouse gases are expected to close current observational gaps particularly over remote, polar, and aerosol-contaminated regions, where actual in situ and passive remote sensing observation techniques have difficulties. For methane, a "Methane Remote Lidar Mission" was proposed by Deutsches Zentrum fuer Luft- und Raumfahrt and Centre National d'Etudes Spatiales in the frame of a German-French climate monitoring initiative. Simulations assess the performance of this mission with the help of Moderate Resolution Imaging Spectroradiometer and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations of the earth's surface albedo and atmospheric optical depth. These are key environmental parameters for integrated path differential absorption lidar which uses the surface backscatter to measure the total atmospheric methane column. Results showthat a lidar with an average optical power of 0.45W at 1.6 µm wavelength and a telescope diameter of 0.55 m, installed on a low Earth orbit platform(506 km), will measure methane columns at precisions of 1.2%, 1.7%, and 2.1% over land, water, and snow or ice surfaces, respectively, for monthly aggregated measurement samples within areas of 50 × 50 km2. Globally, the mean precision for the simulated year 2007 is 1.6%, with a standard deviation of 0.7%. At high latitudes, a lower reflectance due to snow and ice is compensated by denser measurements, owing to the orbital pattern. Over key methane source regions such as densely populated areas, boreal and tropical wetlands, or permafrost, our simulations show that the measurement precision will be between 1 and 2%.

  9. Facility/equipment performance evaluation using microcomputer simulation analysis

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.

    1985-08-01

    A computer simulation analysis model was developed at the Pacific Northwest Laboratory to assist in assuring the adequacy of the Monitored Retrievable Storage facility design to meet the specified spent nuclear fuel throughput requirements. The microcomputer-based model was applied to the analysis of material flow, equipment capability and facility layout. The simulation analysis evaluated uncertainties concerning both facility throughput requirements and process duration times as part of the development of a comprehensive estimate of facility performance. The evaluations provided feedback into the design review task to identify areas where design modifications should be considered

  10. Development and testing of high-performance fuel pin simulators for boiling experiments in liquid metal flow

    International Nuclear Information System (INIS)

    Casal, V.

    1976-01-01

    There are unknown phenomena, about local and integral boiling events in the core of sodium cooled fast breeder reactors. Therefore at GfK depend out-of-pile boiling experiments have been performed using electrically heated dummies of fuel element bundles. The success of these tests and the amount of information derived from them depend exclusively on the successful simulation of the fuel pins by electrically heated rods as regards the essential physical properties. The report deals with the development and testing of heater rods for sodium boiling experiments in bundles including up to 91 heated pins

  11. Numerical simulation and experimental research of the integrated high-power LED radiator

    Science.gov (United States)

    Xiang, J. H.; Zhang, C. L.; Gan, Z. J.; Zhou, C.; Chen, C. G.; Chen, S.

    2017-01-01

    The thermal management has become an urgent problem to be solved with the increasing power and the improving integration of the LED (light emitting diode) chip. In order to eliminate the contact resistance of the radiator, this paper presented an integrated high-power LED radiator based on phase-change heat transfer, which realized the seamless connection between the vapor chamber and the cooling fins. The radiator was optimized by combining the numerical simulation and the experimental research. The effects of the chamber diameter and the parameters of fin on the heat dissipation performance were analyzed. The numerical simulation results were compared with the measured values by experiment. The results showed that the fin thickness, the fin number, the fin height and the chamber diameter were the factors which affected the performance of radiator from primary to secondary.

  12. Neurocognitive Correlates of Young Drivers' Performance in a Driving Simulator.

    Science.gov (United States)

    Guinosso, Stephanie A; Johnson, Sara B; Schultheis, Maria T; Graefe, Anna C; Bishai, David M

    2016-04-01

    Differences in neurocognitive functioning may contribute to driving performance among young drivers. However, few studies have examined this relation. This pilot study investigated whether common neurocognitive measures were associated with driving performance among young drivers in a driving simulator. Young drivers (19.8 years (standard deviation [SD] = 1.9; N = 74)) participated in a battery of neurocognitive assessments measuring general intellectual capacity (Full-Scale Intelligence Quotient, FSIQ) and executive functioning, including the Stroop Color-Word Test (cognitive inhibition), Wisconsin Card Sort Test-64 (cognitive flexibility), and Attention Network Task (alerting, orienting, and executive attention). Participants then drove in a simulated vehicle under two conditions-a baseline and driving challenge. During the driving challenge, participants completed a verbal working memory task to increase demand on executive attention. Multiple regression models were used to evaluate the relations between the neurocognitive measures and driving performance under the two conditions. FSIQ, cognitive inhibition, and alerting were associated with better driving performance at baseline. FSIQ and cognitive inhibition were also associated with better driving performance during the verbal challenge. Measures of cognitive flexibility, orienting, and conflict executive control were not associated with driving performance under either condition. FSIQ and, to some extent, measures of executive function are associated with driving performance in a driving simulator. Further research is needed to determine if executive function is associated with more advanced driving performance under conditions that demand greater cognitive load. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  13. Statistics of Deep Convection in the Congo Basin Derived From High-Resolution Simulations.

    Science.gov (United States)

    White, B.; Stier, P.; Kipling, Z.; Gryspeerdt, E.; Taylor, S.

    2016-12-01

    Convection transports moisture, momentum, heat and aerosols through the troposphere, and so the temporal variability of convection is a major driver of global weather and climate. The Congo basin is home to some of the most intense convective activity on the planet and is under strong seasonal influence of biomass burning aerosol. However, deep convection in the Congo basin remains under studied compared to other regions of tropical storm systems, especially when compared to the neighbouring, relatively well-understood West African climate system. We use the WRF model to perform a high-resolution, cloud-system resolving simulation to investigate convective storm systems in the Congo. Our setup pushes the boundaries of current computational resources, using a 1 km grid length over a domain covering millions of square kilometres and for a time period of one month. This allows us to draw statistical conclusions on the nature of the simulated storm systems. Comparing data from satellite observations and the model enables us to quantify the diurnal variability of deep convection in the Congo basin. This approach allows us to evaluate our simulations despite the lack of in-situ observational data. This provides a more comprehensive analysis of the diurnal cycle than has previously been shown. Further, we show that high-resolution convection-permitting simulations performed over near-seasonal timescales can be used in conjunction with satellite observations as an effective tool to evaluate new convection parameterisations.

  14. Characterization of a novel, highly integrated tubular solid oxide fuel cell system using high-fidelity simulation tools

    Science.gov (United States)

    Kattke, K. J.; Braun, R. J.

    2011-08-01

    A novel, highly integrated tubular SOFC system intended for small-scale power is characterized through a series of sensitivity analyses and parametric studies using a previously developed high-fidelity simulation tool. The high-fidelity tubular SOFC system modeling tool is utilized to simulate system-wide performance and capture the thermofluidic coupling between system components. Stack performance prediction is based on 66 anode-supported tubular cells individually evaluated with a 1-D electrochemical cell model coupled to a 3-D computational fluid dynamics model of the cell surroundings. Radiation is the dominate stack cooling mechanism accounting for 66-92% of total heat loss at the outer surface of all cells at baseline conditions. An average temperature difference of nearly 125 °C provides a large driving force for radiation heat transfer from the stack to the cylindrical enclosure surrounding the tube bundle. Consequently, cell power and voltage disparities within the stack are largely a function of the radiation view factor from an individual tube to the surrounding stack can wall. The cells which are connected in electrical series, vary in power from 7.6 to 10.8 W (with a standard deviation, σ = 1.2 W) and cell voltage varies from 0.52 to 0.73 V (with σ = 81 mV) at the simulation baseline conditions. It is observed that high cell voltage and power outputs directly correspond to tubular cells with the smallest radiation view factor to the enclosure wall, and vice versa for tubes exhibiting low performance. Results also reveal effective control variables and operating strategies along with an improved understanding of the effect that design modifications have on system performance. By decreasing the air flowrate into the system by 10%, the stack can wall temperature increases by about 6% which increases the minimum cell voltage to 0.62 V and reduces deviations in cell power and voltage by 31%. A low baseline fuel utilization is increased by decreasing the

  15. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  16. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  17. Simulation-based Assessment to Reliably Identify Key Resident Performance Attributes.

    Science.gov (United States)

    Blum, Richard H; Muret-Wagstaff, Sharon L; Boulet, John R; Cooper, Jeffrey B; Petrusa, Emil R; Baker, Keith H; Davidyuk, Galina; Dearden, Jennifer L; Feinstein, David M; Jones, Stephanie B; Kimball, William R; Mitchell, John D; Nadelberg, Robert L; Wiser, Sarah H; Albrecht, Meredith A; Anastasi, Amanda K; Bose, Ruma R; Chang, Laura Y; Culley, Deborah J; Fisher, Lauren J; Grover, Meera; Klainer, Suzanne B; Kveraga, Rikante; Martel, Jeffrey P; McKenna, Shannon S; Minehart, Rebecca D; Mitchell, John D; Mountjoy, Jeremi R; Pawlowski, John B; Pilon, Robert N; Shook, Douglas C; Silver, David A; Warfield, Carol A; Zaleski, Katherine L

    2018-04-01

    Obtaining reliable and valid information on resident performance is critical to patient safety and training program improvement. The goals were to characterize important anesthesia resident performance gaps that are not typically evaluated, and to further validate scores from a multiscenario simulation-based assessment. Seven high-fidelity scenarios reflecting core anesthesiology skills were administered to 51 first-year residents (CA-1s) and 16 third-year residents (CA-3s) from three residency programs. Twenty trained attending anesthesiologists rated resident performances using a seven-point behaviorally anchored rating scale for five domains: (1) formulate a clear plan, (2) modify the plan under changing conditions, (3) communicate effectively, (4) identify performance improvement opportunities, and (5) recognize limits. A second rater assessed 10% of encounters. Scores and variances for each domain, each scenario, and the total were compared. Low domain ratings (1, 2) were examined in detail. Interrater agreement was 0.76; reliability of the seven-scenario assessment was r = 0.70. CA-3s had a significantly higher average total score (4.9 ± 1.1 vs. 4.6 ± 1.1, P = 0.01, effect size = 0.33). CA-3s significantly outscored CA-1s for five of seven scenarios and domains 1, 2, and 3. CA-1s had a significantly higher proportion of worrisome ratings than CA-3s (chi-square = 24.1, P < 0.01, effect size = 1.50). Ninety-eight percent of residents rated the simulations more educational than an average day in the operating room. Sensitivity of the assessment to CA-1 versus CA-3 performance differences for most scenarios and domains supports validity. No differences, by experience level, were detected for two domains associated with reflective practice. Smaller score variances for CA-3s likely reflect a training effect; however, worrisome performance scores for both CA-1s and CA-3s suggest room for improvement.

  18. High-fidelity simulation among bachelor students in simulation groups and use of different roles.

    Science.gov (United States)

    Thidemann, Inger-Johanne; Söderhamn, Olle

    2013-12-01

    Cost limitations might challenge the use of high-fidelity simulation as a teaching-learning method. This article presents the results of a Norwegian project including two simulation studies in which simulation teaching and learning were studied among students in the second year of a three-year bachelor nursing programme. The students were organised into small simulation groups with different roles; nurse, physician, family member and observer. Based on experiences in different roles, the students evaluated the simulation design characteristics and educational practices used in the simulation. In addition, three simulation outcomes were measured; knowledge (learning), Student Satisfaction and Self-confidence in Learning. The simulation was evaluated to be a valuable teaching-learning method to develop professional understanding and insight independent of roles. Overall, the students rated the Student Satisfaction and Self-confidence in Learning as high. Knowledge about the specific patient focus increased after the simulation activity. Students can develop practical, communication and collaboration skills, through experiencing the nurse's role. Assuming the observer role, students have the potential for vicarious learning, which could increase the learning value. Both methods of learning (practical experience or vicarious learning) may bridge the gap between theory and practice and contribute to the development of skills in reflective and critical thinking. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Quality and sensitivity of high-resolution numerical simulation of urban heat islands

    Science.gov (United States)

    Li, Dan; Bou-Zeid, Elie

    2014-05-01

    High-resolution numerical simulations of the urban heat island (UHI) effect with the widely-used Weather Research and Forecasting (WRF) model are assessed. Both the sensitivity of the results to the simulation setup, and the quality of the simulated fields as representations of the real world, are investigated. Results indicate that the WRF-simulated surface temperatures are more sensitive to the planetary boundary layer (PBL) scheme choice during nighttime, and more sensitive to the surface thermal roughness length parameterization during daytime. The urban surface temperatures simulated by WRF are also highly sensitive to the urban canopy model (UCM) used. The implementation in this study of an improved UCM (the Princeton UCM or PUCM) that allows the simulation of heterogeneous urban facets and of key hydrological processes, together with the so-called CZ09 parameterization for the thermal roughness length, significantly reduce the bias (Changing UCMs and PBL schemes does not alter the performance of WRF in reproducing bulk boundary layer temperature profiles significantly. The results illustrate the wide range of urban environmental conditions that various configurations of WRF can produce, and the significant biases that should be assessed before inferences are made based on WRF outputs. The optimal set-up of WRF-PUCM developed in this paper also paves the way for a confident exploration of the city-scale impacts of UHI mitigation strategies in the companion paper (Li et al 2014).

  20. Verification of high resolution simulation of precipitation and wind in Portugal

    Science.gov (United States)

    Menezes, Isilda; Pereira, Mário; Moreira, Demerval; Carvalheiro, Luís; Bugalho, Lourdes; Corte-Real, João

    2017-04-01

    Demand of energy and freshwater continues to grow as the global population and demands increase. Precipitation feed the freshwater ecosystems which provides a wealth of goods and services for society and river flow to sustain native species and natural ecosystem functions. The adoption of the wind and hydro-electric power supplies will sustain energy demands/services without restricting the economic growth and accelerated policies scenarios. However, the international meteorological observation network is not sufficiently dense to directly support high resolution climatic research. In this sense, coupled global and regional atmospheric models constitute the most appropriate physical and numerical tool for weather forecasting and downscaling in high resolution grids with the capacity to solve problems resulting from the lack of observed data and measuring errors. Thus, this study aims to calibrate and validate of the WRF regional model from precipitation and wind fields simulation, in high spatial resolution grid cover in Portugal. The simulations were performed in two-way nesting with three grids of increasing resolution (60 km, 20 km and 5 km) and the model performance assessed for the summer and winter months (January and July), using input variables from two different reanalyses and forecasted databases (ERA-Interim and NCEP-FNL) and different forcing schemes. The verification procedure included: (i) the use of several statistics error estimators, correlation based measures and relative errors descriptors; and, (ii) an observed dataset composed by time series of hourly precipitation, wind speed and direction provided by the Portuguese meteorological institute for a comprehensive set of weather stations. Main results suggested the good ability of the WRF to: (i) reproduce the spatial patterns of the mean and total observed fields; (ii) with relatively small values of bias and other errors; and, (iii) and good temporal correlation. These findings are in good

  1. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    Science.gov (United States)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  2. How to use MPI communication in highly parallel climate simulations more easily and more efficiently.

    Science.gov (United States)

    Behrens, Jörg; Hanke, Moritz; Jahns, Thomas

    2014-05-01

    In this talk we present a way to facilitate efficient use of MPI communication for developers of climate models. Exploitation of the performance potential of today's highly parallel supercomputers with real world simulations is a complex task. This is partly caused by the low level nature of the MPI communication library which is the dominant communication tool at least for inter-node communication. In order to manage the complexity of the task, climate simulations with non-trivial communication patterns often use an internal abstraction layer above MPI without exploiting the benefits of communication aggregation or MPI-datatypes. The solution for the complexity and performance problem we propose is the communication library YAXT. This library is built on top of MPI and takes high level descriptions of arbitrary domain decompositions and automatically derives an efficient collective data exchange. Several exchanges can be aggregated in order to reduce latency costs. Examples are given which demonstrate the simplicity and the performance gains for selected climate applications.

  3. Unravelling the structure of matter on high-performance computers

    International Nuclear Information System (INIS)

    Kieu, T.D.; McKellar, B.H.J.

    1992-11-01

    The various phenomena and the different forms of matter in nature are believed to be the manifestation of only a handful set of fundamental building blocks-the elementary particles-which interact through the four fundamental forces. In the study of the structure of matter at this level one has to consider forces which are not sufficiently weak to be treated as small perturbations to the system, an example of which is the strong force that binds the nucleons together. High-performance computers, both vector and parallel machines, have facilitated the necessary non-perturbative treatments. The principles and the techniques of computer simulations applied to Quantum Chromodynamics are explained examples include the strong interactions, the calculation of the mass of nucleons and their decay rates. Some commercial and special-purpose high-performance machines for such calculations are also mentioned. 3 refs., 2 tabs

  4. High performance visual display for HENP detectors

    International Nuclear Information System (INIS)

    McGuigan, Michael; Smith, Gordon; Spiletic, John; Fine, Valeri; Nevski, Pavel

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactive control, including the ability to slice, search and mark areas of the detector. We incorporate the ability to make a high quality still image of a view of the detector and the ability to generate animations and a fly through of the detector and output these to MPEG or VRML models. We develop data compression hardware and software so that remote interactive visualization will be possible among dispersed collaborators. We obtain real time visual display for events accumulated during simulations

  5. Transient performance simulation of aircraft engine integrated with fuel and control systems

    International Nuclear Information System (INIS)

    Wang, C.; Li, Y.G.; Yang, B.Y.

    2017-01-01

    Highlights: • A new performance simulation method for engine hydraulic fuel systems is introduced. • Time delay of engine performance due to fuel system model is noticeable but small. • The method provides details of fuel system behavior in engine transient processes. • The method could be used to support engine and fuel system designs. - Abstract: A new method for the simulation of gas turbine fuel systems based on an inter-component volume method has been developed. It is able to simulate the performance of each of the hydraulic components of a fuel system using physics-based models, which potentially offers more accurate results compared with those using transfer functions. A transient performance simulation system has been set up for gas turbine engines based on an inter-component volume (ICV) method. A proportional-integral (PI) control strategy is used for the simulation of engine controller. An integrated engine and its control and hydraulic fuel systems has been set up to investigate their coupling effect during engine transient processes. The developed simulation system has been applied to a model aero engine. The results show that the delay of the engine transient response due to the inclusion of the fuel system model is noticeable although relatively small. The developed method is generic and can be applied to any other gas turbines and their control and fuel systems.

  6. High-speed LWR transients simulation for optimizing emergency response

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Lekach, S.V.; Mallen, A.N.; Stritar, A.

    1984-01-01

    The purpose of computer-assisted emergency response in nuclear power plants, and the requirements for achieving such a response, are presented. An important requirement is the attainment of realistic high-speed plant simulations at the reactor site. Currently pursued development programs for plant simulations are reviewed. Five modeling principles are established and a criterion is presented for selecting numerical procedures and efficient computer hardware to achieve high-speed simulations. A newly developed technology for high-speed power plant simulation is described and results are presented. It is shown that simulation speeds ten times greater than real-time process-speeds are possible, and that plant instrumentation can be made part of the computational loop in a small, on-site minicomputer. Additional technical issues are presented which must still be resolved before the newly developed technology can be implemented in a nuclear power plant

  7. Imaging Performance Analysis of Simbol-X with Simulations

    Science.gov (United States)

    Chauvin, M.; Roques, J. P.

    2009-05-01

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  8. Imaging Performance Analysis of Simbol-X with Simulations

    International Nuclear Information System (INIS)

    Chauvin, M.; Roques, J. P.

    2009-01-01

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  9. Effects of two doses of alcohol on simulator driving performance in adults with attention-deficit/hyperactivity disorder.

    Science.gov (United States)

    Barkley, Russell A; Murphy, Kevin R; O'Connell, Trisha; Anderson, Deborah; Connor, Daniel F

    2006-01-01

    Prior studies have documented greater impairments in driving performance and greater alcohol consumption among adults with attention-deficit/hyperactivity disorder (ADHD). This study examined whether alcohol consumption produces a differentially greater impairment in driving among adults with ADHD in comparison to a community control group. The present study compared 50 adults with ADHD (mean age 33 years) and 40 control adults (mean age 29 years) on the effects of 2 single, acute doses of alcohol (0.04 and 0.08 blood alcohol concentration) and a placebo on their driving performance. The authors used a virtual reality driving simulator, examiner and self-ratings of simulator performance, and a continuous performance test (CPT) to evaluate attention and inhibition. Approximately half of the adults in each group were randomized to either the low or high dose alcohol treatment arms. Alcohol consumption produced a greater impact on the CPT inattention measures of the ADHD than the control group. Similar results were obtained for the behavioral observations taken during the operation of the driving simulator. Driving simulator scores, however, showed mainly a deleterious effect of alcohol on all participants but no differentially greater effect on the ADHD group. The present results demonstrated that alcohol may have a greater detrimental effect on some aspects of driving performance in ADHD than control adults.

  10. [Acquiring skills in malignant hyperthermia crisis management: comparison of high-fidelity simulation versus computer-based case study].

    Science.gov (United States)

    Mejía, Vilma; Gonzalez, Carlos; Delfino, Alejandro E; Altermatt, Fernando R; Corvetto, Marcia A

    The primary purpose of this study was to compare the effect of high fidelity simulation versus a computer-based case solving self-study, in skills acquisition about malignant hyperthermia on first year anesthesiology residents. After institutional ethical committee approval, 31 first year anesthesiology residents were enrolled in this prospective randomized single-blinded study. Participants were randomized to either a High Fidelity Simulation Scenario or a computer-based Case Study about malignant hyperthermia. After the intervention, all subjects' performance in was assessed through a high fidelity simulation scenario using a previously validated assessment rubric. Additionally, knowledge tests and a satisfaction survey were applied. Finally, a semi-structured interview was done to assess self-perception of reasoning process and decision-making. 28 first year residents finished successfully the study. Resident's management skill scores were globally higher in High Fidelity Simulation versus Case Study, however they were significant in 4 of the 8 performance rubric elements: recognize signs and symptoms (p = 0.025), prioritization of initial actions of management (p = 0.003), recognize complications (p = 0.025) and communication (p = 0.025). Average scores from pre- and post-test knowledge questionnaires improved from 74% to 85% in the High Fidelity Simulation group, and decreased from 78% to 75% in the Case Study group (p = 0.032). Regarding the qualitative analysis, there was no difference in factors influencing the student's process of reasoning and decision-making with both teaching strategies. Simulation-based training with a malignant hyperthermia high-fidelity scenario was superior to computer-based case study, improving knowledge and skills in malignant hyperthermia crisis management, with a very good satisfaction level in anesthesia residents. Copyright © 2018 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights

  11. High-fidelity hybrid simulation of allergic emergencies demonstrates improved preparedness for office emergencies in pediatric allergy clinics.

    Science.gov (United States)

    Kennedy, Joshua L; Jones, Stacie M; Porter, Nicholas; White, Marjorie L; Gephardt, Grace; Hill, Travis; Cantrell, Mary; Nick, Todd G; Melguizo, Maria; Smith, Chris; Boateng, Beatrice A; Perry, Tamara T; Scurlock, Amy M; Thompson, Tonya M

    2013-01-01

    Simulation models that used high-fidelity mannequins have shown promise in medical education, particularly for cases in which the event is uncommon. Allergy physicians encounter emergencies in their offices, and these can be the source of much trepidation. To determine if case-based simulations with high-fidelity mannequins are effective in teaching and retention of emergency management team skills. Allergy clinics were invited to Arkansas Children's Hospital Pediatric Understanding and Learning through Simulation Education center for a 1-day workshop to evaluate skills concerning the management of allergic emergencies. A Clinical Emergency Preparedness Team Performance Evaluation was developed to evaluate the competence of teams in several areas: leadership and/or role clarity, closed-loop communication, team support, situational awareness, and scenario-specific skills. Four cases, which focus on common allergic emergencies, were simulated by using high-fidelity mannequins and standardized patients. Teams were evaluated by multiple reviewers by using video recording and standardized scoring. Ten to 12 months after initial training, an unannounced in situ case was performed to determine retention of the skills training. Clinics showed significant improvements for role clarity, teamwork, situational awareness, and scenario-specific skills during the 1-day workshop (all P clinics (all P ≤ .004). Clinical Emergency Preparedness Team Performance Evaluation scores demonstrated improved team management skills with simulation training in office emergencies. Significant recall of team emergency management skills was demonstrated months after the initial training. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  12. Performance analyses of naval ships based on engineering level of simulation at the initial design stage

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Jeong

    2017-07-01

    Full Text Available Naval ships are assigned many and varied missions. Their performance is critical for mission success, and depends on the specifications of the components. This is why performance analyses of naval ships are required at the initial design stage. Since the design and construction of naval ships take a very long time and incurs a huge cost, Modeling and Simulation (M & S is an effective method for performance analyses. Thus in this study, a simulation core is proposed to analyze the performance of naval ships considering their specifications. This simulation core can perform the engineering level of simulations, considering the mathematical models for naval ships, such as maneuvering equations and passive sonar equations. Also, the simulation models of the simulation core follow Discrete EVent system Specification (DEVS and Discrete Time System Specification (DTSS formalisms, so that simulations can progress over discrete events and discrete times. In addition, applying DEVS and DTSS formalisms makes the structure of simulation models flexible and reusable. To verify the applicability of this simulation core, such a simulation core was applied to simulations for the performance analyses of a submarine in an Anti-SUrface Warfare (ASUW mission. These simulations were composed of two scenarios. The first scenario of submarine diving carried out maneuvering performance analysis by analyzing the pitch angle variation and depth variation of the submarine over time. The second scenario of submarine detection carried out detection performance analysis by analyzing how well the sonar of the submarine resolves adjacent targets. The results of these simulations ensure that the simulation core of this study could be applied to the performance analyses of naval ships considering their specifications.

  13. Driving simulator sickness: Impact on driving performance, influence of blood alcohol concentration, and effect of repeated simulator exposures.

    Science.gov (United States)

    Helland, Arne; Lydersen, Stian; Lervåg, Lone-Eirin; Jenssen, Gunnar D; Mørland, Jørg; Slørdal, Lars

    2016-09-01

    Simulator sickness is a major obstacle to the use of driving simulators for research, training and driver assessment purposes. The purpose of the present study was to investigate the possible influence of simulator sickness on driving performance measures such as standard deviation of lateral position (SDLP), and the effect of alcohol or repeated simulator exposure on the degree of simulator sickness. Twenty healthy male volunteers underwent three simulated driving trials of 1h's duration with a curvy rural road scenario, and rated their degree of simulator sickness after each trial. Subjects drove sober and with blood alcohol concentrations (BAC) of approx. 0.5g/L and 0.9g/L in a randomized order. Simulator sickness score (SSS) did not influence the primary outcome measure SDLP. Higher SSS significantly predicted lower average speed and frequency of steering wheel reversals. These effects seemed to be mitigated by alcohol. Higher BAC significantly predicted lower SSS, suggesting that alcohol inebriation alleviates simulator sickness. The negative relation between the number of previous exposures to the simulator and SSS was not statistically significant, but is consistent with habituation to the sickness-inducing effects, as shown in other studies. Overall, the results suggest no influence of simulator sickness on SDLP or several other driving performance measures. However, simulator sickness seems to cause test subjects to drive more carefully, with lower average speed and fewer steering wheel reversals, hampering the interpretation of these outcomes as measures of driving impairment and safety. BAC and repeated simulator exposures may act as confounding variables by influencing the degree of simulator sickness in experimental studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A virtual reality dental simulator predicts performance in an operative dentistry manikin course.

    Science.gov (United States)

    Imber, S; Shapira, G; Gordon, M; Judes, H; Metzger, Z

    2003-11-01

    This study was designed to test the ability of a virtual reality dental simulator to predict the performance of students in a traditional operative dentistry manikin course. Twenty-six dental students were pre-tested on the simulator, prior to the course. They were briefly instructed and asked to prepare 12 class I cavities which were automatically graded by the simulator. The instructors in the manikin course that followed were unaware of the students' performances in the simulator pre-test. The scores achieved by each student in the last six simulator cavities were compared to their final comprehensive grades in the manikin course. Class standing of the students in the simulator pre-test positively correlated with their achievements in the manikin course with a correlation coefficient of 0.49 (P = 0.012). Eighty-nine percent of the students in the lower third of the class in the pre-test remained in the low performing half of the class in the manikin course. These results indicate that testing students in a dental simulator, prior to a manikin course, may be an efficient way to allow early identification of those who are likely to perform poorly. This in turn could enable early allocation of personal tutors to these students in order to improve their chances of success.

  15. High performance shallow water kernels for parallel overland flow simulations based on FullSWOF2D

    KAUST Repository

    Wittmann, Roland

    2017-01-25

    We describe code optimization and parallelization procedures applied to the sequential overland flow solver FullSWOF2D. Major difficulties when simulating overland flows comprise dealing with high resolution datasets of large scale areas which either cannot be computed on a single node either due to limited amount of memory or due to too many (time step) iterations resulting from the CFL condition. We address these issues in terms of two major contributions. First, we demonstrate a generic step-by-step transformation of the second order finite volume scheme in FullSWOF2D towards MPI parallelization. Second, the computational kernels are optimized by the use of templates and a portable vectorization approach. We discuss the load imbalance of the flux computation due to dry and wet cells and propose a solution using an efficient cell counting approach. Finally, scalability results are shown for different test scenarios along with a flood simulation benchmark using the Shaheen II supercomputer.

  16. A High-Speed Train Operation Plan Inspection Simulation Model

    Directory of Open Access Journals (Sweden)

    Yang Rui

    2018-01-01

    Full Text Available We developed a train operation simulation tool to inspect a train operation plan. In applying an improved Petri Net, the train was regarded as a token, and the line and station were regarded as places, respectively, in accordance with the high-speed train operation characteristics and network function. Location change and running information transfer of the high-speed train were realized by customizing a variety of transitions. The model was built based on the concept of component combination, considering the random disturbance in the process of train running. The simulation framework can be generated quickly and the system operation can be completed according to the different test requirements and the required network data. We tested the simulation tool when used for the real-world Wuhan to Guangzhou high-speed line. The results showed that the proposed model can be developed, the simulation results basically coincide with the objective reality, and it can not only test the feasibility of the high-speed train operation plan, but also be used as a support model to develop the simulation platform with more capabilities.

  17. Analysis of TIMS performance subjected to simulated wind blast

    Science.gov (United States)

    Jaggi, S.; Kuo, S.

    1992-01-01

    The results of the performance of the Thermal Infrared Multispectral Scanner (TIMS) when it is subjected to various wind conditions in the laboratory are described. Various wind conditions were simulated using a 24 inch fan or combinations of air jet streams blowing toward either or both of the blackbody surfaces. The fan was used to simulate a large volume of air flow at moderate speeds (up to 30 mph). The small diameter air jets were used to probe TIMS system response in reaction to localized wind perturbations. The maximum nozzle speed of the air jet was 60 mph. A range of wind directions and speeds were set up in the laboratory during the test. The majority of the wind tests were conducted under ambient conditions with the room temperature fluctuating no more than 2 C. The temperature of the high speed air jet was determined to be within 1 C of the room temperature. TIMS response was recorded on analog tape. Additional thermistor readouts of the blackbody temperatures and thermocouple readout of the ambient temperature were recorded manually to be compared with the housekeeping data recorded on the tape. Additional tests were conducted under conditions of elevated and cooled room temperatures. The room temperature was varied between 19.5 to 25.5 C in these tests. The calibration parameters needed for quantitative analysis of TIMS data were first plotted on a scanline-by-scanline basis. These parameters are the low and high blackbody temperature readings as recorded by the TIMS and their corresponding digitized count values. Using these values, the system transfer equations were calculated. This equation allows us to compute the flux for any video count by computing the slope and intercept of the straight line that relates the flux to the digital count. The actual video of the target (the lab floor in this case) was then compared with a simulated target. This simulated target was assumed to be a blackbody at emissivity of .95 degrees and the temperature was

  18. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests

    International Nuclear Information System (INIS)

    Thien, Mike G.; Barnes, Steve M.

    2013-01-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described

  19. In Patients with Cirrhosis, Driving Simulator Performance is Associated With Real-life Driving

    DEFF Research Database (Denmark)

    Lauridsen, Mette Enok Munk; Thacker, Leroy R; White, Melanie B

    2016-01-01

    BACKGROUND & AIMS: Minimal hepatic encephalopathy (MHE) has been linked to higher real-life rates of automobile crashes and poor performance in driving simulation studies, but the link between driving simulator performance and real-life automobile crashes has not been clearly established. Further......, not all patients with MHE are unsafe drivers, but it is unclear how to distinguish them from unsafe drivers. We investigated the link between performance on driving simulators and real-life automobile accidents and traffic violations. We also aimed to identify features of unsafe drivers with cirrhosis...... and evaluated changes in simulated driving skills and MHE status after 1 year. METHODS: We performed a study of outpatients with cirrhosis (n=205; median 55 years old; median model for end-stage liver disease score, 9.5; none with overt hepatic encephalopathy or alcohol or illicit drug use within previous 6...

  20. High performance data acquisition with InfiniBand

    International Nuclear Information System (INIS)

    Adamczewski, Joern; Essel, Hans G.; Kurz, Nikolaus; Linev, Sergey

    2008-01-01

    For the new experiments at FAIR new concepts of data acquisition systems have to be developed like the distribution of self-triggered, time stamped data streams over high performance networks for event building. In this concept any data filtering is done behind the network. Therefore the network must achieve up to 1 GByte/s bi-directional data transfer per node. Detailed simulations have been done to optimize scheduling mechanisms for such event building networks. For real performance tests InfiniBand has been chosen as one of the fastest available network technology. The measurements of network event building have been performed on different Linux clusters from four to over hundred nodes. Several InfiniBand libraries have been tested like uDAPL, Verbs, or MPI. The tests have been integrated in the data acquisition backbone core software DABC, a general purpose data acquisition library. Detailed results are presented. In the worst cases (over hundred nodes) 50% of the required bandwidth can be already achieved. It seems possible to improve these results by further investigations

  1. A contribution to the development of the modular neutron detector (DEMON): performance evaluation through measurements and simulations; Contribution a la realisation du detecteur modulaire de neutrons (DEMON): etudes des performances par mesures et simulations

    Energy Technology Data Exchange (ETDEWEB)

    Mouatassim, S

    1994-07-01

    The modular neutron detector is dedicated to the study of heavy ion reaction mechanisms. Monte Carlo simulations are performed for the optimization of the NE213 scintillator cell size and the general geometrical setup for the DEMON multidetector of neutrons with a minimum of cross-talk. Tests are performed with various types of photomultiplier tubes and scintillators. Using high energy neutron beams, more than six different reaction processes were identified with pulse shape discrimination by the charge comparison method. Cross sections were estimated. Light yields of charged particles p, d, t and alpha in the NE213 organic scintillator were analyzed using different theoretical approaches, and the intrinsic efficiency of the DEMON`s modules was measured and compared to Monte Carlo calculations. The DEMON experimental filter was simulated and has been associated with the Gemini physical events generator to study the performance of such a multidetector. Thus, the DEMON response for neutron evaporation of excited nuclei and its influence on energy measurement and temperature determination were studied. The same filter was used to simulate pre- and post-fission emission of neutrons for the fission process of the composite {sup 126}Ba system formed in the {sup 19}F + {sup 107}Ag entrance channel. (from author) 70 figs., 99 refs.

  2. 3rd International Conference on High Performance Scientific Computing

    CERN Document Server

    Kostina, Ekaterina; Phu, Hoang; Rannacher, Rolf

    2008-01-01

    This proceedings volume contains a selection of papers presented at the Third International Conference on High Performance Scientific Computing held at the Hanoi Institute of Mathematics, Vietnamese Academy of Science and Technology (VAST), March 6-10, 2006. The conference has been organized by the Hanoi Institute of Mathematics, Interdisciplinary Center for Scientific Computing (IWR), Heidelberg, and its International PhD Program ``Complex Processes: Modeling, Simulation and Optimization'', and Ho Chi Minh City University of Technology. The contributions cover the broad interdisciplinary spectrum of scientific computing and present recent advances in theory, development of methods, and applications in practice. Subjects covered are mathematical modelling, numerical simulation, methods for optimization and control, parallel computing, software development, applications of scientific computing in physics, chemistry, biology and mechanics, environmental and hydrology problems, transport, logistics and site loca...

  3. Simulation of a Local Collision of SC Wall Using High Energy Absorbing Steel

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, H. K.; Chung, C. H.; Park, J.; Lee, J. W. [Dankook University, Yongin (Korea, Republic of); Kim, S. Y. [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-05-15

    Local damage evaluations for nuclear power plant(NPP) design are performed against turbine impact, tornado impact, airplane engine impact, etc., where turbine is a internal source of impact, whereas tornado and airplane engine are external sources of impact. The thickness of NPP wall structure is determined at initial design stage not to be penetrated by local impacts. This study investigated the local damage of NPP substructure against internal turbine impact. Simulation of local collisions of SC wall in NPP structure, which consists of two models: one using general steel and the other using high energy absorbing steel, were performed. The performance of SC wall using ductile high energy absorbing steel can be greatly improved on local collisions when compared with that of general steel

  4. Simulation of a Local Collision of SC Wall Using High Energy Absorbing Steel

    International Nuclear Information System (INIS)

    Yoo, H. K.; Chung, C. H.; Park, J.; Lee, J. W.; Kim, S. Y.

    2011-01-01

    Local damage evaluations for nuclear power plant(NPP) design are performed against turbine impact, tornado impact, airplane engine impact, etc., where turbine is a internal source of impact, whereas tornado and airplane engine are external sources of impact. The thickness of NPP wall structure is determined at initial design stage not to be penetrated by local impacts. This study investigated the local damage of NPP substructure against internal turbine impact. Simulation of local collisions of SC wall in NPP structure, which consists of two models: one using general steel and the other using high energy absorbing steel, were performed. The performance of SC wall using ductile high energy absorbing steel can be greatly improved on local collisions when compared with that of general steel

  5. Flight simulation program for high altitude long endurance unmanned vehicle; Kokodo mujinki no hiko simulation program

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, H.; Hashidate, M. [National Aerospace Laboratory, Tokyo (Japan)

    1995-11-01

    An altitude of about 20 km has the atmospheric density too dilute for common aircraft, and the air resistance too great for satellites. Attention has been drawn in recent years on a high-altitude long-endurance unmanned vehicle that flies at this altitude for a long period of time to serve as a wave relaying base and perform traffic control. Therefore, a development was made on a flight simulation program to evaluate and discuss the guidance and control laws for the high-altitude unmanned vehicle. Equations of motion were derived for three-dimensional six freedom and three-dimensional three freedom. Aerodynamic characteristics of an unmanned vehicle having a Rectenna wing were estimated, and formulation was made according to the past research results on data of winds that the unmanned vehicle is anticipated to encounter at an altitude of 20 km. Noticing the inside of a horizontal plane, a proposal was given on a guidance law that follows a given path. A flight simulation was carried out to have attained a prospect that the unmanned vehicle may be enclosed in a limited space even if the vehicle is encountered with a relatively strong wind. 18 refs., 20 figs., 1 tab.

  6. Correspondence between Simulator and On-Road Drive Performance: Implications for Assessment of Driving Safety.

    Science.gov (United States)

    Aksan, Nazan; Hacker, Sarah D; Sager, Lauren; Dawson, Jeffrey; Anderson, Steven; Rizzo, Matthew

    2016-03-01

    Forty-two younger (Mean age = 35) and 37 older drivers (Mean age = 77) completed four similar simulated drives. In addition, 32 younger and 30 older drivers completed a standard on-road drive in an instrumented vehicle. Performance in the simulated drives was evaluated using both electronic drive data and video-review of errors. Safety errors during the on-road drive were evaluated by a certified driving instructor blind to simulator performance, using state Department of Transportation criteria. We examined the degree of convergence in performance across the two platforms on various driving tasks including lane change, lane keeping, speed control, stopping, turns, and overall performance. Differences based on age group indicated a pattern of strong relative validity for simulator measures. However, relative rank-order in specific metrics of performance suggested a pattern of moderate relative validity. The findings have implications for the use of simulators in assessments of driving safety as well as its use in training and/or rehabilitation settings.

  7. Very high-resolution regional climate simulations over Scandinavia-present climate

    DEFF Research Database (Denmark)

    Christensen, Ole B.; Christensen, Jens H.; Machenhauer, Bennert

    1998-01-01

    realistically simulated. It is found in particular that in mountainous regions the high-resolution simulation shows improvements in the simulation of hydrologically relevant fields such as runoff and snow cover. Also, the distribution of precipitation on different intensity classes is most realistically...... on a high-density station network for the Scandinavian countries compiled for the present study. The simulated runoff is compared with observed data from Sweden extracted from a Swedish climatological atlas. These runoff data indicate that the precipitation analyses are underestimating the true...... simulated in the high-resolution simulation. It does, however, inherit certain large-scale systematic errors from the driving GCM. In many cases these errors increase with increasing resolution. Model verification of near-surface temperature and precipitation is made using a new gridded climatology based...

  8. Numerical simulation and characterization of trapping noise in InGaP-GaAs heterojunctions devices at high injection

    Science.gov (United States)

    Nallatamby, Jean-Christophe; Abdelhadi, Khaled; Jacquet, Jean-Claude; Prigent, Michel; Floriot, Didier; Delage, Sylvain; Obregon, Juan

    2013-03-01

    Commercially available simulators present considerable advantages in performing accurate DC, AC and transient simulations of semiconductor devices, including many fundamental and parasitic effects which are not generally taken into account in house-made simulators. Nevertheless, while the TCAD simulators of the public domain we have tested give accurate results for the simulation of diffusion noise, none of the tested simulators perform trap-assisted GR noise accurately. In order to overcome the aforementioned problem we propose a robust solution to accurately simulate GR noise due to traps. It is based on numerical processing of the output data of one of the simulators available in the public-domain, namely SENTAURUS (from Synopsys). We have linked together, through a dedicated Data Access Component (DAC), the deterministic output data available from SENTAURUS and a powerful, customizable post-processing tool developed on the mathematical SCILAB software package. Thus, robust simulations of GR noise in semiconductor devices can be performed by using GR Langevin sources associated to the scalar Green functions responses of the device. Our method takes advantage of the accuracy of the deterministic simulations of electronic devices obtained with SENTAURUS. A Comparison between 2-D simulations and measurements of low frequency noise on InGaP-GaAs heterojunctions, at low as well as high injection levels, demonstrates the validity of the proposed simulation tool.

  9. Evaluation of simulation-based training on the ability of birth attendants to correctly perform bimanual compression as obstetric first aid.

    Science.gov (United States)

    Andreatta, Pamela; Gans-Larty, Florence; Debpuur, Domitilla; Ofosu, Anthony; Perosky, Joseph

    2011-10-01

    Maternal mortality from postpartum hemorrhage remains high globally, in large part because women give birth in rural communities where unskilled (traditional birth attendants) provide care for delivering mothers. Traditional attendants are neither trained nor equipped to recognize or manage postpartum hemorrhage as a life-threatening emergent condition. Recommended treatment includes using uterotonic agents and physical manipulation to aid uterine contraction. In resource-limited areas where Obstetric first aid may be the only care option, physical methods such as bimanual uterine compression are easily taught, highly practical and if performed correctly, highly effective. A simulator with objective performance feedback was designed to teach skilled and unskilled birth attendants to perform the technique. To evaluate the impact of simulation-based training on the ability of birth attendants to correctly perform bimanual compression in response to postpartum hemorrhage from uterine atony. Simulation-based training was conducted for skilled (N=111) and unskilled birth attendants (N=14) at two regional (Kumasi, Tamale) and two district (Savelugu, Sene) medical centers in Ghana. Training was evaluated using Kirkpatrick's 4-level model. All participants significantly increased their bimanual uterine compression skills after training (p=0.000). There were no significant differences between 2-week delayed post-test performances indicating retention (p=0.52). Applied behavioral and clinical outcomes were reported for 9 months from a subset of birth attendants in Sene District: 425 births, 13 postpartum hemorrhages were reported without concomitant maternal mortality. The results of this study suggest that simulation-based training for skilled and unskilled birth attendants to perform bi-manual uterine compression as postpartum hemorrhage Obstetric first aid leads to improved applied procedural skills. Results from a smaller subset of the sample suggest that these skills

  10. Investigation of high-alpha lateral-directional control power requirements for high-performance aircraft

    Science.gov (United States)

    Foster, John V.; Ross, Holly M.; Ashley, Patrick A.

    1993-01-01

    Designers of the next-generation fighter and attack airplanes are faced with the requirements of good high angle-of-attack maneuverability as well as efficient high speed cruise capability with low radar cross section (RCS) characteristics. As a result, they are challenged with the task of making critical design trades to achieve the desired levels of maneuverability and performance. This task has highlighted the need for comprehensive, flight-validated lateral-directional control power design guidelines for high angles of attack. A joint NASA/U.S. Navy study has been initiated to address this need and to investigate the complex flight dynamics characteristics and controls requirements for high angle-of-attack lateral-directional maneuvering. A multi-year research program is underway which includes groundbased piloted simulation and flight validation. This paper will give a status update of this program that will include a program overview, description of test methodology and preliminary results.

  11. Alternative High-Performance Ceramic Waste Forms

    Energy Technology Data Exchange (ETDEWEB)

    Sundaram, S. K. [Alfred Univ., NY (United States)

    2017-02-01

    This final report (M5NU-12-NY-AU # 0202-0410) summarizes the results of the project titled “Alternative High-Performance Ceramic Waste Forms,” funded in FY12 by the Nuclear Energy University Program (NEUP Project # 12-3809) being led by Alfred University in collaboration with Savannah River National Laboratory (SRNL). The overall focus of the project is to advance fundamental understanding of crystalline ceramic waste forms and to demonstrate their viability as alternative waste forms to borosilicate glasses. We processed single- and multiphase hollandite waste forms based on simulated waste streams compositions provided by SRNL based on the advanced fuel cycle initiative (AFCI) aqueous separation process developed in the Fuel Cycle Research and Development (FCR&D). For multiphase simulated waste forms, oxide and carbonate precursors were mixed together via ball milling with deionized water using zirconia media in a polyethylene jar for 2 h. The slurry was dried overnight and then separated from the media. The blended powders were then subjected to melting or spark plasma sintering (SPS) processes. Microstructural evolution and phase assemblages of these samples were studied using x-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersion analysis of x-rays (EDAX), wavelength dispersive spectrometry (WDS), transmission electron spectroscopy (TEM), selective area x-ray diffraction (SAXD), and electron backscatter diffraction (EBSD). These results showed that the processing methods have significant effect on the microstructure and thus the performance of these waste forms. The Ce substitution into zirconolite and pyrochlore materials was investigated using a combination of experimental (in situ XRD and x-ray absorption near edge structure (XANES)) and modeling techniques to study these single phases independently. In zirconolite materials, a transition from the 2M to the 4M polymorph was observed with increasing Ce content. The resulting

  12. Simulation-Based Stochastic Sensitivity Analysis of a Mach 4.5 Mixed-Compression Intake Performance

    Science.gov (United States)

    Kato, H.; Ito, K.

    2009-01-01

    A sensitivity analysis of a supersonic mixed-compression intake of a variable-cycle turbine-based combined cycle (TBCC) engine is presented. The TBCC engine is de- signed to power a long-range Mach 4.5 transport capable of antipodal missions studied in the framework of an EU FP6 project, LAPCAT. The nominal intake geometry was designed using DLR abpi cycle analysis pro- gram by taking into account various operating require- ments of a typical mission profile. The intake consists of two movable external compression ramps followed by an isolator section with bleed channel. The compressed air is then diffused through a rectangular-to-circular subsonic diffuser. A multi-block Reynolds-averaged Navier- Stokes (RANS) solver with Srinivasan-Tannehill equilibrium air model was used to compute the total pressure recovery and mass capture fraction. While RANS simulation of the nominal intake configuration provides more realistic performance characteristics of the intake than the cycle analysis program, the intake design must also take into account in-flight uncertainties for robust intake performance. In this study, we focus on the effects of the geometric uncertainties on pressure recovery and mass capture fraction, and propose a practical approach to simulation-based sensitivity analysis. The method begins by constructing a light-weight analytical model, a radial-basis function (RBF) network, trained via adaptively sampled RANS simulation results. Using the RBF network as the response surface approximation, stochastic sensitivity analysis is performed using analysis of variance (ANOVA) technique by Sobol. This approach makes it possible to perform a generalized multi-input- multi-output sensitivity analysis based on high-fidelity RANS simulation. The resulting Sobol's influence indices allow the engineer to identify dominant parameters as well as the degree of interaction among multiple parameters, which can then be fed back into the design cycle.

  13. RELAP5: Applications to high fidelity simulation

    International Nuclear Information System (INIS)

    Johnsen, G.W.; Chen, Y.S.

    1988-01-01

    RELAP5 is a pressurized water reactor system transient simulation code for use in nuclear power plant safety analysis. The latest version, MOD2, may be used to simulate and study a wide variety of abnormal events, including loss-of-coolant accidents, operational transients, and transients in which the entire secondary system must be modeled. In this paper, a basic overview of the code is given, its assessment and application illustrated, and progress toward its use as a high fidelity simulator described. 7 refs., 7 figs

  14. Development and Testing of Screen-Based and Psychometric Instruments for Assessing Resident Performance in an Operating Room Simulator

    Directory of Open Access Journals (Sweden)

    Richard R. McNeer

    2016-01-01

    Full Text Available Introduction. Medical simulators are used for assessing clinical skills and increasingly for testing hypotheses. We developed and tested an approach for assessing performance in anesthesia residents using screen-based simulation that ensures expert raters remain blinded to subject identity and experimental condition. Methods. Twenty anesthesia residents managed emergencies in an operating room simulator by logging actions through a custom graphical user interface. Two expert raters rated performance based on these entries using custom Global Rating Scale (GRS and Crisis Management Checklist (CMC instruments. Interrater reliability was measured by calculating intraclass correlation coefficients (ICC, and internal consistency of the instruments was assessed with Cronbach’s alpha. Agreement between GRS and CMC was measured using Spearman rank correlation (SRC. Results. Interrater agreement (GRS: ICC = 0.825, CMC: ICC = 0.878 and internal consistency (GRS: alpha = 0.838, CMC: alpha = 0.886 were good for both instruments. Subscale analysis indicated that several instrument items can be discarded. GRS and CMC scores were highly correlated (SRC = 0.948. Conclusions. In this pilot study, we demonstrated that screen-based simulation can allow blinded assessment of performance. GRS and CMC instruments demonstrated good rater agreement and internal consistency. We plan to further test construct validity of our instruments by measuring performance in our simulator as a function of training level.

  15. Sustained effect of simulation-based ultrasound training on clinical performance

    DEFF Research Database (Denmark)

    Tolsgaard, M G; Ringsted, C; Dreisler, E

    2015-01-01

    on a virtual-reality transvaginal ultrasound simulator until an expert performance level was attained followed by training on a pelvic mannequin. After two months of clinical training, one transvaginal ultrasound scan was recorded for assessment of participants' clinical performance. Two blinded ultrasound...

  16. Association Between Endovascular Performance in a Simulated Setting and in the Catheterization Laboratory

    DEFF Research Database (Denmark)

    Räder, Sune B E W; Abildgaard, Ulrik; Jørgensen, Erik

    2014-01-01

    performance in the catheterization laboratory is not linear. The novel rating scale for CA (CARS) seems to be a valid proficiency assessment instrument in the catheterization laboratory. Familiarity with the simulator may overestimate proficiency, which means that simulator performance as a predictor...

  17. Numerical simulation of aerodynamic performance of a couple multiple units high-speed train

    Science.gov (United States)

    Niu, Ji-qiang; Zhou, Dan; Liu, Tang-hong; Liang, Xi-feng

    2017-05-01

    In order to determine the effect of the coupling region on train aerodynamic performance, and how the coupling region affects aerodynamic performance of the couple multiple units trains when they both run and pass each other in open air, the entrance of two such trains into a tunnel and their passing each other in the tunnel was simulated in Fluent 14.0. The numerical algorithm employed in this study was verified by the data of scaled and full-scale train tests, and the difference lies within an acceptable range. The results demonstrate that the distribution of aerodynamic forces on the train cars is altered by the coupling region; however, the coupling region has marginal effect on the drag and lateral force on the whole train under crosswind, and the lateral force on the train cars is more sensitive to couple multiple units compared to the other two force coefficients. It is also determined that the component of the coupling region increases the fluctuation of aerodynamic coefficients for each train car under crosswind. Affected by the coupling region, a positive pressure pulse was introduced in the alternating pressure produced by trains passing by each other in the open air, and the amplitude of the alternating pressure was decreased by the coupling region. The amplitude of the alternating pressure on the train or on the tunnel was significantly decreased by the coupling region of the train. This phenomenon did not alter the distribution law of pressure on the train and tunnel; moreover, the effect of the coupling region on trains passing by each other in the tunnel is stronger than that on a single train passing through the tunnel.

  18. A high-order particle-in-cell method for low density plasma flow and the simulation of gyrotron resonator devices

    International Nuclear Information System (INIS)

    Stock, Andreas

    2013-01-01

    Within this thesis a parallelized, transient, three-dimensional, high-order discontinuous Galerkin Particle-in-Cell solver is developed and used to simulate the resonant cavity of a gyrotron. The high-order discontinuous Galerkin approach - a Finite-Element type method - provides a fast and efficient algorithm to numerically solve Maxwell's equations used within this thesis. Besides its outstanding dissipation and dispersion properties, the discontinuous Galerkin approach easily allows for using unstructured grids, as required to simulate complex-shaped engineering devices. The discontinuous Galerkin approach approximates a wavelength with significantly less degrees of freedom compared to other methods, e.g. Finite Difference methods. Furthermore, the parallelization capabilities of the discontinuous Galerkin framework are excellent due to the very local dependencies between the elements. These properties are essential for the efficient numerical treatment of the Vlasov-Maxwell system with the Particle-in-Cell method. This system describes the self-consistent interaction of charged particles and the electromagnetic field. As central application within this thesis gyrotron resonators are simulated with the discontinuous Galerkin Particle-in-Cell method on high-performance-computers. The gyrotron is a high-power millimeter wave source, used for the electron cyclotron resonance heating of magnetically confined fusion plasma, e.g. in the Wendelstein 7-X experimental fusion-reactor. Compared to state-of-the-art simulation tools used for the design of gyrotron resonators the Particle-in-Cell method does not use any significant physically simplifications w.r.t. the modelling of the particle-field-interaction, the geometry and the wave-spectrum. Hence, it is the method of choice for validation of current simulation tools being restricted by these simplifications. So far, the Particle-in-Cell method was restricted to be used for demonstration calculations only, because

  19. A high-order particle-in-cell method for low density plasma flow and the simulation of gyrotron resonator devices

    Energy Technology Data Exchange (ETDEWEB)

    Stock, Andreas

    2013-04-26

    Within this thesis a parallelized, transient, three-dimensional, high-order discontinuous Galerkin Particle-in-Cell solver is developed and used to simulate the resonant cavity of a gyrotron. The high-order discontinuous Galerkin approach - a Finite-Element type method - provides a fast and efficient algorithm to numerically solve Maxwell's equations used within this thesis. Besides its outstanding dissipation and dispersion properties, the discontinuous Galerkin approach easily allows for using unstructured grids, as required to simulate complex-shaped engineering devices. The discontinuous Galerkin approach approximates a wavelength with significantly less degrees of freedom compared to other methods, e.g. Finite Difference methods. Furthermore, the parallelization capabilities of the discontinuous Galerkin framework are excellent due to the very local dependencies between the elements. These properties are essential for the efficient numerical treatment of the Vlasov-Maxwell system with the Particle-in-Cell method. This system describes the self-consistent interaction of charged particles and the electromagnetic field. As central application within this thesis gyrotron resonators are simulated with the discontinuous Galerkin Particle-in-Cell method on high-performance-computers. The gyrotron is a high-power millimeter wave source, used for the electron cyclotron resonance heating of magnetically confined fusion plasma, e.g. in the Wendelstein 7-X experimental fusion-reactor. Compared to state-of-the-art simulation tools used for the design of gyrotron resonators the Particle-in-Cell method does not use any significant physically simplifications w.r.t. the modelling of the particle-field-interaction, the geometry and the wave-spectrum. Hence, it is the method of choice for validation of current simulation tools being restricted by these simplifications. So far, the Particle-in-Cell method was restricted to be used for demonstration calculations only, because

  20. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 2 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  1. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 1 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  2. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  3. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC)

    International Nuclear Information System (INIS)

    Schultz, Peter Andrew

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V and V) is required throughout the system to establish evidence-based metrics for the level of confidence in M and S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V and V challenge at the subcontinuum scale, an approach to incorporate V and V concepts into subcontinuum scale modeling and simulation (M and S), and a plan to incrementally incorporate effective V and V into subcontinuum scale M and S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  4. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  5. Feasibility of performing high resolution cloud-resolving simulations of historic extreme events: The San Fruttuoso (Liguria, italy) case of 1915.

    Science.gov (United States)

    Parodi, Antonio; Boni, Giorgio; Ferraris, Luca; Gallus, William; Maugeri, Maurizio; Molini, Luca; Siccardi, Franco

    2017-04-01

    Recent studies show that highly localized and persistent back-building mesoscale convective systems represent one of the most dangerous flash-flood producing storms in the north-western Mediterranean area. Substantial warming of the Mediterranean Sea in recent decades raises concerns over possible increases in frequency or intensity of these types of events as increased atmospheric temperatures generally support increases in water vapor content. Analyses of available historical records do not provide a univocal answer, since these may be likely affected by a lack of detailed observations for older events. In the present study, 20th Century Reanalysis Project initial and boundary condition data in ensemble mode are used to address the feasibility of performing cloud-resolving simulations with 1 km horizontal grid spacing of a historic extreme event that occurred over Liguria (Italy): The San Fruttuoso case of 1915. The proposed approach focuses on the ensemble Weather Research and Forecasting (WRF) model runs, as they are the ones most likely to best simulate the event. It is found that these WRF runs generally do show wind and precipitation fields that are consistent with the occurrence of highly localized and persistent back-building mesoscale convective systems, although precipitation peak amounts are underestimated. Systematic small north-westward position errors with regard to the heaviest rain and strongest convergence areas imply that the Reanalysis members may not be adequately representing the amount of cool air over the Po Plain outflowing into the Liguria Sea through the Apennines gap. Regarding the role of historical data sources, this study shows that in addition to Reanalysis products, unconventional data, such as historical meteorological bulletins, newspapers and even photographs can be very valuable sources of knowledge in the reconstruction of past extreme events.

  6. Lasertron performance simulation

    International Nuclear Information System (INIS)

    Dubrovin, A.; Coulon, J.P.

    1987-05-01

    This report presents a comparative simulation study of the Lasertron at different frequency and emission conditions, in view to establish choice criteria for future experiments. The RING program for these simulations is an improved version of the one presented in an other report. The self-consistent treatment of the R.F. extraction zone is added to it, together with the possibility to vary initial conditions to better describe the laser illumination and the electron extraction from cathode. Plane or curved cathodes are used [fr

  7. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    This presentation will demonstrate how a change in simulation programming paradigm can be exploited to deliver sophisticated simulation capability which is far easier to programme than are conventional models, is capable of exploiting different emerging parallel hardware, and is tailored to the specific needs of geoscientific simulation. Geoscientific simulation represents a grand challenge computational task: many of the largest computers in the world are tasked with this field, and the requirements of resolution and complexity of scientists in this field are far from being sated. However, single thread performance has stalled, even sometimes decreased, over the last decade, and has been replaced by ever more parallel systems: both as conventional multicore CPUs and in the emerging world of accelerators. At the same time, the needs of scientists to couple ever-more complex dynamics and parametrisations into their models makes the model development task vastly more complex. The conventional approach of writing code in low level languages such as Fortran or C/C++ and then hand-coding parallelism for different platforms by adding library calls and directives forces the intermingling of the numerical code with its implementation. This results in an almost impossible set of skill requirements for developers, who must simultaneously be domain science experts, numericists, software engineers and parallelisation specialists. Even more critically, it requires code to be essentially rewritten for each emerging hardware platform. Since new platforms are emerging constantly, and since code owners do not usually control the procurement of the supercomputers on which they must run, this represents an unsustainable development load. The Firedrake system, conversely, offers the developer the opportunity to write PDE discretisations in the high-level mathematical language UFL from the FEniCS project (http://fenicsproject.org). Non-PDE model components, such as parametrisations

  8. PERFORMANCE EVALUATION OF SOLAR COLLECTORS USING A SOLAR SIMULATOR

    Directory of Open Access Journals (Sweden)

    M. Norhafana

    2015-11-01

    Full Text Available Solar water heating systems is one of the applications of solar energy. One of the components of a solar water heating system is a solar collector that consists of an absorber. The performance of the solar water heating system depends on the absorber in the solar collector. In countries with unsuitable weather conditions, the indoor testing of solar collectors with the use of a solar simulator is preferred. Thus, this study is conducted to use a multilayered absorber in the solar collector of a solar water heating system as well as to evaluate the performance of the solar collector in terms of useful heat of the multilayered absorber using the multidirectional ability of a solar simulator at several values of solar radiation. It is operated at three variables of solar radiation of 400 W/m2, 550 W/m2 and 700 W/m2 and using three different positions of angles at 0º, 45º and 90º. The results show that the multilayer absorber in the solar collector is only able to best adapt at 45° of solar simulator with different values of radiation intensity. At this angle the maximum values of useful heat and temperature difference are achieved. KEYWORDS: solar water heating system; solar collector; multilayered absorber; solar simulator; solar radiation 

  9. In-flight simulation of high agility through active control: Taming complexity by design

    Science.gov (United States)

    Padfield, Gareth D.; Bradley, Roy

    1993-01-01

    The motivation for research into helicopter agility stems from the realization that marked improvements relative to current operational types are possible, yet there is a dearth of useful criteria for flying qualities at high performance levels. Several research laboratories are currently investing resources in developing second generation airborne rotorcraft simulators. The UK's focus has been the exploitation of agility through active control technology (ACT); this paper reviews the results of studies conducted to date. The conflict between safety and performance in flight research is highlighted and the various forms of safety net to protect against system failures are described. The role of the safety pilot, and the use of actuator and flight envelope limiting are discussed. It is argued that the deep complexity of a research ACT system can only be tamed through a requirement specification assembled using design principles and cast in an operational simulation form. Work along these lines conducted at DRA is described, including the use of the Jackson System Development method and associated Ada simulation.

  10. SIMULATIONS OF HIGH-VELOCITY CLOUDS. I. HYDRODYNAMICS AND HIGH-VELOCITY HIGH IONS

    International Nuclear Information System (INIS)

    Kwak, Kyujin; Henley, David B.; Shelton, Robin L.

    2011-01-01

    We present hydrodynamic simulations of high-velocity clouds (HVCs) traveling through the hot, tenuous medium in the Galactic halo. A suite of models was created using the FLASH hydrodynamics code, sampling various cloud sizes, densities, and velocities. In all cases, the cloud-halo interaction ablates material from the clouds. The ablated material falls behind the clouds where it mixes with the ambient medium to produce intermediate-temperature gas, some of which radiatively cools to less than 10,000 K. Using a non-equilibrium ionization algorithm, we track the ionization levels of carbon, nitrogen, and oxygen in the gas throughout the simulation period. We present observation-related predictions, including the expected H I and high ion (C IV, N V, and O VI) column densities on sightlines through the clouds as functions of evolutionary time and off-center distance. The predicted column densities overlap those observed for Complex C. The observations are best matched by clouds that have interacted with the Galactic environment for tens to hundreds of megayears. Given the large distances across which the clouds would travel during such time, our results are consistent with Complex C having an extragalactic origin. The destruction of HVCs is also of interest; the smallest cloud (initial mass ∼ 120 M sun ) lost most of its mass during the simulation period (60 Myr), while the largest cloud (initial mass ∼ 4 x 10 5 M sun ) remained largely intact, although deformed, during its simulation period (240 Myr).

  11. Development of High-Performance Cast Crankshafts. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Mark E [General Motors, Detroit, MI (United States)

    2017-03-31

    The objective of this project was to develop technologies that would enable the production of cast crankshafts that can replace high performance forged steel crankshafts. To achieve this, the Ultimate Tensile Strength (UTS) of the new material needs to be 850 MPa with a desired minimum Yield Strength (YS; 0.2% offset) of 615 MPa and at least 10% elongation. Perhaps more challenging, the cast material needs to be able to achieve sufficient local fatigue properties to satisfy the durability requirements in today’s high performance gasoline and diesel engine applications. The project team focused on the development of cast steel alloys for application in crankshafts to take advantage of the higher stiffness over other potential material choices. The material and process developed should be able to produce high-performance crankshafts at no more than 110% of the cost of current production cast units, perhaps the most difficult objective to achieve. To minimize costs, the primary alloy design strategy was to design compositions that can achieve the required properties with minimal alloying and post-casting heat treatments. An Integrated Computational Materials Engineering (ICME) based approach was utilized, rather than relying only on traditional trial-and-error methods, which has been proven to accelerate alloy development time. Prototype melt chemistries designed using ICME were cast as test specimens and characterized iteratively to develop an alloy design within a stage-gate process. Standard characterization and material testing was done to validate the alloy performance against design targets and provide feedback to material design and manufacturing process models. Finally, the project called for Caterpillar and General Motors (GM) to develop optimized crankshaft designs using the final material and manufacturing processing path developed. A multi-disciplinary effort was to integrate finite element analyses by engine designers and geometry-specific casting

  12. Performance evaluation by simulation and analysis with applications to computer networks

    CERN Document Server

    Chen, Ken

    2015-01-01

    This book is devoted to the most used methodologies for performance evaluation: simulation using specialized software and mathematical modeling. An important part is dedicated to the simulation, particularly in its theoretical framework and the precautions to be taken in the implementation of the experimental procedure.  These principles are illustrated by concrete examples achieved through operational simulation languages ​​(OMNeT ++, OPNET). Presented under the complementary approach, the mathematical method is essential for the simulation. Both methodologies based largely on the theory of

  13. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    Science.gov (United States)

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  14. High Performance Marine Vessels

    CERN Document Server

    Yun, Liang

    2012-01-01

    High Performance Marine Vessels (HPMVs) range from the Fast Ferries to the latest high speed Navy Craft, including competition power boats and hydroplanes, hydrofoils, hovercraft, catamarans and other multi-hull craft. High Performance Marine Vessels covers the main concepts of HPMVs and discusses historical background, design features, services that have been successful and not so successful, and some sample data of the range of HPMVs to date. Included is a comparison of all HPMVs craft and the differences between them and descriptions of performance (hydrodynamics and aerodynamics). Readers will find a comprehensive overview of the design, development and building of HPMVs. In summary, this book: Focuses on technology at the aero-marine interface Covers the full range of high performance marine vessel concepts Explains the historical development of various HPMVs Discusses ferries, racing and pleasure craft, as well as utility and military missions High Performance Marine Vessels is an ideal book for student...

  15. Standardised simulation-based emergency and intensive care nursing curriculum to improve nursing students' performance during simulated resuscitation: A quasi-experimental study.

    Science.gov (United States)

    Chen, Jie; Yang, Jian; Hu, Fen; Yu, Si-Hong; Yang, Bing-Xiang; Liu, Qian; Zhu, Xiao-Ping

    2018-03-14

    Simulation-based curriculum has been demonstrated as crucial to nursing education in the development of students' critical thinking and complex clinical skills during a resuscitation simulation. Few studies have comprehensively examined the effectiveness of a standardised simulation-based emergency and intensive care nursing curriculum on the performance of students in a resuscitation simulation. To evaluate the impact of a standardised simulation-based emergency and intensive care nursing curriculum on nursing students' response time in a resuscitation simulation. Two-group, non-randomised quasi-experimental design. A simulation centre in a Chinese University School of Nursing. Third-year nursing students (N = 39) in the Emergency and Intensive Care course were divided into a control group (CG, n = 20) and an experimental group (EG, n = 19). The experimental group participated in a standardised high-technology, simulation-based emergency and intensive care nursing curriculum. The standardised simulation-based curriculum for third-year nursing students consists of three modules: disaster response, emergency care, and intensive care, which include clinical priorities (e.g. triage), basic resuscitation skills, airway/breathing management, circulation management and team work with eighteen lecture hours, six skill-practice hours and twelve simulation hours. The control group took part in the traditional curriculum. This course included the same three modules with thirty-four lecture hours and two skill-practice hours (trauma). Perceived benefits included decreased median (interquartile ranges, IQR) seconds to start compressions [CG 32 (25-75) vs. EG 20 (18-38); p  0.05] and defibrillation [CG 222 (194-254) vs. EG 221 (214-248); p > 0.05] at the beginning of the course. A simulation-based emergency and intensive care nursing curriculum was created and well received by third-year nursing students and associated with decreased response time in a

  16. Numerical simulations of novel high-power high-brightness diode laser structures

    Science.gov (United States)

    Boucke, Konstantin; Rogg, Joseph; Kelemen, Marc T.; Poprawe, Reinhart; Weimann, Guenter

    2001-07-01

    One of the key topics in today's semiconductor laser development activities is to increase the brightness of high-power diode lasers. Although structures showing an increased brightness have been developed specific draw-backs of these structures lead to a still strong demand for investigation of alternative concepts. Especially for the investigation of basically novel structures easy-to-use and fast simulation tools are essential to avoid unnecessary, cost and time consuming experiments. A diode laser simulation tool based on finite difference representations of the Helmholtz equation in 'wide-angle' approximation and the carrier diffusion equation has been developed. An optimized numerical algorithm leads to short execution times of a few seconds per resonator round-trip on a standard PC. After each round-trip characteristics like optical output power, beam profile and beam parameters are calculated. A graphical user interface allows online monitoring of the simulation results. The simulation tool is used to investigate a novel high-power, high-brightness diode laser structure, the so-called 'Z-Structure'. In this structure an increased brightness is achieved by reducing the divergency angle of the beam by angular filtering: The round trip path of the beam is two times folded using internal total reflection at surfaces defined by a small index step in the semiconductor material, forming a stretched 'Z'. The sharp decrease of the reflectivity for angles of incidence above the angle of total reflection leads to a narrowing of the angular spectrum of the beam. The simulations of the 'Z-Structure' indicate an increase of the beam quality by a factor of five to ten compared to standard broad-area lasers.

  17. End-to-end System Performance Simulation: A Data-Centric Approach

    Science.gov (United States)

    Guillaume, Arnaud; Laffitte de Petit, Jean-Luc; Auberger, Xavier

    2013-08-01

    In the early times of space industry, the feasibility of Earth observation missions was directly driven by what could be achieved by the satellite. It was clear to everyone that the ground segment would be able to deal with the small amount of data sent by the payload. Over the years, the amounts of data processed by the spacecrafts have been increasing drastically, leading to put more and more constraints on the ground segment performances - and in particular on timeliness. Nowadays, many space systems require high data throughputs and short response times, with information coming from multiple sources and involving complex algorithms. It has become necessary to perform thorough end-to-end analyses of the full system in order to optimise its cost and efficiency, but even sometimes to assess the feasibility of the mission. This paper presents a novel framework developed by Astrium Satellites in order to meet these needs of timeliness evaluation and optimisation. This framework, named ETOS (for “End-to-end Timeliness Optimisation of Space systems”), provides a modelling process with associated tools, models and GUIs. These are integrated thanks to a common data model and suitable adapters, with the aim of building suitable space systems simulators of the full end-to-end chain. A big challenge of such environment is to integrate heterogeneous tools (each one being well-adapted to part of the chain) into a relevant timeliness simulation.

  18. Problem reporting management system performance simulation

    Science.gov (United States)

    Vannatta, David S.

    1993-01-01

    This paper proposes the Problem Reporting Management System (PRMS) model as an effective discrete simulation tool that determines the risks involved during the development phase of a Trouble Tracking Reporting Data Base replacement system. The model considers the type of equipment and networks which will be used in the replacement system as well as varying user loads, size of the database, and expected operational availability. The paper discusses the dynamics, stability, and application of the PRMS and addresses suggested concepts to enhance the service performance and enrich them.

  19. Optimizing the Performance of Reactive Molecular Dynamics Simulations for Multi-core Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Aktulga, Hasan Metin [Michigan State Univ., East Lansing, MI (United States); Coffman, Paul [Argonne National Lab. (ANL), Argonne, IL (United States); Shan, Tzu-Ray [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Knight, Chris [Argonne National Lab. (ANL), Argonne, IL (United States); Jiang, Wei [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-12-01

    Hybrid parallelism allows high performance computing applications to better leverage the increasing on-node parallelism of modern supercomputers. In this paper, we present a hybrid parallel implementation of the widely used LAMMPS/ReaxC package, where the construction of bonded and nonbonded lists and evaluation of complex ReaxFF interactions are implemented efficiently using OpenMP parallelism. Additionally, the performance of the QEq charge equilibration scheme is examined and a dual-solver is implemented. We present the performance of the resulting ReaxC-OMP package on a state-of-the-art multi-core architecture Mira, an IBM BlueGene/Q supercomputer. For system sizes ranging from 32 thousand to 16.6 million particles, speedups in the range of 1.5-4.5x are observed using the new ReaxC-OMP software. Sustained performance improvements have been observed for up to 262,144 cores (1,048,576 processes) of Mira with a weak scaling efficiency of 91.5% in larger simulations containing 16.6 million particles.

  20. High aspect ratio problem in simulation of a fault current limiter based on superconducting tapes

    Energy Technology Data Exchange (ETDEWEB)

    Velichko, A V; Coombs, T A [Electrical Engineering Division, University of Cambridge (United Kingdom)

    2006-06-15

    We are offering a solution for the high-aspect-ratio problem relevant to the numerical simulation of AC loss in superconductors and metals with high aspect (width-to-thickness) ratio. This is particularly relevant to simulation of fault current limiters (FCLs) based on second generation YBCO tapes on RABiTS. By assuming a linear scaling of the electric and thermal properties with the size of the structure, we can replace the real sample with an effective sample of a reduced aspect ratio by introducing size multipliers into the equations that govern the physics of the system. The simulation is performed using both a proprietary equivalent circuit software and a commercial FEM software. The correctness of the procedure is verified by simulating temperature and current distributions for samples with all three dimensions varying within 10{sup -3}-10{sup 3} of the original size. Qualitatively the distributions for the original and scaled samples are indistinguishable, whereas quantitative differences in the worst case do not exceed 10%.

  1. High aspect ratio problem in simulation of a fault current limiter based on superconducting tapes

    International Nuclear Information System (INIS)

    Velichko, A V; Coombs, T A

    2006-01-01

    We are offering a solution for the high-aspect-ratio problem relevant to the numerical simulation of AC loss in superconductors and metals with high aspect (width-to-thickness) ratio. This is particularly relevant to simulation of fault current limiters (FCLs) based on second generation YBCO tapes on RABiTS. By assuming a linear scaling of the electric and thermal properties with the size of the structure, we can replace the real sample with an effective sample of a reduced aspect ratio by introducing size multipliers into the equations that govern the physics of the system. The simulation is performed using both a proprietary equivalent circuit software and a commercial FEM software. The correctness of the procedure is verified by simulating temperature and current distributions for samples with all three dimensions varying within 10 -3 -10 3 of the original size. Qualitatively the distributions for the original and scaled samples are indistinguishable, whereas quantitative differences in the worst case do not exceed 10%

  2. The current role of simulators for performance evaluations and licensing (case of Mexico)

    International Nuclear Information System (INIS)

    Maldonado A, H.

    1997-01-01

    The main purpose of this paper is to share the experience acquired by the National Commission of Nuclear Safety and Safeguards (CNSNS) during the administration of both certification and licensing operational exams applied to Senior Reactor Operators (SRO) and Reactor OPerators (RO) by using a full-scope simulator. The licensing operational exams are administered to examinate candidates for a SRO or RO license while the certification operational exams are administered to all personnel that possess a SRO or RO license in order to renew their licenses within a six years period. A general description with the most important simulator antecedents from the initial authorization for its usage to provide the ''Initial Simulator Training Course'' until currently in which it has been started the installation and testing performance of a new computer equipment that will improve and increase the simulation capacity of the Laguna Verde Nuclear Power Plant (LVNPP) simulator. In other hand, it is showed the process that the CNSNS will apply during the next verification of the simulator certification which should be performed due to the old computer equipment is being replaced with a more modern computer equipment so that the simulation capability will be improved. The verification process was discussed with the utility personnel and as result of this an agreement has been established to carry into effect this hard task. Finally, the conclusions and recommendations from regulators point of view are presented regarding to the importance of perform a well both evaluation and verification of simulators performance. (author)

  3. The experiences of last-year student midwives with High-Fidelity Perinatal Simulation training: A qualitative descriptive study.

    Science.gov (United States)

    Vermeulen, Joeri; Beeckman, Katrien; Turcksin, Rivka; Van Winkel, Lies; Gucciardo, Léonardo; Laubach, Monika; Peersman, Wim; Swinnen, Eva

    2017-06-01

    Simulation training is a powerful and evidence-based teaching method in healthcare. It allows students to develop essential competences that are often difficult to achieve during internships. High-Fidelity Perinatal Simulation exposes them to real-life scenarios in a safe environment. Although student midwives' experiences need to be considered to make the simulation training work, these have been overlooked so far. To explore the experiences of last-year student midwives with High-Fidelity Perinatal Simulation training. A qualitative descriptive study, using three focus group conversations with last-year student midwives (n=24). Audio tapes were transcribed and a thematic content analysis was performed. The entire data set was coded according to recurrent or common themes. To achieve investigator triangulation and confirm themes, discussions among the researchers was incorporated in the analysis. Students found High-Fidelity Perinatal Simulation training to be a positive learning method that increased both their competence and confidence. Their experiences varied over the different phases of the High-Fidelity Perinatal Simulation training. Although uncertainty, tension, confusion and disappointment were experienced throughout the simulation trajectory, they reported that this did not affect their learning and confidence-building. As High-Fidelity Perinatal Simulation training constitutes a helpful learning experience in midwifery education, it could have a positive influence on maternal and neonatal outcomes. In the long term, it could therefore enhance the midwifery profession in several ways. The present study is an important first step in opening up the debate about the pedagogical use of High-Fidelity Perinatal Simulation training within midwifery education. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  4. HIGH-FIDELITY SIMULATION-DRIVEN MODEL DEVELOPMENT FOR COARSE-GRAINED COMPUTATIONAL FLUID DYNAMICS

    Energy Technology Data Exchange (ETDEWEB)

    Hanna, Botros N.; Dinh, Nam T.; Bolotnov, Igor A.

    2016-06-01

    Nuclear reactor safety analysis requires identifying various credible accident scenarios and determining their consequences. For a full-scale nuclear power plant system behavior, it is impossible to obtain sufficient experimental data for a broad range of risk-significant accident scenarios. In single-phase flow convective problems, Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES) can provide us with high fidelity results when physical data are unavailable. However, these methods are computationally expensive and cannot be afforded for simulation of long transient scenarios in nuclear accidents despite extraordinary advances in high performance scientific computing over the past decades. The major issue is the inability to make the transient computation parallel, thus making number of time steps required in high-fidelity methods unaffordable for long transients. In this work, we propose to apply a high fidelity simulation-driven approach to model sub-grid scale (SGS) effect in Coarse Grained Computational Fluid Dynamics CG-CFD. This approach aims to develop a statistical surrogate model instead of the deterministic SGS model. We chose to start with a turbulent natural convection case with volumetric heating in a horizontal fluid layer with a rigid, insulated lower boundary and isothermal (cold) upper boundary. This scenario of unstable stratification is relevant to turbulent natural convection in a molten corium pool during a severe nuclear reactor accident, as well as in containment mixing and passive cooling. The presented approach demonstrates how to create a correction for the CG-CFD solution by modifying the energy balance equation. A global correction for the temperature equation proves to achieve a significant improvement to the prediction of steady state temperature distribution through the fluid layer.

  5. Performance of highly connected photonic switching lossless metro-access optical networks

    Science.gov (United States)

    Martins, Indayara Bertoldi; Martins, Yara; Barbosa, Felipe Rudge

    2018-03-01

    The present work analyzes the performance of photonic switching networks, optical packet switching (OPS) and optical burst switching (OBS), in mesh topology of different sizes and configurations. The "lossless" photonic switching node is based on a semiconductor optical amplifier, demonstrated and validated with experimental results on optical power gain, noise figure, and spectral range. The network performance was evaluated through computer simulations based on parameters such as average number of hops, optical packet loss fraction, and optical transport delay (Am). The combination of these elements leads to a consistent account of performance, in terms of network traffic and packet delivery for OPS and OBS metropolitan networks. Results show that a combination of highly connected mesh topologies having an ingress e-buffer present high efficiency and throughput, with very low packet loss and low latency, ensuring fast data delivery to the final receiver.

  6. In Patients With Cirrhosis, Driving Simulator Performance Is Associated With Real-life Driving.

    Science.gov (United States)

    Lauridsen, Mette M; Thacker, Leroy R; White, Melanie B; Unser, Ariel; Sterling, Richard K; Stravitz, Richard T; Matherly, Scott; Puri, Puneet; Sanyal, Arun J; Gavis, Edith A; Luketic, Velimir; Siddiqui, Muhammad S; Heuman, Douglas M; Fuchs, Michael; Bajaj, Jasmohan S

    2016-05-01

    Minimal hepatic encephalopathy (MHE) has been linked to higher real-life rates of automobile crashes and poor performance in driving simulation studies, but the link between driving simulator performance and real-life automobile crashes has not been clearly established. Furthermore, not all patients with MHE are unsafe drivers, but it is unclear how to distinguish them from unsafe drivers. We investigated the link between performance on driving simulators and real-life automobile accidents and traffic violations. We also aimed to identify features of unsafe drivers with cirrhosis and evaluated changes in simulated driving skills and MHE status after 1 year. We performed a study of outpatients with cirrhosis (n = 205; median 55 years old; median model for end-stage liver disease score, 9.5; none with overt hepatic encephalopathy or alcohol or illicit drug use within previous 6 months) seen at the Virginia Commonwealth University and McGuire Veterans Administration Medical Center, from November 2008 through April 2014. All participants were given paper-pencil tests to diagnose MHE (98 had MHE; 48%), and 163 patients completed a standardized driving simulation. Data were collected on traffic violations and automobile accidents from the Virginia Department of Motor Vehicles and from participants' self-assessments when they entered the study, and from 73 participants 1 year later. Participants also completed a questionnaire about alcohol use and cessation patterns. The driving simulator measured crashes, run-time, road center and edge excursions, and illegal turns during navigation; before and after each driving simulation session, patients were asked to rate their overall driving skills. Drivers were classified as safe or unsafe based on crashes and violations reported on official driving records; simulation results were compared with real-life driving records. Multivariable regression analyses of real-life crashes and violations was performed using data on

  7. The effects of anticipating a high-stress task on sleep and performance during simulated on-call work.

    Science.gov (United States)

    Sprajcer, Madeline; Jay, Sarah M; Vincent, Grace E; Vakulin, Andrew; Lack, Leon; Ferguson, Sally A

    2018-04-22

    On-call work is used to manage around the clock working requirements in a variety of industries. Often, tasks that must be performed while on-call are highly important, difficult and/or stressful by nature and, as such, may impact the level of anxiety that is experienced by on-call workers. Heightened anxiety is associated with poor sleep, which affects next-day cognitive performance. Twenty-four male participants (20-35 years old) spent an adaptation, a control and two counterbalanced on-call nights in a time-isolated sleep laboratory. On one of the on-call nights they were told that they would be required to do a speech upon waking (high-stress condition), whereas on the other night they were instructed that they would be required to read to themselves (low-stress condition). Pre-bed anxiety was measured by the State Trait Anxiety Inventory form x-1, and polysomnography and quantitative electroencephalogram analyses were used to investigate sleep. Performance was assessed across each day using the 10-min psychomotor vigilance task (09:30 hours, 12:00 hours, 14:30 hours, 17:00 hours). The results indicated that participants experienced no significant changes in pre-bed anxiety or sleep between conditions. However, performance on the psychomotor vigilance task was best in the high-stress condition, possibly as a result of heightened physiological arousal caused by performing the stressful task that morning. This suggests that performing a high-stress task may be protective of cognitive performance to some degree when sleep is not disrupted. © 2018 European Sleep Research Society.

  8. High Performance Computation of a Jet in Crossflow by Lattice Boltzmann Based Parallel Direct Numerical Simulation

    Directory of Open Access Journals (Sweden)

    Jiang Lei

    2015-01-01

    Full Text Available Direct numerical simulation (DNS of a round jet in crossflow based on lattice Boltzmann method (LBM is carried out on multi-GPU cluster. Data parallel SIMT (single instruction multiple thread characteristic of GPU matches the parallelism of LBM well, which leads to the high efficiency of GPU on the LBM solver. With present GPU settings (6 Nvidia Tesla K20M, the present DNS simulation can be completed in several hours. A grid system of 1.5 × 108 is adopted and largest jet Reynolds number reaches 3000. The jet-to-free-stream velocity ratio is set as 3.3. The jet is orthogonal to the mainstream flow direction. The validated code shows good agreement with experiments. Vortical structures of CRVP, shear-layer vortices and horseshoe vortices, are presented and analyzed based on velocity fields and vorticity distributions. Turbulent statistical quantities of Reynolds stress are also displayed. Coherent structures are revealed in a very fine resolution based on the second invariant of the velocity gradients.

  9. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    Directory of Open Access Journals (Sweden)

    M. S. Mizielinski

    2014-08-01

    Full Text Available The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3 atmosphere-only global climate simulations over the period 1985–2011, at resolutions of N512 (25 km, N216 (60 km and N96 (130 km as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe in 2012, with additional resources supplied by the Natural Environment Research Council (NERC and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS, and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  10. Pharmacy practice simulations: performance of senior pharmacy students at a University in southern Brazil

    Directory of Open Access Journals (Sweden)

    Galato D

    2011-09-01

    Full Text Available Objective: A simulation process known as objective structured clinical examination (OSCE was applied to assess pharmacy practice performed by senior pharmacy students.Methods: A cross-sectional study was conducted based on documentary analysis of performance evaluation records of pharmacy practice simulations that occurred between 2005 and 2009. These simulations were related to the process of self-medication and dispensing, and were performed with the use of patients simulated. The simulations were filmed to facilitate the evaluation process. It presents the OSCE educational experience performed by pharmacy trainees of the University of Southern Santa Catarina and experienced by two evaluators. The student general performance was analyzed, and the criteria for pharmacy practice assessment often identified trainees in difficulty.Results: The results of 291 simulations showed that students have an average yield performance of 70.0%. Several difficulties were encountered, such as the lack of information about the selected/prescribed treatment regimen (65.1%; inadequate communication style (21.9%; lack of identification of patients’ needs (7.7% and inappropriate drug selection for self-medication (5.3%.Conclusions: These data show that there is a need for reorientation of clinical pharmacy students because they need to improve their communication skills, and have a deeper knowledge of medicines and health problems in order to properly orient their patients.

  11. Integrated Simulation for HVAC Performance Prediction: State-of-the-Art Illustration

    NARCIS (Netherlands)

    Hensen, J.L.M.; Clarke, J.A.

    2000-01-01

    This paper aims to outline the current state-of-the-art in integrated building simulation for performance prediction of heating, ventilating and air-conditioning (HVAC) systems. The ESP-r system is used as an example where integrated simulation is a core philosophy behind the development. The

  12. Simulation of the behaviour of nuclear fuel under high burnup conditions

    International Nuclear Information System (INIS)

    Soba, Alejandro; Lemes, Martin; González, Martin Emilio; Denis, Alicia; Romero, Luis

    2014-01-01

    Highlights: • Increasing the time of nuclear fuel into reactor generates high burnup structure. • We analyze model to simulate high burnup scenarios for UO 2 nuclear fuel. • We include these models in the DIONISIO 2.0 code. • Tests of our models are in very good agreement with experimental data. • We extend the range of predictability of our code up to 60 MWd/KgU average. - Abstract: In this paper we summarize all the models included in the latest version of the DIONISIO code related to the high burnup scenario. Due to the extension of nuclear fuels permanence under irradiation, physical and chemical modifications are developed in the fuel material, especially in the external corona of the pellet. The codes devoted to simulation of the rod behaviour under irradiation need to introduce modifications and new models in order to describe those phenomena and be capable to predict the behaviour in all the range of a general pressurized water reactor. A complex group of subroutines has been included in the code in order to predict the radial distribution of power density, burnup, concentration of diverse nuclides and porosity within the pellet. The behaviour of gadolinium as burnable poison also is modelled into the code. The results of some of the simulations performed with DIONISIO are presented to show the good agreement with the data selected for the FUMEX I/II/III exercises, compiled in the NEA data bank

  13. Development and testing of high performance pseudo random number generator for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Chakraborty, Brahmananda

    2009-01-01

    Random number plays an important role in any Monte Carlo simulation. The accuracy of the results depends on the quality of the sequence of random numbers employed in the simulation. These include randomness of the random numbers, uniformity of their distribution, absence of correlation and long period. In a typical Monte Carlo simulation of particle transport in a nuclear reactor core, the history of a particle from its birth in a fission event until its death by an absorption or leakage event is tracked. The geometry of the core and the surrounding materials are exactly modeled in the simulation. To track a neutron history one needs random numbers for determining inter collision distance, nature of the collision, the direction of the scattered neutron etc. Neutrons are tracked in batches. In one batch approximately 2000-5000 neutrons are tracked. The statistical accuracy of the results of the simulation depends on the total number of particles (number of particles in one batch multiplied by the number of batches) tracked. The number of histories to be generated is usually large for a typical radiation transport problem. To track a very large number of histories one needs to generate a long sequence of independent random numbers. In other words the cycle length of the random number generator (RNG) should be more than the total number of random numbers required for simulating the given transport problem. The number of bits of the machine generally limits the cycle length. For a binary machine of p bits the maximum cycle length is 2 p . To achieve higher cycle length in the same machine one has to use either register arithmetic or bit manipulation technique

  14. School physics teacher class management, laboratory practice, student engagement, critical thinking, cooperative learning and use of simulations effects on student performance

    Science.gov (United States)

    Riaz, Muhammad

    The purpose of this study was to examine how simulations in physics class, class management, laboratory practice, student engagement, critical thinking, cooperative learning, and use of simulations predicted the percentage of students achieving a grade point average of B or higher and their academic performance as reported by teachers in secondary school physics classes. The target population consisted of secondary school physics teachers who were members of Science Technology, Engineeering and,Mathematics Teachers of New York City (STEMteachersNYC) and American Modeling Teachers Association (AMTA). They used simulations in their physics classes in the 2013 and 2014 school years. Subjects for this study were volunteers. A survey was constructed based on a literature review. Eighty-two physics teachers completed the survey about instructional practice in physics. All respondents were anonymous. Classroom management was the only predictor of the percent of students achieving a grade point average of B or higher in high school physics class. Cooperative learning, use of simulations, and student engagement were predictors of teacher's views of student academic performance in high school physics class. All other variables -- class management, laboratory practice, critical thinking, and teacher self-efficacy -- were not predictors of teacher's views of student academic performance in high school physics class. The implications of these findings were discussed and recommendations for physics teachers to improve student learning were presented.

  15. High fidelity simulation effectiveness in nursing students' transfer of learning.

    Science.gov (United States)

    Kirkman, Tera R

    2013-07-13

    Members of nursing faculty are utilizing interactive teaching tools to improve nursing student's clinical judgment; one method that has been found to be potentially effective is high fidelity simulation (HFS). The purpose of this time series design study was to determine whether undergraduate nursing students were able to transfer knowledge and skills learned from classroom lecture and a HFS clinical to the traditional clinical setting. Students (n=42) were observed and rated on their ability to perform a respiratory assessment. The observations and ratings took place at the bedside, prior to a respiratory lecture, following the respiratory lecture, and following simulation clinical. The findings indicated that there was a significant difference (p=0.000) in transfer of learning demonstrated over time. Transfer of learning was demonstrated and the use of HFS was found to be an effective learning and teaching method. Implications of results are discussed.

  16. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests - 13342

    Energy Technology Data Exchange (ETDEWEB)

    Thien, Mike G. [Washington River Protection Solutions, LLC, P.O Box 850, Richland WA, 99352 (United States); Barnes, Steve M. [Waste Treatment Plant, 2435 Stevens Center Place, Richland WA 99354 (United States)

    2013-07-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described. (authors)

  17. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests - 13342

    International Nuclear Information System (INIS)

    Thien, Mike G.; Barnes, Steve M.

    2013-01-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described. (authors)

  18. Alcohol consumption for simulated driving performance: A systematic review

    Directory of Open Access Journals (Sweden)

    Mohammad Saeid Rezaee-Zavareh

    2017-06-01

    Conclusion: Alcohol consumption may decrease simulated driving performance in alcohol consumed people compared with non-alcohol consumed people via changes in SDSD, LPSD, speed, MLPD, LC and NA. More well-designed randomized controlled clinical trials are recommended.

  19. Very high performance pseudo-random number generation on DAP

    Science.gov (United States)

    Smith, K. A.; Reddaway, S. F.; Scott, D. M.

    1985-07-01

    Since the National DAP Service began at QMC in 1980, extensive use has been made of pseudo-random numbers in Monte Carlo simulation. Matrices of uniform numbers have been produced by various generators: (a) multiplicative ( x+ 1 = 13 13xn mod 2 59); (b) very long period shift register ( x4423 + x271 + 1); (c) multiple shorter period ( x127 + x7 + 1) shift registers generating several matrices per iteration. The above uniform generators can also feed a normal distribution generator that uses the Box-Muller transformation. This paper describes briefly the generators, their implementation and speed. Generator (b) has been greatly speeded-up by re-implementation, and now produces more than 100 × 10 6 high quality 16-bit numbers/s. Generator (c) is under development and will achieve even higher performance, mainly due to producing data in greater bulk. High quality numbers are expected, and performance will range from 400 to 800 × 10 6 numbers/s, depending on how the generator is used.

  20. Enhancement of High-Intensity Actions and Physical Performance During a Simulated Brazilian Jiu-Jitsu Competition With a Moderate Dose of Caffeine.

    Science.gov (United States)

    Diaz-Lara, Francisco Javier; Del Coso, Juan; Portillo, Javier; Areces, Francisco; García, Jose Manuel; Abián-Vicén, Javier

    2016-10-01

    Although caffeine is one of the most commonly used substances in combat sports, information about its ergogenic effects on these disciplines is very limited. To determine the effectiveness of ingesting a moderate dose of caffeine to enhance overall performance during a simulated Brazilian jiu-jitsu (BJJ) competition. Fourteen elite BJJ athletes participated in a double-blind, placebo-controlled experimental design. In a random order, the athletes ingested either 3 mg/kg body mass of caffeine or a placebo (cellulose, 0 mg/kg) and performed 2 simulated BJJ combats (with 20 min rest between them), following official BJJ rules. Specific physical tests such as maximal handgrip dynamometry, maximal height during a countermovement jump, permanence during a maximal static-lift test, peak power in a bench-press exercise, and blood lactate concentration were measured at 3 specific times: before the first combat and immediately after the first and second combats. The combats were video-recorded to analyze fight actions. After the caffeine ingestion, participants spent more time in offensive actions in both combats and revealed higher blood lactate values (P Performance in all physical tests carried out before the first combat was enhanced with caffeine (P caffeine and placebo. Caffeine might be an effective ergogenic aid for improving intensity and physical performance during successive elite BJJ combats.

  1. Performance of the general circulation models in simulating temperature and precipitation over Iran

    Science.gov (United States)

    Abbasian, Mohammadsadegh; Moghim, Sanaz; Abrishamchi, Ahmad

    2018-03-01

    General Circulation Models (GCMs) are advanced tools for impact assessment and climate change studies. Previous studies show that the performance of the GCMs in simulating climate variables varies significantly over different regions. This study intends to evaluate the performance of the Coupled Model Intercomparison Project phase 5 (CMIP5) GCMs in simulating temperature and precipitation over Iran. Simulations from 37 GCMs and observations from the Climatic Research Unit (CRU) were obtained for the period of 1901-2005. Six measures of performance including mean bias, root mean square error (RMSE), Nash-Sutcliffe efficiency (NSE), linear correlation coefficient (r), Kolmogorov-Smirnov statistic (KS), Sen's slope estimator, and the Taylor diagram are used for the evaluation. GCMs are ranked based on each statistic at seasonal and annual time scales. Results show that most GCMs perform reasonably well in simulating the annual and seasonal temperature over Iran. The majority of the GCMs have a poor skill to simulate precipitation, particularly at seasonal scale. Based on the results, the best GCMs to represent temperature and precipitation simulations over Iran are the CMCC-CMS (Euro-Mediterranean Center on Climate Change) and the MRI-CGCM3 (Meteorological Research Institute), respectively. The results are valuable for climate and hydrometeorological studies and can help water resources planners and managers to choose the proper GCM based on their criteria.

  2. High-Performance Data Converters

    DEFF Research Database (Denmark)

    Steensgaard-Madsen, Jesper

    -resolution internal D/A converters are required. Unit-element mismatch-shaping D/A converters are analyzed, and the concept of mismatch-shaping is generalized to include scaled-element D/A converters. Several types of scaled-element mismatch-shaping D/A converters are proposed. Simulations show that, when implemented...... in a standard CMOS technology, they can be designed to yield 100 dB performance at 10 times oversampling. The proposed scaled-element mismatch-shaping D/A converters are well suited for use as the feedback stage in oversampled delta-sigma quantizers. It is, however, not easy to make full use of their potential......-order difference of the output signal from the loop filter's first integrator stage. This technique avoids the need for accurate matching of analog and digital filters that characterizes the MASH topology, and it preserves the signal-band suppression of quantization errors. Simulations show that quantizers...

  3. Computer simulation of steady-state performance of air-to-air heat pumps

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, R D; Creswick, F A

    1978-03-01

    A computer model by which the performance of air-to-air heat pumps can be simulated is described. The intended use of the model is to evaluate analytically the improvements in performance that can be effected by various component improvements. The model is based on a trio of independent simulation programs originated at the Massachusetts Institute of Technology Heat Transfer Laboratory. The three programs have been combined so that user intervention and decision making between major steps of the simulation are unnecessary. The program was further modified by substituting a new compressor model and adding a capillary tube model, both of which are described. Performance predicted by the computer model is shown to be in reasonable agreement with performance data observed in our laboratory. Planned modifications by which the utility of the computer model can be enhanced in the future are described. User instructions and a FORTRAN listing of the program are included.

  4. Novel high-fidelity realistic explosion damage simulation for urban environments

    Science.gov (United States)

    Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya

    2010-04-01

    Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.

  5. Measurement and numerical simulation of high intensity focused ultrasound field in water

    Science.gov (United States)

    Lee, Kang Il

    2017-11-01

    In the present study, the acoustic field of a high intensity focused ultrasound (HIFU) transducer in water was measured by using a commercially available needle hydrophone intended for HIFU use. To validate the results of hydrophone measurements, numerical simulations of HIFU fields were performed by integrating the axisymmetric Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation from the frequency-domain perspective with the help of a MATLAB-based software package developed for HIFU simulation. Quantitative values for the focal waveforms, the peak pressures, and the size of the focal spot were obtained in various regimes of linear, quasilinear, and nonlinear propagation up to the source pressure levels when the shock front was formed in the waveform. The numerical results with the HIFU simulator solving the KZK equation were compared with the experimental data and found to be in good agreement. This confirms that the numerical simulation based on the KZK equation is capable of capturing the nonlinear pressure field of therapeutic HIFU transducers well enough to make it suitable for HIFU treatment planning.

  6. Three-Dimensional Unsteady Simulation of Aerodynamics and Heat Transfer in a Modern High Pressure Turbine Stage

    Science.gov (United States)

    Shyam, Vikram; Ameri, Ali

    2009-01-01

    Unsteady 3-D RANS simulations have been performed on a highly loaded transonic turbine stage and results are compared to steady calculations as well as to experiment. A low Reynolds number k-epsilon turbulence model is employed to provide closure for the RANS system. A phase-lag boundary condition is used in the tangential direction. This allows the unsteady simulation to be performed by using only one blade from each of the two rows. The objective of this work is to study the effect of unsteadiness on rotor heat transfer and to glean any insight into unsteady flow physics. The role of the stator wake passing on the pressure distribution at the leading edge is also studied. The simulated heat transfer and pressure results agreed favorably with experiment. The time-averaged heat transfer predicted by the unsteady simulation is higher than the heat transfer predicted by the steady simulation everywhere except at the leading edge. The shock structure formed due to stator-rotor interaction was analyzed. Heat transfer and pressure at the hub and casing were also studied. Thermal segregation was observed that leads to the heat transfer patterns predicted by steady and unsteady simulations to be different.

  7. Simulation and rubrics: technology and grading student performance in nurse anesthesia education.

    Science.gov (United States)

    Overstreet, Maria; McCarver, Lewis; Shields, John; Patterson, Jordan

    2015-06-01

    The use of simulation technology has introduced a challenge for simulation nurse educators: evaluation of student performance. The subjectivity of student performance evaluation has been in need of improvement. It is imperative to provide clear and consistent information to the learner of expectations for their performance. Educators use objectives to define for the learner what the primary focus will be in the learning activities. Creation of rubrics to replace checklists to evaluate learner performance is a team task. Improved rubrics assist instructors in providing valuable, immediate, and postactivity feedback and consistency among instructors, and improved inter-rater reliability. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Learning through simulated independent practice leads to better future performance in a simulated crisis than learning through simulated supervised practice.

    Science.gov (United States)

    Goldberg, A; Silverman, E; Samuelson, S; Katz, D; Lin, H M; Levine, A; DeMaria, S

    2015-05-01

    Anaesthetists may fail to recognize and manage certain rare intraoperative events. Simulation has been shown to be an effective educational adjunct to typical operating room-based education to train for these events. It is yet unclear, however, why simulation has any benefit. We hypothesize that learners who are allowed to manage a scenario independently and allowed to fail, thus causing simulated morbidity, will consequently perform better when re-exposed to a similar scenario. Using a randomized, controlled, observer-blinded design, 24 first-year residents were exposed to an oxygen pipeline contamination scenario, either where patient harm occurred (independent group, n=12) or where a simulated attending anaesthetist intervened to prevent harm (supervised group, n=12). Residents were brought back 6 months later and exposed to a different scenario (pipeline contamination) with the same end point. Participants' proper treatment, time to diagnosis, and non-technical skills (measured using the Anaesthetists' Non-Technical Skills Checklist, ANTS) were measured. No participants provided proper treatment in the initial exposure. In the repeat encounter 6 months later, 67% in the independent group vs 17% in the supervised group resumed adequate oxygen delivery (P=0.013). The independent group also had better ANTS scores [median (interquartile range): 42.3 (31.5-53.1) vs 31.3 (21.6-41), P=0.015]. There was no difference in time to treatment if proper management was provided [602 (490-820) vs 610 (420-800) s, P=0.79]. Allowing residents to practise independently in the simulation laboratory, and subsequently, allowing them to fail, can be an important part of simulation-based learning. This is not feasible in real clinical practice but appears to have improved resident performance in this study. The purposeful use of independent practice and its potentially negative outcomes thus sets simulation-based learning apart from traditional operating room learning. © The Author

  9. A high-orbit collimating infrared earth simulator

    International Nuclear Information System (INIS)

    Zhang Guoyu; Jiang Huilin; Fang Yang; Yu Huadong; Xu Xiping; Wang, Lingyun; Liu Xuli; Huang Lan; Yue Shixin; Peng Hui

    2007-01-01

    The earth simulator is the most important testing equipment ground-based for the infrared earth sensor, and it is also a key component in the satellite controlling system. for three orbit heights 18000Km, 35786Km and 42000Km, in this paper we adopt a project of collimation and replaceable earth diaphragm and develop a high orbit collimation earth simulator. This simulator can afford three angles 15.19 0 , 17.46 0 and 30.42 0 , resulting simulating the earth on the ground which can be seen in out space by the satellite. In this paper we introduce the components, integer structure, and the earth's field angles testing method of the earth simulator in detail. Germanium collimation lens is the most important component in the earth simulator. According to the optical configuration parameter of Germanium collimation lens, we find the location and size of the earth diaphragm and the hot earth by theoretical analyses and optics calculation, which offer foundation of design in the study of the earth simulator. The earth angle is the index to scale the precision of earth simulator. We test the three angles by experiment and the results indicate that three angles errors are all less than ±0.05 0

  10. Review of Methods Related to Assessing Human Performance in Nuclear Power Plant Control Room Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Katya L Le Blanc; Ronald L Boring; David I Gertman

    2001-11-01

    With the increased use of digital systems in Nuclear Power Plant (NPP) control rooms comes a need to thoroughly understand the human performance issues associated with digital systems. A common way to evaluate human performance is to test operators and crews in NPP control room simulators. However, it is often challenging to characterize human performance in meaningful ways when measuring performance in NPP control room simulations. A review of the literature in NPP simulator studies reveals a variety of ways to measure human performance in NPP control room simulations including direct observation, automated computer logging, recordings from physiological equipment, self-report techniques, protocol analysis and structured debriefs, and application of model-based evaluation. These methods and the particular measures used are summarized and evaluated.

  11. Prior video game utilization is associated with improved performance on a robotic skills simulator.

    Science.gov (United States)

    Harbin, Andrew C; Nadhan, Kumar S; Mooney, James H; Yu, Daohai; Kaplan, Joshua; McGinley-Hence, Nora; Kim, Andrew; Gu, Yiming; Eun, Daniel D

    2017-09-01

    Laparoscopic surgery and robotic surgery, two forms of minimally invasive surgery (MIS), have recently experienced a large increase in utilization. Prior studies have shown that video game experience (VGE) may be associated with improved laparoscopic surgery skills; however, similar data supporting a link between VGE and proficiency on a robotic skills simulator (RSS) are lacking. The objective of our study is to determine whether volume or timing of VGE had any impact on RSS performance. Pre-clinical medical students completed a comprehensive questionnaire detailing previous VGE across several time periods. Seventy-five subjects were ultimately evaluated in 11 training exercises on the daVinci Si Skills Simulator. RSS skill was measured by overall score, time to completion, economy of motion, average instrument collision, and improvement in Ring Walk 3 score. Using the nonparametric tests and linear regression, these metrics were analyzed for systematic differences between non-users, light, and heavy video game users based on their volume of use in each of the following four time periods: past 3 months, past year, past 3 years, and high school. Univariate analyses revealed significant differences between heavy and non-users in all five performance metrics. These trends disappeared as the period of VGE went further back. Our study showed a positive association between video game experience and robotic skills simulator performance that is stronger for more recent periods of video game use. The findings may have important implications for the evolution of robotic surgery training.

  12. High-reliability emergency response teams in the hospital: improving quality and safety using in situ simulation training.

    Science.gov (United States)

    Wheeler, Derek S; Geis, Gary; Mack, Elizabeth H; LeMaster, Tom; Patterson, Mary D

    2013-06-01

    In situ simulation training is a team-based training technique conducted on actual patient care units using equipment and resources from that unit, and involving actual members of the healthcare team. We describe our experience with in situ simulation training in a major children's medical centre. In situ simulations were conducted using standardised scenarios approximately twice per month on inpatient hospital units on a rotating basis. Simulations were scheduled so that each unit participated in at least two in situ simulations per year. Simulations were conducted on a revolving schedule alternating on the day and night shifts and were unannounced. Scenarios were preselected to maximise the educational experience, and frequently involved clinical deterioration to cardiopulmonary arrest. We performed 64 of the scheduled 112 (57%) in situ simulations on all shifts and all units over 21 months. We identified 134 latent safety threats and knowledge gaps during these in situ simulations, which we categorised as medication, equipment, and/or resource/system threats. Identification of these errors resulted in modification of systems to reduce the risk of error. In situ simulations also provided a method to reinforce teamwork behaviours, such as the use of assertive statements, role clarity, performance of frequent updating, development of a shared mental model, performance of independent double checks of high-risk medicines, and overcoming authority gradients between team members. Participants stated that the training programme was effective and did not disrupt patient care. In situ simulations can identify latent safety threats, identify knowledge gaps, and reinforce teamwork behaviours when used as part of an organisation-wide safety programme.

  13. Patterns of communication in high-fidelity simulation.

    Science.gov (United States)

    Anderson, Judy K; Nelson, Kimberly

    2015-01-01

    High-fidelity simulation is commonplace in nursing education. However, critical thinking, decision making, and psychomotor skills scenarios are emphasized. Scenarios involving communication occur in interprofessional or intraprofessional settings. The importance of effective nurse-patient communication is reflected in statements from the American Nurses Association and Quality and Safety Education for Nurses, and in the graduate outcomes of most nursing programs. This qualitative study examined the patterns of communication observed in video recordings of a medical-surgical scenario with 71 senior students in a baccalaureate program. Thematic analysis revealed patterns of (a) focusing on tasks, (b) communicating-in-action, and (c) being therapeutic. Additional categories under the patterns included missing opportunities, viewing the "small picture," relying on informing, speaking in "medical tongues," offering choices…okay?, feeling uncomfortable, and using therapeutic techniques. The findings suggest the importance of using high-fidelity simulation to develop expertise in communication. In addition, the findings reinforce the recommendation to prioritize communication aspects of scenarios and debriefing for all simulations. Copyright 2015, SLACK Incorporated.

  14. Improved performance of maternal-fetal medicine staff after maternal cardiac arrest simulation-based training.

    Science.gov (United States)

    Fisher, Nelli; Eisen, Lewis A; Bayya, Jyothshna V; Dulu, Alina; Bernstein, Peter S; Merkatz, Irwin R; Goffman, Dena

    2011-09-01

    To determine the impact of simulation-based maternal cardiac arrest training on performance, knowledge, and confidence among Maternal-Fetal Medicine staff. Maternal-Fetal Medicine staff (n = 19) participated in a maternal arrest simulation program. Based on evaluation of performance during initial simulations, an intervention was designed including: basic life support course, advanced cardiac life support pregnancy modification lecture, and simulation practice. Postintervention evaluative simulations were performed. All simulations included a knowledge test, confidence survey, and debriefing. A checklist with 9 pregnancy modification (maternal) and 16 critical care (25 total) tasks was used for scoring. Postintervention scores reflected statistically significant improvement. Maternal-Fetal Medicine staff demonstrated statistically significant improvement in timely initiation of cardiopulmonary resuscitation (120 vs 32 seconds, P = .042) and cesarean delivery (240 vs 159 seconds, P = .017). Prompt cardiopulmonary resuscitation initiation and pregnancy modifications application are critical in maternal and fetal survival during cardiac arrest. Simulation is a useful tool for Maternal-Fetal Medicine staff to improve skills, knowledge, and confidence in the management of this catastrophic event. Published by Mosby, Inc.

  15. Simulation and high performance computing-Building a predictive capability for fusion

    International Nuclear Information System (INIS)

    Strand, P.I.; Coelho, R.; Coster, D.; Eriksson, L.-G.; Imbeaux, F.; Guillerminet, Bernard

    2010-01-01

    The Integrated Tokamak Modelling Task Force (ITM-TF) is developing an infrastructure where the validation needs, as being formulated in terms of multi-device data access and detailed physics comparisons aiming for inclusion of synthetic diagnostics in the simulation chain, are key components. As the activity and the modelling tools are aimed for general use, although focused on ITER plasmas, a device independent approach to data transport and a standardized approach to data management (data structures, naming, and access) is being developed in order to allow cross-validation between different fusion devices using a single toolset. Extensive work has already gone into, and is continuing to go into, the development of standardized descriptions of the data (Consistent Physical Objects). The longer term aim is a complete simulation platform which is expected to last and be extended in different ways for the coming 30 years. The technical underpinning is therefore of vital importance. In particular the platform needs to be extensible and open-ended to be able to take full advantage of not only today's most advanced technologies but also be able to marshal future developments. As a full level comprehensive prediction of ITER physics rapidly becomes expensive in terms of computing resources, the simulation framework needs to be able to use both grid and HPC computing facilities. Hence data access and code coupling technologies are required to be available for a heterogeneous, possibly distributed, environment. The developments in this area are pursued in a separate project-EUFORIA (EU Fusion for ITER Applications) which is providing about 15 professional person year (ppy) per annum from 14 different institutes. The range and size of the activity is not only technically challenging but is providing some unique management challenges in that a large and geographically distributed team (a truly pan-European set of researchers) need to be coordinated on a fairly detailed

  16. Mixing-to-eruption timescales: an integrated model combining numerical simulations and high-temperature experiments with natural melts

    Science.gov (United States)

    Montagna, Chiara; Perugini, Diego; De Campos, Christina; Longo, Antonella; Dingwell, Donald Bruce; Papale, Paolo

    2015-04-01

    Arrival of magma from depth into shallow reservoirs and associated mixing processes have been documented as possible triggers of explosive eruptions. Quantifying the timing from beginning of mixing to eruption is of fundamental importance in volcanology in order to put constraints about the possible onset of a new eruption. Here we integrate numerical simulations and high-temperature experiment performed with natural melts with the aim to attempt identifying the mixing-to-eruption timescales. We performed two-dimensional numerical simulations of the arrival of gas-rich magmas into shallow reservoirs. We solve the fluid dynamics for the two interacting magmas evaluating the space-time evolution of the physical properties of the mixture. Convection and mingling develop quickly into the chamber and feeding conduit/dyke. Over time scales of hours, the magmas in the reservoir appear to have mingled throughout, and convective patterns become harder to identify. High-temperature magma mixing experiments have been performed using a centrifuge and using basaltic and phonolitic melts from Campi Flegrei (Italy) as initial end-members. Concentration Variance Decay (CVD), an inevitable consequence of magma mixing, is exponential with time. The rate of CVD is a powerful new geochronometer for the time from mixing to eruption/quenching. The mingling-to-eruption time of three explosive volcanic eruptions from Campi Flegrei (Italy) yield durations on the order of tens of minutes. These results are in perfect agreement with the numerical simulations that suggest a maximum mixing time of a few hours to obtain a hybrid mixture. We show that integration of numerical simulation and high-temperature experiments can provide unprecedented results about mixing processes in volcanic systems. The combined application of numerical simulations and CVD geochronometer to the eruptive products of active volcanoes could be decisive for the preparation of hazard mitigation during volcanic unrest.

  17. Numerical Simulation of the Dynamic Performance of the Ceramic Material Affected by Different Strain Rate and Porosity

    International Nuclear Information System (INIS)

    Wang Zhen; Mei, H; Lai, X; Liu, L S; Zhai, P C; Cao, D F

    2013-01-01

    Ceramic materials are frequently used in protective armor applications for its low-density, high elastic modulus and high strength. It may be subject to different ballistic impacts in many situations, thus many studies have been carried out to explore the approach to improve the mechanical properties of the ceramic material. However, the materials manufactured in real world are full of defects, which would involve in variable fractures or damage. Therefore, the defects should be taken into account while the simulations are performed. In this paper, the dynamic properties of ceramic materials (Al 2 O 3 ) affected by different strain rate (500–5000) and porosity (below 5%) are investigated. Foremost, the effect of strain rate was studied by using different load velocities. Then, compression simulations are performed by setting different porosities and random distribution of pores size and location in ceramic materials. Crack extensions and failure modes are observed to describe the dynamic mechanical behavior.

  18. High Fidelity In Situ Shoulder Dystocia Simulation

    Directory of Open Access Journals (Sweden)

    Andrew Pelikan, MD

    2018-04-01

    Full Text Available Audience: Resident physicians, emergency department (ED staff Introduction: Precipitous deliveries are high acuity, low occurrence in most emergency departments. Shoulder dystocia is a rare but potentially fatal complication of labor that can be relieved by specific maneuvers that must be implemented in a timely manner. This simulation is designed to educate resident learners on the critical management steps in a shoulder dystocia presenting to the emergency department. A special aspect of this simulation is the unique utilization of the “Noelle” model with an instructing physician at bedside maneuvering the fetus through the stations of labor and providing subtle adjustments to fetal positioning not possible though a mechanized model. A literature search of “shoulder dystocia simulation” consists primarily of obstetrics and mid-wife journals, many of which utilize various mannequin models. None of the reviewed articles utilized a bedside provider maneuvering the fetus with the Noelle model, making this method unique. While the Noelle model is equipped with a remote-controlled motor that automatically rotates and delivers the baby either to the head or to the shoulders and can produce a turtle sign and which will prevent delivery of the baby until signaled to do so by the instructor, using the bedside instructor method allows this simulation to be reproduced with less mechanistically advanced and lower cost models.1-5 Objectives: At the end of this simulation, learners will: 1 Recognize impending delivery and mobilize appropriate resources (ie, both obstetrics [OB] and NICU/pediatrics; 2 Identify risk factors for shoulder dystocia based on history and physical; 3 Recognize shoulder dystocia during delivery; 4 Demonstrate maneuvers to relieve shoulder dystocia; 5 Communicate with team members and nursing staff during resuscitation of a critically ill patient. Method: High-fidelity simulation. Topics: High fidelity, in situ, Noelle model

  19. Psychophysiological Assessment in Pilots Performing Challenging Simulated and Real Flight Maneuvers.

    Science.gov (United States)

    Johannes, Bernd; Rothe, Stefanie; Gens, André; Westphal, Soeren; Birkenfeld, Katja; Mulder, Edwin; Rittweger, Jörn; Ledderhos, Carla

    2017-09-01

    The objective assessment of psychophysiological arousal during challenging flight maneuvers is of great interest to aerospace medicine, but remains a challenging task. In the study presented here, a vector-methodological approach was used which integrates different psychophysiological variables, yielding an integral arousal index called the Psychophysiological Arousal Value (PAV). The arousal levels of 15 male pilots were assessed during predetermined, well-defined flight maneuvers performed under simulated and real flight conditions. The physiological data, as expected, revealed inter- and intra-individual differences for the various measurement conditions. As indicated by the PAV, air-to-air refueling (AAR) turned out to be the most challenging task. In general, arousal levels were comparable between simulator and real flight conditions. However, a distinct difference was observed when the pilots were divided by instructors into two groups based on their proficiency in AAR with AWACS (AAR-Novices vs. AAR-Professionals). AAR-Novices had on average more than 2000 flight hours on other aircrafts. They showed higher arousal reactions to AAR in real flight (contact: PAV score 8.4 ± 0.37) than under simulator conditions (7.1 ± 0.30), whereas AAR-Professionals did not (8.5 ± 0.46 vs. 8.8 ± 0.80). The psychophysiological arousal value assessment was tested in field measurements, yielding quantifiable arousal differences between proficiency groups of pilots during simulated and real flight conditions. The method used in this study allows an evaluation of the psychophysiological cost during a certain flying performance and thus is possibly a valuable tool for objectively evaluating the actual skill status of pilots.Johannes B, Rothe S, Gens A, Westphal S, Birkenfeld K, Mulder E, Rittweger J, Ledderhos C. Psychophysiological assessment in pilots performing challenging simulated and real flight maneuvers. Aerosp Med Hum Perform. 2017; 88(9):834-840.

  20. High performance visual display for HENP detectors

    CERN Document Server

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...