WorldWideScience

Sample records for computer simulation experiments

  1. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  2. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  3. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  4. Computer simulation of Wheeler's delayed-choice experiment with photons

    NARCIS (Netherlands)

    Zhao, S.; Yuan, S.; De Raedt, H.; Michielsen, K.

    We present a computer simulation model of Wheeler's delayed-choice experiment that is a one-to-one copy of an experiment reported recently (Jacques V. et al., Science, 315 (2007) 966). The model is solely based on experimental facts, satisfies Einstein's criterion of local causality and does not

  5. Large scale statistics for computational verification of grain growth simulations with experiments

    International Nuclear Information System (INIS)

    Demirel, Melik C.; Kuprat, Andrew P.; George, Denise C.; Straub, G.K.; Misra, Amit; Alexander, Kathleen B.; Rollett, Anthony D.

    2002-01-01

    It is known that by controlling microstructural development, desirable properties of materials can be achieved. The main objective of our research is to understand and control interface dominated material properties, and finally, to verify experimental results with computer simulations. We have previously showed a strong similarity between small-scale grain growth experiments and anisotropic three-dimensional simulations obtained from the Electron Backscattered Diffraction (EBSD) measurements. Using the same technique, we obtained 5170-grain data from an Aluminum-film (120 (micro)m thick) with a columnar grain structure. Experimentally obtained starting microstructure and grain boundary properties are input for the three-dimensional grain growth simulation. In the computational model, minimization of the interface energy is the driving force for the grain boundary motion. The computed evolved microstructure is compared with the final experimental microstructure, after annealing at 550 C. Characterization of the structures and properties of grain boundary networks (GBN) to produce desirable microstructures is one of the fundamental problems in interface science. There is an ongoing research for the development of new experimental and analytical techniques in order to obtain and synthesize information related to GBN. The grain boundary energy and mobility data were characterized by Electron Backscattered Diffraction (EBSD) technique and Atomic Force Microscopy (AFM) observations (i.e., for ceramic MgO and for the metal Al). Grain boundary energies are extracted from triple junction (TJ) geometry considering the local equilibrium condition at TJ's. Relative boundary mobilities were also extracted from TJ's through a statistical/multiscale analysis. Additionally, there are recent theoretical developments of grain boundary evolution in microstructures. In this paper, a new technique for three-dimensional grain growth simulations was used to simulate interface migration

  6. Computer simulation of ductile fracture

    International Nuclear Information System (INIS)

    Wilkins, M.L.; Streit, R.D.

    1979-01-01

    Finite difference computer simulation programs are capable of very accurate solutions to problems in plasticity with large deformations and rotation. This opens the possibility of developing models of ductile fracture by correlating experiments with equivalent computer simulations. Selected experiments were done to emphasize different aspects of the model. A difficult problem is the establishment of a fracture-size effect. This paper is a study of the strain field around notched tensile specimens of aluminum 6061-T651. A series of geometrically scaled specimens are tested to fracture. The scaled experiments are conducted for different notch radius-to-diameter ratios. The strains at fracture are determined from computer simulations. An estimate is made of the fracture-size effect

  7. Developments of multibody system dynamics: computer simulations and experiments

    International Nuclear Information System (INIS)

    Yoo, Wan-Suk; Kim, Kee-Nam; Kim, Hyun-Woo; Sohn, Jeong-Hyun

    2007-01-01

    It is an exceptional success when multibody dynamics researchers Multibody System Dynamics journal one of the most highly ranked journals in the last 10 years. In the inaugural issue, Professor Schiehlen wrote an interesting article explaining the roots and perspectives of multibody system dynamics. Professor Shabana also wrote an interesting article to review developments in flexible multibody dynamics. The application possibilities of multibody system dynamics have grown wider and deeper, with many application examples being introduced with multibody techniques in the past 10 years. In this paper, the development of multibody dynamics is briefly reviewed and several applications of multibody dynamics are described according to the author's research results. Simulation examples are compared to physical experiments, which show reasonableness and accuracy of the multibody formulation applied to real problems. Computer simulations using the absolute nodal coordinate formulation (ANCF) were also compared to physical experiments; therefore, the validity of ANCF for large-displacement and large-deformation problems was shown. Physical experiments for large deformation problems include beam, plate, chain, and strip. Other research topics currently being carried out in the author's laboratory are also briefly explained

  8. Computer-simulated experiments and computer games: a method of design analysis

    Directory of Open Access Journals (Sweden)

    Jerome J. Leary

    1995-12-01

    Full Text Available Through the new modularization of the undergraduate science degree at the University of Brighton, larger numbers of students are choosing to take some science modules which include an amount of laboratory practical work. Indeed, within energy studies, the fuels and combustion module, for which the computer simulations were written, has seen a fourfold increase in student numbers from twelve to around fifty. Fitting out additional laboratories with new equipment to accommodate this increase presented problems: the laboratory space did not exist; fitting out the laboratories with new equipment would involve a relatively large capital spend per student for equipment that would be used infrequently; and, because some of the experiments use inflammable liquids and gases, additional staff would be needed for laboratory supervision.

  9. Amorphous nanoparticles — Experiments and computer simulations

    International Nuclear Information System (INIS)

    Hoang, Vo Van; Ganguli, Dibyendu

    2012-01-01

    The data obtained by both experiments and computer simulations concerning the amorphous nanoparticles for decades including methods of synthesis, characterization, structural properties, atomic mechanism of a glass formation in nanoparticles, crystallization of the amorphous nanoparticles, physico-chemical properties (i.e. catalytic, optical, thermodynamic, magnetic, bioactivity and other properties) and various applications in science and technology have been reviewed. Amorphous nanoparticles coated with different surfactants are also reviewed as an extension in this direction. Much attention is paid to the pressure-induced polyamorphism of the amorphous nanoparticles or amorphization of the nanocrystalline counterparts. We also introduce here nanocomposites and nanofluids containing amorphous nanoparticles. Overall, amorphous nanoparticles exhibit a disordered structure different from that of corresponding bulks or from that of the nanocrystalline counterparts. Therefore, amorphous nanoparticles can have unique physico-chemical properties differed from those of the crystalline counterparts leading to their potential applications in science and technology.

  10. Simulation in computer forensics teaching: the student experience

    OpenAIRE

    Crellin, Jonathan; Adda, Mo; Duke-Williams, Emma; Chandler, Jane

    2011-01-01

    The use of simulation in teaching computing is well established, with digital forensic investigation being a subject area where the range of simulation required is both wide and varied demanding a corresponding breadth of fidelity. Each type of simulation can be complex and expensive to set up resulting in students having only limited opportunities to participate and learn from the simulation. For example students' participation in mock trials in the University mock courtroom or in simulation...

  11. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  12. Fisher information in the design of computer simulation experiments

    Energy Technology Data Exchange (ETDEWEB)

    StehlIk, Milan; Mueller, Werner G [Department of Applied Statistics, Johannes-Kepler-University Linz Freistaedter Strasse 315, A-4040 Linz (Austria)], E-mail: Milan.Stehlik@jku.at, E-mail: Werner.Mueller@jku.at

    2008-11-01

    The concept of Fisher information is conveniently used as a basis for designing efficient experiments. However, if the output stems from computer simulations they are often approximated as realizations of correlated random fields. Consequently, the conditions under which Fisher information may be suitable must be restated. In the paper we intend to give some simple but illuminating examples for these cases. 'Random phenomena have increasing importance in Engineering and Physics, therefore theoretical results are strongly needed. But there is a gap between the probability theory used by mathematicians and practitioners. Two very different languages have been generated in this way...' (Paul Kree, Paris 1995)

  13. Fisher information in the design of computer simulation experiments

    International Nuclear Information System (INIS)

    StehlIk, Milan; Mueller, Werner G

    2008-01-01

    The concept of Fisher information is conveniently used as a basis for designing efficient experiments. However, if the output stems from computer simulations they are often approximated as realizations of correlated random fields. Consequently, the conditions under which Fisher information may be suitable must be restated. In the paper we intend to give some simple but illuminating examples for these cases. 'Random phenomena have increasing importance in Engineering and Physics, therefore theoretical results are strongly needed. But there is a gap between the probability theory used by mathematicians and practitioners. Two very different languages have been generated in this way...' (Paul Kree, Paris 1995)

  14. EDUCATIONAL COMPUTER SIMULATION EXPERIMENT «REAL-TIME SINGLE-MOLECULE IMAGING OF QUANTUM INTERFERENCE»

    Directory of Open Access Journals (Sweden)

    Alexander V. Baranov

    2015-01-01

    Full Text Available Taking part in the organized project activities students of the technical University create virtual physics laboratories. The article gives an example of the student’s project-computer modeling and visualization one of the most wonderful manifestations of reality-quantum interference of particles. The real experiment with heavy organic fluorescent molecules is used as a prototype for this computer simulation. The student’s software product can be used in informational space of the system of open education.

  15. Studies on defect evolution in steels: experiments and computer simulations

    International Nuclear Information System (INIS)

    Sundar, C.S.

    2011-01-01

    In this paper, we present the results of our on-going studies on steels that are being carried out with a view to develop radiation resistant steels. The focus is on the use of nano-dispersoids in alloys towards the suppression of void formation and eventual swelling under irradiation. Results on the nucleation and growth of TiC precipitates in Ti modified austenitic steels and investigations on nano Yttria particles in Fe - a model oxide dispersion ferritic steel will be presented. The experimental methods of ion beam irradiation and positron annihilation spectroscopy have been used to elucidate the role of minor alloying elements on swelling behaviour. Computer simulation of defect processes have been carried out using ab-initio methods, molecular dynamics and Monte Carlo simulations. Our perspectives on addressing the multi-scale phenomena of defect processes leading to radiation damage, through a judicious combination of experiments and simulations, would be presented. (author)

  16. Distributed simulation of large computer systems

    International Nuclear Information System (INIS)

    Marzolla, M.

    2001-01-01

    Sequential simulation of large complex physical systems is often regarded as a computationally expensive task. In order to speed-up complex discrete-event simulations, the paradigm of Parallel and Distributed Discrete Event Simulation (PDES) has been introduced since the late 70s. The authors analyze the applicability of PDES to the modeling and analysis of large computer system; such systems are increasingly common in the area of High Energy and Nuclear Physics, because many modern experiments make use of large 'compute farms'. Some feasibility tests have been performed on a prototype distributed simulator

  17. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  18. Creating science simulations through Computational Thinking Patterns

    Science.gov (United States)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  19. Experiences using DAKOTA stochastic expansion methods in computational simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Templeton, Jeremy Alan; Ruthruff, Joseph R.

    2012-01-01

    Uncertainty quantification (UQ) methods bring rigorous statistical connections to the analysis of computational and experiment data, and provide a basis for probabilistically assessing margins associated with safety and reliability. The DAKOTA toolkit developed at Sandia National Laboratories implements a number of UQ methods, which are being increasingly adopted by modeling and simulation teams to facilitate these analyses. This report disseminates results as to the performance of DAKOTA's stochastic expansion methods for UQ on a representative application. Our results provide a number of insights that may be of interest to future users of these methods, including the behavior of the methods in estimating responses at varying probability levels, and the expansion levels for the methodologies that may be needed to achieve convergence.

  20. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  1. Pharmacology Experiments on the Computer.

    Science.gov (United States)

    Keller, Daniel

    1990-01-01

    A computer program that replaces a set of pharmacology and physiology laboratory experiments on live animals or isolated organs is described and illustrated. Five experiments are simulated: dose-effect relationships on smooth muscle, blood pressure and catecholamines, neuromuscular signal transmission, acetylcholine and the circulation, and…

  2. Computer Simulation of Einstein-Podolsky-Rosen-Bohm Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.

    We review an event-based simulation approach which reproduces the statistical distributions of quantum physics experiments by generating detection events one-by-one according to an unknown distribution and without solving a wave equation. Einstein-Podolsky-Rosen-Bohm laboratory experiments are used

  3. Production of proteinase A by Saccharomyces cerevisiae in a cell-recycling fermentation system: Experiments and computer simulations

    DEFF Research Database (Denmark)

    Grøn, S.; Biedermann, K.; Emborg, Claus

    1996-01-01

    experimentally and by computer simulations. Experiments and simulations showed that cell mass and product concentration were enhanced by high ratios of recycling. Additional simulations showed that the proteinase A concentration decreased drastically at high dilution rates and the optimal volumetric...... productivities were at high dilution rates just below washout and at high ratios of recycling. Cell-recycling fermentation gave much higher volumetric productivities and stable product concentrations in contrast to simple continuous fermentation....

  4. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  5. Monte Carlo simulation of experiments

    International Nuclear Information System (INIS)

    Opat, G.I.

    1977-07-01

    An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)

  6. Computer simulations of laser hot spots and implosion symmetry kiniform phase plate experiments on Nova

    International Nuclear Information System (INIS)

    Peterson, R. R.; Lindman, E. L.; Delamater, N. D.; Magelssen, G. R.

    2000-01-01

    LASNEX computer code simulations have been performed for radiation symmetry experiments on the Nova laser with vacuum and gas-filled hohlraum targets [R. L. Kauffman et al., Phys. Plasmas 5, 1927 (1998)]. In previous experiments with unsmoothed laser beams, the symmetry was substantially shifted by deflection of the laser beams. In these experiments, laser beams have been smoothed with Kiniform Phase Plates in an attempt to remove deflection of the beams. The experiments have shown that this smoothing significantly improves the agreement with LASNEX calculations of implosion symmetry. The images of laser produced hot spots on the inside of the hohlraum case have been found to differ from LASNEX calculations, suggesting that some beam deflection or self-focusing may still be present or that emission from interpenetrating plasmas is an important component of the images. The measured neutron yields are in good agreement with simulations for vacuum hohlraums but are far different for gas-filled hohlraums. (c) 2000 American Institute of Physics

  7. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  8. Simulations and Experiments in Astronomy and Physics

    Science.gov (United States)

    Maloney, F. P.; Maurone, P. A.; Dewarf, L. E.

    1998-12-01

    There are new approaches to teaching astronomy and physics in the laboratory setting, involving the use of computers as tools to simulate events and concepts which can be illuminated in no other reasonable way. With the computer, it is possible to travel back in time to replicate the sky as Galileo saw it. Astronomical phenomena which reveal themselves only after centuries of real time may be compressed in the computer to a simulation of several minutes. Observations simulated on the computer do not suffer from the vagaries of weather, fixed time or geographic position, or non-repeatability. In physics, the computer allows us to secure data for experiments which, by their nature, may not be amenable to human interaction. These could include experiments with very fast or very slow timescales, large number of data samples, complex or tedious manipulation of the data which hides the fundamental nature of the experiment, or data sampling which would need a specialized probe, such as for acid rain. This innovation has become possible only recently, due to the availability and affordability of sophisticated computer hardware and software. We have developed a laboratory experience for non-scientists who need an introductory course in astronomy or physics. Our approach makes extensive use of computers in this laboratory. Using commercially available software, the students use the computer as a time machine and a space craft to explore and rediscover fundamental science. The physics experiments are classical in nature, and the computer acts as a data collector and presenter, freeing the student from the tedium of repetitive data gathering and replotting. In this way, the student is encouraged to explore, to try new things, to refine the measurements, and to discover the principles underlying the observed phenomena.

  9. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  10. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  11. Computer Simulation in Information and Communication Engineering

    CERN Multimedia

    Anton Topurov

    2005-01-01

    CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...

  12. The role of computer simulation in nuclear technology development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, VV.; Ryazanov, D.K.; Tellin, A.I.

    2000-01-01

    In the report, the role and purpose of computer simulation in nuclear technology development is discussed. The authors consider such applications of computer simulation as: (a) Nuclear safety research; (b) Optimization of technical and economic parameters of acting nuclear plant; (c) Planning and support of reactor experiments; (d) Research and design new devices and technologies; (f) Design and development of 'simulators' for operating personnel training. Among marked applications, the following aspects of computer simulation are discussed in the report: (g) Neutron-physical, thermal and hydrodynamics models; (h) Simulation of isotope structure change and dam- age dose accumulation for materials under irradiation; (i) Simulation of reactor control structures. (authors)

  13. Computer Simulation of Diffraction Patterns.

    Science.gov (United States)

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  14. Ontological and Epistemological Issues Regarding Climate Models and Computer Experiments

    Science.gov (United States)

    Vezer, M. A.

    2010-12-01

    Recent philosophical discussions (Parker 2009; Frigg and Reiss 2009; Winsberg, 2009; Morgon 2002, 2003, 2005; Gula 2002) about the ontology of computer simulation experiments and the epistemology of inferences drawn from them are of particular relevance to climate science as computer modeling and analysis are instrumental in understanding climatic systems. How do computer simulation experiments compare with traditional experiments? Is there an ontological difference between these two methods of inquiry? Are there epistemological considerations that result in one type of inference being more reliable than the other? What are the implications of these questions with respect to climate studies that rely on computer simulation analysis? In this paper, I examine these philosophical questions within the context of climate science, instantiating concerns in the philosophical literature with examples found in analysis of global climate change. I concentrate on Wendy Parker’s (2009) account of computer simulation studies, which offers a treatment of these and other questions relevant to investigations of climate change involving such modelling. Two theses at the center of Parker’s account will be the focus of this paper. The first is that computer simulation experiments ought to be regarded as straightforward material experiments; which is to say, there is no significant ontological difference between computer and traditional experimentation. Parker’s second thesis is that some of the emphasis on the epistemological importance of materiality has been misplaced. I examine both of these claims. First, I inquire as to whether viewing computer and traditional experiments as ontologically similar in the way she does implies that there is no proper distinction between abstract experiments (such as ‘thought experiments’ as well as computer experiments) and traditional ‘concrete’ ones. Second, I examine the notion of materiality (i.e., the material commonality between

  15. Analysis of material flow in metal forming processes by using computer simulation and experiment with model material

    International Nuclear Information System (INIS)

    Kim, Heon Young; Kim, Dong Won

    1993-01-01

    The objective of the present study is to analyze material flow in the metal forming processes by using computer simulation and experiment with model material, plasticine. A UBET program is developed to analyze the bulk flow behaviour of various metal forming problems. The elemental strain-hardening effect is considered in an incremental manner and the element system is automatically regenerated at every deforming step in the program. The material flow behaviour in closed-die forging process with rib-web type cavity are analyzed by UBET and elastic-plastic finite element method, and verified by experiments with plasticine. There were good agreements between simulation and experiment. The effect of corner rounding on material flow behavior is investigated in the analysis of backward extrusion with square die. Flat punch indentation process is simulated by UBET, and the results are compared with that of elastic-plastic finite element method. (Author)

  16. ATLAS Distributed Computing: Experience and Evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2013-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb-1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centers around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics program including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2014 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  17. ATLAS distributed computing: experience and evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25/fb of data. The total volume of beam and simulated data products exceeds 100~PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  18. Simulation of chamber experiments

    International Nuclear Information System (INIS)

    Ivanov, V.G.

    1981-01-01

    The description of the system of computer simulation of experiments conducted by means of track detectors with film data output is given. Considered is the principle of organization of computer model of the chamber experiment comprising the following stages: generation of events, generation of measurements, ge-- neration of scanning results, generation of distorbions, generated data calibration, filtration, events reconstruction, kinematic identification, total results tape formation, analysis of the results. Generation programs are formed as special RAM-files, where the RAM-file is the text of the program written in FORTRAN and divided into structural elements. All the programs are a ''part of the ''Hydra'' system. The system possibilities are considered on the base of the CDSC-6500 computer. The five-beam event generation, creation data structure for identification and calculation by the kinematic program take about 1s of CDC-6500 computer time [ru

  19. Self-propagating exothermic reaction analysis in Ti/Al reactive films using experiments and computational fluid dynamics simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Seema, E-mail: seema.sen@tu-ilmenau.de [Technical University of Ilmenau, Department of Materials for Electronics, Gustav-Kirchhoff-Str. 5, 98693 Ilmenau (Germany); Niederrhein University of Applied Science, Department of Mechanical and Process Engineering, Reinarzstraße 49, 47805 Krefeld (Germany); Lake, Markus; Kroppen, Norman; Farber, Peter; Wilden, Johannes [Niederrhein University of Applied Science, Department of Mechanical and Process Engineering, Reinarzstraße 49, 47805 Krefeld (Germany); Schaaf, Peter [Technical University of Ilmenau, Department of Materials for Electronics, Gustav-Kirchhoff-Str. 5, 98693 Ilmenau (Germany)

    2017-02-28

    Highlights: • Development of nanoscale Ti/Al multilayer films with 1:1, 1:2 and 1:3 molar ratios. • Characterization of exothermic reaction propagation by experiments and simulation. • The reaction velocity depends on the ignition potentials and molar ratios of the films. • Only 1Ti/3Al films exhibit the unsteady reaction propagation with ripple formation. • CFD simulation shows the time dependent atom mixing and temperature flow during exothermic reaction. - Abstract: This study describes the self-propagating exothermic reaction in Ti/Al reactive multilayer foils by using experiments and computational fluid dynamics simulation. The Ti/Al foils with different molar ratios of 1Ti/1Al, 1Ti/2Al and 1Ti/3Al were fabricated by magnetron sputtering method. Microstructural characteristics of the unreacted and reacted foils were analyzed by using electronic and atomic force microscopes. After an electrical ignition, the influence of ignition potentials on reaction propagation has been experimentally investigated. The reaction front propagates with a velocity of minimum 0.68 ± 0.4 m/s and maximum 2.57 ± 0.6 m/s depending on the input ignition potentials and the chemical compositions. Here, the 1Ti/3Al reactive foil exhibits both steady state and unsteady wavelike reaction propagation. Moreover, the numerical computational fluid dynamics (CFD) simulation shows the time dependent temperature flow and atomic mixing in a nanoscale reaction zone. The CFD simulation also indicates the potentiality for simulating exothermic reaction in the nanoscale Ti/Al foil.

  20. ATLAS distributed computing: experience and evolution

    International Nuclear Information System (INIS)

    Nairz, A

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb −1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, energies and event complexities. An essential requirement will be the efficient utilisation of current and future processor technologies as well as a broad range of computing platforms, including supercomputing and cloud resources. We will report on experience gained thus far and our progress in preparing ATLAS computing for the future

  1. A benchmark on computational simulation of a CT fracture experiment

    International Nuclear Information System (INIS)

    Franco, C.; Brochard, J.; Ignaccolo, S.; Eripret, C.

    1992-01-01

    For a better understanding of the fracture behavior of cracked welds in piping, FRAMATOME, EDF and CEA have launched an important analytical research program. This program is mainly based on the analysis of the effects of the geometrical parameters (the crack size and the welded joint dimensions) and the yield strength ratio on the fracture behavior of several cracked configurations. Two approaches have been selected for the fracture analyses: on one hand, the global approach based on the concept of crack driving force J and on the other hand, a local approach of ductile fracture. In this approach the crack initiation and growth are modelized by the nucleation, growth and coalescence of cavities in front of the crack tip. The model selected in this study estimates only the growth of the cavities using the RICE and TRACEY relationship. The present study deals with a benchmark on computational simulation of CT fracture experiments using three computer codes : ALIBABA developed by EDF the CEA's code CASTEM 2000 and the FRAMATOME's code SYSTUS. The paper is split into three parts. At first, the authors present the experimental procedure for high temperature toughness testing of two CT specimens taken from a welded pipe, characteristic of pressurized water reactor primary piping. Secondly, considerations are outlined about the Finite Element analysis and the application procedure. A detailed description is given on boundary and loading conditions, on the mesh characteristics, on the numerical scheme involved and on the void growth computation. Finally, the comparisons between numerical and experimental results are presented up to the crack initiation, the tearing process being not taken into account in the present study. The variations of J and of the local variables used to estimate the damage around the crack tip (triaxiality and hydrostatic stresses, plastic deformations, void growth ...) are computed as a function of the increasing load

  2. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  3. Simulator experiments: effects of NPP operator experience on performance

    International Nuclear Information System (INIS)

    Beare, A.N.; Gray, L.H.

    1984-01-01

    During the FY83 research, a simulator experiment was conducted at the control room simulator for a GE Boiling Water Reactor (BWR) NPP. The research subjects were licensed operators undergoing requalification training and shift technical advisors (STAs). This experiment was designed to investigate the effects of senior reactor operator (SRO) experience, operating crew augmentation with an STA and practice, as a crew, upon crew and individual operator performance, in response to anticipated plant transients. Sixteen two-man crews of licensed operators were employed in a 2 x 2 factorial design. The SROs leading the crews were split into high and low experience groups on the basis of their years of experience as an SRO. One half of the high- and low-SRO experience groups were assisted by an STA. The crews responded to four simulated plant casualties. A five-variable set of content-referenced performance measures was derived from task analyses of the procedurally correct responses to the four casualties. System parameters and control manipulations were recorded by the computer controlling the simulator. Data on communications and procedure use were obtained from analysis of videotapes of the exercises. Questionnaires were used to collect subject biographical information and data on subjective workload during each simulated casualty. For four of the five performance measures, no significant differences were found between groups led by high (25 to 114 months) and low (1 to 17 months as an SRO) experience SROs. However, crews led by low experience SROs tended to have significantly shorter task performance times than crews led by high experience SROs. The presence of the STA had no significant effect on overall team performance in responding to the four simulated casualties. The FY84 experiments are a partial replication and extension of the FY83 experiment, but with PWR operators and simulator

  4. Predictive modeling of liquid-sodium thermal–hydraulics experiments and computations

    International Nuclear Information System (INIS)

    Arslan, Erkan; Cacuci, Dan G.

    2014-01-01

    Highlights: • We applied the predictive modeling method of Cacuci and Ionescu-Bujor (2010). • We assimilated data from sodium flow experiments. • We used computational fluid dynamics simulations of sodium experiments. • The predictive modeling method greatly reduced uncertainties in predicted results. - Abstract: This work applies the predictive modeling procedure formulated by Cacuci and Ionescu-Bujor (2010) to assimilate data from liquid-sodium thermal–hydraulics experiments in order to reduce systematically the uncertainties in the predictions of computational fluid dynamics (CFD) simulations. The predicted CFD-results for the best-estimate model parameters and results describing sodium-flow velocities and temperature distributions are shown to be significantly more precise than the original computations and experiments, in that the predicted uncertainties for the best-estimate results and model parameters are significantly smaller than both the originally computed and the experimental uncertainties

  5. Understanding Islamist political violence through computational social simulation

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  6. Impact of detector simulation in particle physics collider experiments

    Science.gov (United States)

    Daniel Elvira, V.

    2017-06-01

    Through the last three decades, accurate simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics (HEP) experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detector simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the precision of the physics results and publication turnaround, from data-taking to submission. It also presents estimates of the cost and economic impact of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data with increasingly complex detectors, taxing heavily the performance of simulation and reconstruction software. Consequently, exploring solutions to speed up simulation and reconstruction software to satisfy the growing demand of computing resources in a time of flat budgets is a matter that deserves immediate attention. The article ends with a short discussion on the potential solutions that are being considered, based on leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering HEP code for concurrency and parallel computing.

  7. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  8. Guiding Simulations and Experiments using Continuation

    DEFF Research Database (Denmark)

    When applying continuation of periodic solutions to high-dimensional finite element models one might face a dilemma. The mesh resolution and thus the dimension N of the model are typically chosen such that a given computer system can store the information necessary to perform one integration step...... for dimension N, but not for larger dimensions. In other words, a model is usually implemented as a carefully derived implicit integration scheme tailored for numerically stable simulations with the highest spacial resolution admitted by the computational power available. On the other hand, stable numerical...... developed method of control based continuation allows the continuation of periodic solutions without a reduction of the model resolution, and even directly in physical experiments. Moreover, both a simulation as well as an experiment can run asynchronously from the actual continuation method, which...

  9. Computational steering of GEM based detector simulations

    Science.gov (United States)

    Sheharyar, Ali; Bouhali, Othmane

    2017-10-01

    Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.

  10. Interferences and events on epistemic shifts in physics through computer simulations

    CERN Document Server

    Warnke, Martin

    2017-01-01

    Computer simulations are omnipresent media in today's knowledge production. For scientific endeavors such as the detection of gravitational waves and the exploration of subatomic worlds, simulations are essential; however, the epistemic status of computer simulations is rather controversial as they are neither just theory nor just experiment. Therefore, computer simulations have challenged well-established insights and common scientific practices as well as our very understanding of knowledge. This volume contributes to the ongoing discussion on the epistemic position of computer simulations in a variety of physical disciplines, such as quantum optics, quantum mechanics, and computational physics. Originating from an interdisciplinary event, it shows that accounts of contemporary physics can constructively interfere with media theory, philosophy, and the history of science.

  11. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  12. Computational Experiments for Science and Engineering Education

    Science.gov (United States)

    Xie, Charles

    2011-01-01

    How to integrate simulation-based engineering and science (SBES) into the science curriculum smoothly is a challenging question. For the importance of SBES to be appreciated, the core value of simulations-that they help people understand natural phenomena and solve engineering problems-must be taught. A strategy to achieve this goal is to introduce computational experiments to the science curriculum to replace or supplement textbook illustrations and exercises and to complement or frame hands-on or wet lab experiments. In this way, students will have an opportunity to learn about SBES without compromising other learning goals required by the standards and teachers will welcome these tools as they strengthen what they are already teaching. This paper demonstrates this idea using a number of examples in physics, chemistry, and engineering. These exemplary computational experiments show that it is possible to create a curriculum that is both deeper and wider.

  13. Computer simulation of charged fusion-product trajectories and detection efficiency expected for future experiments within the COMPASS tokamak

    International Nuclear Information System (INIS)

    Kwiatkowski, Roch; Malinowski, Karol; Sadowski, Marek J

    2014-01-01

    This paper presents results of computer simulations of charged particle motions and detection efficiencies for an ion-pinhole camera of a new diagnostic system to be used in future COMPASS tokamak experiments. A probe equipped with a nuclear track detector can deliver information about charged products of fusion reactions. The calculations were performed with a so-called Gourdon code, based on a single-particle model and toroidal symmetry. There were computed trajectories of fast ions (> 500 keV) in medium-dense plasma (n e  < 10 14  cm −3 ) and an expected detection efficiency (a ratio of the number of detected particles to that of particles emitted from plasma). The simulations showed that charged fusion products can reach the new diagnostic probe, and the expected detection efficiency can reach 2 × 10 −8 . Based on such calculations, one can determine the optimal position and orientation of the probe. The obtained results are of importance for the interpretation of fusion-product images to be recorded in future COMPASS experiments. (paper)

  14. Computer simulation of FT-NMR multiple pulse experiment

    Science.gov (United States)

    Allouche, A.; Pouzard, G.

    1989-04-01

    Using the product operator formalism in its real form, SIMULDENS expands the density matrix of a scalar coupled nuclear spin system and simulates analytically a large variety of FT-NMR multiple pulse experiments. The observable transverse magnetizations are stored and can be combined to represent signal accumulation. The programming language is VAX PASCAL, but a MacIntosh Turbo Pascal Version is also available.

  15. Integration of genetic algorithm, computer simulation and design of experiments for forecasting electrical energy consumption

    International Nuclear Information System (INIS)

    Azadeh, A.; Tarverdian, S.

    2007-01-01

    This study presents an integrated algorithm for forecasting monthly electrical energy consumption based on genetic algorithm (GA), computer simulation and design of experiments using stochastic procedures. First, time-series model is developed as a benchmark for GA and simulation. Computer simulation is developed to generate random variables for monthly electricity consumption. This is achieved to foresee the effects of probabilistic distribution on monthly electricity consumption. The GA and simulated-based GA models are then developed by the selected time-series model. Therefore, there are four treatments to be considered in analysis of variance (ANOVA) which are actual data, time series, GA and simulated-based GA. Furthermore, ANOVA is used to test the null hypothesis of the above four alternatives being equal. If the null hypothesis is accepted, then the lowest mean absolute percentage error (MAPE) value is used to select the best model, otherwise the Duncan Multiple Range Test (DMRT) method of paired comparison is used to select the optimum model, which could be time series, GA or simulated-based GA. In case of ties the lowest MAPE value is considered as the benchmark. The integrated algorithm has several unique features. First, it is flexible and identifies the best model based on the results of ANOVA and MAPE, whereas previous studies consider the best-fit GA model based on MAPE or relative error results. Second, the proposed algorithm may identify conventional time series as the best model for future electricity consumption forecasting because of its dynamic structure, whereas previous studies assume that GA always provide the best solutions and estimation. To show the applicability and superiority of the proposed algorithm, the monthly electricity consumption in Iran from March 1994 to February 2005 (131 months) is used and applied to the proposed algorithm

  16. Structure and dynamics of amorphous polymers: computer simulations compared to experiment and theory

    International Nuclear Information System (INIS)

    Paul, Wolfgang; Smith, Grant D

    2004-01-01

    This contribution considers recent developments in the computer modelling of amorphous polymeric materials. Progress in our capabilities to build models for the computer simulation of polymers from the detailed atomistic scale up to coarse-grained mesoscopic models, together with the ever-improving performance of computers, have led to important insights from computer simulations into the structural and dynamic properties of amorphous polymers. Structurally, chain connectivity introduces a range of length scales from that of the chemical bond to the radius of gyration of the polymer chain covering 2-4 orders of magnitude. Dynamically, this range of length scales translates into an even larger range of time scales observable in relaxation processes in amorphous polymers ranging from about 10 -13 to 10 -3 s or even to 10 3 s when glass dynamics is concerned. There is currently no single simulation technique that is able to describe all these length and time scales efficiently. On large length and time scales basic topology and entropy become the governing properties and this fact can be exploited using computer simulations of coarse-grained polymer models to study universal aspects of the structure and dynamics of amorphous polymers. On the largest length and time scales chain connectivity is the dominating factor leading to the strong increase in longest relaxation times described within the reptation theory of polymer melt dynamics. Recently, many of the universal aspects of this behaviour have been further elucidated by computer simulations of coarse-grained polymer models. On short length scales the detailed chemistry and energetics of the polymer are important, and one has to be able to capture them correctly using chemically realistic modelling of specific polymers, even when the aim is to extract generic physical behaviour exhibited by the specific chemistry. Detailed studies of chemically realistic models highlight the central importance of torsional dynamics

  17. Computer simulation of FT-NMR multiple pulse experiment

    International Nuclear Information System (INIS)

    Allouche, A.; Pouzard, G.

    1989-01-01

    Using the product operator formalism in its real form, SIMULDENS expands the density matrix of a scalar coupled nuclear spin system and simulates analytically a large variety of FT-NMR multiple pulse experiments. The observable transverse magnetizations are stored and can be combined to represent signal accumulation. The programming language is VAX PASCAL, but a MacIntosh Turbo Pascal Version is also available. (orig.)

  18. Volunteer computing experience with ATLAS@Home

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00068610; The ATLAS collaboration; Bianchi, Riccardo-Maria; Cameron, David; Filipčič, Andrej; Lançon, Eric; Wu, Wenjing

    2016-01-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  19. Volunteer Computing Experience with ATLAS@Home

    CERN Document Server

    Cameron, David; The ATLAS collaboration; Bourdarios, Claire; Lan\\c con, Eric

    2016-01-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers' resources make up a sizable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one job to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  20. Volunteer Computing Experience with ATLAS@Home

    Science.gov (United States)

    Adam-Bourdarios, C.; Bianchi, R.; Cameron, D.; Filipčič, A.; Isacchini, G.; Lançon, E.; Wu, W.; ATLAS Collaboration

    2017-10-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  1. Computer based training simulator for Hunterston Nuclear Power Station

    International Nuclear Information System (INIS)

    Bowden, R.S.M.; Hacking, D.

    1978-01-01

    For reasons which are stated, the Hunterston-B nuclear power station automatic control system includes a manual over-ride facility. It is therefore essential for the station engineers to be trained to recognise and control all feasible modes of plant and logic malfunction. A training simulator has been built which consists of a replica of the shutdown monitoring panel in the Central Control Room and is controlled by a mini-computer. This paper highlights the computer aspects of the simulator and relevant derived experience, under the following headings: engineering background; shutdown sequence equipment; simulator equipment; features; software; testing; maintenance. (U.K.)

  2. Teaching Computer Organization and Architecture Using Simulation and FPGA Applications

    OpenAIRE

    D. K.M. Al-Aubidy

    2007-01-01

    This paper presents the design concepts and realization of incorporating micro-operation simulation and FPGA implementation into a teaching tool for computer organization and architecture. This teaching tool helps computer engineering and computer science students to be familiarized practically with computer organization and architecture through the development of their own instruction set, computer programming and interfacing experiments. A two-pass assembler has been designed and implemente...

  3. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    Science.gov (United States)

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  4. Numerical simulation of NQR/NMR: Applications in quantum computing.

    Science.gov (United States)

    Possa, Denimar; Gaudio, Anderson C; Freitas, Jair C C

    2011-04-01

    A numerical simulation program able to simulate nuclear quadrupole resonance (NQR) as well as nuclear magnetic resonance (NMR) experiments is presented, written using the Mathematica package, aiming especially applications in quantum computing. The program makes use of the interaction picture to compute the effect of the relevant nuclear spin interactions, without any assumption about the relative size of each interaction. This makes the program flexible and versatile, being useful in a wide range of experimental situations, going from NQR (at zero or under small applied magnetic field) to high-field NMR experiments. Some conditions specifically required for quantum computing applications are implemented in the program, such as the possibility of use of elliptically polarized radiofrequency and the inclusion of first- and second-order terms in the average Hamiltonian expansion. A number of examples dealing with simple NQR and quadrupole-perturbed NMR experiments are presented, along with the proposal of experiments to create quantum pseudopure states and logic gates using NQR. The program and the various application examples are freely available through the link http://www.profanderson.net/files/nmr_nqr.php. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Experiences using multigrid for geothermal simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bullivant, D.P.; O`Sullivan, M.J. [Univ. of Auckland (New Zealand); Yang, Z. [Univ. of New South Wales (Australia)

    1995-03-01

    Experiences of applying multigrid to the calculation of natural states for geothermal simulations are discussed. The modelling of natural states was chosen for this study because they can take a long time to compute and the computation is often dominated by the development of phase change boundaries that take up a small region in the simulation. For the first part of this work a modified version of TOUGH was used for 2-D vertical problems. A {open_quotes}test-bed{close_quotes} program is now being used to investigate some of the problems encountered with implementing multigrid. This is ongoing work. To date, there have been some encouraging but not startling results.

  6. Atomic-level computer simulation

    International Nuclear Information System (INIS)

    Adams, J.B.; Rockett, Angus; Kieffer, John; Xu Wei; Nomura, Miki; Kilian, K.A.; Richards, D.F.; Ramprasad, R.

    1994-01-01

    This paper provides a broad overview of the methods of atomic-level computer simulation. It discusses methods of modelling atomic bonding, and computer simulation methods such as energy minimization, molecular dynamics, Monte Carlo, and lattice Monte Carlo. ((orig.))

  7. Computational fluid dynamics (CFD) simulation of hot air flow ...

    African Journals Online (AJOL)

    Computational Fluid Dynamics simulation of air flow distribution, air velocity and pressure field pattern as it will affect moisture transient in a cabinet tray dryer is performed using SolidWorks Flow Simulation (SWFS) 2014 SP 4.0 program. The model used for the drying process in this experiment was designed with Solid ...

  8. Reduction of community alcohol problems: computer simulation experiments in three counties.

    Science.gov (United States)

    Holder, H D; Blose, J O

    1987-03-01

    A series of alcohol abuse prevention strategies was evaluated using computer simulation for three counties in the United States: Wake County, North Carolina, Washington County, Vermont and Alameda County, California. A system dynamics model composed of a network of interacting variables was developed for the pattern of alcoholic beverage consumption in a community. The relationship of community drinking patterns to various stimulus factors was specified in the model based on available empirical research. Stimulus factors included disposable income, alcoholic beverage prices, advertising exposure, minimum drinking age and changes in cultural norms. After a generic model was developed and validated on the national level, a computer-based system dynamics model was developed for each county, and a series of experiments was conducted to project the potential impact of specific prevention strategies. The project concluded that prevention efforts can both lower current levels of alcohol abuse and reduce projected increases in alcohol-related problems. Without such efforts, already high levels of alcohol-related family disruptions in the three counties could be expected to rise an additional 6% and drinking-related work problems 1-5%, over the next 10 years after controlling for population growth. Of the strategies tested, indexing the price of alcoholic beverages to the consumer price index in conjunction with the implementation of a community educational program with well-defined target audiences has the best potential for significant problem reduction in all three counties.

  9. Computer simulations of a 1/5-scale experiment of a Mark I boiler water reactor pressure-suppression system under hypothetical LOCA conditions

    International Nuclear Information System (INIS)

    Edwards, L.L.

    1978-01-01

    The CHAMP computer code was employed to simulate a plane-geometry cross section of a Mark I boiling water reactor toroidal pressure suppression system air discharge experiment under hypothetical loss-of-coolant accident conditions. The experiments were performed at the Lawrence Livermore Laboratory on a 1 / 5 -scale model of the Peach Bottom Nuclear Power Plant

  10. Research on integrated simulation of fluid-structure system by computation science techniques

    International Nuclear Information System (INIS)

    Yamaguchi, Akira

    1996-01-01

    In Power Reactor and Nuclear Fuel Development Corporation, the research on the integrated simulation of fluid-structure system by computation science techniques has been carried out, and by its achievement, the verification of plant systems which has depended on large scale experiments is substituted by computation science techniques, in this way, it has been aimed at to reduce development costs and to attain the optimization of FBR systems. For the purpose, it is necessary to establish the technology for integrally and accurately analyzing complicated phenomena (simulation technology), the technology for applying it to large scale problems (speed increasing technology), and the technology for assuring the reliability of the results of analysis when simulation technology is utilized for the permission and approval of FBRs (verifying technology). The simulation of fluid-structure interaction, the heat flow simulation in the space with complicated form and the related technologies are explained. As the utilization of computation science techniques, the elucidation of phenomena by numerical experiment and the numerical simulation as the substitute for tests are discussed. (K.I.)

  11. A Computer Simulation of Community Pharmacy Practice for Educational Use.

    Science.gov (United States)

    Bindoff, Ivan; Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert

    2014-11-15

    To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor.

  12. Development and application of a computer model for large-scale flame acceleration experiments

    International Nuclear Information System (INIS)

    Marx, K.D.

    1987-07-01

    A new computational model for large-scale premixed flames is developed and applied to the simulation of flame acceleration experiments. The primary objective is to circumvent the necessity for resolving turbulent flame fronts; this is imperative because of the relatively coarse computational grids which must be used in engineering calculations. The essence of the model is to artificially thicken the flame by increasing the appropriate diffusivities and decreasing the combustion rate, but to do this in such a way that the burn velocity varies with pressure, temperature, and turbulence intensity according to prespecified phenomenological characteristics. The model is particularly aimed at implementation in computer codes which simulate compressible flows. To this end, it is applied to the two-dimensional simulation of hydrogen-air flame acceleration experiments in which the flame speeds and gas flow velocities attain or exceed the speed of sound in the gas. It is shown that many of the features of the flame trajectories and pressure histories in the experiments are simulated quite well by the model. Using the comparison of experimental and computational results as a guide, some insight is developed into the processes which occur in such experiments. 34 refs., 25 figs., 4 tabs

  13. Simulated experiments

    International Nuclear Information System (INIS)

    Bjerknes, R.

    1977-01-01

    A cybernetic model has been developed to elucidate some of the main principles of the growth regulation system in the epidermis of the hairless mouse. A number of actual and theoretical biological experiments have been simulated on the model. These included simulating the cell kinetics as measured by pulse labelling with tritiated thymidine and by continuous labelling with tritiated thymidine. Other simulated experiments included steady state, wear and tear, painting with a carcinogen, heredity and heredity and tumour. Numerous diagrams illustrate the results of these simulated experiments. (JIW)

  14. Using Computer Simulations in Chemistry Problem Solving

    Science.gov (United States)

    Avramiotis, Spyridon; Tsaparlis, Georgios

    2013-01-01

    This study is concerned with the effects of computer simulations of two novel chemistry problems on the problem solving ability of students. A control-experimental group, equalized by pair groups (n[subscript Exp] = n[subscript Ctrl] = 78), research design was used. The students had no previous experience of chemical practical work. Student…

  15. Virtual geotechnical laboratory experiments using a simulator

    Science.gov (United States)

    Penumadu, Dayakar; Zhao, Rongda; Frost, David

    2000-04-01

    The details of a test simulator that provides a realistic environment for performing virtual laboratory experimentals in soil mechanics is presented. A computer program Geo-Sim that can be used to perform virtual experiments, and allow for real-time observations of material response is presented. The results of experiments, for a given set of input parameters, are obtained with the test simulator using well-trained artificial neural-network-based soil models for different soil types and stress paths. Multimedia capabilities are integrated in Geo-Sim, using software that links and controls a laser disc player with a real-time parallel processing ability. During the simulation of a virtual experiment, relevant portions of the video image of a previously recorded test on an actual soil specimen are dispalyed along with the graphical presentation of response from the feedforward ANN model predictions. The pilot simulator developed to date includes all aspects related to performing a triaxial test on cohesionless soil under undrained and drained conditions. The benefits of the test simulator are also presented.

  16. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  17. Optically stimulated luminescence sensitivity changes in quartz due to repeated use in single aliquot readout: Experiments and computer simulations

    DEFF Research Database (Denmark)

    McKeever, S.W.S.; Bøtter-Jensen, L.; Agersnap Larsen, N.

    1996-01-01

    believed to be occurring. The computer model used includes both shallow and deep ('hard-to-bleach') traps, OSL ('easy-to-bleach') traps, and radiative and non-radiative recombination centres. The model has previously been used successfully to account for sensitivity changes in quartz due to thermal......As part of a study to examine sensitivity changes in single aliquot techniques using optically stimulated luminescence (OSL) a series of experiments has been conducted with single aliquots of natural quartz, and the data compared with the results of computer simulations of the type of processes...... annealing. The simulations are able to reproduce qualitatively the main features of the experimental results including sensitivity changes as a function of reuse, and their dependence upon bleaching time and laboratory dose. The sensitivity changes are believed to be the result of a combination of shallow...

  18. Effect of computer game playing on baseline laparoscopic simulator skills.

    Science.gov (United States)

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  19. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  20. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  1. Computer simulation games in population and education.

    Science.gov (United States)

    Moreland, R S

    1988-01-01

    Computer-based simulation games are effective training tools that have several advantages. They enable players to learn in a nonthreatening manner and develop strategies to achieve goals in a dynamic environment. They also provide visual feedback on the effects of players' decisions, encourage players to explore and experiment with options before making final decisions, and develop players' skills in analysis, decision making, and cooperation. 2 games have been developed by the Research Triangle Institute for public-sector planning agencies interested in or dealing with developing countries. The UN Population and Development Game teaches players about the interaction between population variables and the national economy and how population policies complement other national policies, such as education. The BRIDGES Education Planning Game focuses on the effects education has on national policies. In both games, the computer simulates the reactions of a fictional country's socioeconomic system to players' decisions. Players can change decisions after seeing their effects on a computer screen and thus can improve their performance in achieving goals.

  2. On the computer simulation of the EPR-Bohm experiment

    International Nuclear Information System (INIS)

    McGoveran, D.O.; Noyes, H.P.; Manthey, M.J.

    1988-12-01

    We argue that supraluminal correlation without supraluminal signaling is a necessary consequence of any finite and discrete model for physics. Every day, the commercial and military practice of using encrypted communication based on correlated, pseudo-random signals illustrates this possibility. All that is needed are two levels of computational complexity which preclude using a smaller system to detect departures from ''randomness'' in the larger system. Hence the experimental realizations of the EPR-Bohm experiment leave open the question of whether the world of experience is ''random'' or pseudo-random. The latter possibility could be demonstrated experimentally if a complexity parameter related to the arm length and switching time in an Aspect-type realization of the EPR-Bohm experiment is sufficiently small compared to the number of reliable total counts which can be obtained in practice. 6 refs

  3. Advanced computational simulations of water waves interacting with wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  4. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  5. The effects of computer assisted physics experiment simulations on students' learning

    Directory of Open Access Journals (Sweden)

    Turhan Civelek

    2013-11-01

    Full Text Available The main goal of this study is to present the significant difference between utilization of simulations of physics experiment during lectures and traditional physics lecture. Two groups of 115 students were selected for the purpose of the study. The same subjects have been taught to both groups, while a group of 115 had their lectures in science and technology class supported by physics experiment simulations for a month, the other group of115 had their lectures ina traditional way. The research has been conducted in Izzet Unver highs school in Istanbul, Gungoren. The main resource of this research is the data collected through surveys. The survey is a result of the literature and the suggestions of the experts on the topic. Thirty questions were prepared under ten topics. Two different surveys were conducted during the data collection. While the first survey questions focused on the effects of traditional lecturing on students, the second survey questions were targeting the effects of lecturing via the support of psychics experiment simulations. The data collected as a result of the survey which was coded in to SPSS Software and statistical anal yses was conducted. In order to test the significant difference between the means t-test was utilized. 0.05 was chosen as the significance level. As a result of the analyses utilized, significant differences were found in their satisfaction on class materials, in their motivation, in their learning speed, in their interest in the class, and in their contribution to the class. In findings such as the effect on students’ learning, information availability, organization of information, students’ integration to the class and gaining different point of views “lectures supported by physics experiment simulations” is significantly different from traditional lecturing. As the result of the literature review and the statistical analyses, “lectures supported via physics experiment simulations” seem to

  6. Computer simulation as representation of knowledge in education

    International Nuclear Information System (INIS)

    Krekic, Valerija Pinter; Namestovski, Zolt

    2009-01-01

    According to Aebli's operative method (1963) and Bruner's (1974) theory of representation the development of the process of thinking in teaching has the following phases - levels of abstraction: manipulation with specific things (specific phase), iconic representation (figural phase), symbolic representation (symbolic phase). Modern information technology has contributed to the enrichment of teaching and learning processes, especially in the fields of natural sciences and mathematics and those of production and technology. Simulation appears as a new possibility in the representation of knowledge. According to Guetzkow (1972) simulation is an operative representation of reality from a relevant aspect. It is about a model of an objective system, which is dynamic in itself. If that model is material it is a simple simulation, if it is abstract it is a reflective experiment, that is a computer simulation. This present work deals with the systematization and classification of simulation methods in the teaching of natural sciences and mathematics and of production and technology with special retrospective view on computer simulations and exemplar representation of the place and the role of this modern method of cognition. Key words: Representation of knowledge, modeling, simulation, education

  7. Using computer simulations to probe the structure and dynamics of biopolymers

    International Nuclear Information System (INIS)

    Levy, R.M.; Hirata, F.; Kim, K.; Zhang, P.

    1987-01-01

    The use of computer simulations to study internal motions and thermodynamic properties is receiving increased attention. One important use of the method is to provide a more fundamental understanding of the molecular information contained in various kinds of experiments on these complex systems. In the first part of this paper the authors review recent work in their laboratory concerned with the use of computer simulations for the interpretation of experimental probes of molecular structure and dynamics of proteins and nucleic acids. The interplay between computer simulations and three experimental techniques is emphasized: (1) nuclear magnetic resonance relaxation spectroscopy, (2) refinement of macro-molecular x-ray structures, and (3) vibrational spectroscopy. The treatment of solvent effects in biopolymer simulations is a difficult problem. It is not possible to study systematically the effect of solvent conditions, e.g. added salt concentration, on biopolymer properties by means of simulations alone. In the last part of the paper the authors review a more analytical approach they developed to study polyelectrolyte properties of solvated biopolymers. The results are compared with computer simulations

  8. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  9. Optically stimulated luminescence sensitivity changes in quartz due to repeated use in single aliquot readout: experiments and computer simulations

    International Nuclear Information System (INIS)

    McKeever, S.W.S.; Oklahoma State Univ., Stillwater, OK; Boetter-Jensen, L.; Agersnap Larsen, N.; Mejdahl, V.; Poolton, N.R.J.

    1996-01-01

    As part of a study to examine sensitivity changes in single aliquot techniques using optically stimulated luminescence (OSL) a series of experiments has been conducted with single aliquots of natural quartz, and the data compared with the results of computer simulations of the type of processes believed to be occurring. The computer model used includes both shallow and deep ('hard-to-bleach') traps, OSL ('easy-to-bleach') traps, and radiative and non-radiative recombination centres. The model has previously been used successfully to account for sensitivity changes in quartz due to thermal annealing. The simulations are able to reproduce qualitatively the main features of the experimental results including sensitivity changes as a function of re-use, and their dependence upon bleaching time and laboratory dose. The sensitivity changes are believed to be the result of a combination of shallow trap and deep trap effects. (author)

  10. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  11. Could running experience on SPMD computers contribute to the architectural choices for future dedicated computers for high energy physics simulation

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.

    1989-01-01

    Results obtained on strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers

  12. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    Science.gov (United States)

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to

  13. Use of computer simulations for the early introduction of nuclear engineering concepts

    International Nuclear Information System (INIS)

    Ougouag, A.M.; Zerguini, T.H.

    1985-01-01

    A sophomore level nuclear engineering (NE) course is being introduced at the University of Illinois. Via computer simulations, this course presents materials covering the most important aspects of the field. It is noted that computer simulations in nuclear engineering are cheaper and safer than experiments yet they provide an effective teaching tool for the early introduction of advanced concepts. The new course material can be used as a tutorial and for remedial learning. The use of computer simulation motivates learning since students associate computer activities with games. Such a course can help in the dissemination of the proper information to students from different fields, including liberal arts, and eventually increase undergraduate student enrollment in nuclear engineering

  14. Could running experience on SPMD computers contribute to the architectural choices for future dedicated computers for high energy physics simulation?

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.

    1989-01-01

    Results obtained on a strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers. (orig.)

  15. Event-by-event simulation of Einstein-Podolsky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    Zhao, Shuang; De Raedt, Hans; Michielsen, Kristel

    We construct an event-based computer simulation model of the Einstein-Podolsky-Rosen-Bohm experiments with photons. The algorithm is a one-to-one copy of the data gathering and analysis procedures used in real laboratory experiments. We consider two types of experiments, those with a source emitting

  16. Computer simulation of high energy displacement cascades

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1990-01-01

    A methodology developed for modeling many aspects of high energy displacement cascades with molecular level computer simulations is reviewed. The initial damage state is modeled in the binary collision approximation (using the MARLOWE computer code), and the subsequent disposition of the defects within a cascade is modeled with a Monte Carlo annealing simulation (the ALSOME code). There are few adjustable parameters, and none are set to physically unreasonable values. The basic configurations of the simulated high energy cascades in copper, i.e., the number, size and shape of damage regions, compare well with observations, as do the measured numbers of residual defects and the fractions of freely migrating defects. The success of these simulations is somewhat remarkable, given the relatively simple models of defects and their interactions that are employed. The reason for this success is that the behavior of the defects is very strongly influenced by their initial spatial distributions, which the binary collision approximation adequately models. The MARLOWE/ALSOME system, with input from molecular dynamics and experiments, provides a framework for investigating the influence of high energy cascades on microstructure evolution. (author)

  17. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  18. Computer simulation of gain fluctuations in proportional counters

    International Nuclear Information System (INIS)

    Demir, Nelgun; Tapan, . Ilhan

    2004-01-01

    A computer simulation code has been developed in order to examine the fluctuation in gas amplification in wire proportional counters which are common in detector applications in particle physics experiments. The magnitude of the variance in the gain dominates the statistical portion of the energy resolution. In order to compare simulation and experimental results, the gain and its variation has been calculated numerically for the well known Aleph Inner Tracking Detector geometry. The results show that the bias voltage has a strong influence on the variance in the gain. The simulation calculations are in good agreement with experimental results. (authors)

  19. CMS distributed computing workflow experience

    Science.gov (United States)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D.; Prosper, Harrison B.; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao, Junhui; Pin, Arnaud; Schul, Nicolas; De Lentdecker, Gilles; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey; Barge, Derek; Lahiff, Andrew

    2011-12-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  20. CMS distributed computing workflow experience

    International Nuclear Information System (INIS)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D; Prosper, Harrison B; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao Junhui; Pin, Arnaud; Schul, Nicolas; Lentdecker, Gilles De; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey

    2011-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  1. Computational Physics Simulation of Classical and Quantum Systems

    CERN Document Server

    Scherer, Philipp O. J

    2010-01-01

    This book encapsulates the coverage for a two-semester course in computational physics. The first part introduces the basic numerical methods while omitting mathematical proofs but demonstrating the algorithms by way of numerous computer experiments. The second part specializes in simulation of classical and quantum systems with instructive examples spanning many fields in physics, from a classical rotor to a quantum bit. All program examples are realized as Java applets ready to run in your browser and do not require any programming skills.

  2. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  3. Computer Assisted Fluid Power Instruction: A Comparison of Hands-On and Computer-Simulated Laboratory Experiences for Post-Secondary Students

    Science.gov (United States)

    Wilson, Scott B.

    2005-01-01

    The primary purpose of this study was to examine the effectiveness of utilizing a combination of lecture and computer resources to train personnel to assume roles as hydraulic system technicians and specialists in the fluid power industry. This study compared computer simulated laboratory instruction to traditional hands-on laboratory instruction,…

  4. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    Science.gov (United States)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  5. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  6. Simulation of complete neutron scattering experiments: from model systems to liquid germanium; Simulation complete d'une experience de diffusion de neutrons: des systemes modeles au germanium liquide

    Energy Technology Data Exchange (ETDEWEB)

    Hugouvieux, V

    2004-11-15

    In this thesis, both theoretical and experimental studies of liquids are done. Neutron scattering enables structural and dynamical properties of liquids to be investigated. On the theoretical side, molecular dynamics simulations are of great interest since they give positions and velocities of the atoms and the forces acting on each of them. They also enable spatial and temporal correlations to be computed and these quantities are also available from neutron scattering experiments. Consequently, the comparison can be made between results from molecular dynamics simulations and from neutron scattering experiments, in order to improve our understanding of the structure and dynamics of liquids. However, since extracting reliable data from a neutron scattering experiment is difficult, we propose to simulate the experiment as a whole, including both instrument and sample, in order to gain understanding and to evaluate the impact of the different parasitic contributions (absorption, multiple scattering associated with elastic and inelastic scattering, instrument resolution). This approach, in which the sample is described by its structure and dynamics as computed from molecular dynamics simulations, is presented and tested on isotropic model systems. Then liquid germanium is investigated by inelastic neutron scattering and both classical and ab initio molecular dynamics simulations. This enables us to simulate the experiment we performed and to evaluate the influence of the contributions from the instrument and from the sample on the detected signal. (author)

  7. Computational physics. Simulation of classical and quantum systems

    Energy Technology Data Exchange (ETDEWEB)

    Scherer, Philipp O.J. [TU Muenchen (Germany). Physikdepartment T38

    2010-07-01

    This book encapsulates the coverage for a two-semester course in computational physics. The first part introduces the basic numerical methods while omitting mathematical proofs but demonstrating the algorithms by way of numerous computer experiments. The second part specializes in simulation of classical and quantum systems with instructive examples spanning many fields in physics, from a classical rotor to a quantum bit. All program examples are realized as Java applets ready to run in your browser and do not require any programming skills. (orig.)

  8. Integration of adaptive process control with computational simulation for spin-forming

    International Nuclear Information System (INIS)

    Raboin, P. J. LLNL

    1998-01-01

    Improvements in spin-forming capabilities through upgrades to a metrology and machine control system and advances in numerical simulation techniques were studied in a two year project funded by Laboratory Directed Research and Development (LDRD) at Lawrence Livermore National Laboratory. Numerical analyses were benchmarked with spin-forming experiments and computational speeds increased sufficiently to now permit actual part forming simulations. Extensive modeling activities examined the simulation speeds and capabilities of several metal forming computer codes for modeling flat plate and cylindrical spin-forming geometries. Shape memory research created the first numerical model to describe this highly unusual deformation behavior in Uranium alloys. A spin-forming metrology assessment led to sensor and data acquisition improvements that will facilitate future process accuracy enhancements, such as a metrology frame. Finally, software improvements (SmartCAM) to the manufacturing process numerically integrate the part models to the spin-forming process and to computational simulations

  9. Simulation Experiment Description Markup Language (SED-ML Level 1 Version 3 (L1V3

    Directory of Open Access Journals (Sweden)

    Bergmann Frank T.

    2018-03-01

    Full Text Available The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML is an XML-based format that encodes, for a given simulation experiment, (i which models to use; (ii which modifications to apply to models before simulation; (iii which simulation procedures to run on each model; (iv how to post-process the data; and (v how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1 implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  10. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    Science.gov (United States)

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  11. CMS Distributed Computing Workflow Experience

    CERN Document Server

    Haas, Jeffrey David

    2010-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simul...

  12. Topographic evolution of sandbars: Flume experiment and computational modeling

    Science.gov (United States)

    Kinzel, Paul J.; Nelson, Jonathan M.; McDonald, Richard R.; Logan, Brandy L.

    2010-01-01

    Measurements of sandbar formation and evolution were carried out in a laboratory flume and the topographic characteristics of these barforms were compared to predictions from a computational flow and sediment transport model with bed evolution. The flume experiment produced sandbars with approximate mode 2, whereas numerical simulations produced a bed morphology better approximated as alternate bars, mode 1. In addition, bar formation occurred more rapidly in the laboratory channel than for the model channel. This paper focuses on a steady-flow laboratory experiment without upstream sediment supply. Future experiments will examine the effects of unsteady flow and sediment supply and the use of numerical models to simulate the response of barform topography to these influences.

  13. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  14. Simulation of complete neutron scattering experiments: from model systems to liquid germanium

    International Nuclear Information System (INIS)

    Hugouvieux, V.

    2004-11-01

    In this thesis, both theoretical and experimental studies of liquids are done. Neutron scattering enables structural and dynamical properties of liquids to be investigated. On the theoretical side, molecular dynamics simulations are of great interest since they give positions and velocities of the atoms and the forces acting on each of them. They also enable spatial and temporal correlations to be computed and these quantities are also available from neutron scattering experiments. Consequently, the comparison can be made between results from molecular dynamics simulations and from neutron scattering experiments, in order to improve our understanding of the structure and dynamics of liquids. However, since extracting reliable data from a neutron scattering experiment is difficult, we propose to simulate the experiment as a whole, including both instrument and sample, in order to gain understanding and to evaluate the impact of the different parasitic contributions (absorption, multiple scattering associated with elastic and inelastic scattering, instrument resolution). This approach, in which the sample is described by its structure and dynamics as computed from molecular dynamics simulations, is presented and tested on isotropic model systems. Then liquid germanium is investigated by inelastic neutron scattering and both classical and ab initio molecular dynamics simulations. This enables us to simulate the experiment we performed and to evaluate the influence of the contributions from the instrument and from the sample on the detected signal. (author)

  15. Thermal Hydraulic Computational Fluid Dynamics Simulations and Experimental Investigation of Deformed Fuel Assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Mays, Brian [AREVA Federal Services, Lynchburg, VA (United States); Jackson, R. Brian [TerraPower, Bellevue, WA (United States)

    2017-03-08

    The project, Toward a Longer Life Core: Thermal Hydraulic CFD Simulations and Experimental Investigation of Deformed Fuel Assemblies, DOE Project code DE-NE0008321, was a verification and validation project for flow and heat transfer through wire wrapped simulated liquid metal fuel assemblies that included both experiments and computational fluid dynamics simulations of those experiments. This project was a two year collaboration between AREVA, TerraPower, Argonne National Laboratory and Texas A&M University. Experiments were performed by AREVA and Texas A&M University. Numerical simulations of these experiments were performed by TerraPower and Argonne National Lab. Project management was performed by AREVA Federal Services. The first of a kind project resulted in the production of both local point temperature measurements and local flow mixing experiment data paired with numerical simulation benchmarking of the experiments. The project experiments included the largest wire-wrapped pin assembly Mass Index of Refraction (MIR) experiment in the world, the first known wire-wrapped assembly experiment with deformed duct geometries and the largest numerical simulations ever produced for wire-wrapped bundles.

  16. Computer simulation for sodium-concrete reactions

    International Nuclear Information System (INIS)

    Zhang Bin; Zhu Jizhou

    2006-01-01

    In the liquid metal cooled fast breeder reactors (LMFBRs), direct contacts between sodium and concrete is unavoidable. Due to sodium's high chemical reactivity, sodium would react with concrete violently. Lots of hydrogen gas and heat would be released then. This would harm the ignorantly of the containment. This paper developed a program to simualte sodium-conrete reactions across-the-board. It could give the reaction zone temperature, pool temperature, penetration depth, penetration rate, hydrogen flux and reaction heat and so on. Concrete was considered to be composed of silica and water only in this paper. The variable, the quitient of sodium hydroxide, was introduced in the continuity equation to simulate the chemical reactions more realistically. The product of the net gas flux and boundary depth was ably transformed to that of penetration rate and boundary depth. The complex chemical kinetics equations was simplified under some hypothesises. All the technique applied above simplified the computer simulation consumedly. In other words, they made the computer simulation feasible. Theoretics models that applied in the program and the calculation procedure were expatiated in detail. Good agreements of an overall transient behavior were obtained in the series of sodium-concrete reaction experiment analysis. The comparison between the analytical and experimental results showed the program presented in this paper was creditable and reasonable for simulating the sodium-concrete reactions. This program could be used for nuclear safety judgement. (authors)

  17. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  18. Validation of thermohydraulic codes by comparison of experimental results with computer simulations

    International Nuclear Information System (INIS)

    Madeira, A.A.; Galetti, M.R.S.; Pontedeiro, A.C.

    1989-01-01

    The results obtained by simulation of three cases from CANON depressurization experience, using the TRAC-PF1 computer code, version 7.6, implanted in the VAX-11/750 computer of Brazilian CNEN, are presented. The CANON experience was chosen as first standard problem in thermo-hydraulic to be discussed at ENFIR for comparing results from different computer codes with results obtained experimentally. The ability of TRAC-PF1 code to prevent the depressurization phase of a loss of primary collant accident in pressurized water reactors is evaluated. (M.C.K.) [pt

  19. Parallel reservoir simulator computations

    International Nuclear Information System (INIS)

    Hemanth-Kumar, K.; Young, L.C.

    1995-01-01

    The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90

  20. Computer simulations of a rough sphere fluid

    International Nuclear Information System (INIS)

    Lyklema, J.W.

    1978-01-01

    A computer simulation is described on rough hard spheres with a continuously variable roughness parameter, including the limits of smooth and completely rough spheres. A system of 500 particles is simulated with a homogeneous mass distribution at 8 different densities and for 5 different values of the roughness parameter. For these 40 physically different situations the intermediate scattering function for 6 values of the wave number, the orientational correlation functions and the velocity autocorrelation functions have been calculated. A comparison has been made with a neutron scattering experiment on neopentane and agreement was good for an intermediate value of the roughness parameter. Some often made approximations in neutron scattering experiments are also checked. The influence of the variable roughness parameter on the correlation functions has been investigated and three simple stochastic models studied to describe the orientational correlation function which shows the most pronounced dependence on the roughness. (Auth.)

  1. Computer simulation of void formation in residual gas atom free metals by dual beam irradiation experiments

    International Nuclear Information System (INIS)

    Shimomura, Y.; Nishiguchi, R.; La Rubia, T.D. de; Guinan, M.W.

    1992-01-01

    In our recent experiments (1), we found that voids nucleate at vacancy clusters which trap gas atoms such as hydrogen and helium in ion- and neutron-irradiated copper. A molecular dynamics computer simulation, which implements an empirical embedded atom method to calculate forces that act on atoms in metals, suggests that a void nucleation occurs in pure copper at six and seven vacancy clusters. The structure of six and seven vacancy clusters in copper fluctuates between a stacking fault tetrahedron and a void. When a hydrogen is trapped at voids of six and seven vacancy, a void can keep their structure for appreciably long time; that is, the void do not relax to a stacking fault tetrahedron and grows to a large void. In order to explore the detailed atomics of void formation, it is emphasized that dual-beam irradiation experiments that utilize beams of gas atoms and self-ions should be carried out with residual gas atom free metal specimens. (author)

  2. Computing activities for the P-bar ANDA experiment at FAIR

    International Nuclear Information System (INIS)

    Messchendorp, Johan

    2010-01-01

    The P-bar ANDA experiment at the future facility FAIR will provide valuable data for our present understanding of the strong interaction. In preparation for the experiments, large-scale simulations for design and feasibility studies are performed exploiting a new software framework, P-bar ANDAROOT, which is based on FairROOT and the Virtual Monte Carlo interface, and which runs on a large-scale computing GRID environment exploiting the AliEn 2 middleware. In this paper, an overview is given of the P-bar ANDA experiment with the emphasis on the various developments which are pursuit to provide a user and developer friendly computing environment for the P-bar ANDA collaboration.

  3. HTTR plant dynamic simulation using a hybrid computer

    International Nuclear Information System (INIS)

    Shimazaki, Junya; Suzuki, Katsuo; Nabeshima, Kunihiko; Watanabe, Koichi; Shinohara, Yoshikuni; Nakagawa, Shigeaki.

    1990-01-01

    A plant dynamic simulation of High-Temperature Engineering Test Reactor has been made using a new-type hybrid computer. This report describes a dynamic simulation model of HTTR, a hybrid simulation method for SIMSTAR and some results obtained from dynamics analysis of HTTR simulation. It concludes that the hybrid plant simulation is useful for on-line simulation on account of its capability of computation at high speed, compared with that of all digital computer simulation. With sufficient accuracy, 40 times faster computation than real time was reached only by changing an analog time scale for HTTR simulation. (author)

  4. GPU-accelerated micromagnetic simulations using cloud computing

    Energy Technology Data Exchange (ETDEWEB)

    Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  5. GPU-accelerated micromagnetic simulations using cloud computing

    International Nuclear Information System (INIS)

    Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.

    2016-01-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  6. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  7. Computer Simulation Western

    International Nuclear Information System (INIS)

    Rasmussen, H.

    1992-01-01

    Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)

  8. Simultaneous epicardial and noncontact endocardial mapping of the canine right atrium: simulation and experiment.

    Science.gov (United States)

    Sabouri, Sepideh; Matene, Elhacene; Vinet, Alain; Richer, Louis-Philippe; Cardinal, René; Armour, J Andrew; Pagé, Pierre; Kus, Teresa; Jacquemet, Vincent

    2014-01-01

    Epicardial high-density electrical mapping is a well-established experimental instrument to monitor in vivo the activity of the atria in response to modulations of the autonomic nervous system in sinus rhythm. In regions that are not accessible by epicardial mapping, noncontact endocardial mapping performed through a balloon catheter may provide a more comprehensive description of atrial activity. We developed a computer model of the canine right atrium to compare epicardial and noncontact endocardial mapping. The model was derived from an experiment in which electroanatomical reconstruction, epicardial mapping (103 electrodes), noncontact endocardial mapping (2048 virtual electrodes computed from a 64-channel balloon catheter), and direct-contact endocardial catheter recordings were simultaneously performed in a dog. The recording system was simulated in the computer model. For simulations and experiments (after atrio-ventricular node suppression), activation maps were computed during sinus rhythm. Repolarization was assessed by measuring the area under the atrial T wave (ATa), a marker of repolarization gradients. Results showed an epicardial-endocardial correlation coefficients of 0.80 and 0.63 (two dog experiments) and 0.96 (simulation) between activation times, and a correlation coefficients of 0.57 and 0.46 (two dog experiments) and 0.92 (simulation) between ATa values. Despite distance (balloon-atrial wall) and dimension reduction (64 electrodes), some information about atrial repolarization remained present in noncontact signals.

  9. Simultaneous epicardial and noncontact endocardial mapping of the canine right atrium: simulation and experiment.

    Directory of Open Access Journals (Sweden)

    Sepideh Sabouri

    Full Text Available Epicardial high-density electrical mapping is a well-established experimental instrument to monitor in vivo the activity of the atria in response to modulations of the autonomic nervous system in sinus rhythm. In regions that are not accessible by epicardial mapping, noncontact endocardial mapping performed through a balloon catheter may provide a more comprehensive description of atrial activity. We developed a computer model of the canine right atrium to compare epicardial and noncontact endocardial mapping. The model was derived from an experiment in which electroanatomical reconstruction, epicardial mapping (103 electrodes, noncontact endocardial mapping (2048 virtual electrodes computed from a 64-channel balloon catheter, and direct-contact endocardial catheter recordings were simultaneously performed in a dog. The recording system was simulated in the computer model. For simulations and experiments (after atrio-ventricular node suppression, activation maps were computed during sinus rhythm. Repolarization was assessed by measuring the area under the atrial T wave (ATa, a marker of repolarization gradients. Results showed an epicardial-endocardial correlation coefficients of 0.80 and 0.63 (two dog experiments and 0.96 (simulation between activation times, and a correlation coefficients of 0.57 and 0.46 (two dog experiments and 0.92 (simulation between ATa values. Despite distance (balloon-atrial wall and dimension reduction (64 electrodes, some information about atrial repolarization remained present in noncontact signals.

  10. Simultaneous Epicardial and Noncontact Endocardial Mapping of the Canine Right Atrium: Simulation and Experiment

    Science.gov (United States)

    Sabouri, Sepideh; Matene, Elhacene; Vinet, Alain; Richer, Louis-Philippe; Cardinal, René; Armour, J. Andrew; Pagé, Pierre; Kus, Teresa; Jacquemet, Vincent

    2014-01-01

    Epicardial high-density electrical mapping is a well-established experimental instrument to monitor in vivo the activity of the atria in response to modulations of the autonomic nervous system in sinus rhythm. In regions that are not accessible by epicardial mapping, noncontact endocardial mapping performed through a balloon catheter may provide a more comprehensive description of atrial activity. We developed a computer model of the canine right atrium to compare epicardial and noncontact endocardial mapping. The model was derived from an experiment in which electroanatomical reconstruction, epicardial mapping (103 electrodes), noncontact endocardial mapping (2048 virtual electrodes computed from a 64-channel balloon catheter), and direct-contact endocardial catheter recordings were simultaneously performed in a dog. The recording system was simulated in the computer model. For simulations and experiments (after atrio-ventricular node suppression), activation maps were computed during sinus rhythm. Repolarization was assessed by measuring the area under the atrial T wave (ATa), a marker of repolarization gradients. Results showed an epicardial-endocardial correlation coefficients of 0.80 and 0.63 (two dog experiments) and 0.96 (simulation) between activation times, and a correlation coefficients of 0.57 and 0.46 (two dog experiments) and 0.92 (simulation) between ATa values. Despite distance (balloon-atrial wall) and dimension reduction (64 electrodes), some information about atrial repolarization remained present in noncontact signals. PMID:24598778

  11. General-purpose parallel simulator for quantum computing

    International Nuclear Information System (INIS)

    Niwa, Jumpei; Matsumoto, Keiji; Imai, Hiroshi

    2002-01-01

    With current technologies, it seems to be very difficult to implement quantum computers with many qubits. It is therefore of importance to simulate quantum algorithms and circuits on the existing computers. However, for a large-size problem, the simulation often requires more computational power than is available from sequential processing. Therefore, simulation methods for parallel processors are required. We have developed a general-purpose simulator for quantum algorithms/circuits on the parallel computer (Sun Enterprise4500). It can simulate algorithms/circuits with up to 30 qubits. In order to test efficiency of our proposed methods, we have simulated Shor's factorization algorithm and Grover's database search, and we have analyzed robustness of the corresponding quantum circuits in the presence of both decoherence and operational errors. The corresponding results, statistics, and analyses are presented in this paper

  12. Comparison of GPU-Based Numerous Particles Simulation and Experiment

    International Nuclear Information System (INIS)

    Park, Sang Wook; Jun, Chul Woong; Sohn, Jeong Hyun; Lee, Jae Wook

    2014-01-01

    The dynamic behavior of numerous grains interacting with each other can be easily observed. In this study, this dynamic behavior was analyzed based on the contact between numerous grains. The discrete element method was used for analyzing the dynamic behavior of each particle and the neighboring-cell algorithm was employed for detecting their contact. The Hertzian and tangential sliding friction contact models were used for calculating the contact force acting between the particles. A GPU-based parallel program was developed for conducting the computer simulation and calculating the numerous contacts. The dam break experiment was performed to verify the simulation results. The reliability of the program was verified by comparing the results of the simulation with those of the experiment

  13. Pore-scale and Continuum Simulations of Solute Transport Micromodel Benchmark Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Oostrom, Martinus; Mehmani, Yashar; Romero Gomez, Pedro DJ; Tang, Y.; Liu, H.; Yoon, Hongkyu; Kang, Qinjun; Joekar Niasar, Vahid; Balhoff, Matthew; Dewers, T.; Tartakovsky, Guzel D.; Leist, Emily AE; Hess, Nancy J.; Perkins, William A.; Rakowski, Cynthia L.; Richmond, Marshall C.; Serkowski, John A.; Werth, Charles J.; Valocchi, Albert J.; Wietsma, Thomas W.; Zhang, Changyong

    2016-08-01

    Four sets of micromodel nonreactive solute transport experiments were conducted with flow velocity, grain diameter, pore-aspect ratio, and flow focusing heterogeneity as the variables. The data sets were offered to pore-scale modeling groups to test their simulators. Each set consisted of two learning experiments, for which all results was made available, and a challenge experiment, for which only the experimental description and base input parameters were provided. The experimental results showed a nonlinear dependence of the dispersion coefficient on the Peclet number, a negligible effect of the pore-aspect ratio on transverse mixing, and considerably enhanced mixing due to flow focusing. Five pore-scale models and one continuum-scale model were used to simulate the experiments. Of the pore-scale models, two used a pore-network (PN) method, two others are based on a lattice-Boltzmann (LB) approach, and one employed a computational fluid dynamics (CFD) technique. The learning experiments were used by the PN models to modify the standard perfect mixing approach in pore bodies into approaches to simulate the observed incomplete mixing. The LB and CFD models used these experiments to appropriately discretize the grid representations. The continuum model use published non-linear relations between transverse dispersion coefficients and Peclet numbers to compute the required dispersivity input values. Comparisons between experimental and numerical results for the four challenge experiments show that all pore-scale models were all able to satisfactorily simulate the experiments. The continuum model underestimated the required dispersivity values and, resulting in less dispersion. The PN models were able to complete the simulations in a few minutes, whereas the direct models needed up to several days on supercomputers to resolve the more complex problems.

  14. Separation of electron ion ring components (computational simulation and experimental results)

    International Nuclear Information System (INIS)

    Aleksandrov, V.S.; Dolbilov, G.V.; Kazarinov, N.Yu.; Mironov, V.I.; Novikov, V.G.; Perel'shtejn, Eh.A.; Sarantsev, V.P.; Shevtsov, V.F.

    1978-01-01

    The problems of the available polarization value of electron-ion rings in the regime of acceleration and separation of its components at the final stage of acceleration are studied. The results of computational simulation by use of the macroparticle method and experiments on the ring acceleration and separation are given. The comparison of calculation results with experiment is presented

  15. Advanced computers and simulation

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1993-01-01

    Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators

  16. Ideas in Practice (3): A Simulated Laboratory Experience in Digital Design.

    Science.gov (United States)

    Cleaver, Thomas G.

    1988-01-01

    Gives an example of the use of a simplified logic simulator in a logic design course. Discusses some problems in logic design classes, commercially available software, and software problems. Describes computer-aided engineering (CAE) software. Lists 14 experiments in the simulated laboratory and presents students' evaluation of the course. (YP)

  17. Numerical simulation of hypersonic flight experiment vehicle

    OpenAIRE

    Yamamoto, Yukimitsu; Yoshioka, Minako; 山本 行光; 吉岡 美菜子

    1994-01-01

    Hypersonic aerodynamic characteristics of Hypersonic FLight EXperiment (HYFLEX vehicle were investigated by numerical simulations using Navier-Stokes CFD (Computational Fluid Dynamics) code of NAL. Numerical results were compared with experimental data obtained at Hypersonic Wind Tunnel at NAL. In order to investigate real flight aerodynamic characteristics. numerical calculations corresponding to the flight conditions suffering from maximum aero thermodynamic heating were also made and the d...

  18. New method of processing heat treatment experiments with numerical simulation support

    Science.gov (United States)

    Kik, T.; Moravec, J.; Novakova, I.

    2017-08-01

    In this work, benefits of combining modern software for numerical simulations of welding processes with laboratory research was described. Proposed new method of processing heat treatment experiments leading to obtaining relevant input data for numerical simulations of heat treatment of large parts was presented. It is now possible, by using experiments on small tested samples, to simulate cooling conditions comparable with cooling of bigger parts. Results from this method of testing makes current boundary conditions during real cooling process more accurate, but also can be used for improvement of software databases and optimization of a computational models. The point is to precise the computation of temperature fields for large scale hardening parts based on new method of temperature dependence determination of the heat transfer coefficient into hardening media for the particular material, defined maximal thickness of processed part and cooling conditions. In the paper we will also present an example of the comparison standard and modified (according to newly suggested methodology) heat transfer coefficient data’s and theirs influence on the simulation results. It shows how even the small changes influence mainly on distribution of temperature, metallurgical phases, hardness and stresses distribution. By this experiment it is also possible to obtain not only input data and data enabling optimization of computational model but at the same time also verification data. The greatest advantage of described method is independence of used cooling media type.

  19. Large Scale Beam-beam Simulations for the CERN LHC using Distributed Computing

    CERN Document Server

    Herr, Werner; McIntosh, E; Schmidt, F

    2006-01-01

    We report on a large scale simulation of beam-beam effects for the CERN Large Hadron Collider (LHC). The stability of particles which experience head-on and long-range beam-beam effects was investigated for different optical configurations and machine imperfections. To cover the interesting parameter space required computing resources not available at CERN. The necessary resources were available in the LHC@home project, based on the BOINC platform. At present, this project makes more than 60000 hosts available for distributed computing. We shall discuss our experience using this system during a simulation campaign of more than six months and describe the tools and procedures necessary to ensure consistent results. The results from this extended study are presented and future plans are discussed.

  20. A model ecosystem experiment and its computational simulation studies

    International Nuclear Information System (INIS)

    Doi, M.

    2002-01-01

    Simplified microbial model ecosystem and its computer simulation model are introduced as eco-toxicity test for the assessment of environmental responses from the effects of environmental impacts. To take the effects on the interactions between species and environment into account, one option is to select the keystone species on the basis of ecological knowledge, and to put it in the single-species toxicity test. Another option proposed is to put the eco-toxicity tests as experimental micro ecosystem study and a theoretical model ecosystem analysis. With these tests, the stressors which are more harmful to the ecosystems should be replace with less harmful ones on the basis of unified measures. Management of radioactive materials, chemicals, hyper-eutrophic, and other artificial disturbances of ecosystem should be discussed consistently from the unified view point of environmental protection. (N.C.)

  1. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  2. Computer simulations of collisionless shock waves

    International Nuclear Information System (INIS)

    Leroy, M.M.

    1984-01-01

    A review of the contributions of particle computer simulations to the understanding of the physics of magnetic shock waves in collisionless plasmas is presented. The emphasis is on the relation between the computer simulation results, spacecraft observations of shocks in space, and related theories, rather than on technical aspects of the numerics. It is shown that much has been learned from the comparison of ISEE spacecraft observations of the terrestrial bow shock and particle computer simulations concerning the quasi-perpendicular, supercritical shock (ion scale structure, ion reflection mechanism and ultimate dissipation processes). Particle computer simulations have also had an appreciable prospective role in the investigation of the physics of quasi-parallel shocks, about which still little is known observationally. Moreover, these numerical techniques have helped to clarify the process of suprathermal ion rejection by the shock into the foreshock, and the subsequent evolution of the ions in the foreshock. 95 references

  3. The TESS [Tandem Experiment Simulation Studies] computer code user's manual

    International Nuclear Information System (INIS)

    Procassini, R.J.

    1990-01-01

    TESS (Tandem Experiment Simulation Studies) is a one-dimensional, bounded particle-in-cell (PIC) simulation code designed to investigate the confinement and transport of plasma in a magnetic mirror device, including tandem mirror configurations. Mirror plasmas may be modeled in a system which includes an applied magnetic field and/or a self-consistent or applied electrostatic potential. The PIC code TESS is similar to the PIC code DIPSI (Direct Implicit Plasma Surface Interactions) which is designed to study plasma transport to and interaction with a solid surface. The codes TESS and DIPSI are direct descendants of the PIC code ES1 that was created by A. B. Langdon. This document provides the user with a brief description of the methods used in the code and a tutorial on the use of the code. 10 refs., 2 tabs

  4. Interactive simulation of nuclear power systems using a dedicated minicomputer - computer graphics facility

    International Nuclear Information System (INIS)

    Tye, C.; Sezgen, A.O.

    1980-01-01

    The design of control systems and operational procedures for large scale nuclear power plant poses a difficult optimization problem requiring a lot of computational effort. Plant dynamic simulation using digital minicomputers offers the prospect of relatively low cost computing and when combined with graphical input/output provides a powerful tool for studying such problems. The paper discusses the results obtained from a simulation study carried out at the Computer Graphics Unit of the University of Manchester using a typical station control model for an Advanced Gas Cooled reactor. Particular reference is placed on the use of computer graphics for information display, parameter and control system optimization and techniques for using graphical input for defining and/or modifying the control system topology. Experience gained from this study has shown that a relatively modest minicomputer system can be used for simulating large scale dynamic systems and that highly interactive computer graphics can be used to advantage to relieve the designer of many of the tedious aspects of simulation leaving him free to concentrate on the more creative aspects of his work. (author)

  5. Computer algebra simulation - what can it do?; Was leistet Computer-Algebra-Simulation?

    Energy Technology Data Exchange (ETDEWEB)

    Braun, S. [Visual Analysis AG, Muenchen (Germany)

    2001-07-01

    Shortened development times require new and improved calculation methods. Numeric methods have long become state of the art. However, although numeric simulations provide a better understanding of process parameters, they do not give a feast overview of the interdependences between parameters. Numeric simulations are effective only if all physical parameters are sufficiently known; otherwise, the efficiency will decrease due to the large number of variant calculations required. Computer algebra simulation closes this gap and provides a deeper understanding of the physical fundamentals of technical processes. [German] Neue und verbesserte Berechnungsmethoden sind notwendig, um die staendige Verkuerzung der Entwicklungszyklen zu ermoeglichen. Herkoemmliche Methoden, die auf einem rein numerischen Ansatz basieren, haben sich in vielen Anwendungsbereichen laengst zum Standard entwickelt. Aber nicht nur die staendig kuerzer werdenden Entwicklungszyklen, sondern auch die weiterwachsende Komplexitaet machen es notwendig, ein besseres Verstaendnis der beteiligten Prozessparameter zu gewinnen. Die numerische Simulation besticht zwar durch Detailloesungen, selbst bei komplexen Strukturen und Prozessen, allerdings liefert sie keine schnelle Abschaetzung ueber die Zusammenhaenge zwischen den einzelnen Parametern. Die numerische Simulation ist nur dann effektiv, wenn alle physikalischen Parameter hinreichend bekannt sind; andernfalls sinkt die Effizienz durch die notwendige Anzahl von notwendigen Variantenrechnungen sehr stark. Die Computer-Algebra-Simulation schliesst diese Luecke in dem sie es erlaubt, sich einen tieferen Einblick in die physikalische Funktionsweise technischer Prozesse zu verschaffen. (orig.)

  6. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  7. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  8. Evolution and experience with the ATLAS Simulation at Point1 Project

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00389536; The ATLAS collaboration; Brasolin, Franco; Kouba, Tomas; Schovancova, Jaroslava; Fazio, Daniel; Di Girolamo, Alessandro; Scannicchio, Diana; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander; Lee, Christopher

    2017-01-01

    The Simulation at Point1 project is successfully running standard ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We present our experience with using the Event Service that provides the event-level granularity of computations. We show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources is also presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  9. Evolution and experience with the ATLAS simulation at Point1 project

    CERN Document Server

    Ballestrero, Sergio; The ATLAS collaboration; Fazio, Daniel; Di Girolamo, Alessandro; Kouba, Tomas; Lee, Christopher; Scannicchio, Diana; Schovancova, Jaroslava; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2016-01-01

    The Simulation at Point1 project is successfully running traditional ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We will present our experience with using the Event Service that provides the event-level granularity of computations. We will show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources will also be presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  10. Ion bombardment induced smoothing of amorphous metallic surfaces: Experiments versus computer simulations

    International Nuclear Information System (INIS)

    Vauth, Sebastian; Mayr, S. G.

    2008-01-01

    Smoothing of rough amorphous metallic surfaces by bombardment with heavy ions in the low keV regime is investigated by a combined experimental-simulational study. Vapor deposited rough amorphous Zr 65 Al 7.5 Cu 27.5 films are the basis for systematic in situ scanning tunneling microscopy measurements on the smoothing reaction due to 3 keV Kr + ion bombardment. The experimental results are directly compared to the predictions of a multiscale simulation approach, which incorporates stochastic rate equations of the Langevin type in combination with previously reported classical molecular dynamics simulations [Phys. Rev. B 75, 224107 (2007)] to model surface smoothing across length and time scales. The combined approach of experiments and simulations clearly corroborates a key role of ion induced viscous flow and ballistic effects in low keV heavy ion induced smoothing of amorphous metallic surfaces at ambient temperatures

  11. SIMULATED ANIMAL EXPERIMENTS IN TEACHING AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Chirag B. Mistry, Shreya M. Shah, Jagatkumar D. Bhatt

    2015-07-01

    Full Text Available Animal experiments are of paramount importance in the pre-clinical screening of new chemical entity. On the other hand, various regulatory guidelines for animal experiments are becoming more stringent in the face of worldwide protests by animal rights activists. Moreover, simulated animal experiments’ softwares are being developed and they can be implemented in the postgraduate and graduate students’ curriculum for demonstration of standard physiological and pharmacological principles compared to real time animal experiments. In fact, implementation of virtual experiment will decrease hand on experience of animal experiments among medical students, but after medical graduation, animal experiment is lest utilized during their day to day clinical practice. Similarly, in case of postgraduate pharmacology curriculum, computer based virtual animal experiments can facilitate teaching and learning in a short span of time with various protocols, without sacrificing any animal for already established experimental outcomes.

  12. What do we want from computer simulation of SIMS using clusters?

    International Nuclear Information System (INIS)

    Webb, R.P.

    2008-01-01

    Computer simulation of energetic cluster interactions with surfaces has provided much needed insight into some of the complex processes which occur and are responsible for the desirable as well as undesirable effects which make the use of clusters in SIMS both useful and challenging. Simulations have shown how cluster impacts can cause meso-scale motion of the target material which can result in the relatively gentle up-lift of large intact molecules adsorbed on the surface in contrast to the behaviour of single atom impacts which tend to create discrete motion in the surface often ejecting fragments of adsorbed molecules instead. With the insight provided from simulations experimentalists can then improve their equipment to best maximise the desired effects. The past 40 years has seen great progress in simulation techniques and computer equipment. 40 years ago simulations were performed on simple atomic systems of around 300 atoms employing only simple pair-wise interaction potentials to times of several hundred femtoseconds. Currently simulations can be performed on large organic materials employing many body potentials for millions of atoms for times of many picoseconds. These simulations, however, can take several months of computation time. Even with the degree of realism introduced with these long time simulations they are still not perfect are often not capable of being used in a completely predictive way. Computer simulation is reaching a position where by any more effort to increase its realism will make it completely intractable to solution in a reasonable time frame and yet there is an increasing demand from experimentalists for something that can help in a predictive way to help in experiment design and interpretation. This paper will discuss the problems of computer simulation and what might be possible to achieve in the short term, what is unlikely ever to be possible without a major new break through and how we might exploit the meso-scale effects in

  13. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  14. When Feedback Harms and Collaboration Helps in Computer Simulation Environments: An Expertise Reversal Effect

    Science.gov (United States)

    Nihalani, Priya K.; Mayrath, Michael; Robinson, Daniel H.

    2011-01-01

    We investigated the effects of feedback and collaboration on undergraduates' transfer performance when using a computer networking training simulation. In Experiment 1, 65 computer science "novices" worked through an instructional protocol individually (control), individually with feedback, or collaboratively with feedback. Unexpectedly,…

  15. Computer simulations of compact toroid formation and acceleration

    International Nuclear Information System (INIS)

    Peterkin, R.E. Jr.; Sovinec, C.R.

    1990-01-01

    Experiments to form, accelerate, and focus compact toroid plasmas will be performed on the 9.4 MJ SHIVA STAR fast capacitor bank at the Air Force Weapons Laboratory during the 1990. The MARAUDER (magnetically accelerated rings to achieve ultrahigh directed energy and radiation) program is a research effort to accelerate magnetized plasma rings with the masses between 0.1 and 1.0 mg to velocities above 10 8 cm/sec and energies above 1 MJ. Research on these high-velocity compact toroids may lead to development of very fast opening switches, high-power microwave sources, and an alternative path to inertial confinement fusion. Design of a compact toroid accelerator experiment on the SHIVA STAR capacitor bank is underway, and computer simulations with the 2 1/2-dimensional magnetohydrodynamics code, MACH2, have been performed to guide this endeavor. The compact toroids are produced in a magnetized coaxial plasma gun, and the acceleration will occur in a configuration similar to a coaxial railgun. Detailed calculations of formation and equilibration of a low beta magnetic force-free configuration (curl B = kB) have been performed with MACH2. In this paper, the authors discuss computer simulations of the focusing and acceleration of the toroid

  16. State of Theory and Computer Simulations of Radiation Effects in Ceramics

    International Nuclear Information System (INIS)

    Corrales, Louis R.; Weber, William J.

    2003-01-01

    This article presents opinions based on the presentations and discussions at a Workshop on Theory and Computer Simulations of Radiation Effects in Ceramics held in August 2002 at Pacific Northwest National Laboratory in Richland, WA, USA. The workshop was focused on the current state-of-the-art of theory, modeling and simulation of radiation effects in oxide ceramics, directions for future breakthroughs, and creating a close integration with experiment

  17. Pain Assessment and Management in Nursing Education Using Computer-based Simulations.

    Science.gov (United States)

    Romero-Hall, Enilda

    2015-08-01

    It is very important for nurses to have a clear understanding of the patient's pain experience and of management strategies. However, a review of the nursing literature shows that one of the main barriers to proper pain management practice is lack of knowledge. Nursing schools are in a unique position to address the gap in pain management knowledge by facilitating the acquisition and use of knowledge by the next generation of nurses. The purpose of this article is to discuss the role of computer-based simulations as a reliable educational technology strategy that can enhance the learning experience of nursing students acquiring pain management knowledge and practice. Computer-based simulations provide a significant number of learning affordances that can help change nursing students' attitudes and behaviors toward and practice of pain assessment and management. Copyright © 2015 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  18. Teaching emergency medical services management skills using a computer simulation exercise.

    Science.gov (United States)

    Hubble, Michael W; Richards, Michael E; Wilfong, Denise

    2011-02-01

    Simulation exercises have long been used to teach management skills in business schools. However, this pedagogical approach has not been reported in emergency medical services (EMS) management education. We sought to develop, deploy, and evaluate a computerized simulation exercise for teaching EMS management skills. Using historical data, a computer simulation model of a regional EMS system was developed. After validation, the simulation was used in an EMS management course. Using historical operational and financial data of the EMS system under study, students designed an EMS system and prepared a budget based on their design. The design of each group was entered into the model that simulated the performance of the EMS system. Students were evaluated on operational and financial performance of their system design and budget accuracy and then surveyed about their experiences with the exercise. The model accurately simulated the performance of the real-world EMS system on which it was based. The exercise helped students identify operational inefficiencies in their system designs and highlighted budget inaccuracies. Most students rated the exercise as moderately or very realistic in ambulance deployment scheduling, budgeting, personnel cost calculations, demand forecasting, system design, and revenue projections. All students indicated the exercise was helpful in gaining a top management perspective, and 89% stated the exercise was helpful in bridging the gap between theory and reality. Preliminary experience with a computer simulator to teach EMS management skills was well received by students in a baccalaureate paramedic program and seems to be a valuable teaching tool. Copyright © 2011 Society for Simulation in Healthcare

  19. Numerical characteristics of quantum computer simulation

    Science.gov (United States)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  20. Computer simulations in the high school: students' cognitive stages, science process skills and academic achievement in microbiology

    Science.gov (United States)

    Huppert, J.; Michal Lomask, S.; Lazarowitz, R.

    2002-08-01

    Computer-assisted learning, including simulated experiments, has great potential to address the problem solving process which is a complex activity. It requires a highly structured approach in order to understand the use of simulations as an instructional device. This study is based on a computer simulation program, 'The Growth Curve of Microorganisms', which required tenth grade biology students to use problem solving skills whilst simultaneously manipulating three independent variables in one simulated experiment. The aims were to investigate the computer simulation's impact on students' academic achievement and on their mastery of science process skills in relation to their cognitive stages. The results indicate that the concrete and transition operational students in the experimental group achieved significantly higher academic achievement than their counterparts in the control group. The higher the cognitive operational stage, the higher students' achievement was, except in the control group where students in the concrete and transition operational stages did not differ. Girls achieved equally with the boys in the experimental group. Students' academic achievement may indicate the potential impact a computer simulation program can have, enabling students with low reasoning abilities to cope successfully with learning concepts and principles in science which require high cognitive skills.

  1. Analyzing Robotic Kinematics Via Computed Simulations

    Science.gov (United States)

    Carnahan, Timothy M.

    1992-01-01

    Computing system assists in evaluation of kinematics of conceptual robot. Displays positions and motions of robotic manipulator within work cell. Also displays interactions between robotic manipulator and other objects. Results of simulation displayed on graphical computer workstation. System includes both off-the-shelf software originally developed for automotive industry and specially developed software. Simulation system also used to design human-equivalent hand, to model optical train in infrared system, and to develop graphical interface for teleoperator simulation system.

  2. Computer Simulations, Disclosure and Duty of Care

    Directory of Open Access Journals (Sweden)

    John Barlow

    2006-05-01

    Full Text Available Computer simulations provide cost effective methods for manipulating and modeling 'reality'. However they are not real. They are imitations of a system or event, real or fabricated, and as such mimic, duplicate or represent that system or event. The degree to which a computer simulation aligns with and reproduces the ‘reality’ of the system or event it attempts to mimic or duplicate depends upon many factors including the efficiency of the simulation algorithm, the processing power of the computer hardware used to run the simulation model, and the expertise, assumptions and prejudices of those concerned with designing, implementing and interpreting the simulation output. Computer simulations in particular are increasingly replacing physical experimentation in many disciplines, and as a consequence, are used to underpin quite significant decision-making which may impact on ‘innocent’ third parties. In this context, this paper examines two interrelated issues: Firstly, how much and what kind of information should a simulation builder be required to disclose to potential users of the simulation? Secondly, what are the implications for a decision-maker who acts on the basis of their interpretation of a simulation output without any reference to its veracity, which may in turn comprise the safety of other parties?

  3. Computational simulation of the biomass gasification process in a fluidized bed reactor

    International Nuclear Information System (INIS)

    Rojas Mazaira, Leorlen Y.; Gamez Rodriguez, Abel; Andrade Gregori, Maria Dolores; Armas Cardona, Raul

    2009-01-01

    In an agro-industrial country as Cuba many residues of cultivation like the rice and the cane of sugar take place, besides the forest residues in wooded extensions. Is an interesting application for all this biomass, the gasification technology, by its high efficiency and its positive environmental impact. The computer simulation appears like a useful tool in the researches of parameters of operation of a gas- emitting, because it reduces the number of experiments to realise and the cost of the researches. In the work the importance of the application of the computer simulation is emphasized to anticipate the hydrodynamic behavior of fluidized bed and of the process of combustion of the biomass for different residues and different conditions of operation. A model using CFD for the simulation of the process of combustion in a gas- emitting of biomass sets out of fluidized bed, the hydrodynamic parameters of the multiphasic flow from the elaboration of a computer simulator that allows to form and to vary the geometry of the reactor, as well as the influence of the variation of magnitudes are characterized such as: speed, diameter of the sand and equivalent reason. Experimental results in cylindrical channels appear, to complete the study of the computer simulation realised in 2D. (author)

  4. Radiotracer experiments and CFD simulation for industrial hydrocyclone performance

    International Nuclear Information System (INIS)

    Stegowski, Z.; Nowak, E.

    2007-01-01

    Hydrocyclone is a device for solid concentration or selection of solid particles from a liquid-solid mixture. It is widely used in the mineral industry for selection of solid particles from a few to a few hundred micrometers. This paper presents a radiotracer experiment and computational simulation of selection of solid particles in a hydrocyclone of Φ-500 μm, which is used in the industrial copper ore concentration process. The simulation, based on computational fluid dynamics (CFD) techniques, allowed obtaining the velocity and concentration distribution for a real mixture flowing in the hydrocyclone. The mixture was composed of water and nine solid phases of different grain sizes. Finally, the selection curve of solid grains was obtained and compared with the experimental radiotracer results. (author)

  5. Virtual Reality Simulation of the International Space Welding Experiment

    Science.gov (United States)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.

  6. Simulation of a small computer of the TRA-1001 type on the BESM computer

    International Nuclear Information System (INIS)

    Galaktionov, V.V.

    1975-01-01

    Considered are the purpose and probable simulation ways of one computer by the other. The emulator (simulation program) is given for a small computer of TRA-1001 type on BESM-6 computer. The simulated computer basic elements are the following: memory (8 K words), central processor, input-output program channel, interruption circuit, computer panel. The work with the input-output devices, teletypes ASP-33, FS-1500 is also simulated. Under actual operation the emulator has been used for translating the programs prepared on punched cards with the aid of translator SLANG-1 by BESM-6 computer. The translator alignment from language COPLAN has been realized with the aid of the emulator

  7. Nonlinear simulations with and computational issues for NIMROD

    International Nuclear Information System (INIS)

    Sovinec, C.R.

    1998-01-01

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this

  8. Nonlinear simulations with and computational issues for NIMROD

    Energy Technology Data Exchange (ETDEWEB)

    Sovinec, C.R. [Los Alamos National Lab., NM (United States)

    1998-12-31

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

  9. COMPUTATIONAL SIMULATION OF FIRE DEVELOPMENT INSIDE A TRADE CENTRE

    Directory of Open Access Journals (Sweden)

    Constantin LUPU

    2015-07-01

    Full Text Available Real scale fire experiments involve considerable costs compared to computational mathematical modelling. This paperwork is the result of such a virtual simulation of a fire occurred in a hypothetical wholesale warehouse comprising a large number of trade stands. The analysis starts from the ignition source located inside a trade stand towards the fire expansion over three groups of compartments, by highlighting the heat transfer, both in small spaces, as well as over large distances. In order to confirm the accuracy of the simulation, the obtained values are compared to the ones from the specialized literature.

  10. A computer-simulated liver phantom (virtual liver phantom) for multidetector computed tomography evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Funama, Yoshinori [Kumamoto University, Department of Radiological Sciences, School of Health Sciences, Kumamoto (Japan); Awai, Kazuo; Nakayama, Yoshiharu; Liu, Da; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Miyazaki, Osamu; Goto, Taiga [Hitachi Medical Corporation, Tokyo (Japan); Hori, Shinichi [Gate Tower Institute of Image Guided Therapy, Osaka (Japan)

    2006-04-15

    The purpose of study was to develop a computer-simulated liver phantom for hepatic CT studies. A computer-simulated liver phantom was mathematically constructed on a computer workstation. The computer-simulated phantom was calibrated using real CT images acquired by an actual four-detector CT. We added an inhomogeneous texture to the simulated liver by referring to CT images of chronically damaged human livers. The mean CT number of the simulated liver was 60 HU and we added numerous 5-to 10-mm structures with 60{+-}10 HU/mm. To mimic liver tumors we added nodules measuring 8, 10, and 12 mm in diameter with CT numbers of 60{+-}10, 60{+-}15, and 60{+-}20 HU. Five radiologists visually evaluated similarity of the texture of the computer-simulated liver phantom and a real human liver to confirm the appropriateness of the virtual liver images using a five-point scale. The total score was 44 in two radiologists, and 42, 41, and 39 in one radiologist each. They evaluated that the textures of virtual liver were comparable to those of human liver. Our computer-simulated liver phantom is a promising tool for the evaluation of the image quality and diagnostic performance of hepatic CT imaging. (orig.)

  11. Development of a research simulator for the study of human factors and experiments

    International Nuclear Information System (INIS)

    Kawano, R.; Shibuya, S.

    1999-01-01

    A research simulator of nuclear power plant for Human Factors was developed. It simulates the behaviors of the 1100MWe BWR nuclear power plant and has almost same functions ant scope of the simulation as a full-scope training simulator. Physical models installed in the system enable us to execute experiments with multi-malfunction scenario. A severe accident simulation package replaces the running simulation code when the maximum core temperature exceeds 1200 deg C and the core approaches meltdown conditions. The central control panel was simulated by soft panels, indicator and operational switches on the panels by computer graphics, displayed on 22 console boxes containing CRT. The introduction of soft panels and EWSs connected with LAN accomplished flexibility and extendibility. Some experiments by using the simulator were executed and the system has been improved based on the experience from the experiments. It is important to evaluate the effectiveness of any new system by using an actual plant size research simulator before its practical application to keep steady and safe operation of nuclear power plants. (author)

  12. Computer Simulations of Lipid Bilayers and Proteins

    DEFF Research Database (Denmark)

    Sonne, Jacob

    2006-01-01

    The importance of computer simulations in lipid bilayer research has become more prominent for the last couple of decades and as computers get even faster, simulations will play an increasingly important part of understanding the processes that take place in and across cell membranes. This thesis...... entitled Computer simulations of lipid bilayers and proteins describes two molecular dynamics (MD) simulation studies of pure lipid bilayers as well as a study of a transmembrane protein embedded in a lipid bilayer matrix. Below follows a brief overview of the thesis. Chapter 1. This chapter is a short...... in the succeeding chapters is presented. Details on system setups, simulation parameters and other technicalities can be found in the relevant chapters. Chapter 3, DPPC lipid parameters: The quality of MD simulations is intimately dependent on the empirical potential energy function and its parameters, i...

  13. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [ORNL

    2017-08-01

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes were used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge

  14. Computational simulation of the creep-rupture process in filamentary composite materials

    Science.gov (United States)

    Slattery, Kerry T.; Hackett, Robert M.

    1991-01-01

    A computational simulation of the internal damage accumulation which causes the creep-rupture phenomenon in filamentary composite materials is developed. The creep-rupture process involves complex interactions between several damage mechanisms. A statistically-based computational simulation using a time-differencing approach is employed to model these progressive interactions. The finite element method is used to calculate the internal stresses. The fibers are modeled as a series of bar elements which are connected transversely by matrix elements. Flaws are distributed randomly throughout the elements in the model. Load is applied, and the properties of the individual elements are updated at the end of each time step as a function of the stress history. The simulation is continued until failure occurs. Several cases, with different initial flaw dispersions, are run to establish a statistical distribution of the time-to-failure. The calculations are performed on a supercomputer. The simulation results compare favorably with the results of creep-rupture experiments conducted at the Lawrence Livermore National Laboratory.

  15. Skin hydration analysis by experiment and computer simulations and its implications for diapered skin.

    Science.gov (United States)

    Saadatmand, M; Stone, K J; Vega, V N; Felter, S; Ventura, S; Kasting, G; Jaworska, J

    2017-11-01

    Experimental work on skin hydration is technologically challenging, and mostly limited to observations where environmental conditions are constant. In some cases, like diapered baby skin, such work is practically unfeasible, yet it is important to understand potential effects of diapering on skin condition. To overcome this challenge, in part, we developed a computer simulation model of reversible transient skin hydration effects. Skin hydration model by Li et al. (Chem Eng Sci, 138, 2015, 164) was further developed to simulate transient exposure conditions where relative humidity (RH), wind velocity, air, and skin temperature can be any function of time. Computer simulations of evaporative water loss (EWL) decay after different occlusion times were compared with experimental data to calibrate the model. Next, we used the model to investigate EWL and SC thickness in different diapering scenarios. Key results from the experimental work were: (1) For occlusions by RH=100% and free water longer than 30 minutes the absorbed amount of water is almost the same; (2) Longer occlusion times result in higher water absorption by the SC. The EWL decay and skin water content predictions were in agreement with experimental data. Simulations also revealed that skin under occlusion hydrates mainly because the outflux is blocked, not because it absorbs water from the environment. Further, simulations demonstrated that hydration level is sensitive to time, RH and/or free water on skin. In simulated diapering scenarios, skin maintained hydration content very close to the baseline conditions without a diaper for the entire duration of a 24 hours period. Different diapers/diaper technologies are known to have different profiles in terms of their ability to provide wetness protection, which can result in consumer-noticeable differences in wetness. Simulation results based on published literature using data from a number of different diapers suggest that diapered skin hydrates within

  16. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  17. Scientific and computational challenges of the fusion simulation program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) - a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  18. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2013-01-01

    This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...

  19. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and

  20. Radiotherapy Monte Carlo simulation using cloud computing technology.

    Science.gov (United States)

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  1. Radiotherapy Monte Carlo simulation using cloud computing technology

    International Nuclear Information System (INIS)

    Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.

    2012-01-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  2. Computational simulation of natural circulation and rewetting experiments using the TRAC/PF1 code

    International Nuclear Information System (INIS)

    Silva, J.D. da.

    1994-05-01

    In this work the TRAC code was used to simulate experiments of natural circulation performed in the first Brazilian integral test facility at (COPESP), Sao Paulo and a rewetting experiment in a single tube test section carried out at CDTN, Belo Horizonte, Brazil. In the first simulation the loop behavior in two transient conditions with different thermal power, namely 20 k W and 120 k W, was verified in the second one the quench front propagation, the liquid mass collected in the carry over measuring tube and the wall temperature at different elevations during the flooding experiment was measured. A comparative analysis, for code consistency, shows a good agreement between the code results and experimental data, except for the quench from velocity. (author). 15 refs, 19 figs, 12 tabs

  3. Development of a mechanistically based computer simulation of nitrogen oxide absorption in packed towers

    International Nuclear Information System (INIS)

    Counce, R.M.

    1981-01-01

    A computer simulation for nitrogen oxide (NO/sub x/) scrubbing in packed towers was developed for use in process design and process control. This simulation implements a mechanistically based mathematical model, which was formulated from (1) an exhaustive literature review; (2) previous NO/sub x/ scrubbing experience with sieve-plate towers; and (3) comparisons of sequential sets of experiments. Nitrogen oxide scrubbing is characterized by simultaneous absorption and desorption phenomena: the model development is based on experiments designed to feature these two phenomena. The model was then successfully tested in experiments designed to put it in jeopardy

  4. Atomistic computer simulations a practical guide

    CERN Document Server

    Brazdova, Veronika

    2013-01-01

    Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o

  5. Generalized Bell-inequality experiments and computation

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Department of Computer Science, University of Oxford, Wolfson Building, Parks Road, Oxford OX1 3QD (United Kingdom); Wallman, Joel J. [School of Physics, The University of Sydney, Sydney, New South Wales 2006 (Australia); Browne, Dan E. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom)

    2011-12-15

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.

  6. Generalized Bell-inequality experiments and computation

    International Nuclear Information System (INIS)

    Hoban, Matty J.; Wallman, Joel J.; Browne, Dan E.

    2011-01-01

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.

  7. Two-dimensional computer simulation of high intensity proton beams

    CERN Document Server

    Lapostolle, Pierre M

    1972-01-01

    A computer program has been developed which simulates the two- dimensional transverse behaviour of a proton beam in a focusing channel. The model is represented by an assembly of a few thousand 'superparticles' acted upon by their own self-consistent electric field and an external focusing force. The evolution of the system is computed stepwise in time by successively solving Poisson's equation and Newton's law of motion. Fast Fourier transform techniques are used for speed in the solution of Poisson's equation, while extensive area weighting is utilized for the accurate evaluation of electric field components. A computer experiment has been performed on the CERN CDC 6600 computer to study the nonlinear behaviour of an intense beam in phase space, showing under certain circumstances a filamentation due to space charge and an apparent emittance growth. (14 refs).

  8. Fel simulations using distributed computing

    NARCIS (Netherlands)

    Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.

    2016-01-01

    While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code

  9. CUBESIM, Hypercube and Denelcor Hep Parallel Computer Simulation

    International Nuclear Information System (INIS)

    Dunigan, T.H.

    1988-01-01

    1 - Description of program or function: CUBESIM is a set of subroutine libraries and programs for the simulation of message-passing parallel computers and shared-memory parallel computers. Subroutines are supplied to simulate the Intel hypercube and the Denelcor HEP parallel computers. The system permits a user to develop and test parallel programs written in C or FORTRAN on a single processor. The user may alter such hypercube parameters as message startup times, packet size, and the computation-to-communication ratio. The simulation generates a trace file that can be used for debugging, performance analysis, or graphical display. 2 - Method of solution: The CUBESIM simulator is linked with the user's parallel application routines to run as a single UNIX process. The simulator library provides a small operating system to perform process and message management. 3 - Restrictions on the complexity of the problem: Up to 128 processors can be simulated with a virtual memory limit of 6 million bytes. Up to 1000 processes can be simulated

  10. Computer modeling of active experiments in space plasmas

    International Nuclear Information System (INIS)

    Bollens, R.J.

    1993-01-01

    The understanding of space plasmas is expanding rapidly. This is, in large part, due to the ambitious efforts of scientists from around the world who are performing large scale active experiments in the space plasma surrounding the earth. One such effort was designated the Active Magnetospheric Particle Tracer Explorers (AMPTE) and consisted of a series of plasma releases that were completed during 1984 and 1985. What makes the AMPTE experiments particularly interesting was the occurrence of a dramatic anomaly that was completely unpredicted. During the AMPTE experiment, three satellites traced the solar-wind flow into the earth's magnetosphere. One satellite, built by West Germany, released a series of barium and lithium canisters that were detonated and subsequently photo-ionized via solar radiation, thereby creating an artificial comet. Another satellite, built by Great Britain and in the vicinity during detonation, carried, as did the first satellite, a comprehensive set of magnetic field, particle and wave instruments. Upon detonation, what was observed by the satellites, as well as by aircraft and ground-based observers, was quite unexpected. The initial deflection of the ion clouds was not in the ambient solar wind's flow direction (rvec V) but rather in the direction transverse to the solar wind and the background magnetic field (rvec V x rvec B). This result was not predicted by any existing theories or simulation models; it is the main subject discussed in this dissertation. A large three dimensional computer simulation was produced to demonstrate that this transverse motion can be explained in terms of a rocket effect. Due to the extreme computer resources utilized in producing this work, the computer methods used to complete the calculation and the visualization techniques used to view the results are also discussed

  11. Accelerator simulation using computers

    International Nuclear Information System (INIS)

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications

  12. Microcrack propagation under multiaxial loading - experiment and simulation

    International Nuclear Information System (INIS)

    Poetter, K.; Suhartono, A.; Yousefi, F.; Zenner, H.; Duewel, V.; Schram, A.

    2000-01-01

    The accuracy of lifetime prediction for technical components subjected to cyclic loading is still not satisfying. One essential reason for the deviation between the results of the lifetime calculation and experimental results is that it is not yet possible to generate a model capable to describe the microstructural damage process which occurs in the tested material and to integrate this model in the calculation. All of the present research results recognize that the growth of microcracks is significantly influenced by the microstructure of the material. In order to take into account the influence of the microstructure on the damage process a simulation model is suggested in this paper which considers the local stress state in addition to the random nature of the material structure in the form of grain boundaries and slip systems. The results generated by means of the simulation model are compared and verified with those experiences obtained from multiaxial fatigue testing of the investigated aluminum material. For this purpose the surfaces of the tested specimens are carefully observed to discover and analyze microcracks which are classified according to their number, length, and orientation. Moreover the mechanisms of crack initiation and propagation are major points of interest for the comparison of theoretical and experimental results. The developed computer software is suitable to simulate the microcrack initiation, the propagation and coalescence of microcracks as well as the transition of stage I cracks to stage II cracks for uniaxial and multiaxial loading. Results obtained from the simulation model could be verified with the experiment. The future aim to be emphasized is the utilization of the parameter investigations carried out with the computer simulation model in order to improve the lifetime prediction. (orig.)

  13. Cyclic deformation-induced solute transport in tissue scaffolds with computer designed, interconnected, pore networks: experiments and simulations.

    Science.gov (United States)

    Den Buijs, Jorn Op; Dragomir-Daescu, Dan; Ritman, Erik L

    2009-08-01

    Nutrient supply and waste removal in porous tissue engineering scaffolds decrease from the periphery to the center, leading to limited depth of ingrowth of new tissue into the scaffold. However, as many tissues experience cyclic physiological strains, this may provide a mechanism to enhance solute transport in vivo before vascularization of the scaffold. The hypothesis of this study was that pore cross-sectional geometry and interconnectivity are of major importance for the effectiveness of cyclic deformation-induced solute transport. Transparent elastic polyurethane scaffolds, with computer-programmed design of pore networks in the form of interconnected channels, were fabricated using a 3D printing and injection molding technique. The scaffold pores were loaded with a colored tracer for optical contrast, cyclically compressed with deformations of 10 and 15% of the original undeformed height at 1.0 Hz. Digital imaging was used to quantify the spatial distribution of the tracer concentration within the pores. Numerical simulations of a fluid-structure interaction model of deformation-induced solute transport were compared to the experimental data. The results of experiments and modeling agreed well and showed that pore interconnectivity heavily influences deformation-induced solute transport. Pore cross-sectional geometry appears to be of less relative importance in interconnected pore networks. Validated computer models of solute transport can be used to design optimal scaffold pore geometries that will enhance the convective transport of nutrients inside the scaffold and the removal of waste, thus improving the cell survivability deep inside the scaffold.

  14. An Educational Software for Simulating the Sample Size of Molecular Marker Experiments

    Science.gov (United States)

    Helms, T. C.; Doetkott, C.

    2007-01-01

    We developed educational software to show graduate students how to plan molecular marker experiments. These computer simulations give the students feedback on the precision of their experiments. The objective of the software was to show students using a hands-on approach how: (1) environmental variation influences the range of the estimates of the…

  15. COMPUTER CONTROL OF BEHAVIORAL EXPERIMENTS.

    Science.gov (United States)

    SIEGEL, LOUIS

    THE LINC COMPUTER PROVIDES A PARTICULAR SCHEDULE OF REINFORCEMENT FOR BEHAVIORAL EXPERIMENTS BY EXECUTING A SEQUENCE OF COMPUTER OPERATIONS IN CONJUNCTION WITH A SPECIALLY DESIGNED INTERFACE. THE INTERFACE IS THE MEANS OF COMMUNICATION BETWEEN THE EXPERIMENTAL CHAMBER AND THE COMPUTER. THE PROGRAM AND INTERFACE OF AN EXPERIMENT INVOLVING A PIGEON…

  16. Computational simulation of concurrent engineering for aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  17. Computational simulation for concurrent engineering of aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  18. Simulation of radioactive waste transmutation on the t.node parallel computer

    International Nuclear Information System (INIS)

    Bacha, F.; Maillard, J.; Silva, J.

    1995-01-01

    Before any experiment on reactor driven by an accelerator, computer simulation supplies tools for optimization. Some of the key parameters are neutron production on a heavy target and neutronic distribution flux in the core. During two code benchmarks organized by the NEA-OECD, simulations of energetic incident proton collisions on a thin lead target for the first one, on a thick lead target for the second one, are described. One validation of the numeric codes is based on these results. A preliminary design of a burning waste system using benchmark result analysis and fission focused simulations is proposed

  19. Simulation of radioactive waste transmutation on the T. Node parallel computer

    International Nuclear Information System (INIS)

    Bacha, F.; Maillard, J.; Silva, J.

    1995-01-01

    Before any experiment on reactor driven by an accelerator, computer simulation supplies tools for optimization. Some of the key parameters are neutron production on a heavy target and neutronic distribution flux in the core. During two code benchmarks organized by the NEA-OECD, simulations of energetic incident proton collisions on a thin lead target for the first one, on a thick lead target for the second one, are described. One validation of our numeric codes is based on these results. A preliminary design of a burning waste system using benchmark result analysis and fission focused simulations is proposed

  20. Simulation of radioactive waste transmutation on the t.node parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Bacha, F.; Maillard, J.; Silva, J. [LPC College de France, Paris (France)

    1995-10-01

    Before any experiment on reactor driven by an accelerator, computer simulation supplies tools for optimization. Some of the key parameters are neutron production on a heavy target and neutronic distribution flux in the core. During two code benchmarks organized by the NEA-OECD, simulations of energetic incident proton collisions on a thin lead target for the first one, on a thick lead target for the second one, are described. One validation of the numeric codes is based on these results. A preliminary design of a burning waste system using benchmark result analysis and fission focused simulations is proposed.

  1. Computer-generated ovaries to assist follicle counting experiments.

    Directory of Open Access Journals (Sweden)

    Angelos Skodras

    Full Text Available Precise estimation of the number of follicles in ovaries is of key importance in the field of reproductive biology, both from a developmental point of view, where follicle numbers are determined at specific time points, as well as from a therapeutic perspective, determining the adverse effects of environmental toxins and cancer chemotherapeutics on the reproductive system. The two main factors affecting follicle number estimates are the sampling method and the variation in follicle numbers within animals of the same strain, due to biological variability. This study aims at assessing the effect of these two factors, when estimating ovarian follicle numbers of neonatal mice. We developed computer algorithms, which generate models of neonatal mouse ovaries (simulated ovaries, with characteristics derived from experimental measurements already available in the published literature. The simulated ovaries are used to reproduce in-silico counting experiments based on unbiased stereological techniques; the proposed approach provides the necessary number of ovaries and sampling frequency to be used in the experiments given a specific biological variability and a desirable degree of accuracy. The simulated ovary is a novel, versatile tool which can be used in the planning phase of experiments to estimate the expected number of animals and workload, ensuring appropriate statistical power of the resulting measurements. Moreover, the idea of the simulated ovary can be applied to other organs made up of large numbers of individual functional units.

  2. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  3. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  4. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  5. COMPUTER SIMULATION THE MECHANICAL MOVEMENT BODY BY MEANS OF MATHCAD

    Directory of Open Access Journals (Sweden)

    Leonid Flehantov

    2017-03-01

    Full Text Available Here considered the technique of using computer mathematics system MathCAD for computer implementation of mathematical model of the mechanical motion of the physical body thrown at an angle to the horizon, and its use for educational computer simulation experiment in teaching the fundamentals of mathematical modeling. The advantages of MathCAD as environment of implementation mathematical models in the second stage of higher education are noted. It describes the creation the computer simulation model that allows you to comprehensively analyze the process of mechanical movement of the body, changing the input parameters of the model: the acceleration of gravity, the initial and final position of the body, the initial velocity and angle, the geometric dimensions of the body and goals. The technique aimed at the effective assimilation of basic knowledge and skills of students on the basics of mathematical modeling, it provides an opportunity to better master the basic theoretical principles of mathematical modeling and related disciplines, promotes logical thinking development of students, their motivation to learn discipline, improves cognitive interest, forms skills research activities than creating conditions for the effective formation of professional competence of future specialists.

  6. The Australian Computational Earth Systems Simulator

    Science.gov (United States)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  7. The rheology of concentrated dispersions: structure changes and shear thickening in experiments and computer simulations

    NARCIS (Netherlands)

    Boersma, W.H.; Laven, J.; Stein, H.N.; Moldenaers, P.; Keunings, R.

    1992-01-01

    The flow-induced changes in the microstructure and rheol. of very concd., shear thickening dispersions are studied. Results obtained for polystyrene sphere dispersions are compared with previous data and computer simulations to give better insight into the processes occurring in the dispersions. [on

  8. Methods and computing challenges of the realistic simulation of physics events in the presence of pile-up in the ATLAS experiment

    CERN Document Server

    Chapman, J D; The ATLAS collaboration

    2014-01-01

    We are now in a regime where we observe substantial multiple proton-proton collisions within each filled LHC bunch-crossing and also multiple filled bunch-crossings within the sensitive time window of the ATLAS detector. This will increase with increased luminosity in the near future. Including these effects in Monte Carlo simulation poses significant computing challenges. We present a description of the standard approach used by the ATLAS experiment and details of how we manage the conflicting demands of keeping the background dataset size as small as possible while minimizing the effect of background event re-use. We also present details of the methods used to minimize the memory footprint of these digitization jobs, to keep them within the grid limit, despite combining the information from thousands of simulated events at once. We also describe an alternative approach, known as Overlay. Here, the actual detector conditions are sampled from raw data using a special zero-bias trigger, and the simulated physi...

  9. Rationalization of foundry processes on the basis of simulation experiment

    Directory of Open Access Journals (Sweden)

    S. Kukla

    2008-10-01

    Full Text Available The paper presents results of research obtained on the basis of simulation experiment, whose aim was to analyze the performance of cast iron foundry. A simulation model of automobile industry foundry was made. The course of the following processes was analyzedin a computer model: preparation of liquid cast iron, forming and filling the moulds, cooling and stamping the castings, cleaning andfinishing treatment. The sheets of multi-criterion evaluation were prepared, where criteria and variants were assessed by meansof subjective point evaluation and fuzzy character evaluation. The paper presents an analysis example of finishing activities of castings realized in foundry on traditional machines and efficient presses and in cooperation. On the basis of reports from a simulation experiment information was achieved related to activities’ duration, load of accessible resources, the problems of storage and transport, bottle necks in the system and appearing queues in from of workplaces. The research used a universal modelling and simulation packet for productionsystems - ARENA.

  10. Extracting Synthetic Multi-Cluster Platform Configurations from Grid'5000 for Driving Simulation Experiments

    OpenAIRE

    Suter , Frédéric; Casanova , Henri

    2007-01-01

    This report presents a collection of synthetic but realistic distributed computing platform configurations. These configurations are intended for simulation experiments in the study of parallel applications on multi-cluster platforms.

  11. Automatic temperature computation for realistic IR simulation

    Science.gov (United States)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  12. Computer simulation studies in fluid and calcium regulation and orthostatic intolerance

    Science.gov (United States)

    1985-01-01

    The systems analysis approach to physiological research uses mathematical models and computer simulation. Major areas of concern during prolonged space flight discussed include fluid and blood volume regulation; cardiovascular response during shuttle reentry; countermeasures for orthostatic intolerance; and calcium regulation and bone atrophy. Potential contributions of physiologic math models to future flight experiments are examined.

  13. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    people who use computers every moment of their waking lives, others even ... How is discrete event simulation different from other kinds of simulation? ... time, energy consumption .... Schedule the CustomerDeparture event for this customer.

  14. CET exSim: mineral exploration experience via simulation

    Science.gov (United States)

    Wong, Jason C.; Holden, Eun-Jung; Kovesi, Peter; McCuaig, T. Campbell; Hronsky, Jon

    2013-08-01

    Undercover mineral exploration is a challenging task as it requires understanding of subsurface geology by relying heavily on remotely sensed (i.e. geophysical) data. Cost-effective exploration is essential in order to increase the chance of success using finite budgets. This requires effective decision-making in both the process of selecting the optimum data collection methods and in the process of achieving accuracy during subsequent interpretation. Traditionally, developing the skills, behaviour and practices of exploration decision-making requires many years of experience through working on exploration projects under various geological settings, commodities and levels of available resources. This implies long periods of sub-optimal exploration decision-making, before the necessary experience has been successfully obtained. To address this critical industry issue, our ongoing research focuses on the development of the unique and novel e-learning environment, exSim, which simulates exploration scenarios where users can test their strategies and learn the consequences of their choices. This simulator provides an engaging platform for self-learning and experimentation in exploration decision strategies, providing a means to build experience more effectively. The exSim environment also provides a unique platform on which numerous scenarios and situations (e.g. deposit styles) can be simulated, potentially allowing the user to become virtually familiarised with a broader scope of exploration practices. Harnessing the power of computer simulation, visualisation and an intuitive graphical user interface, the simulator provides a way to assess the user's exploration decisions and subsequent interpretations. In this paper, we present the prototype functionalities in exSim including: simulation of geophysical surveys, follow-up drill testing and interpretation assistive tools.

  15. Simulation of sodium boiling experiments with THERMIT sodium version

    International Nuclear Information System (INIS)

    Huh, K.Y.

    1982-05-01

    Natural and forced convection experiments (SBTF and French) are simulated with the sodium version of the thermal-hydraulic computer code THERMIT. Simulation is done for the test section with the pressure-velocity boundary condition and subsequently extended to the whole loop. For the test section simulation, a steady-state and transient calculations are performed and compared with experimental data. For the loop simulation, two methods are used, a simulated 1-D loop and an actual 1-D loop. In the simulated 1-D loop analysis, the vapor density is increased by one hundred and two hundred times to avoid the code failure and the results still showed some of the important characteristics of the two-phase flow oscillation in a loop. A mathematical model is suggested for the two-phase flow oscillation. In the actual 1-D loop, only the single phase calculation was performed and turned out to be nearly the same as the simulated 1-D loop single phase results

  16. Overview of Computational Fluid Dynamics (CFD) simulation of stirred vessel

    International Nuclear Information System (INIS)

    Mohd Rizal Mamat; Azraf Azman; Anwar Abdul Rahman; Noraishah Othman

    2010-01-01

    Stirred vessel is one of many widely used equipment in industrial process and chemical industry. The design of stirred vessel typically follows a certain standard chemical engineering practice that may also involve empirical data acquired from experiments. However the design may still take a different route which is computational engineering simulation and analysis. CFD has been identified as one of the possible tools for such purposes. CFD enables the flow fields variables such as velocity, temperature and pressure in the whole computational domain to be obtained and as such it presents an advantage over the experimental setup. (author)

  17. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  18. Launch Site Computer Simulation and its Application to Processes

    Science.gov (United States)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  19. ASAS: Computational code for Analysis and Simulation of Atomic Spectra

    Directory of Open Access Journals (Sweden)

    Jhonatha R. dos Santos

    2017-01-01

    Full Text Available The laser isotopic separation process is based on the selective photoionization principle and, because of this, it is necessary to know the absorption spectrum of the desired atom. Computational resource has become indispensable for the planning of experiments and analysis of the acquired data. The ASAS (Analysis and Simulation of Atomic Spectra software presented here is a helpful tool to be used in studies involving atomic spectroscopy. The input for the simulations is friendly and essentially needs a database containing the energy levels and spectral lines of the atoms subjected to be studied.

  20. High performance simulation for the Silva project using the tera computer

    International Nuclear Information System (INIS)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F.; Boulet, M.; Scheurer, B.; Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A.

    2003-01-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  1. High performance simulation for the Silva project using the tera computer

    Energy Technology Data Exchange (ETDEWEB)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F. [CS Communication and Systemes, 92 - Clamart (France); Boulet, M.; Scheurer, B. [CEA Bruyeres-le-Chatel, 91 - Bruyeres-le-Chatel (France); Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A. [CEA Saclay, 91 - Gif sur Yvette (France)

    2003-07-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  2. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-01-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  3. Estimating the Diffusion Coefficients of Sugars Using Diffusion Experiments in Agar-Gel and Computer Simulations.

    Science.gov (United States)

    Miyamoto, Shuichi; Atsuyama, Kenji; Ekino, Keisuke; Shin, Takashi

    2018-01-01

    The isolation of useful microbes is one of the traditional approaches for the lead generation in drug discovery. As an effective technique for microbe isolation, we recently developed a multidimensional diffusion-based gradient culture system of microbes. In order to enhance the utility of the system, it is favorable to have diffusion coefficients of nutrients such as sugars in the culture medium beforehand. We have, therefore, built a simple and convenient experimental system that uses agar-gel to observe diffusion. Next, we performed computer simulations-based on random-walk concepts-of the experimental diffusion system and derived correlation formulas that relate observable diffusion data to diffusion coefficients. Finally, we applied these correlation formulas to our experimentally-determined diffusion data to estimate the diffusion coefficients of sugars. Our values for these coefficients agree reasonably well with values published in the literature. The effectiveness of our simple technique, which has elucidated the diffusion coefficients of some molecules which are rarely reported (e.g., galactose, trehalose, and glycerol) is demonstrated by the strong correspondence between the literature values and those obtained in our experiments.

  4. Highway traffic simulation on multi-processor computers

    Energy Technology Data Exchange (ETDEWEB)

    Hanebutte, U.R.; Doss, E.; Tentner, A.M.

    1997-04-01

    A computer model has been developed to simulate highway traffic for various degrees of automation with a high level of fidelity in regard to driver control and vehicle characteristics. The model simulates vehicle maneuvering in a multi-lane highway traffic system and allows for the use of Intelligent Transportation System (ITS) technologies such as an Automated Intelligent Cruise Control (AICC). The structure of the computer model facilitates the use of parallel computers for the highway traffic simulation, since domain decomposition techniques can be applied in a straight forward fashion. In this model, the highway system (i.e. a network of road links) is divided into multiple regions; each region is controlled by a separate link manager residing on an individual processor. A graphical user interface augments the computer model kv allowing for real-time interactive simulation control and interaction with each individual vehicle and road side infrastructure element on each link. Average speed and traffic volume data is collected at user-specified loop detector locations. Further, as a measure of safety the so- called Time To Collision (TTC) parameter is being recorded.

  5. Haptic feedback improves surgeons' user experience and fracture reduction in facial trauma simulation.

    Science.gov (United States)

    Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka

    2016-01-01

    Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.

  6. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2017-01-01

    This textbook presents basic numerical methods and applies them to a large variety of physical models in multiple computer experiments. Classical algorithms and more recent methods are explained. Partial differential equations are treated generally comparing important methods, and equations of motion are solved by a large number of simple as well as more sophisticated methods. Several modern algorithms for quantum wavepacket motion are compared. The first part of the book discusses the basic numerical methods, while the second part simulates classical and quantum systems. Simple but non-trivial examples from a broad range of physical topics offer readers insights into the numerical treatment but also the simulated problems. Rotational motion is studied in detail, as are simple quantum systems. A two-level system in an external field demonstrates elementary principles from quantum optics and simulation of a quantum bit. Principles of molecular dynamics are shown. Modern bounda ry element methods are presented ...

  7. Alternative energy technologies an introduction with computer simulations

    CERN Document Server

    Buxton, Gavin

    2014-01-01

    Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe

  8. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  9. Comparison of meaningful learning characteristics in simulated nursing practice after traditional versus computer-based simulation method: a qualitative videography study.

    Science.gov (United States)

    Poikela, Paula; Ruokamo, Heli; Teräs, Marianne

    2015-02-01

    Nursing educators must ensure that nursing students acquire the necessary competencies; finding the most purposeful teaching methods and encouraging learning through meaningful learning opportunities is necessary to meet this goal. We investigated student learning in a simulated nursing practice using videography. The purpose of this paper is to examine how two different teaching methods presented students' meaningful learning in a simulated nursing experience. The 6-hour study was divided into three parts: part I, general information; part II, training; and part III, simulated nursing practice. Part II was delivered by two different methods: a computer-based simulation and a lecture. The study was carried out in the simulated nursing practice in two universities of applied sciences, in Northern Finland. The participants in parts II and I were 40 first year nursing students; 12 student volunteers continued to part III. Qualitative analysis method was used. The data were collected using video recordings and analyzed by videography. The students who used a computer-based simulation program were more likely to report meaningful learning themes than those who were first exposed to lecture method. Educators should be encouraged to use computer-based simulation teaching in conjunction with other teaching methods to ensure that nursing students are able to receive the greatest educational benefits. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Computer simulation of leadership, consensus decision making and collective behaviour in humans.

    Directory of Open Access Journals (Sweden)

    Song Wu

    Full Text Available The aim of this study is to evaluate the reliability of a crowd simulation model developed by the authors by reproducing Dyer et al.'s experiments (published in Philosophical Transactions in 2009 on human leadership and consensus decision making in a computer-based environment. The theoretical crowd model of the simulation environment is presented, and its results are compared and analysed against Dyer et al.'s original experiments. It is concluded that the simulation results are largely consistent with the experiments, which demonstrates the reliability of the crowd model. Furthermore, the simulation data also reveals several additional new findings, namely: 1 the phenomena of sacrificing accuracy to reach a quicker consensus decision found in ants colonies was also discovered in the simulation; 2 the ability of reaching consensus in groups has a direct impact on the time and accuracy of arriving at the target position; 3 the positions of the informed individuals or leaders in the crowd could have significant impact on the overall crowd movement; and 4 the simulation also confirmed Dyer et al.'s anecdotal evidence of the proportion of the leadership in large crowds and its effect on crowd movement. The potential applications of these findings are highlighted in the final discussion of this paper.

  11. Simulator experiments: effects of NPP operator experience on performance

    International Nuclear Information System (INIS)

    Beare, A.N.; Gray, L.H.

    1985-01-01

    Experiments are being conducted on nuclear power plant (NPP) control room training simulators by the Oak Ridge National Laboratory, its subcontractor, General Physics Corporation, and participating utilities. The experiments are sponsored by the Nuclear Regulatory Commission's (NRC) Human Factors and Safeguards Branch, Division of Risk Analysis and Operations, and are a continuation of prior research using simulators, supported by field data collection, to provide a technical basis for NRC human factors regulatory issues concerned with the operational safety of nuclear power plants. During the FY83 research, a simulator experiment was conducted at the control room simulator for a GE boiling water reactor (BWR) NPP. The research subjects were licensed operators undergoing requalification training and shift technical advisors (STAs). This experiment was designed to investigate the effects of (a) senior reactor operator (SRO) experience, (b) operating crew augmentation with an STA and (c) practice, as a crew, upon crew and individual operator performance, in response to anticipated plant transients. The FY84 experiments are a partial replication and extension of the FY83 experiment, but with PWR operators and simulator. Methodology and results to date are reported

  12. Development of computational science in JAEA. R and D of simulation

    International Nuclear Information System (INIS)

    Nakajima, Norihiro; Araya, Fumimasa; Hirayama, Toshio

    2006-01-01

    R and D of computational science in JAEA (Japan Atomic Energy Agency) is described. Environment of computer, R and D system in CCSE (Center for Computational Science and e-Systems), joint computational science researches in Japan and world, development of computer technologies, the some examples of simulation researches, 3-dimensional image vibrational platform system, simulation researches of FBR cycle techniques, simulation of large scale thermal stress for development of steam generator, simulation research of fusion energy techniques, development of grid computing technology, simulation research of quantum beam techniques and biological molecule simulation researches are explained. Organization of JAEA, development of computational science in JAEA, network of JAEA, international collaboration of computational science, and environment of ITBL (Information-Technology Based Laboratory) project are illustrated. (S.Y.)

  13. Computer simulations of liquid crystals: Defects, deformations and dynamics

    Science.gov (United States)

    Billeter, Jeffrey Lee

    1999-11-01

    Computer simulations play an increasingly important role in investigating fundamental issues in the physics of liquid crystals. Presented here are the results of three projects which utilize the unique power of simulations to probe questions which neither theory nor experiment can adequately answer. Throughout, we use the (generalized) Gay-Berne model, a widely-used phenomenological potential which captures the essential features of the anisotropic mesogen shapes and interactions. First, we used a Molecular Dynamics simulation with 65536 Gay-Berne particles to study the behaviors of topological defects in a quench from the isotropic to the nematic phase. Twist disclination loops were the dominant defects, and we saw evidence for dynamical scaling. We observed the loops separating, combining and collapsing, and we also observed numerous non-singular type-1 lines which appeared to be intimately involved with many of the loop processes. Second, we used a Molecular Dynamics simulation of a sphere embedded in a system of 2048 Gay-Berne particles to study the effects of radial anchoring of the molecules at the sphere's surface. A saturn ring defect configuration was observed, and the ring caused a driven sphere (modelling the falling ball experiment) to experience an increased resistance as it moved through the nematic. Deviations from a linear relationship between the driving force and the terminal speed are attributed to distortions of the saturn ring which we observed. The existence of the saturn ring confirms theoretical predictions for small spheres. Finally, we constructed a model for wedge-shaped molecules and used a linear response approach in a Monte Carlo simulation to investigate the flexoelectric behavior of a system of 256 such wedges. Novel potential models as well as novel analytical and visualization techniques were developed for these projects. Once again, the emphasis throughout was to investigate questions which simulations alone can adequately answer.

  14. Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation

    Science.gov (United States)

    Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.

    2012-01-01

    The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.

  15. Polymer Composites Corrosive Degradation: A Computational Simulation

    Science.gov (United States)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  16. Design of preconcentration flow-sheet for processing Bhimunipatnam beach sands using pilot plant experiments and computer simulation

    International Nuclear Information System (INIS)

    Padmanabhan, N.P.H.; Sridhar, U.

    1993-01-01

    Simulation was carried out using a beach sand beneficiation plant simulator software, SANDBEN, currently being developed in Indian School of Mines, Dhanbad, and the results were compared and analyzed with those obtained by actual pilot plant experiments on a beach sand sample from Bhimunipatnam deposit. The software is discussed and its capabilities and limitations are highlighted. An optimal preconcentrator flow-sheet for processing Bhimunipatnam beach sand was developed by simulation and using the results of the pilot plant experiments. (author). 13 refs., 2 tabs., 3 figs

  17. Simulation Experiment Description Markup Language (SED-ML Level 1 Version 2

    Directory of Open Access Journals (Sweden)

    Bergmann Frank T.

    2015-06-01

    Full Text Available The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE guidelines.

  18. Simulation of COMEDIE Fission Product Plateout Experiment Using GAMMA-FP

    International Nuclear Information System (INIS)

    Tak, Nam-il; Yoon, Churl

    2014-01-01

    FThis phenomenon is particularly important under a VHTR design with vented low pressure confinement (VLPC), because the vent allows the prompt release of fission products accumulated within the primary circuit to environment during an initial blow-down phase after pipe break accidents. In order to analyze the fission product plateout, an numerical model was developed by Yoo et al. and incorporated into the GAMMA-FP code in the past. The GAMMA-FP model was validated against two experiment data, i.e., VAMPYR-1 and OGL, during the development phase. One of the well-known experiments for fission product plateout is the COMEDIE experiment. In this work, the COMEDIE experiment has been simulated using the GAMMA-FP code to investigate the reliability and applicability of the plateout model of GAMMA-FP. The COMEDIE experiment for fission product plateout was simulated using the GAMMA-FP code in this work. A good agreement was achieved between the measured and predicted plateout activities. The existing solution scheme was modified to allow larger time step size for fission product analysis in order to speed-up the computational time. Nevertheless, the modification of the existing numerical model of GAMMA-FP is necessary when a simulation capability of a long duration of plateout period (e.g., 60 years) is targeted

  19. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  20. Simulation of the Phebus FPT1 experiment

    International Nuclear Information System (INIS)

    Amador G, R.; Nunez C, A.; Angel M, E. Del

    2003-01-01

    The present work describes the pattern of the denominated installation Phebus developed and used by the National Commission of Nuclear Security and Safeguards for their participation in the International Standard Problem ISP-46, organized by the Nuclear Energy Agency (NEA). The exercise consisted on the simulation of the denominated experiment Phebus FPT1 carried out in the experimental installation Phebus located in the Institut de Protection et de Surete Nucleaire of France. The experiment Phebus FP1 had as objective to evaluate the capacity of different computer codes to model in integral form the physical processes that are carried out during a severe accident in a pressurized water reactor (PWR), from the degradation of the core until the late stage with the formation of a pool of fused material, hydrogen production, liberation and transport of fission products, phenomena in the contention and chemistry of the iodine. The CNSNS uses the version bi of the SCDAPSIM code developed by the company Innovative Software Systems to simulate the International Standard Problem 46. The obtained results showed that the code is able to predict the thermohydraulic part of the experiment, however the same thing doesn't happen to the parameters related with the one fused of the fuel. (Author)

  1. Enabling the ATLAS Experiment at the LHC for High Performance Computing

    CERN Document Server

    AUTHOR|(CDS)2091107; Ereditato, Antonio

    In this thesis, I studied the feasibility of running computer data analysis programs from the Worldwide LHC Computing Grid, in particular large-scale simulations of the ATLAS experiment at the CERN LHC, on current general purpose High Performance Computing (HPC) systems. An approach for integrating HPC systems into the Grid is proposed, which has been implemented and tested on the „Todi” HPC machine at the Swiss National Supercomputing Centre (CSCS). Over the course of the test, more than 500000 CPU-hours of processing time have been provided to ATLAS, which is roughly equivalent to the combined computing power of the two ATLAS clusters at the University of Bern. This showed that current HPC systems can be used to efficiently run large-scale simulations of the ATLAS detector and of the detected physics processes. As a first conclusion of my work, one can argue that, in perspective, running large-scale tasks on a few large machines might be more cost-effective than running on relatively small dedicated com...

  2. Fluid simulation for computer graphics

    CERN Document Server

    Bridson, Robert

    2008-01-01

    Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.

  3. Large-scale simulations of error-prone quantum computation devices

    International Nuclear Information System (INIS)

    Trieu, Doan Binh

    2009-01-01

    The theoretical concepts of quantum computation in the idealized and undisturbed case are well understood. However, in practice, all quantum computation devices do suffer from decoherence effects as well as from operational imprecisions. This work assesses the power of error-prone quantum computation devices using large-scale numerical simulations on parallel supercomputers. We present the Juelich Massively Parallel Ideal Quantum Computer Simulator (JUMPIQCS), that simulates a generic quantum computer on gate level. It comprises an error model for decoherence and operational errors. The robustness of various algorithms in the presence of noise has been analyzed. The simulation results show that for large system sizes and long computations it is imperative to actively correct errors by means of quantum error correction. We implemented the 5-, 7-, and 9-qubit quantum error correction codes. Our simulations confirm that using error-prone correction circuits with non-fault-tolerant quantum error correction will always fail, because more errors are introduced than being corrected. Fault-tolerant methods can overcome this problem, provided that the single qubit error rate is below a certain threshold. We incorporated fault-tolerant quantum error correction techniques into JUMPIQCS using Steane's 7-qubit code and determined this threshold numerically. Using the depolarizing channel as the source of decoherence, we find a threshold error rate of (5.2±0.2) x 10 -6 . For Gaussian distributed operational over-rotations the threshold lies at a standard deviation of 0.0431±0.0002. We can conclude that quantum error correction is especially well suited for the correction of operational imprecisions and systematic over-rotations. For realistic simulations of specific quantum computation devices we need to extend the generic model to dynamic simulations, i.e. time-dependent Hamiltonian simulations of realistic hardware models. We focus on today's most advanced technology, i

  4. Computer simulation for synchrotron radiation based X-ray fluorescent microtomography

    International Nuclear Information System (INIS)

    Deng Biao; Yu Xiaohan; Xu Hongjie

    2007-01-01

    Synchrotron radiation based fluorescent microtomography (SR-XFMT) is a nondestructive technique for detecting elemental composition and distribution inside a specimen with high spatial resolution and sensitivity, and will be an optional experimental technique at SSRF hard X-ray micro-focusing beamline now under construction. In this paper, the principles and developments of SR-XFMT are briefly introduced. Computer simulation of SR-XFMT experiment is performed. The image of the simulated sample is reconstructed using Filtered Back Projection (FBP), Algebraic Reconstruction Techniques (ART) and modified FBP with absorption correction. The qualities of the reconstructed images are analyzed and compared. The validity of these reconstruction techniques is discussed. (authors)

  5. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  6. Sophistication of computational science and fundamental physics simulations

    International Nuclear Information System (INIS)

    Ishiguro, Seiji; Ito, Atsushi; Usami, Shunsuke; Ohtani, Hiroaki; Sakagami, Hitoshi; Toida, Mieko; Hasegawa, Hiroki; Horiuchi, Ritoku; Miura, Hideaki

    2016-01-01

    Numerical experimental reactor research project is composed of the following studies: (1) nuclear fusion simulation research with a focus on specific physical phenomena of specific equipment, (2) research on advanced simulation method to increase predictability or expand its application range based on simulation, (3) visualization as the foundation of simulation research, (4) research for advanced computational science such as parallel computing technology, and (5) research aiming at elucidation of fundamental physical phenomena not limited to specific devices. Specifically, a wide range of researches with medium- to long-term perspectives are being developed: (1) virtual reality visualization, (2) upgrading of computational science such as multilayer simulation method, (3) kinetic behavior of plasma blob, (4) extended MHD theory and simulation, (5) basic plasma process such as particle acceleration due to interaction of wave and particle, and (6) research related to laser plasma fusion. This paper reviews the following items: (1) simultaneous visualization in virtual reality space, (2) multilayer simulation of collisionless magnetic reconnection, (3) simulation of microscopic dynamics of plasma coherent structure, (4) Hall MHD simulation of LHD, (5) numerical analysis for extension of MHD equilibrium and stability theory, (6) extended MHD simulation of 2D RT instability, (7) simulation of laser plasma, (8) simulation of shock wave and particle acceleration, and (9) study on simulation of homogeneous isotropic MHD turbulent flow. (A.O.)

  7. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  8. Simulations of silicon vertex tracker for star experiment at RHIC

    Energy Technology Data Exchange (ETDEWEB)

    Odyniec, G.; Cebra, D.; Christie, W.; Naudet, C.; Schroeder, L.; Wilson, W. [Lawrence Berkeley Lab., CA (United States); Liko, D. [Institut fur Hochenenergiephysik, Vienna, (Austria); Cramer, J.; Prindle, D.; Trainor, T. [Univ. of Washington, Seattle (United States); Braithwaite, W. [Univ. of Arkansas, Little Rock (United States)

    1991-12-31

    The first computer simulations to optimize the Silicon Vertex Tracker (SVT) designed for the STAR experiment at RHIC are presented. The physics goals and the expected complexity of the events at RHIC dictate the design of a tracking system for the STAR experiment. The proposed tracking system will consist of a silicon vertex tracker (SVT) to locate the primary interaction and secondary decay vertices and to improve the momentum resolution, and a time projection chamber (TPC), positioned inside a solenoidal magnet, for continuous tracking.

  9. A Review of Freely Available Quantum Computer Simulation Software

    OpenAIRE

    Brandhorst-Satzkorn, Johan

    2012-01-01

    A study has been made of a few different freely available Quantum Computer simulators. All the simulators tested are available online on their respective websites. A number of tests have been performed to compare the different simulators against each other. Some untested simulators of various programming languages are included to show the diversity of the quantum computer simulator applications. The conclusion of the review is that LibQuantum is the best of the simulators tested because of ea...

  10. Computer simulation of liquid crystals

    International Nuclear Information System (INIS)

    McBride, C.

    1999-01-01

    Molecular dynamics simulation performed on modern computer workstations provides a powerful tool for the investigation of the static and dynamic characteristics of liquid crystal phases. In this thesis molecular dynamics computer simulations have been performed for two model systems. Simulations of 4,4'-di-n-pentyl-bibicyclo[2.2.2]octane demonstrate the growth of a structurally ordered phase directly from an isotropic fluid. This is the first time that this has been achieved for an atomistic model. The results demonstrate a strong coupling between orientational ordering and molecular shape, but indicate that the coupling between molecular conformational changes and molecular reorientation is relatively weak. Simulations have also been performed for a hybrid Gay-Berne/Lennard-Jones model resulting in thermodynamically stable nematic and smectic phases. Frank elastic constants have been calculated for the nematic phase formed by the hybrid model through analysis of the fluctuations of the nematic director, giving results comparable with those found experimentally. Work presented in this thesis also describes the parameterization of the torsional potential of a fragment of a dimethyl siloxane polymer chain, disiloxane diol (HOMe 2 Si) 2 O, using ab initio quantum mechanical calculations. (author)

  11. Microdefects in an as-grown Czochralski silicon crystal studied by synchrotron radiation section topography with aid of computer simulation

    International Nuclear Information System (INIS)

    Iida, Satoshi; Aoki, Yoshirou; Okitsu, Kouhei; Sugita, Yoshimitsu; Kawata, Hiroshi; Abe, Takao

    1998-01-01

    Grown-in microdefects of a Czochralski (CZ) silicon crystal grown at a slow growth rate were studied by section topography using high energy synchrotron radiation. Images of the microdefects in the section topographs were analyzed quantitatively using computer simulation based on the Takagi-Taupin type dynamical diffraction theory of X-rays, and reproduced successfully by the simulation when the microdefects were assumed to be spherical strain centers. Sizes and positions of the microdefects were able to be determined by detailed comparison between the experiments and the computer simulations. The validity of the computer simulation in an analysis of the section topographs is discussed. (author)

  12. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    Science.gov (United States)

    Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul

    2015-11-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.

  13. Simulation tools for two-dimensional experiments in x-ray computed tomography using the FORBILD head phantom.

    Science.gov (United States)

    Yu, Zhicong; Noo, Frédéric; Dennerlein, Frank; Wunderlich, Adam; Lauritsch, Günter; Hornegger, Joachim

    2012-07-07

    Mathematical phantoms are essential for the development and early stage evaluation of image reconstruction algorithms in x-ray computed tomography (CT). This note offers tools for computer simulations using a two-dimensional (2D) phantom that models the central axial slice through the FORBILD head phantom. Introduced in 1999, in response to a need for a more robust test, the FORBILD head phantom is now seen by many as the gold standard. However, the simple Shepp-Logan phantom is still heavily used by researchers working on 2D image reconstruction. Universal acceptance of the FORBILD head phantom may have been prevented by its significantly higher complexity: software that allows computer simulations with the Shepp-Logan phantom is not readily applicable to the FORBILD head phantom. The tools offered here address this problem. They are designed for use with Matlab®, as well as open-source variants, such as FreeMat and Octave, which are all widely used in both academia and industry. To get started, the interested user can simply copy and paste the codes from this PDF document into Matlab® M-files.

  14. Simulation tools for two-dimensional experiments in x-ray computed tomography using the FORBILD head phantom

    International Nuclear Information System (INIS)

    Yu Zhicong; Noo, Frédéric; Wunderlich, Adam; Dennerlein, Frank; Lauritsch, Günter; Hornegger, Joachim

    2012-01-01

    Mathematical phantoms are essential for the development and early stage evaluation of image reconstruction algorithms in x-ray computed tomography (CT). This note offers tools for computer simulations using a two-dimensional (2D) phantom that models the central axial slice through the FORBILD head phantom. Introduced in 1999, in response to a need for a more robust test, the FORBILD head phantom is now seen by many as the gold standard. However, the simple Shepp–Logan phantom is still heavily used by researchers working on 2D image reconstruction. Universal acceptance of the FORBILD head phantom may have been prevented by its significantly higher complexity: software that allows computer simulations with the Shepp–Logan phantom is not readily applicable to the FORBILD head phantom. The tools offered here address this problem. They are designed for use with Matlab®, as well as open-source variants, such as FreeMat and Octave, which are all widely used in both academia and industry. To get started, the interested user can simply copy and paste the codes from this PDF document into Matlab® M-files. (note)

  15. Comparisons of physical experiment and discrete element simulations of sheared granular materials in an annular shear cell

    Science.gov (United States)

    Ji, S.; Hanes, D.M.; Shen, H.H.

    2009-01-01

    In this study, we report a direct comparison between a physical test and a computer simulation of rapidly sheared granular materials. An annular shear cell experiment was conducted. All parameters were kept the same between the physical and the computational systems to the extent possible. Artificially softened particles were used in the simulation to reduce the computational time to a manageable level. Sensitivity study on the particle stiffness ensured such artificial modification was acceptable. In the experiment, a range of normal stress was applied to a given amount of particles sheared in an annular trough with a range of controlled shear speed. Two types of particles, glass and Delrin, were used in the experiment. Qualitatively, the required torque to shear the materials under different rotational speed compared well with those in the physical experiments for both the glass and the Delrin particles. However, the quantitative discrepancies between the measured and simulated shear stresses were nearly a factor of two. Boundary conditions, particle size distribution, particle damping and friction, including a sliding and rolling, contact force model, were examined to determine their effects on the computational results. It was found that of the above, the rolling friction between particles had the most significant effect on the macro stress level. This study shows that discrete element simulation is a viable method for engineering design for granular material systems. Particle level information is needed to properly conduct these simulations. However, not all particle level information is equally important in the study regime. Rolling friction, which is not commonly considered in many discrete element models, appears to play an important role. ?? 2009 Elsevier Ltd.

  16. The Longitudinal Study of Computer Simulation in Learning Statistics for Hospitality College Students

    Science.gov (United States)

    Huang, Ching-Hsu

    2014-01-01

    The class quasi-experiment was conducted to determine whether using computer simulation teaching strategy enhanced student understanding of statistics concepts for students enrolled in an introductory course. One hundred and ninety-three sophomores in hospitality management department were invited as participants in this two-year longitudinal…

  17. Biomes computed from simulated climatologies

    Energy Technology Data Exchange (ETDEWEB)

    Claussen, M.; Esch, M. [Max-Planck-Institut fuer Meteorologie, Hamburg (Germany)

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

  18. Computer security simulation

    International Nuclear Information System (INIS)

    Schelonka, E.P.

    1979-01-01

    Development and application of a series of simulation codes used for computer security analysis and design are described. Boolean relationships for arrays of barriers within functional modules are used to generate composite effectiveness indices. The general case of multiple layers of protection with any specified barrier survival criteria is given. Generalized reduction algorithms provide numerical security indices in selected subcategories and for the system as a whole. 9 figures, 11 tables

  19. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  20. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  1. SED-ED, a workflow editor for computational biology experiments written in SED-ML.

    Science.gov (United States)

    Adams, Richard R

    2012-04-15

    The simulation experiment description markup language (SED-ML) is a new community data standard to encode computational biology experiments in a computer-readable XML format. Its widespread adoption will require the development of software support to work with SED-ML files. Here, we describe a software tool, SED-ED, to view, edit, validate and annotate SED-ML documents while shielding end-users from the underlying XML representation. SED-ED supports modellers who wish to create, understand and further develop a simulation description provided in SED-ML format. SED-ED is available as a standalone Java application, as an Eclipse plug-in and as an SBSI (www.sbsi.ed.ac.uk) plug-in, all under an MIT open-source license. Source code is at https://sed-ed-sedmleditor.googlecode.com/svn. The application itself is available from https://sourceforge.net/projects/jlibsedml/files/SED-ED/.

  2. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    Energy Technology Data Exchange (ETDEWEB)

    Simunovic, S.; Aramayo, G.A.; Zacharia, T. [Oak Ridge National Lab., TN (United States); Toridis, T.G. [George Washington Univ., Washington, DC (United States); Bandak, F.; Ragland, C.L. [Dept. of Transportation, Washington, DC (United States)

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  3. Voxel-based Monte Carlo simulation of X-ray imaging and spectroscopy experiments

    International Nuclear Information System (INIS)

    Bottigli, U.; Brunetti, A.; Golosio, B.; Oliva, P.; Stumbo, S.; Vincze, L.; Randaccio, P.; Bleuet, P.; Simionovici, A.; Somogyi, A.

    2004-01-01

    A Monte Carlo code for the simulation of X-ray imaging and spectroscopy experiments in heterogeneous samples is presented. The energy spectrum, polarization and profile of the incident beam can be defined so that X-ray tube systems as well as synchrotron sources can be simulated. The sample is modeled as a 3D regular grid. The chemical composition and density is given at each point of the grid. Photoelectric absorption, fluorescent emission, elastic and inelastic scattering are included in the simulation. The core of the simulation is a fast routine for the calculation of the path lengths of the photon trajectory intersections with the grid voxels. The voxel representation is particularly useful for samples that cannot be well described by a small set of polyhedra. This is the case of most naturally occurring samples. In such cases, voxel-based simulations are much less expensive in terms of computational cost than simulations on a polygonal representation. The efficient scheme used for calculating the path lengths in the voxels and the use of variance reduction techniques make the code suitable for the detailed simulation of complex experiments on generic samples in a relatively short time. Examples of applications to X-ray imaging and spectroscopy experiments are discussed

  4. Voxel-based Monte Carlo simulation of X-ray imaging and spectroscopy experiments

    Energy Technology Data Exchange (ETDEWEB)

    Bottigli, U. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Sezione INFN di Cagliari (Italy); Brunetti, A. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Golosio, B. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy) and Sezione INFN di Cagliari (Italy)]. E-mail: golosio@uniss.it; Oliva, P. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Stumbo, S. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Vincze, L. [Department of Chemistry, University of Antwerp (Belgium); Randaccio, P. [Dipartimento di Fisica dell' Universita di Cagliari and Sezione INFN di Cagliari (Italy); Bleuet, P. [European Synchrotron Radiation Facility, Grenoble (France); Simionovici, A. [European Synchrotron Radiation Facility, Grenoble (France); Somogyi, A. [European Synchrotron Radiation Facility, Grenoble (France)

    2004-10-08

    A Monte Carlo code for the simulation of X-ray imaging and spectroscopy experiments in heterogeneous samples is presented. The energy spectrum, polarization and profile of the incident beam can be defined so that X-ray tube systems as well as synchrotron sources can be simulated. The sample is modeled as a 3D regular grid. The chemical composition and density is given at each point of the grid. Photoelectric absorption, fluorescent emission, elastic and inelastic scattering are included in the simulation. The core of the simulation is a fast routine for the calculation of the path lengths of the photon trajectory intersections with the grid voxels. The voxel representation is particularly useful for samples that cannot be well described by a small set of polyhedra. This is the case of most naturally occurring samples. In such cases, voxel-based simulations are much less expensive in terms of computational cost than simulations on a polygonal representation. The efficient scheme used for calculating the path lengths in the voxels and the use of variance reduction techniques make the code suitable for the detailed simulation of complex experiments on generic samples in a relatively short time. Examples of applications to X-ray imaging and spectroscopy experiments are discussed.

  5. REACTOR: a computer simulation for schools

    International Nuclear Information System (INIS)

    Squires, D.

    1985-01-01

    The paper concerns computer simulation of the operation of a nuclear reactor, for use in schools. The project was commissioned by UKAEA, and carried out by the Computers in the Curriculum Project, Chelsea College. The program, for an advanced gas cooled reactor, is briefly described. (U.K.)

  6. Learning and instruction with computer simulations

    NARCIS (Netherlands)

    de Jong, Anthonius J.M.

    1991-01-01

    The present volume presents the results of an inventory of elements of such a computer learning environment. This inventory was conducted within a DELTA project called SIMULATE. In the project a learning environment that provides intelligent support to learners and that has a simulation as its

  7. Bringing history to life: simulating landmark experiments in psychology.

    Science.gov (United States)

    Boynton, David M; Smith, Laurence D

    2006-05-01

    The course in history of psychology can be challenging for students, many of whom enter it with little background in history and faced with unfamiliar names and concepts. The sheer volume of material can encourage passive memorization unless efforts are made to increase student involvement. As part of a trend toward experiential history, historians of science have begun to supplement their lectures with demonstrations of classic physics experiments as a way to bring the history of science to life. Here, the authors report on computer simulations of five landmark experiments from early experimental psychology in the areas of reaction time, span of attention, and apparent motion. The simulations are designed not only to permit hands-on replication of historically important results but also to reproduce the experimental procedures closely enough that students can gain a feel for the nature of early research and the psychological processes being studied.

  8. Monte Carlo simulation of fast neutron scattering experiments including DD-breakup neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, D.; Siebert, B.R.L.

    1993-06-01

    The computational simulation of the deuteron breakup in a scattering experiment has been investigated. Experimental breakup spectra measured at 16 deuteron energies and at 7 angles for each energy served as the data base. Analysis of these input data and of the conditions of the scattering experiment made it possible to reduce the input data. The use of one weighted breakup spectrum is sufficient to simulate the scattering spectra at one incident neutron energy. A number of tests were carried out to prove the validity of this result. The simulation of neutron scattering on carbon, including the breakup, was compared with measured spectra. Differences between calculated and measured spectra were for the most part within the experimental uncertainties. Certain significant deviations can be attributed to erroneous scattering cross sections taken from an evaluation and used in the simulation. Scattering on higher-lying states in [sup 12]C can be analyzed by subtracting the simulated breakup-scattering from the experimental spectra. (orig.)

  9. Monte Carlo simulation of fast neutron scattering experiments including DD-breakup neutrons

    International Nuclear Information System (INIS)

    Schmidt, D.; Siebert, B.R.L.

    1993-06-01

    The computational simulation of the deuteron breakup in a scattering experiment has been investigated. Experimental breakup spectra measured at 16 deuteron energies and at 7 angles for each energy served as the data base. Analysis of these input data and of the conditions of the scattering experiment made it possible to reduce the input data. The use of one weighted breakup spectrum is sufficient to simulate the scattering spectra at one incident neutron energy. A number of tests were carried out to prove the validity of this result. The simulation of neutron scattering on carbon, including the breakup, was compared with measured spectra. Differences between calculated and measured spectra were for the most part within the experimental uncertainties. Certain significant deviations can be attributed to erroneous scattering cross sections taken from an evaluation and used in the simulation. Scattering on higher-lying states in 12 C can be analyzed by subtracting the simulated breakup-scattering from the experimental spectra. (orig.)

  10. Computer simulation on molten ionic salts

    International Nuclear Information System (INIS)

    Kawamura, K.; Okada, I.

    1978-01-01

    The extensive advances in computer technology have since made it possible to apply computer simulation to the evaluation of the macroscopic and microscopic properties of molten salts. The evaluation of the potential energy in molten salts systems is complicated by the presence of long-range energy, i.e. Coulomb energy, in contrast to simple liquids where the potential energy is easily evaluated. It has been shown, however, that no difficulties are encountered when the Ewald method is applied to the evaluation of Coulomb energy. After a number of attempts had been made to approximate the pair potential, the Huggins-Mayer potential based on ionic crystals became the most often employed. Since it is thought that the only appreciable contribution to many-body potential, not included in Huggins-Mayer potential, arises from the internal electrostatic polarization of ions in molten ionic salts, computer simulation with a provision for ion polarization has been tried recently. The computations, which are employed mainly for molten alkali halides, can provide: (1) thermodynamic data such as internal energy, internal pressure and isothermal compressibility; (2) microscopic configurational data such as radial distribution functions; (3) transport data such as the diffusion coefficient and electrical conductivity; and (4) spectroscopic data such as the intensity of inelastic scattering and the stretching frequency of simple molecules. The computed results seem to agree well with the measured results. Computer simulation can also be used to test the effectiveness of a proposed pair potential and the adequacy of postulated models of molten salts, and to obtain experimentally inaccessible data. A further application of MD computation employing the pair potential based on an ionic model to BeF 2 , ZnCl 2 and SiO 2 shows the possibility of quantitative interpretation of structures and glass transformation phenomena

  11. MoCog1: A computer simulation of recognition-primed human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  12. A Facility for Long-Term Mars Simulation Experiments: The Mars Environmental Simulation Chamber (MESCH)

    Science.gov (United States)

    Jensen, Lars Liengaard; Merrison, Jonathan; Hansen, Aviaja Anna; Mikkelsen, Karina Aarup; Kristoffersen, Tommy; Nørnberg, Per; Lomstein, Bente Aagaard; Finster, Kai

    2008-06-01

    We describe the design, construction, and pilot operation of a Mars simulation facility comprised of a cryogenic environmental chamber, an atmospheric gas analyzer, and a xenon/mercury discharge source for UV generation. The Mars Environmental Simulation Chamber (MESCH) consists of a double-walled cylindrical chamber. The double wall provides a cooling mantle through which liquid N2 can be circulated. A load-lock system that consists of a small pressure-exchange chamber, which can be evacuated, allows for the exchange of samples without changing the chamber environment. Fitted within the MESCH is a carousel, which holds up to 10 steel sample tubes. Rotation of the carousel is controlled by an external motor. Each sample in the carousel can be placed at any desired position. Environmental data, such as temperature, pressure, and UV exposure time, are computer logged and used in automated feedback mechanisms, enabling a wide variety of experiments that include time series. Tests of the simulation facility have successfully demonstrated its ability to produce temperature cycles and maintain low temperature (down to -140°C), low atmospheric pressure (5 10 mbar), and a gas composition like that of Mars during long-term experiments.

  13. New Pedagogies on Teaching Science with Computer Simulations

    Science.gov (United States)

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  14. Computer simulation of plasma behavior in open-ended linear theta machines. Scientific report 81-5

    International Nuclear Information System (INIS)

    Stover, E.K.

    1981-04-01

    Zero-dimensional and one-dimensional fluid plasma computer models have been developed to study the behavior of linear theta pinch plasmas. Computer simulation results generated from these codes are compared with data obtained from two theta pinch experiments so that significant machine plasma behavior can be identified. The experiments examined are a collisional experiment, T/sub i/ approx. 50 eV, n/sub e/ approx. 10 17 cm -3 , where the plasma mean-free-path was significantly less than the plasma column length, and a hot ion species experiment, T/sub i/ approx. 3 keV, n/sub e/ approx. 10 16 cm -3 , where the ion mean-free-path was on the order of the plasma column length

  15. Large-scale simulations of error-prone quantum computation devices

    Energy Technology Data Exchange (ETDEWEB)

    Trieu, Doan Binh

    2009-07-01

    The theoretical concepts of quantum computation in the idealized and undisturbed case are well understood. However, in practice, all quantum computation devices do suffer from decoherence effects as well as from operational imprecisions. This work assesses the power of error-prone quantum computation devices using large-scale numerical simulations on parallel supercomputers. We present the Juelich Massively Parallel Ideal Quantum Computer Simulator (JUMPIQCS), that simulates a generic quantum computer on gate level. It comprises an error model for decoherence and operational errors. The robustness of various algorithms in the presence of noise has been analyzed. The simulation results show that for large system sizes and long computations it is imperative to actively correct errors by means of quantum error correction. We implemented the 5-, 7-, and 9-qubit quantum error correction codes. Our simulations confirm that using error-prone correction circuits with non-fault-tolerant quantum error correction will always fail, because more errors are introduced than being corrected. Fault-tolerant methods can overcome this problem, provided that the single qubit error rate is below a certain threshold. We incorporated fault-tolerant quantum error correction techniques into JUMPIQCS using Steane's 7-qubit code and determined this threshold numerically. Using the depolarizing channel as the source of decoherence, we find a threshold error rate of (5.2{+-}0.2) x 10{sup -6}. For Gaussian distributed operational over-rotations the threshold lies at a standard deviation of 0.0431{+-}0.0002. We can conclude that quantum error correction is especially well suited for the correction of operational imprecisions and systematic over-rotations. For realistic simulations of specific quantum computation devices we need to extend the generic model to dynamic simulations, i.e. time-dependent Hamiltonian simulations of realistic hardware models. We focus on today's most advanced

  16. Interoceanic canal excavation scheduling via computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Baldonado, Orlino C [Holmes and Narver, Inc., Los Angeles, CA (United States)

    1970-05-15

    The computer simulation language GPSS/360 was used to simulate the schedule of several nuclear detonation programs for the interoceanic canal project. The effects of using different weather restriction categories due to air blast and fallout were investigated. The effect of increasing the number of emplacement and stemming crews and the effect of varying the reentry period after detonating a row charge or salvo were also studied. Detonation programs were simulated for the proposed Routes 17A and 25E. The study demonstrates the method of using computer simulation so that a schedule and its associated constraints can be assessed for feasibility. Since many simulation runs can be made for a given set of detonation program constraints, one readily obtains an average schedule for a range of conditions. This provides a method for analyzing time-sensitive operations so that time and cost-effective operational schedules can be established. A comparison of the simulated schedules with those that were published shows them to be similar. (author)

  17. Interoceanic canal excavation scheduling via computer simulation

    International Nuclear Information System (INIS)

    Baldonado, Orlino C.

    1970-01-01

    The computer simulation language GPSS/360 was used to simulate the schedule of several nuclear detonation programs for the interoceanic canal project. The effects of using different weather restriction categories due to air blast and fallout were investigated. The effect of increasing the number of emplacement and stemming crews and the effect of varying the reentry period after detonating a row charge or salvo were also studied. Detonation programs were simulated for the proposed Routes 17A and 25E. The study demonstrates the method of using computer simulation so that a schedule and its associated constraints can be assessed for feasibility. Since many simulation runs can be made for a given set of detonation program constraints, one readily obtains an average schedule for a range of conditions. This provides a method for analyzing time-sensitive operations so that time and cost-effective operational schedules can be established. A comparison of the simulated schedules with those that were published shows them to be similar. (author)

  18. Montecarlo Simulations for a Lep Experiment with Unix Workstation Clusters

    Science.gov (United States)

    Bonesini, M.; Calegari, A.; Rossi, P.; Rossi, V.

    Modular systems of RISC CPU based computers have been implemented for large productions of Montecarlo simulated events for the DELPHI experiment at CERN. From a pilot system based on DEC 5000 CPU’s, a full size system based on a CONVEX C3820 UNIX supercomputer and a cluster of HP 735 workstations has been put into operation as a joint effort between INFN Milano and CILEA.

  19. Biomes computed from simulated climatologies

    Energy Technology Data Exchange (ETDEWEB)

    Claussen, W.; Esch, M.

    1992-09-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study is undertaken in order to show the advantage of this biome model in comprehensively diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to failures in simulated rain fall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are seen for the tropical rain forests. A potential North-East shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting changes in vegetation patterns due to a rapid climate change, the latter simulation has to be taken as a prediction of changes in conditions favorable for the existence of certain biomes, not as a prediction of a future distribution of biomes. (orig.).

  20. Discrete Element Simulations and Experiments on the Deformation of Cohesive Powders in a Bi-Axial Box

    NARCIS (Netherlands)

    Imole, Olukayode Isaiah; Kumar, Nishant; Magnanimo, Vanessa; Luding, Stefan

    2012-01-01

    We compare element test experiments and simulations on the deformation of frictional, cohesive particles in a bi-axial box. We show that computer simulations with the Discrete Element Method qualitatively reproduce a uniaxial compression element test in the true bi-axial tester. We highlight the

  1. Computer graphics in heat-transfer simulations

    International Nuclear Information System (INIS)

    Hamlin, G.A. Jr.

    1980-01-01

    Computer graphics can be very useful in the setup of heat transfer simulations and in the display of the results of such simulations. The potential use of recently available low-cost graphics devices in the setup of such simulations has not been fully exploited. Several types of graphics devices and their potential usefulness are discussed, and some configurations of graphics equipment are presented in the low-, medium-, and high-price ranges

  2. Parallel Computing for Brain Simulation.

    Science.gov (United States)

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Simulation of nuclear fuel rods by using process computer-controlled power for indirect electrically heated rods

    International Nuclear Information System (INIS)

    Malang, S.

    1975-11-01

    An investigation was carried out to determine how the simulation of nuclear fuel rods with indirect electrically heated rods could be improved by use of a computer to control the electrical power during a loss-of-coolant accident (LOCA). To aid in the experiment, a new version of the HETRAP code was developed which simulates a LOCA with heater rod power controlled by a computer that adjusts rod power during a blowdown to minimize the difference in heat flux of the fuel and heater rods. Results show that without computer control of heater rod power, only the part of a blowdown up to the time when the heat transfer mode changes from nucleate boiling to transition or film boiling can be simulated well and then only for short times. With computer control, the surface heat flux and temperature of an electrically heated rod can be made nearly identical to that of a reactor fuel rod with the same cooling conditions during much of the LOCA. A small process control computer can be used to achieve close simulation of a nuclear fuel rod with an indirect electrically heated rod

  4. Designing simulation experiments with controllable and uncontrollable factors for applications in healthcare

    DEFF Research Database (Denmark)

    Dehlendorff, Christian; Kulahci, Murat; Andersen, Klaus Kaae

    2011-01-01

    We propose a new methodology for designing computer experiments that was inspired by the split-plot designs that are often used in physical experimentation.The methodology has been developed for a simulation model of a surgical unit in a Danish hospital.We classify the factors as controllable and...

  5. Steam condensation induced water hammer in a vertical up-fill configuration within an integral test facility. Experiments and computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dirndorfer, Stefan

    2017-01-17

    Condensation induced water hammer is a source of danger and unpredictable loads in pipe systems. Studies concerning condensation induced water hammer were predominantly made for horizontal pipes, studies concerning vertical pipe geometries are quite rare. This work presents a new integral test facility and an analysis of condensation induced water hammer in a vertical up-fill configuration. Thanks to the state of the art technology, the phenomenology of vertical condensation induced water hammer can be analysed by means of sufficient high-sampled experimental data. The system code ATHLET is used to simulate UniBw condensation induced water hammer experiments. A newly developed and implemented direct contact condensation model enables ATHLET to calculate condensation induced water hammer. Selected experiments are validated by the modified ATHLET system code. A sensitivity analysis in ATHLET, together with the experimental data, allows to assess the performance of ATHLET to compute condensation induced water hammer in a vertical up-fill configuration.

  6. Steam condensation induced water hammer in a vertical up-fill configuration within an integral test facility. Experiments and computational simulations

    International Nuclear Information System (INIS)

    Dirndorfer, Stefan

    2017-01-01

    Condensation induced water hammer is a source of danger and unpredictable loads in pipe systems. Studies concerning condensation induced water hammer were predominantly made for horizontal pipes, studies concerning vertical pipe geometries are quite rare. This work presents a new integral test facility and an analysis of condensation induced water hammer in a vertical up-fill configuration. Thanks to the state of the art technology, the phenomenology of vertical condensation induced water hammer can be analysed by means of sufficient high-sampled experimental data. The system code ATHLET is used to simulate UniBw condensation induced water hammer experiments. A newly developed and implemented direct contact condensation model enables ATHLET to calculate condensation induced water hammer. Selected experiments are validated by the modified ATHLET system code. A sensitivity analysis in ATHLET, together with the experimental data, allows to assess the performance of ATHLET to compute condensation induced water hammer in a vertical up-fill configuration.

  7. Computational Dehydration of Crystalline Hydrates Using Molecular Dynamics Simulations

    DEFF Research Database (Denmark)

    Larsen, Anders Støttrup; Rantanen, Jukka; Johansson, Kristoffer E

    2017-01-01

    Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach...... to the dehydration of ampicillin trihydrate. The crystallographic unit cell of the trihydrate is used to construct the simulation cell containing 216 ampicillin and 648 water molecules. This system is dehydrated by removing water molecules during a 2200 ps simulation, and depending on the computational dehydration....... The structural changes could be followed in real time, and in addition, an intermediate amorphous phase was identified. The computationally identified dehydrated structure (anhydrate) was slightly different from the experimentally known anhydrate structure suggesting that the simulated computational structure...

  8. The effects of nutrition labeling on consumer food choice: a psychological experiment and computational model.

    Science.gov (United States)

    Helfer, Peter; Shultz, Thomas R

    2014-12-01

    The widespread availability of calorie-dense food is believed to be a contributing cause of an epidemic of obesity and associated diseases throughout the world. One possible countermeasure is to empower consumers to make healthier food choices with useful nutrition labeling. An important part of this endeavor is to determine the usability of existing and proposed labeling schemes. Here, we report an experiment on how four different labeling schemes affect the speed and nutritional value of food choices. We then apply decision field theory, a leading computational model of human decision making, to simulate the experimental results. The psychology experiment shows that quantitative, single-attribute labeling schemes have greater usability than multiattribute and binary ones, and that they remain effective under moderate time pressure. The computational model simulates these psychological results and provides explanatory insights into them. This work shows how experimental psychology and computational modeling can contribute to the evaluation and improvement of nutrition-labeling schemes. © 2014 New York Academy of Sciences.

  9. Using a computer simulation for teaching communication skills: A blinded multisite mixed methods randomized controlled trial.

    Science.gov (United States)

    Kron, Frederick W; Fetters, Michael D; Scerbo, Mark W; White, Casey B; Lypson, Monica L; Padilla, Miguel A; Gliva-McConvey, Gayle A; Belfore, Lee A; West, Temple; Wallace, Amelia M; Guetterman, Timothy C; Schleicher, Lauren S; Kennedy, Rebecca A; Mangrulkar, Rajesh S; Cleary, James F; Marsella, Stacy C; Becker, Daniel M

    2017-04-01

    To assess advanced communication skills among second-year medical students exposed either to a computer simulation (MPathic-VR) featuring virtual humans, or to a multimedia computer-based learning module, and to understand each group's experiences and learning preferences. A single-blinded, mixed methods, randomized, multisite trial compared MPathic-VR (N=210) to computer-based learning (N=211). Primary outcomes: communication scores during repeat interactions with MPathic-VR's intercultural and interprofessional communication scenarios and scores on a subsequent advanced communication skills objective structured clinical examination (OSCE). Multivariate analysis of variance was used to compare outcomes. student attitude surveys and qualitative assessments of their experiences with MPathic-VR or computer-based learning. MPathic-VR-trained students improved their intercultural and interprofessional communication performance between their first and second interactions with each scenario. They also achieved significantly higher composite scores on the OSCE than computer-based learning-trained students. Attitudes and experiences were more positive among students trained with MPathic-VR, who valued its providing immediate feedback, teaching nonverbal communication skills, and preparing them for emotion-charged patient encounters. MPathic-VR was effective in training advanced communication skills and in enabling knowledge transfer into a more realistic clinical situation. MPathic-VR's virtual human simulation offers an effective and engaging means of advanced communication training. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Using a computer simulation for teaching communication skills: A blinded multisite mixed methods randomized controlled trial

    Science.gov (United States)

    Kron, Frederick W.; Fetters, Michael D.; Scerbo, Mark W.; White, Casey B.; Lypson, Monica L.; Padilla, Miguel A.; Gliva-McConvey, Gayle A.; Belfore, Lee A.; West, Temple; Wallace, Amelia M.; Guetterman, Timothy C.; Schleicher, Lauren S.; Kennedy, Rebecca A.; Mangrulkar, Rajesh S.; Cleary, James F.; Marsella, Stacy C.; Becker, Daniel M.

    2016-01-01

    Objectives To assess advanced communication skills among second-year medical students exposed either to a computer simulation (MPathic-VR) featuring virtual humans, or to a multimedia computer-based learning module, and to understand each group’s experiences and learning preferences. Methods A single-blinded, mixed methods, randomized, multisite trial compared MPathic-VR (N=210) to computer-based learning (N=211). Primary outcomes: communication scores during repeat interactions with MPathic-VR’s intercultural and interprofessional communication scenarios and scores on a subsequent advanced communication skills objective structured clinical examination (OSCE). Multivariate analysis of variance was used to compare outcomes. Secondary outcomes: student attitude surveys and qualitative assessments of their experiences with MPathic-VR or computer-based learning. Results MPathic-VR-trained students improved their intercultural and interprofessional communication performance between their first and second interactions with each scenario. They also achieved significantly higher composite scores on the OSCE than computer-based learning-trained students. Attitudes and experiences were more positive among students trained with MPathic-VR, who valued its providing immediate feedback, teaching nonverbal communication skills, and preparing them for emotion-charged patient encounters. Conclusions MPathic-VR was effective in training advanced communication skills and in enabling knowledge transfer into a more realistic clinical situation. Practice Implications MPathic-VR’s virtual human simulation offers an effective and engaging means of advanced communication training. PMID:27939846

  11. Computer simulation of thermal plant operations

    CERN Document Server

    O'Kelly, Peter

    2012-01-01

    This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.

  12. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael; Lee, Eleanor

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  13. Quantum simulations with noisy quantum computers

    Science.gov (United States)

    Gambetta, Jay

    Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.

  14. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  15. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  16. Pyro shock simulation: Experience with the MIPS simulator

    Science.gov (United States)

    Dwyer, Thomas J.; Moul, David S.

    1988-01-01

    The Mechanical Impulse Pyro Shock (MIPS) Simulator at GE Astro Space Division is one version of a design that is in limited use throughout the aerospace industry, and is typically used for component shock testing at levels up to 10,000 response g's. Modifications to the force imput, table and component boundary conditions have allowed a range of test conditions to be achieved. Twelve different designs of components with weights up to 23 Kg are in the process or have completed qualification testing in the Dynamic Simulation Lab at GE in Valley Forge, Pa. A summary of the experience gained through the use of this simulator is presented as well as examples of shock experiments that can be readily simulated at the GE Astro MIPS facility.

  17. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Directory of Open Access Journals (Sweden)

    De Raedt Hans

    2017-11-01

    Full Text Available Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015; L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other “post-selection” is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell’s theorem which states that this is impossible. The failure of Bell’s theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  18. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Science.gov (United States)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2017-11-01

    Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015); L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015)] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other "post-selection" is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell's theorem which states that this is impossible. The failure of Bell's theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  19. Computer Simulation of Reading.

    Science.gov (United States)

    Leton, Donald A.

    In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…

  20. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    Science.gov (United States)

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  1. A review of computer-aided oral and maxillofacial surgery: planning, simulation and navigation.

    Science.gov (United States)

    Chen, Xiaojun; Xu, Lu; Sun, Yi; Politis, Constantinus

    2016-11-01

    Currently, oral and maxillofacial surgery (OMFS) still poses a significant challenge for surgeons due to the anatomic complexity and limited field of view of the oral cavity. With the great development of computer technologies, he computer-aided surgery has been widely used for minimizing the risks and improving the precision of surgery. Areas covered: The major goal of this paper is to provide a comprehensive reference source of current and future development of computer-aided OMFS including surgical planning, simulation and navigation for relevant researchers. Expert commentary: Compared with the traditional OMFS, computer-aided OMFS overcomes the disadvantage that the treatment on the region of anatomically complex maxillofacial depends almost exclusively on the experience of the surgeon.

  2. Computer simulation of plasma behavior in open-ended linear theta machines. Scientific report 81-5

    Energy Technology Data Exchange (ETDEWEB)

    Stover, E. K.

    1981-04-01

    Zero-dimensional and one-dimensional fluid plasma computer models have been developed to study the behavior of linear theta pinch plasmas. Computer simulation results generated from these codes are compared with data obtained from two theta pinch experiments so that significant machine plasma behavior can be identified. The experiments examined are a collisional experiment, T/sub i/ approx. 50 eV, n/sub e/ approx. 10/sup 17/ cm/sup -3/, where the plasma mean-free-path was significantly less than the plasma column length, and a hot ion species experiment, T/sub i/ approx. 3 keV, n/sub e/ approx. 10/sup 16/ cm/sup -3/, where the ion mean-free-path was on the order of the plasma column length.

  3. Fabrication Improvement of Cold Forging Hexagonal Nuts by Computational Analysis and Experiment Verification

    Directory of Open Access Journals (Sweden)

    Shao-Yi Hsia

    2015-01-01

    Full Text Available Cold forging has played a critical role in fasteners and has been applied to the automobile industry, construction industry, aerospace industry, and living products so that cold forging presents the opportunities for manufacturing more products. By using computer simulation, this study attempts to analyze the process of creating machine parts, such as hexagonal nuts. The DEFORM-3D forming software is applied to analyze the process at various stages in the computer simulation, and the compression test is also used for the flow stress equation in order to compare the differences between the experimental results and the equation that is built into the computer simulation software. At the same time, the metallography and hardness of experiments are utilized to understand the cold forging characteristics of hexagonal nuts. The research results would benefit machinery businesses to realize the forging load and forming conditions at various stages before the fastener formation. In addition to planning proper die design and production, the quality of the produced hexagonal nuts would be more stable to promote industrial competitiveness.

  4. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  5. Computer Simulation of a Hardwood Processing Plant

    Science.gov (United States)

    D. Earl Kline; Philip A. Araman

    1990-01-01

    The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...

  6. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    Science.gov (United States)

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  7. Computational Enhancements for Direct Numerical Simulations of Statistically Stationary Turbulent Premixed Flames

    KAUST Repository

    Mukhadiyev, Nurzhan

    2017-05-01

    Combustion at extreme conditions, such as a turbulent flame at high Karlovitz and Reynolds numbers, is still a vast and an uncertain field for researchers. Direct numerical simulation of a turbulent flame is a superior tool to unravel detailed information that is not accessible to most sophisticated state-of-the-art experiments. However, the computational cost of such simulations remains a challenge even for modern supercomputers, as the physical size, the level of turbulence intensity, and chemical complexities of the problems continue to increase. As a result, there is a strong demand for computational cost reduction methods as well as in acceleration of existing methods. The main scope of this work was the development of computational and numerical tools for high-fidelity direct numerical simulations of premixed planar flames interacting with turbulence. The first part of this work was KAUST Adaptive Reacting Flow Solver (KARFS) development. KARFS is a high order compressible reacting flow solver using detailed chemical kinetics mechanism; it is capable to run on various types of heterogeneous computational architectures. In this work, it was shown that KARFS is capable of running efficiently on both CPU and GPU. The second part of this work was numerical tools for direct numerical simulations of planar premixed flames: such as linear turbulence forcing and dynamic inlet control. DNS of premixed turbulent flames conducted previously injected velocity fluctuations at an inlet. Turbulence injected at the inlet decayed significantly while reaching the flame, which created a necessity to inject higher than needed fluctuations. A solution for this issue was to maintain turbulence strength on the way to the flame using turbulence forcing. Therefore, a linear turbulence forcing was implemented into KARFS to enhance turbulence intensity. Linear turbulence forcing developed previously by other groups was corrected with net added momentum removal mechanism to prevent mean

  8. Magnetic insulation of high voltages in vacuum: comparison of experiment with simulations

    International Nuclear Information System (INIS)

    Bergeron, K.D.; Poukey, J.W.; Di Capua, M.S.; Pellinen, D.G.

    1978-01-01

    Experiments on long magnetically insulated vacuum transmission lines at the 700 kV/cm level have been analyzed by comparing with computer simulations. The particle-in-cell code used is 2-D, time-dependent and, like the experiments, coaxial cylindrical. Comparison could be made with current monitors at three intermediate longitudinal positions at both the outer electrode (for total current) and the inner electrode (for boundary current). The overall agreement was quite good, though the measured boundary current was consistently about 22 percent lower than the simulation values. In addition, a detailed comparison of the radial variation of several time-averaged quantities from the simulation was made with the predictions of the parapotential theory. It was found that the electric potential was very similar in the two cases, but the charge and current densities were not

  9. Using Computer Simulation Method to Improve Throughput of Production Systems by Buffers and Workers Allocation

    Directory of Open Access Journals (Sweden)

    Kłos Sławomir

    2015-12-01

    Full Text Available This paper proposes the application of computer simulation methods to support decision making regarding intermediate buffer allocations in a series-parallel production line. The simulation model of the production system is based on a real example of a manufacturing company working in the automotive industry. Simulation experiments were conducted for different allocations of buffer capacities and different numbers of employees. The production system consists of three technological operations with intermediate buffers between each operation. The technological operations are carried out using machines and every machine can be operated by one worker. Multi-work in the production system is available (one operator operates several machines. On the basis of the simulation experiments, the relationship between system throughput, buffer allocation and the number of employees is analyzed. Increasing the buffer capacity results in an increase in the average product lifespan. Therefore, in the article a new index is proposed that includes the throughput of the manufacturing system and product life span. Simulation experiments were performed for different configurations of technological operations.

  10. Computed radiography simulation using the Monte Carlo code MCNPX

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.

    2009-01-01

    Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)

  11. Computed radiography simulation using the Monte Carlo code MCNPX

    Energy Technology Data Exchange (ETDEWEB)

    Correa, S.C.A. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Centro Universitario Estadual da Zona Oeste (CCMAT)/UEZO, Av. Manuel Caldeira de Alvarenga, 1203, Campo Grande, 23070-200, Rio de Janeiro, RJ (Brazil); Souza, E.M. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.b [PEN/COPPE-DNC/Poli CT, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Cassiano, D.H. [Instituto de Radioprotecao e Dosimetria/CNEN Av. Salvador Allende, s/n, Recreio, 22780-160, Rio de Janeiro, RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil)

    2010-09-15

    Simulating X-ray images has been of great interest in recent years as it makes possible an analysis of how X-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data.

  12. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  13. Event-by-Event Simulation of the Hanbury Brown-Twiss Experiment with Coherent Light

    NARCIS (Netherlands)

    Jin, F.; De Raedt, H.; Michielsen, K.

    We present a computer simulation model for the Hanbury Brown-Twiss experiment that is entirely particle-based and reproduces the results of wave theory. The model is solely based on experimental facts, satisfies Einstein's criterion of local causality and does not require knowledge of the solution

  14. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  15. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-01-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  16. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  17. Computational Fluid Dynamics (CFD) simulations of a Heisenberg Vortex Tube

    Science.gov (United States)

    Bunge, Carl; Sitaraman, Hariswaran; Leachman, Jake

    2017-11-01

    A 3D Computational Fluid Dynamics (CFD) simulation of a Heisenberg Vortex Tube (HVT) is performed to estimate cooling potential with cryogenic hydrogen. The main mechanism driving operation of the vortex tube is the use of fluid power for enthalpy streaming in a highly turbulent swirl in a dual-outlet tube. This enthalpy streaming creates a temperature separation between the outer and inner regions of the flow. Use of a catalyst on the peripheral wall of the centrifuge enables endothermic conversion of para-ortho hydrogen to aid primary cooling. A κ- ɛ turbulence model is used with a cryogenic, non-ideal equation of state, and para-orthohydrogen species evolution. The simulations are validated with experiments and strategies for parametric optimization of this device are presented.

  18. Inovation of the computer system for the WWER-440 simulator

    International Nuclear Information System (INIS)

    Schrumpf, L.

    1988-01-01

    The configuration of the WWER-440 simulator computer system consists of four SMEP computers. The basic data processing unit consists of two interlinked SM 52/11.M1 computers with 1 MB of main memory. This part of the computer system of the simulator controls the operation of the entire simulator, processes the programs of technology behavior simulation, of the unit information system and of other special systems, guarantees program support and the operation of the instructor's console. An SM 52/11 computer with 256 kB of main memory is connected to each unit. It is used as a communication unit for data transmission using the DASIO 600 interface. Semigraphic color displays are based on the microprocessor modules of the SM 50/40 and SM 53/10 kit supplemented with a modified TESLA COLOR 110 ST tv receiver. (J.B.). 1 fig

  19. Phet simulator and the table teaching in basic education: experience report

    Directory of Open Access Journals (Sweden)

    Lilian de Fatima Oliveira Falchi

    2018-06-01

    Full Text Available This research was developed with the purpose of encouraging elementary school teachers to use the PhET simulator with their students as a means to enhance the learning of the table. The simulator is a learning facilitator, which aims to develop simulations for teaching science, mathematics, physics and chemistry. This paper reports the use of the simulator with the students of the third year of elementary school, applied under the joint supervision by the researchers and the schoolteachers. The use of the simulator with the students occurred in September and October of 2016 with weekly classes. Those were practical classes in the school’s computer lab, where students were able to explore the simulator as well as advance the phases according to their performance. This experience helped to understand the need to transform education and incorporate digital technology into teaching / learning to motivate and facilitate learning.

  20. Computer simulations of the activity of RND efflux pumps.

    Science.gov (United States)

    Vargiu, Attilio Vittorio; Ramaswamy, Venkata Krishnan; Malloci, Giuliano; Malvacio, Ivana; Atzori, Alessio; Ruggerone, Paolo

    2018-01-31

    The putative mechanism by which bacterial RND-type multidrug efflux pumps recognize and transport their substrates is a complex and fascinating enigma of structural biology. How a single protein can recognize a huge number of unrelated compounds and transport them through one or just a few mechanisms is an amazing feature not yet completely unveiled. The appearance of cooperativity further complicates the understanding of structure-dynamics-activity relationships in these complex machineries. Experimental techniques may have limited access to the molecular determinants and to the energetics of key processes regulating the activity of these pumps. Computer simulations are a complementary approach that can help unveil these features and inspire new experiments. Here we review recent computational studies that addressed the various molecular processes regulating the activity of RND efflux pumps. Copyright © 2018 The Authors. Published by Elsevier Masson SAS.. All rights reserved.

  1. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    Science.gov (United States)

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  3. Computer simulation in cell radiobiology

    International Nuclear Information System (INIS)

    Yakovlev, A.Y.; Zorin, A.V.

    1988-01-01

    This research monograph demonstrates the possible ways of using stochastic simulation for exploring cell kinetics, emphasizing the effects of cell radiobiology. In vitro kinetics of normal and irradiated cells is the main subject, but some approaches to the simulation of controlled cell systems are considered as well: the epithelium of the small intestine in mice taken as a case in point. Of particular interest is the evaluation of simulation modelling as a tool for gaining insight into biological processes and hence the new inferences from concrete experimental data, concerning regularities in cell population response to irradiation. The book is intended to stimulate interest among computer science specialists in developing new, more efficient means for the simulation of cell systems and to help radiobiologists in interpreting the experimental data

  4. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  5. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  6. Simulation of biological ion channels with technology computer-aided design.

    Science.gov (United States)

    Pandey, Santosh; Bortei-Doku, Akwete; White, Marvin H

    2007-01-01

    Computer simulations of realistic ion channel structures have always been challenging and a subject of rigorous study. Simulations based on continuum electrostatics have proven to be computationally cheap and reasonably accurate in predicting a channel's behavior. In this paper we discuss the use of a device simulator, SILVACO, to build a solid-state model for KcsA channel and study its steady-state response. SILVACO is a well-established program, typically used by electrical engineers to simulate the process flow and electrical characteristics of solid-state devices. By employing this simulation program, we have presented an alternative computing platform for performing ion channel simulations, besides the known methods of writing codes in programming languages. With the ease of varying the different parameters in the channel's vestibule and the ability of incorporating surface charges, we have shown the wide-ranging possibilities of using a device simulator for ion channel simulations. Our simulated results closely agree with the experimental data, validating our model.

  7. Computational algorithms for simulations in atmospheric optics.

    Science.gov (United States)

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

  8. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  9. SiMon: Simulation Monitor for Computational Astrophysics

    Science.gov (United States)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  10. Optimising electron microscopy experiment through electron optics simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, Y. [CEMES-CNRS, 29 Rue Jeanne Marvig, 31055 Toulouse France (France); Hitachi High-Technologies Corporation, 882, Ichige, Hitachinaka, Ibaraki 312-8504 (Japan); Gatel, C.; Snoeck, E. [CEMES-CNRS, 29 Rue Jeanne Marvig, 31055 Toulouse France (France); Houdellier, F., E-mail: florent.houdellier@cemes.fr [CEMES-CNRS, 29 Rue Jeanne Marvig, 31055 Toulouse France (France)

    2017-04-15

    We developed a new type of electron trajectories simulation inside a complete model of a modern transmission electron microscope (TEM). Our model incorporates the precise and real design of each element constituting a TEM, i.e. the field emission (FE) cathode, the extraction optic and acceleration stages of a 300 kV cold field emission gun, the illumination lenses, the objective lens, the intermediate and projection lenses. Full trajectories can be computed using magnetically saturated or non-saturated round lenses, magnetic deflectors and even non-cylindrical symmetry elements like electrostatic biprism. This multi-scale model gathers nanometer size components (FE tip) with parts of meter length (illumination and projection systems). We demonstrate that non-trivial TEM experiments requiring specific and complex optical configurations can be simulated and optimized prior to any experiment using such model. We show that all the currents set in all optical elements of the simulated column can be implemented in the real column (I2TEM in CEMES) and used as starting alignment for the requested experiment. We argue that the combination of such complete electron trajectory simulations in the whole TEM column with automatic optimization of the microscope parameters for optimal experimental data (images, diffraction, spectra) allows drastically simplifying the implementation of complex experiments in TEM and will facilitate the development of advanced use of the electron microscope in the near future. - Highlights: • Using dedicated electron optics software, we calculate full electrons trajectories inside a modern transmission electron microscope. • We have determined how to deal with multi-scale electron optics elements like high voltage cold field emission source. • W • e have succeed to model both weak and strong magnetic lenses whether in saturated or unsaturated conditions as well as electrostatic biprism and magnetic deflectors. • We have applied this model

  11. Optimising electron microscopy experiment through electron optics simulation

    International Nuclear Information System (INIS)

    Kubo, Y.; Gatel, C.; Snoeck, E.; Houdellier, F.

    2017-01-01

    We developed a new type of electron trajectories simulation inside a complete model of a modern transmission electron microscope (TEM). Our model incorporates the precise and real design of each element constituting a TEM, i.e. the field emission (FE) cathode, the extraction optic and acceleration stages of a 300 kV cold field emission gun, the illumination lenses, the objective lens, the intermediate and projection lenses. Full trajectories can be computed using magnetically saturated or non-saturated round lenses, magnetic deflectors and even non-cylindrical symmetry elements like electrostatic biprism. This multi-scale model gathers nanometer size components (FE tip) with parts of meter length (illumination and projection systems). We demonstrate that non-trivial TEM experiments requiring specific and complex optical configurations can be simulated and optimized prior to any experiment using such model. We show that all the currents set in all optical elements of the simulated column can be implemented in the real column (I2TEM in CEMES) and used as starting alignment for the requested experiment. We argue that the combination of such complete electron trajectory simulations in the whole TEM column with automatic optimization of the microscope parameters for optimal experimental data (images, diffraction, spectra) allows drastically simplifying the implementation of complex experiments in TEM and will facilitate the development of advanced use of the electron microscope in the near future. - Highlights: • Using dedicated electron optics software, we calculate full electrons trajectories inside a modern transmission electron microscope. • We have determined how to deal with multi-scale electron optics elements like high voltage cold field emission source. • W • e have succeed to model both weak and strong magnetic lenses whether in saturated or unsaturated conditions as well as electrostatic biprism and magnetic deflectors. • We have applied this model

  12. Simulated and Virtual Science Laboratory Experiments: Improving Critical Thinking and Higher-Order Learning Skills

    Science.gov (United States)

    Simon, Nicole A.

    Virtual laboratory experiments using interactive computer simulations are not being employed as viable alternatives to laboratory science curriculum at extensive enough rates within higher education. Rote traditional lab experiments are currently the norm and are not addressing inquiry, Critical Thinking, and cognition throughout the laboratory experience, linking with educational technologies (Pyatt & Sims, 2007; 2011; Trundle & Bell, 2010). A causal-comparative quantitative study was conducted with 150 learners enrolled at a two-year community college, to determine the effects of simulation laboratory experiments on Higher-Order Learning, Critical Thinking Skills, and Cognitive Load. The treatment population used simulated experiments, while the non-treatment sections performed traditional expository experiments. A comparison was made using the Revised Two-Factor Study Process survey, Motivated Strategies for Learning Questionnaire, and the Scientific Attitude Inventory survey, using a Repeated Measures ANOVA test for treatment or non-treatment. A main effect of simulated laboratory experiments was found for both Higher-Order Learning, [F (1, 148) = 30.32,p = 0.00, eta2 = 0.12] and Critical Thinking Skills, [F (1, 148) = 14.64,p = 0.00, eta 2 = 0.17] such that simulations showed greater increases than traditional experiments. Post-lab treatment group self-reports indicated increased marginal means (+4.86) in Higher-Order Learning and Critical Thinking Skills, compared to the non-treatment group (+4.71). Simulations also improved the scientific skills and mastery of basic scientific subject matter. It is recommended that additional research recognize that learners' Critical Thinking Skills change due to different instructional methodologies that occur throughout a semester.

  13. Simulation of hydrogen deflagration experiment – Benchmark exercise with lumped-parameter codes

    Energy Technology Data Exchange (ETDEWEB)

    Kljenak, Ivo, E-mail: ivo.kljenak@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Kuznetsov, Mikhail, E-mail: mike.kuznetsov@kit.edu [Karlsruhe Institute of Technology, Kaiserstraße 12, 76131 Karlsruhe (Germany); Kostka, Pal, E-mail: kostka@nubiki.hu [NUBIKI Nuclear Safety Research Institute, Konkoly-Thege Miklós út 29-33, 1121 Budapest (Hungary); Kubišova, Lubica, E-mail: lubica.kubisova@ujd.gov.sk [Nuclear Regulatory Authority of the Slovak Republic, Bajkalská 27, 82007 Bratislava (Slovakia); Maltsev, Mikhail, E-mail: maltsev_MB@aep.ru [JSC Atomenergoproekt, 1, st. Podolskykh Kursantov, Moscow (Russian Federation); Manzini, Giovanni, E-mail: giovanni.manzini@rse-web.it [Ricerca sul Sistema Energetico, Via Rubattino 54, 20134 Milano (Italy); Povilaitis, Mantas, E-mail: mantas.p@mail.lei.lt [Lithuania Energy Institute, Breslaujos g.3, 44403 Kaunas (Lithuania)

    2015-03-15

    Highlights: • Blind and open simulations of hydrogen combustion experiment in large-scale containment-like facility with different lumped-parameter codes. • Simulation of axial as well as radial flame propagation. • Confirmation of adequacy of lumped-parameter codes for safety analyses of actual nuclear power plants. - Abstract: An experiment on hydrogen deflagration (Upward Flame Propagation Experiment – UFPE) was proposed by the Jozef Stefan Institute (Slovenia) and performed in the HYKA A2 facility at the Karlsruhe Institute of Technology (Germany). The experimental results were used to organize a benchmark exercise for lumped-parameter codes. Six organizations (JSI, AEP, LEI, NUBIKI, RSE and UJD SR) participated in the benchmark exercise, using altogether four different computer codes: ANGAR, ASTEC, COCOSYS and ECART. Both blind and open simulations were performed. In general, all the codes provided satisfactory results of the pressure increase, whereas the results of the temperature show a wider dispersal. Concerning the flame axial and radial velocities, the results may be considered satisfactory, given the inherent simplification of the lumped-parameter description compared to the local instantaneous description.

  14. Simulation of hydrogen deflagration experiment – Benchmark exercise with lumped-parameter codes

    International Nuclear Information System (INIS)

    Kljenak, Ivo; Kuznetsov, Mikhail; Kostka, Pal; Kubišova, Lubica; Maltsev, Mikhail; Manzini, Giovanni; Povilaitis, Mantas

    2015-01-01

    Highlights: • Blind and open simulations of hydrogen combustion experiment in large-scale containment-like facility with different lumped-parameter codes. • Simulation of axial as well as radial flame propagation. • Confirmation of adequacy of lumped-parameter codes for safety analyses of actual nuclear power plants. - Abstract: An experiment on hydrogen deflagration (Upward Flame Propagation Experiment – UFPE) was proposed by the Jozef Stefan Institute (Slovenia) and performed in the HYKA A2 facility at the Karlsruhe Institute of Technology (Germany). The experimental results were used to organize a benchmark exercise for lumped-parameter codes. Six organizations (JSI, AEP, LEI, NUBIKI, RSE and UJD SR) participated in the benchmark exercise, using altogether four different computer codes: ANGAR, ASTEC, COCOSYS and ECART. Both blind and open simulations were performed. In general, all the codes provided satisfactory results of the pressure increase, whereas the results of the temperature show a wider dispersal. Concerning the flame axial and radial velocities, the results may be considered satisfactory, given the inherent simplification of the lumped-parameter description compared to the local instantaneous description

  15. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  16. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  17. A Computational Framework for Efficient Low Temperature Plasma Simulations

    Science.gov (United States)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  18. MPPhys—A many-particle simulation package for computational physics education

    Science.gov (United States)

    Müller, Thomas

    2014-03-01

    In a first course to classical mechanics elementary physical processes like elastic two-body collisions, the mass-spring model, or the gravitational two-body problem are discussed in detail. The continuation to many-body systems, however, is deferred to graduate courses although the underlying equations of motion are essentially the same and although there is a strong motivation for high-school students in particular because of the use of particle systems in computer games. The missing link between the simple and the more complex problem is a basic introduction to solve the equations of motion numerically which could be illustrated, however, by means of the Euler method. The many-particle physics simulation package MPPhys offers a platform to experiment with simple particle simulations. The aim is to give a principle idea how to implement many-particle simulations and how simulation and visualization can be combined for interactive visual explorations. Catalogue identifier: AERR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERR_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 111327 No. of bytes in distributed program, including test data, etc.: 608411 Distribution format: tar.gz Programming language: C++, OpenGL, GLSL, OpenCL. Computer: Linux and Windows platforms with OpenGL support. Operating system: Linux and Windows. RAM: Source Code 4.5 MB Complete package 242 MB Classification: 14, 16.9. External routines: OpenGL, OpenCL Nature of problem: Integrate N-body simulations, mass-spring models Solution method: Numerical integration of N-body-simulations, 3D-Rendering via OpenGL. Running time: Problem dependent

  19. Use of computer graphics simulation for teaching of flexible sigmoidoscopy.

    Science.gov (United States)

    Baillie, J; Jowell, P; Evangelou, H; Bickel, W; Cotton, P

    1991-05-01

    The concept of simulation training in endoscopy is now well-established. The systems currently under development employ either computer graphics simulation or interactive video technology; each has its strengths and weaknesses. A flexible sigmoidoscopy training device has been designed which uses graphic routines--such as object oriented programming and double buffering--in entirely new ways. These programming techniques compensate for the limitations of currently available desk-top microcomputers. By boosting existing computer 'horsepower' with next generation coprocessors and sophisticated graphics tools such as intensity interpolation (Gouraud shading), the realism of computer simulation of flexible sigmoidoscopy is being greatly enhanced. The computer program has teaching and scoring capabilities, making it a truly interactive system. Use has been made of this ability to record, grade and store each trainee encounter in computer memory as part of a multi-center, prospective trial of simulation training being conducted currently in the USA. A new input device, a dummy endoscope, has been designed that allows application of variable resistance to the insertion tube. This greatly enhances tactile feedback, such as resistance during looping. If carefully designed trials show that computer simulation is an attractive and effective training tool, it is expected that this technology will evolve rapidly and be made widely available to trainee endoscopists.

  20. Noise simulation in cone beam CT imaging with parallel computing

    International Nuclear Information System (INIS)

    Tu, S.-J.; Shaw, Chris C; Chen, Lingyun

    2006-01-01

    We developed a computer noise simulation model for cone beam computed tomography imaging using a general purpose PC cluster. This model uses a mono-energetic x-ray approximation and allows us to investigate three primary performance components, specifically quantum noise, detector blurring and additive system noise. A parallel random number generator based on the Weyl sequence was implemented in the noise simulation and a visualization technique was accordingly developed to validate the quality of the parallel random number generator. In our computer simulation model, three-dimensional (3D) phantoms were mathematically modelled and used to create 450 analytical projections, which were then sampled into digital image data. Quantum noise was simulated and added to the analytical projection image data, which were then filtered to incorporate flat panel detector blurring. Additive system noise was generated and added to form the final projection images. The Feldkamp algorithm was implemented and used to reconstruct the 3D images of the phantoms. A 24 dual-Xeon PC cluster was used to compute the projections and reconstructed images in parallel with each CPU processing 10 projection views for a total of 450 views. Based on this computer simulation system, simulated cone beam CT images were generated for various phantoms and technique settings. Noise power spectra for the flat panel x-ray detector and reconstructed images were then computed to characterize the noise properties. As an example among the potential applications of our noise simulation model, we showed that images of low contrast objects can be produced and used for image quality evaluation

  1. The Simulation and Analysis of the Closed Die Hot Forging Process by A Computer Simulation Method

    Directory of Open Access Journals (Sweden)

    Dipakkumar Gohil

    2012-06-01

    Full Text Available The objective of this research work is to study the variation of various parameters such as stress, strain, temperature, force, etc. during the closed die hot forging process. A computer simulation modeling approach has been adopted to transform the theoretical aspects in to a computer algorithm which would be used to simulate and analyze the closed die hot forging process. For the purpose of process study, the entire deformation process has been divided in to finite number of steps appropriately and then the output values have been computed at each deformation step. The results of simulation have been graphically represented and suitable corrective measures are also recommended, if the simulation results do not agree with the theoretical values. This computer simulation approach would significantly improve the productivity and reduce the energy consumption of the overall process for the components which are manufactured by the closed die forging process and contribute towards the efforts in reducing the global warming.

  2. Magnetohydrodynamic simulation of solid-deuterium-initiated Z-pinch experiments

    International Nuclear Information System (INIS)

    Sheehey, P.T.

    1994-02-01

    Solid-deuterium-initiated Z-pinch experiments are numerically simulated using a two-dimensional resistive magnetohydrodynamic model, which includes many important experimental details, such as ''cold-start'' initial conditions, thermal conduction, radiative energy loss, actual discharge current vs. time, and grids of sufficient size and resolution to allow realistic development of the plasma. The alternating-direction-implicit numerical technique used meets the substantial demands presented by such a computational task. Simulations of fiber-initiated experiments show that when the fiber becomes fully ionized rapidly developing m=0 instabilities, which originated in the coronal plasma generated from the ablating fiber, drive intense non-uniform heating and rapid expansion of the plasma column. The possibility that inclusion of additional physical effects would improve stability is explored. Finite-Larmor-radius-ordered Hall and diamagnetic pressure terms in the magnetic field evolution equation, corresponding energy equation terms, and separate ion and electron energy equations are included; these do not change the basic results. Model diagnostics, such as shadowgrams and interferograms, generated from simulation results, are in good agreement with experiment. Two alternative experimental approaches are explored: high-current magnetic implosion of hollow cylindrical deuterium shells, and ''plasma-on-wire'' (POW) implosion of low-density plasma onto a central deuterium fiber. By minimizing instability problems, these techniques may allow attainment of higher temperatures and densities than possible with bare fiber-initiated Z-pinches. Conditions for significant D-D or D-T fusion neutron production may be realizable with these implosion-based approaches

  3. Thermoeconomic optimization of a solar-assisted heat pump based on transient simulations and computer Design of Experiments

    International Nuclear Information System (INIS)

    Calise, Francesco; Dentice d’Accadia, Massimo; Figaj, Rafal Damian; Vanoli, Laura

    2016-01-01

    Highlights: • A polygeneration system for a residential house is presented. • Hybrid photovoltaic/thermal collectors are used, coupled with a solar-assisted heat pump. • An optimization has been performed. • The system is profitable even in the absence of incentives. • A simple pay-back period of about 5 year is achieved. - Abstract: In the paper, a model for the simulation and the optimization of a novel solar trigeneration system is presented. The plant simulation model is designed to supply electricity, space heating or cooling and domestic hot water for a small residential building. The system is based on a solar field equipped with flat-plate photovoltaic/thermal collectors, coupled with a water-to-water electric heat pump/chiller. The electrical energy produced by the hybrid collectors is entirely supplied to the building. During the winter, the thermal energy available from the solar field is used as a heat source for the evaporator of the heat pump and/or to produce domestic hot water. During the summer, the heat pump operates in cooling mode, coupled with a closed circuit cooling tower, providing space cooling for the building, and the hot water produced by the collectors is only used to produce domestic hot water. For such a system, a dynamic simulation model was developed in TRNSYS environment, paying special attention to the dynamic simulation of the building, too. The system was analyzed from an energy and economic point of view, considering different time bases. In order to minimize the pay-back period, an optimum set of the main design/control parameters was obtained by means of a sensitivity analysis. Simultaneously, a computer-based Design of Experiment procedure was implemented, aiming at calculating the optimal set of design parameters, using both energy and economic objective functions. The results showed that thermal and electrical efficiencies are above 40% and 10%, respectively. The coefficient of performance of the reversible heat

  4. Prototyping and Simulating Parallel, Distributed Computations with VISA

    National Research Council Canada - National Science Library

    Demeure, Isabelle M; Nutt, Gary J

    1989-01-01

    ...] to support the design, prototyping, and simulation of parallel, distributed computations. In particular, VISA is meant to guide the choice of partitioning and communication strategies for such computations, based on their performance...

  5. Slab cooling system design using computer simulation

    NARCIS (Netherlands)

    Lain, M.; Zmrhal, V.; Drkal, F.; Hensen, J.L.M.

    2007-01-01

    For a new technical library building in Prague computer simulations were carried out to help design of slab cooling system and optimize capacity of chillers. In the paper is presented concept of new technical library HVAC system, the model of the building, results of the energy simulations for

  6. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. How Many Times Should One Run a Computational Simulation?

    DEFF Research Database (Denmark)

    Seri, Raffaello; Secchi, Davide

    2017-01-01

    This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces statisti......This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...

  8. Virtual machines & volunteer computing: Experience from LHC@Home: Test4Theory project

    CERN Document Server

    Lombraña González, Daniel; Blomer, Jakob; Buncic, Predrag; Harutyunyan, Artem; Marquina, Miguel; Segal, Ben; Skands, Peter; Karneyeu, Anton

    2012-01-01

    Volunteer desktop grids are nowadays becoming more and more powerful thanks to improved high end components: multi-core CPUs, larger RAM memories and hard disks, better network connectivity and bandwidth, etc. As a result, desktop grid systems can run more complex experiments or simulations, but some problems remain: the heterogeneity of hardware architectures and software (library dependencies, code length, big repositories, etc.) make it very difficult for researchers and developers to deploy and maintain a software stack for all the available platforms. In this paper, the employment of virtualization is shown to be the key to solve these problems. It provides a homogeneous layer allowing researchers to focus their efforts on running their experiments. Inside virtual custom execution environments, researchers can control and deploy very complex experiments or simulations running on heterogeneous grids of high-end computers. The following work presents the latest results from CERN’s LHC@home Test4Theory p...

  9. Computer simulation of gear tooth manufacturing processes

    Science.gov (United States)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  10. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.

    Science.gov (United States)

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-08

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

  11. Training simulators in nuclear power plants: Experience, programme design and assessment methodology. Proceedings of a specialists' meeting

    International Nuclear Information System (INIS)

    1997-11-01

    Simulators became an indispensable part of training world-wide. Therefore, international exchange of information is important to share the experience gained in different countries in order to assure high international standards. A second aspects is the tremendous evolution in the computing capacities of the simulator hardware and the increasing functionality of the simulator software. This background has let the IAEA to invite the simulator experts for an experience exchange. The German Simulator Centre in Essen, which is operated by the companies KSG and GfS, was asked to host this Specialists' Meeting. The Specialists' Meeting on ''Training Simulators in Nuclear Power Plants: Experience, Programme Design and Assessment Methodology'' was organized by IAEA in-cooperation with the German Simulator Centre operated by KSG Kraftwerks-Simulator-Gesellschaft mbH and GfS Gesellschaft fuer Simulatorschulung mbH and was held from 17 - 19 November 1997 in Essen, Germany. The meeting focused on developments in simulation technology, experiences with simulator upgrades, utilization of computerized tools as support and complement of simulator training, use of simulators for other purposes. The meeting was attended by 50 participants from 16 countries. In the course of four sessions 21 technical presentations were made. The present volume contains the papers by national delegates at the Specialists' Meeting

  12. Training simulators in nuclear power plants: Experience, programme design and assessment methodology. Proceedings of a specialists` meeting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-11-01

    Simulators became an indispensable part of training world-wide. Therefore, international exchange of information is important to share the experience gained in different countries in order to assure high international standards. A second aspects is the tremendous evolution in the computing capacities of the simulator hardware and the increasing functionality of the simulator software. This background has let the IAEA to invite the simulator experts for an experience exchange. The German Simulator Centre in Essen, which is operated by the companies KSG and GfS, was asked to host this Specialists` Meeting. The Specialists` Meeting on ``Training Simulators in Nuclear Power Plants: Experience, Programme Design and Assessment Methodology`` was organized by IAEA in-cooperation with the German Simulator Centre operated by KSG Kraftwerks-Simulator-Gesellschaft mbH and GfS Gesellschaft fuer Simulatorschulung mbH and was held from 17 - 19 November 1997 in Essen, Germany. The meeting focused on developments in simulation technology, experiences with simulator upgrades, utilization of computerized tools as support and complement of simulator training, use of simulators for other purposes. The meeting was attended by 50 participants from 16 countries. In the course of four sessions 21 technical presentations were made. The present volume contains the papers by national delegates at the Specialists` Meeting Refs, figs, tabs

  13. Event-by-event simulation of quantum phenomena : Application to Einstein-Podolosky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    De Raedt, H.; De Raedt, K.; Michielsen, K.; Keimpema, K.; Miyashita, S.

    We review the data gathering and analysis procedure used in real E instein-Podolsky-Rosen-Bohm experiments with photons and we illustrate the procedure by analyzing experimental data. Based on this analysis, we construct event-based computer simulation models in which every essential element in the

  14. Hopper Flow: Experiments and Simulation

    Science.gov (United States)

    Li, Zhusong; Shattuck, Mark

    2013-03-01

    Jamming and intermittent granular flow are important problems in industry, and the vertical hopper is a canonical example. Clogging of granular hoppers account for significant losses across many industries. We use realistic DEM simulations of gravity driven flow in a hopper to examine flow and jamming of 2D disks and compare with identical companion experiments. We use experimental data to validate simulation parameters and the form of the inter particle force law. We measure and compare flow rate, emptying times, jamming statistics, and flow fields as a function of opening angle and opening size in both experiment and simulations. Suppored by: NSF-CBET-0968013

  15. Locative media and data-driven computing experiments

    Directory of Open Access Journals (Sweden)

    Sung-Yueh Perng

    2016-06-01

    Full Text Available Over the past two decades urban social life has undergone a rapid and pervasive geocoding, becoming mediated, augmented and anticipated by location-sensitive technologies and services that generate and utilise big, personal, locative data. The production of these data has prompted the development of exploratory data-driven computing experiments that seek to find ways to extract value and insight from them. These projects often start from the data, rather than from a question or theory, and try to imagine and identify their potential utility. In this paper, we explore the desires and mechanics of data-driven computing experiments. We demonstrate how both locative media data and computing experiments are ‘staged’ to create new values and computing techniques, which in turn are used to try and derive possible futures that are ridden with unintended consequences. We argue that using computing experiments to imagine potential urban futures produces effects that often have little to do with creating new urban practices. Instead, these experiments promote Big Data science and the prospect that data produced for one purpose can be recast for another and act as alternative mechanisms of envisioning urban futures.

  16. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    Science.gov (United States)

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  17. The visual simulators for architecture and computer organization learning

    OpenAIRE

    Nikolić Boško; Grbanović Nenad; Đorđević Jovan

    2009-01-01

    The paper proposes a method of an effective distance learning of architecture and computer organization. The proposed method is based on a software system that is possible to be applied in any course in this field. Within this system students are enabled to observe simulation of already created computer systems. The system provides creation and simulation of switch systems, too.

  18. Programme for the simulation of the TPA-i 1001 computer on the CDC-1604-A computer

    International Nuclear Information System (INIS)

    Belyaev, A.V.

    1976-01-01

    The basic features and capacities of the program simulating the 1001 TPA-i computer with the help of CDC-1604-A are described. The program is essentially aimed at translation of programs in the SLAHG language for the TPA-type computers. The basic part of the program simulates the work of the central TPA processor. This subprogram consequently performs the actions changing in the necessary manner the registers and memory states of the TPA computer. The simulated TPA computer has subprograms-analogous of external devices, i.e. the ASR-33 teletype, the FS 1501 tape reader, and the FACIT perforator. Work according to the program takes 1.65 - 2 times less time as against the work with TPA with the minimum set of external equipment [ru

  19. Large scale particle simulations in a virtual memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Million, R.; Wagner, J.S.; Tajima, T.

    1983-01-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceeds the computer core size. The required address space is automatically mapped onto slow disc memory the the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Assesses to slow memory significantly reduce the excecution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time. (orig.)

  20. Large-scale particle simulations in a virtual-memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Wagner, J.S.; Tajima, T.; Million, R.

    1982-08-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceed the computer core size. The required address space is automatically mapped onto slow disc memory by the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Accesses to slow memory significantly reduce the execution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time

  1. Computer simulations of nanoindentation in Mg-Cu and Cu-Zr metallic glasses

    DEFF Research Database (Denmark)

    Paduraru, Anca; Andersen, Ulrik Grønbjerg; Thyssen, Anders

    2010-01-01

    The formation of shear bands during plastic deformation of Cu0.50Zr0.50 and Mg0.85Cu0.15 metallic glasses is studied using atomic-scale computer simulations. The atomic interactions are described using realistic many-body potentials within the effective medium theory, and are compared with similar...... simulations using a Lennard-Jones description of the material. The metallic glasses are deformed both in simple shear and in a simulated nanoindentation experiment. Plastic shear localizes into shear bands with a width of approximately 5 nm in CuZr and 8 nm in MgCu. In simple shear, the shear band formation...... is very clear, whereas only incipient shear bands are seen in nanoindentation. The shear band formation during nanoindentation is sensitive to the indentation velocity, indenter radius and the cooling rate during the formation of the metallic glass. For comparison, a similar nanoindentation simulation...

  2. New tools and technology for the study of human performance in simulator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Droeivoldsmo, Asgeir

    2003-07-01

    This thesis suggests that new tools and technology can be used for production of relevant data and insights from the study of human performance in simulator and field experiments. It examines some of the theoretical perspectives behind data collection and human performance assessment, and argues for a high resemblance of the real world and use of subject matter expertise in simulator studies. A model is proposed, suggesting that human performance measurement should be tightly coupled to the topic of study and have a close connection to the time line. This coupling requires new techniques for continuous data collection, and eye movement tracking has been identified as a promising basis for this type of measures. One way of improving realism is to create virtual environments allowing for controlling more of the environment surrounding the test subjects. New application areas for virtual environments are discussed for use in control room and field studies. The combination of wearable computing, virtual and augmented (the use of computers to overlay virtual information onto the real world) reality provides many new possibilities to present information to operators. In two experiments, virtual and augmented reality techniques were used to visualise radiation fields for operators in a contaminated nuclear environment. This way the operators could train for and execute their tasks in a way that minimised radiation exposure to the individual operator. Both experiments were successful in proving the concept of radiation visualisation Virtual environments allow for early end-user feedback in the design and refurbishment of control room man-machine interfaces. The practical usability of VR in the control room setting was tested in two control room design experiments. The results show that with the right tools for solving the tasks under test, even desktop presentations of the virtual environment can provide sufficient resemblance of the real world. Computerised data

  3. New tools and technology for the study of human performance in simulator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Droeivoldsmo, Asgeir

    2003-07-01

    This thesis suggests that new tools and technology can be used for production of relevant data and insights from the study of human performance in simulator and field experiments. It examines some of the theoretical perspectives behind data collection and human performance assessment, and argues for a high resemblance of the real world and use of subject matter expertise in simulator studies. A model is proposed, suggesting that human performance measurement should be tightly coupled to the topic of study and have a close connection to the time line. This coupling requires new techniques for continuous data collection, and eye movement tracking has been identified as a promising basis for this type of measures. One way of improving realism is to create virtual environments allowing for controlling more of the environment surrounding the test subjects. New application areas for virtual environments are discussed for use in control room and field studies. The combination of wearable computing, virtual and augmented (the use of computers to overlay virtual information onto the real world) reality provides many new possibilities to present information to operators. In two experiments, virtual and augmented reality techniques were used to visualise radiation fields for operators in a contaminated nuclear environment. This way the operators could train for and execute their tasks in a way that minimised radiation exposure to the individual operator. Both experiments were successful in proving the concept of radiation visualisation. Virtual environments allow for early end-user feedback in the design and refurbishment of control room man-machine interfaces. The practical usability of VR in the control room setting was tested in two control room design experiments. The results show that with the right tools for solving the tasks under test, even desktop presentations of the virtual environment can provide sufficient resemblance of the real world. Computerised data

  4. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    Science.gov (United States)

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  5. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  6. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    Science.gov (United States)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit

  7. Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics

    CERN Document Server

    Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI

    2006-01-01

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  8. Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)

    Energy Technology Data Exchange (ETDEWEB)

    Hogden, John Edward [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Unal, Cetin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-06

    A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.

  9. Simulation of a complete inelastic neutron scattering experiment

    DEFF Research Database (Denmark)

    Edwards, H.; Lefmann, K.; Lake, B.

    2002-01-01

    A simulation of an inelastic neutron scattering experiment on the high-temperature superconductor La2-xSrxCuO4 is presented. The complete experiment, including sample, is simulated using an interface between the experiment control program and the simulation software package (McStas) and is compared...... with the experimental data. Simulating the entire experiment is an attractive alternative to the usual method of convoluting the model cross section with the resolution function, especially if the resolution function is nontrivial....

  10. Parallel Monte Carlo simulations on an ARC-enabled computing grid

    International Nuclear Information System (INIS)

    Nilsen, Jon K; Samset, Bjørn H

    2011-01-01

    Grid computing opens new possibilities for running heavy Monte Carlo simulations of physical systems in parallel. The presentation gives an overview of GaMPI, a system for running an MPI-based random walker simulation on grid resources. Integrating the ARC middleware and the new storage system Chelonia with the Ganga grid job submission and control system, we show that MPI jobs can be run on a world-wide computing grid with good performance and promising scaling properties. Results for relatively communication-heavy Monte Carlo simulations run on multiple heterogeneous, ARC-enabled computing clusters in several countries are presented.

  11. Computer simulation in nuclear science and engineering

    International Nuclear Information System (INIS)

    Akiyama, Mamoru; Miya, Kenzo; Iwata, Shuichi; Yagawa, Genki; Kondo, Shusuke; Hoshino, Tsutomu; Shimizu, Akinao; Takahashi, Hiroshi; Nakagawa, Masatoshi.

    1992-01-01

    The numerical simulation technology used for the design of nuclear reactors includes the scientific fields of wide range, and is the cultivated technology which grew in the steady efforts to high calculation accuracy through safety examination, reliability verification test, the assessment of operation results and so on. Taking the opportunity of putting numerical simulation to practical use in wide fields, the numerical simulation of five basic equations which describe the natural world and the progress of its related technologies are reviewed. It is expected that numerical simulation technology contributes to not only the means of design study but also the progress of science and technology such as the construction of new innovative concept, the exploration of new mechanisms and substances, of which the models do not exist in the natural world. The development of atomic energy and the progress of computers, Boltzmann's transport equation and its periphery, Navier-Stokes' equation and its periphery, Maxwell's electromagnetic field equation and its periphery, Schroedinger wave equation and its periphery, computational solid mechanics and its periphery, and probabilistic risk assessment and its periphery are described. (K.I.)

  12. Computational fluid dynamics simulations of light water reactor flows

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Weber, D.P.

    1999-01-01

    Advances in computational fluid dynamics (CFD), turbulence simulation, and parallel computing have made feasible the development of three-dimensional (3-D) single-phase and two-phase flow CFD codes that can simulate fluid flow and heat transfer in realistic reactor geometries with significantly reduced reliance, especially in single phase, on empirical correlations. The objective of this work was to assess the predictive power and computational efficiency of a CFD code in the analysis of a challenging single-phase light water reactor problem, as well as to identify areas where further improvements are needed

  13. COMPUTER LEARNING SIMULATOR WITH VIRTUAL REALITY FOR OPHTHALMOLOGY

    Directory of Open Access Journals (Sweden)

    Valeria V. Gribova

    2013-01-01

    Full Text Available A toolset of a medical computer learning simulator for ophthalmology with virtual reality and its implementation are considered in the paper. The simulator is oriented for professional skills training for students of medical universities. 

  14. Molecular dynamics simulations and applications in computational toxicology and nanotoxicology.

    Science.gov (United States)

    Selvaraj, Chandrabose; Sakkiah, Sugunadevi; Tong, Weida; Hong, Huixiao

    2018-02-01

    Nanotoxicology studies toxicity of nanomaterials and has been widely applied in biomedical researches to explore toxicity of various biological systems. Investigating biological systems through in vivo and in vitro methods is expensive and time taking. Therefore, computational toxicology, a multi-discipline field that utilizes computational power and algorithms to examine toxicology of biological systems, has gained attractions to scientists. Molecular dynamics (MD) simulations of biomolecules such as proteins and DNA are popular for understanding of interactions between biological systems and chemicals in computational toxicology. In this paper, we review MD simulation methods, protocol for running MD simulations and their applications in studies of toxicity and nanotechnology. We also briefly summarize some popular software tools for execution of MD simulations. Published by Elsevier Ltd.

  15. Use of Simulation Learning Experiences in Physical Therapy Entry-to-Practice Curricula: A Systematic Review

    Science.gov (United States)

    Carnahan, Heather; Herold, Jodi

    2015-01-01

    ABSTRACT Purpose: To review the literature on simulation-based learning experiences and to examine their potential to have a positive impact on physiotherapy (PT) learners' knowledge, skills, and attitudes in entry-to-practice curricula. Method: A systematic literature search was conducted in the MEDLINE, CINAHL, Embase Classic+Embase, Scopus, and Web of Science databases, using keywords such as physical therapy, simulation, education, and students. Results: A total of 820 abstracts were screened, and 23 articles were included in the systematic review. While there were few randomized controlled trials with validated outcome measures, some discoveries about simulation can positively affect the design of the PT entry-to-practice curricula. Using simulators to provide specific output feedback can help students learn specific skills. Computer simulations can also augment students' learning experience. Human simulation experiences in managing the acute patient in the ICU are well received by students, positively influence their confidence, and decrease their anxiety. There is evidence that simulated learning environments can replace a portion of a full-time 4-week clinical rotation without impairing learning. Conclusions: Simulation-based learning activities are being effectively incorporated into PT curricula. More rigorously designed experimental studies that include a cost–benefit analysis are necessary to help curriculum developers make informed choices in curriculum design. PMID:25931672

  16. Possibilities and importance of using computer games and simulations in educational process

    Directory of Open Access Journals (Sweden)

    Danilović Mirčeta S.

    2003-01-01

    Full Text Available The paper discusses if it is possible and appropriate to use simulations (simulation games and traditional games in the process of education. It is stressed that the terms "game" and "simulation" can and should be taken in a broader sense, although they are chiefly investigated herein as video-computer games and simulations. Any activity combining the properties of game (competition, rules, players and the properties of simulation (i.e. operational presentation of reality should be understood as simulation games, where role-play constitutes their essence and basis. In those games the student assumes a new identity, identifies himself with another personality and responds similarly. Game rules are basic and most important conditions for its existence, accomplishment and goal achievement. Games and simulations make possible for a student to acquire experience and practice i.e. to do exercises in nearly similar or identical life situations, to develop cognitive and psycho-motor abilities and skills, to acquire knowledge, to develop, create and change attitudes and value criteria, and to develop perception of other people’s feelings and attitudes. It is obligatory for the teacher to conduct preparations to use and apply simulation games in the process of teaching.

  17. Computer Simulation Model to Train Medical Personnel on Glucose Clamp Procedures.

    Science.gov (United States)

    Maghoul, Pooya; Boulet, Benoit; Tardif, Annie; Haidar, Ahmad

    2017-10-01

    A glucose clamp procedure is the most reliable way to quantify insulin pharmacokinetics and pharmacodynamics, but skilled and trained research personnel are required to frequently adjust the glucose infusion rate. A computer environment that simulates glucose clamp experiments can be used for efficient personnel training and development and testing of algorithms for automated glucose clamps. We built 17 virtual healthy subjects (mean age, 25±6 years; mean body mass index, 22.2±3 kg/m 2 ), each comprising a mathematical model of glucose regulation and a unique set of parameters. Each virtual subject simulates plasma glucose and insulin concentrations in response to intravenous insulin and glucose infusions. Each virtual subject provides a unique response, and its parameters were estimated from combined intravenous glucose tolerance test-hyperinsulinemic-euglycemic clamp data using the Bayesian approach. The virtual subjects were validated by comparing their simulated predictions against data from 12 healthy individuals who underwent a hyperglycemic glucose clamp procedure. Plasma glucose and insulin concentrations were predicted by the virtual subjects in response to glucose infusions determined by a trained research staff performing a simulated hyperglycemic clamp experiment. The total amount of glucose infusion was indifferent between the simulated and the real subjects (85±18 g vs. 83±23 g; p=NS) as well as plasma insulin levels (63±20 mU/L vs. 58±16 mU/L; p=NS). The virtual subjects can reliably predict glucose needs and plasma insulin profiles during hyperglycemic glucose clamp conditions. These virtual subjects can be used to train personnel to make glucose infusion adjustments during clamp experiments. Copyright © 2017 Diabetes Canada. Published by Elsevier Inc. All rights reserved.

  18. Computer simulations of shear thickening of concentrated dispersions

    NARCIS (Netherlands)

    Boersma, W.H.; Laven, J.; Stein, H.N.

    1995-01-01

    Stokesian dynamics computer simulations were performed on monolayers of equally sized spheres. The influence of repulsive and attractive forces on the rheological behavior and on the microstructure were studied. Under specific conditions shear thickening could be observed in the simulations, usually

  19. Augmented Reality Simulations on Handheld Computers

    Science.gov (United States)

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  20. Lessons Learned From the Development and Parameterization of a Computer Simulation Model to Evaluate Task Modification for Health Care Providers.

    Science.gov (United States)

    Kasaie, Parastu; David Kelton, W; Ancona, Rachel M; Ward, Michael J; Froehle, Craig M; Lyons, Michael S

    2018-02-01

    Computer simulation is a highly advantageous method for understanding and improving health care operations with a wide variety of possible applications. Most computer simulation studies in emergency medicine have sought to improve allocation of resources to meet demand or to assess the impact of hospital and other system policies on emergency department (ED) throughput. These models have enabled essential discoveries that can be used to improve the general structure and functioning of EDs. Theoretically, computer simulation could also be used to examine the impact of adding or modifying specific provider tasks. Doing so involves a number of unique considerations, particularly in the complex environment of acute care settings. In this paper, we describe conceptual advances and lessons learned during the design, parameterization, and validation of a computer simulation model constructed to evaluate changes in ED provider activity. We illustrate these concepts using examples from a study focused on the operational effects of HIV screening implementation in the ED. Presentation of our experience should emphasize the potential for application of computer simulation to study changes in health care provider activity and facilitate the progress of future investigators in this field. © 2017 by the Society for Academic Emergency Medicine.

  1. Computer Simulation of the Circulation Subsystem of a Library

    Science.gov (United States)

    Shaw, W. M., Jr.

    1975-01-01

    When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)

  2. I - Detector Simulation for the LHC and beyond: how to match computing resources and physics requirements

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Detector simulation at the LHC is one of the most computing intensive activities. In these lectures we will show how physics requirements were met for the LHC experiments and extrapolate to future experiments (FCC-hh case). At the LHC, detectors are complex, very precise and ambitious: this implies modern modelisation tools for geometry and response. Events are busy and characterised by an unprecedented energy scale with hundreds of particles to be traced and high energy showers to be accurately simulated. Furthermore, high luminosities imply many events in a bunch crossing and many bunch crossings to be considered at the same time. In addition, backgrounds not directly correlated to bunch crossings have also to be taken into account. Solutions chosen for ATLAS (a mixture of detailed simulation and fast simulation/parameterisation) will be described and CPU and memory figures will be given. An extrapolation to the FCC-hh case will be tried by taking as example the calorimeter simulation.

  3. II - Detector simulation for the LHC and beyond : how to match computing resources and physics requirements

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Detector simulation at the LHC is one of the most computing intensive activities. In these lectures we will show how physics requirements were met for the LHC experiments and extrapolate to future experiments (FCC-hh case). At the LHC, detectors are complex, very precise and ambitious: this implies modern modelisation tools for geometry and response. Events are busy and characterised by an unprecedented energy scale with hundreds of particles to be traced and high energy showers to be accurately simulated. Furthermore, high luminosities imply many events in a bunch crossing and many bunch crossings to be considered at the same time. In addition, backgrounds not directly correlated to bunch crossings have also to be taken into account. Solutions chosen for ATLAS (a mixture of detailed simulation and fast simulation/parameterisation) will be described and CPU and memory figures will be given. An extrapolation to the FCC-hh case will be tried by taking as example the calorimeter simulation.

  4. Using EDUCache Simulator for the Computer Architecture and Organization Course

    Directory of Open Access Journals (Sweden)

    Sasko Ristov

    2013-07-01

    Full Text Available The computer architecture and organization course is essential in all computer science and engineering programs, and the most selected and liked elective course for related engineering disciplines. However, the attractiveness brings a new challenge, it requires a lot of effort by the instructor, to explain rather complicated concepts to beginners or to those who study related disciplines. The usage of visual simulators can improve both the teaching and learning processes. The overall goal is twofold: 1~to enable a visual environment to explain the basic concepts and 2~to increase the student's willingness and ability to learn the material.A lot of visual simulators have been used for the computer architecture and organization course. However, due to the lack of visual simulators for simulation of the cache memory concepts, we have developed a new visual simulator EDUCache simulator. In this paper we present that it can be effectively and efficiently used as a supporting tool in the learning process of modern multi-layer, multi-cache and multi-core multi-processors.EDUCache's features enable an environment for performance evaluation and engineering of software systems, i.e. the students will also understand the importance of computer architecture building parts and hopefully, will increase their curiosity for hardware courses in general.

  5. A counterpoint between computer simulations and biological experiments to train new members of a laboratory of physiological sciences.

    Science.gov (United States)

    Ozu, Marcelo; Dorr, Ricardo A; Gutiérrez, Facundo; Politi, M Teresa; Toriano, Roxana

    2012-12-01

    When new members join a working group dedicated to scientific research, several changes occur in the group's dynamics. From a teaching point of view, a subsequent challenge is to develop innovative strategies to train new staff members in creative thinking, which is the most complex and abstract skill in the cognitive domain according to Bloom's revised taxonomy. In this sense, current technological and digital advances offer new possibilities in the field of education. Computer simulation and biological experiments can be used together as a combined tool for teaching and learning sometimes complex physiological and biophysical concepts. Moreover, creativity can be thought of as a social process that relies on interactions among staff members. In this regard, the acquisition of cognitive abilities coexists with the attainment of other skills from psychomotor and affective domains. Such dynamism in teaching and learning stimulates teamwork and encourages the integration of members of the working group. A practical example, based on the teaching of biophysical subjects such as osmosis, solute transport, and membrane permeability, which are crucial in understanding the physiological concept of homeostasis, is presented.

  6. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  7. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Carolyn L., E-mail: wangcl@uw.edu [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Schopp, Jennifer G.; Kani, Kimia [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Petscavage-Thomas, Jonelle M. [Penn State Hershey Medical Center, Department of Radiology, 500 University Drive, Hershey, PA 17033 (United States); Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H. [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States)

    2013-12-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation.

  8. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    International Nuclear Information System (INIS)

    Wang, Carolyn L.; Schopp, Jennifer G.; Kani, Kimia; Petscavage-Thomas, Jonelle M.; Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H.

    2013-01-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation

  9. Systematic simulation of a tubular recycle reactor on the basis of pilot plant experiments

    Energy Technology Data Exchange (ETDEWEB)

    Paar, H; Narodoslawsky, M; Moser, A [Technische Univ., Graz (Austria). Inst. fuer Biotechnologie, Mikrobiologie und Abfalltechnologie

    1990-10-10

    Systematic simulatiom may decisively help in development and optimization of bioprocesses. By applying simulation techniques, optimal use can be made of experimental data, decreasing development costs and increasing the accuracy in predicting the behavior of an industrial scale plant. The procedure of the dialogue between simulation and experimental efforts will be exemplified in a case study. Alcoholic fermentation of glucose by zymomonas mobilis bacteria in a gasified turbular recycle reactor was studied first by systematic simulation, using a computer model based solely on literature data. On the base of the results of this simulation, a 0.013 m{sup 3} pilot plant reactor was constructed. The pilot plant experiments, too, were based on the results of the systematic simulation. Simulated and experimental data were well in agreement. The pilot plant experiments reiterated the trends and limits of the process as shown by the simulation results. Data from the pilot plant runs were then used to improve the simulation model. This improved model was subsequently used to simulate the performances of an industrial scale plant. The results of this simulation are presented. They show that the alcohol fermentation in a tubular recycle reactor is potentially advantageous to other reactor configurations, especially to continuous stirred tanks. (orig.).

  10. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  11. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient

  12. Simulation experiments concerning functioning tests of technical safety devices with process computers. Pt. 2

    International Nuclear Information System (INIS)

    Hawickhorst, W.

    1976-12-01

    Computerized inspection techniques of engineered safety systems improve the diagnosis capability, relative to the presently used techniques, even if anticipated system disturbances can only be qualitatively predicted. To achieve this, the system to be inspected must be partitioned into small subsystems, which can be treated independently from each other. This report contains the formulation of a standardized inspection concept based on system decomposition. Its performance is discussed by means of simulation experiments. (orig.) [de

  13. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    Science.gov (United States)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  14. Digital control computer upgrade at the Cernavoda NPP simulator

    International Nuclear Information System (INIS)

    Ionescu, T.

    2006-01-01

    The Plant Process Computer equips some Nuclear Power Plants, like CANDU-600, with Centralized Control performed by an assembly of two computers known as Digital Control Computers (DCC) and working in parallel for safely driving of the plan at steady state and during normal maneuvers but also during abnormal transients when the plant is automatically steered to a safe state. The Centralized Control means both hardware and software with obligatory presence in the frame of the Full Scope Simulator and subject to changing its configuration with specific requirements during the plant and simulator life and covered by this subsection

  15. Estimating social carrying capacity through computer simulation modeling: an application to Arches National Park, Utah

    Science.gov (United States)

    Benjamin Wang; Robert E. Manning; Steven R. Lawson; William A. Valliere

    2001-01-01

    Recent research and management experience has led to several frameworks for defining and managing carrying capacity of national parks and related areas. These frameworks rely on monitoring indicator variables to ensure that standards of quality are maintained. The objective of this study was to develop a computer simulation model to estimate the relationships between...

  16. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    Science.gov (United States)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  17. A note on simulated annealing to computer laboratory scheduling ...

    African Journals Online (AJOL)

    The concepts, principles and implementation of simulated Annealing as a modem heuristic technique is presented. Simulated Annealing algorithm is used in solving real life problem of Computer Laboratory scheduling in order to maximize the use of scarce and insufficient resources. KEY WORDS: Simulated Annealing ...

  18. Computer simulation study of the displacement threshold-energy surface in Cu

    International Nuclear Information System (INIS)

    King, W.E.; Benedek, R.

    1981-01-01

    Computer simulations were performed using the molecular-dynamics technique to determine the directional dependence of the threshold energy for production of stable Frenkel pairs in copper. Sharp peaks were observed in the simulated threshold energy surface in between the low-index directions. Threshold energies ranged from approx.25 eV for directions near or to 180 eV at the position of the peak between and . The general topographical features of the simulated threshold-energy surface are in good agreement with those determined from an analysis of recent experiments by King et al. on the basis of a Frenkel-pair resistivity rho/sub F/ = 2.85 x 10 -4 Ω cm. Evidence is presented in favor of this number as opposed to the usually assumed value, rho/sub F/ = 2.00 x 10 -4 Ω cm. The energy dependence of defect production in a number of directions was investigated to determine the importance of nonproductive events above threshold

  19. Computer Simulations of Resonant Coherent Excitation of Heavy Hydrogen-Like Ions Under Planar Channeling

    Science.gov (United States)

    Babaev, A. A.; Pivovarov, Yu L.

    2010-04-01

    Resonant coherent excitation (RCE) of relativistic hydrogen-like ions is investigated by computer simulations methods. The suggested theoretical model is applied to the simulations of recent experiments on RCE of 390 MeV/u Ar17+ ions under (220) planar channeling in a Si crystal performed by T.Azuma et al at HIMAC (Tokyo). Theoretical results are in a good agreement with these experimental data and clearly show the appearance of the doublet structure of RCE peaks. The simulations are also extended to greater ion energies in order to predict the new RCE features at the future accelerator facility FAIR OSI and as an example, RCE of II GeV/u U91+ ions is considered in detail.

  20. Sharing experience and knowledge with wearable computers

    OpenAIRE

    Nilsson, Marcus; Drugge, Mikael; Parnes, Peter

    2004-01-01

    Wearable computer have mostly been looked on when used in isolation. But the wearable computer with Internet connection is a good tool for communication and for sharing knowledge and experience with other people. The unobtrusiveness of this type of equipment makes it easy to communicate at most type of locations and contexts. The wearable computer makes it easy to be a mediator of other people knowledge and becoming a knowledgeable user. This paper describes the experience gained from testing...

  1. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  2. Simulation of Robot Kinematics Using Interactive Computer Graphics.

    Science.gov (United States)

    Leu, M. C.; Mahajan, R.

    1984-01-01

    Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

  3. Computer simulations of long-time tails: what's new?

    NARCIS (Netherlands)

    Hoef, van der M.A.; Frenkel, D.

    1995-01-01

    Twenty five years ago Alder and Wainwright discovered, by simulation, the 'long-time tails' in the velocity autocorrelation function of a single particle in fluid [1]. Since then, few qualitatively new results on long-time tails have been obtained by computer simulations. However, within the

  4. Computer-assisted comparison of analysis and test results in transportation experiments

    International Nuclear Information System (INIS)

    Knight, R.D.; Ammerman, D.J.; Koski, J.A.

    1998-01-01

    As a part of its ongoing research efforts, Sandia National Laboratories' Transportation Surety Center investigates the integrity of various containment methods for hazardous materials transport, subject to anomalous structural and thermal events such as free-fall impacts, collisions, and fires in both open and confined areas. Since it is not possible to conduct field experiments for every set of possible conditions under which an actual transportation accident might occur, accurate modeling methods must be developed which will yield reliable simulations of the effects of accident events under various scenarios. This requires computer software which is capable of assimilating and processing data from experiments performed as benchmarks, as well as data obtained from numerical models that simulate the experiment. Software tools which can present all of these results in a meaningful and useful way to the analyst are a critical aspect of this process. The purpose of this work is to provide software resources on a long term basis, and to ensure that the data visualization capabilities of the Center keep pace with advancing technology. This will provide leverage for its modeling and analysis abilities in a rapidly evolving hardware/software environment

  5. Faster quantum chemistry simulation on fault-tolerant quantum computers

    International Nuclear Information System (INIS)

    Cody Jones, N; McMahon, Peter L; Yamamoto, Yoshihisa; Whitfield, James D; Yung, Man-Hong; Aspuru-Guzik, Alán; Van Meter, Rodney

    2012-01-01

    Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. We propose methods which substantially improve the performance of a particular form of simulation, ab initio quantum chemistry, on fault-tolerant quantum computers; these methods generalize readily to other quantum simulation problems. Quantum teleportation plays a key role in these improvements and is used extensively as a computing resource. To improve execution time, we examine techniques for constructing arbitrary gates which perform substantially faster than circuits based on the conventional Solovay–Kitaev algorithm (Dawson and Nielsen 2006 Quantum Inform. Comput. 6 81). For a given approximation error ϵ, arbitrary single-qubit gates can be produced fault-tolerantly and using a restricted set of gates in time which is O(log ϵ) or O(log log ϵ); with sufficient parallel preparation of ancillas, constant average depth is possible using a method we call programmable ancilla rotations. Moreover, we construct and analyze efficient implementations of first- and second-quantized simulation algorithms using the fault-tolerant arbitrary gates and other techniques, such as implementing various subroutines in constant time. A specific example we analyze is the ground-state energy calculation for lithium hydride. (paper)

  6. Brownfield Action: An education through an environmental science simulation experience for undergraduates

    Science.gov (United States)

    Kelsey, Ryan Daniel

    Brownfield Action is a computer simulation experience used by undergraduates in an Introduction to Environmental Science course for non-science majors at Barnard College. Students play the role of environmental consultants given the semester-long task of investigating a potentially contaminated landsite in a simulated town. The simulation serves as the integration mechanism for the entire course. The project is a collaboration between Professor Bower and the Columbia University Center for New Media Teaching and Learning (CCNMTL). This study chronicles the discovery, design, development, implementation, and evaluation of this project over its four-year history from prototype to full-fledged semester-long integrated lecture and lab experience. The complete project history serves as a model for the development of best practices in contributing to the field of educational technology in higher education through the study of fully designed and implemented projects in real classrooms. Recommendations from the project focus on linking the laboratory and lecture portions of a course, the use of simulations (especially for novice students), instructor adaptation to the use of technology, general educational technology project development, and design research, among others. Findings from the study also emphasize the uniqueness of individual student's growth through the experience, and the depth of understanding that can be gained from embracing the complexity of studying sophisticated learning environments in real classrooms.

  7. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  8. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  9. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  10. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  11. The Information Science Experiment System - The computer for science experiments in space

    Science.gov (United States)

    Foudriat, Edwin C.; Husson, Charles

    1989-01-01

    The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.

  12. Computer Simulations for Lab Experiences in Secondary Physics

    Science.gov (United States)

    Murphy, David Shannon

    2012-01-01

    Physical science instruction often involves modeling natural systems, such as electricity that possess particles which are invisible to the unaided eye. The effect of these particles' motion is observable, but the particles are not directly observable to humans. Simulations have been developed in physics, chemistry and biology that, under certain…

  13. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  14. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  15. The Use of Computer Simulation Gaming in Teaching Broadcast Economics.

    Science.gov (United States)

    Mancuso, Louis C.

    The purpose of this study was to develop a broadcast economic computer simulation and to ascertain how a lecture-computer simulation game compared as a teaching method with a more traditional lecture and case study instructional methods. In each of three sections of a broadcast economics course, a different teaching methodology was employed: (1)…

  16. Simulating and assessing boson sampling experiments with phase-space representations

    Science.gov (United States)

    Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.

    2018-04-01

    The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.

  17. Aerodynamics of ski jumping: experiments and CFD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Meile, W.; Reisenberger, E.; Brenn, G. [Graz University of Technology, Institute of Fluid Mechanics and Heat Transfer, Graz (Austria); Mayer, M. [VRVis GmbH, Vienna (Austria); Schmoelzer, B.; Mueller, W. [Medical University of Graz, Department for Biophysics, Graz (Austria)

    2006-12-15

    The aerodynamic behaviour of a model ski jumper is investigated experimentally at full-scale Reynolds numbers and computationally applying a standard RANS code. In particular we focus on the influence of different postures on aerodynamic forces in a wide range of angles of attack. The experimental results proved to be in good agreement with full-scale measurements with athletes in much larger wind tunnels, and form a reliable basis for further predictions of the effects of position changes on the performance. The comparison of CFD results with the experiments shows poor agreement, but enables a clear outline of simulation potentials and limits when accurate predictions of effects from small variations are required. (orig.)

  18. Aerodynamics of ski jumping: experiments and CFD simulations

    Science.gov (United States)

    Meile, W.; Reisenberger, E.; Mayer, M.; Schmölzer, B.; Müller, W.; Brenn, G.

    2006-12-01

    The aerodynamic behaviour of a model ski jumper is investigated experimentally at full-scale Reynolds numbers and computationally applying a standard RANS code. In particular we focus on the influence of different postures on aerodynamic forces in a wide range of angles of attack. The experimental results proved to be in good agreement with full-scale measurements with athletes in much larger wind tunnels, and form a reliable basis for further predictions of the effects of position changes on the performance. The comparison of CFD results with the experiments shows poor agreement, but enables a clear outline of simulation potentials and limits when accurate predictions of effects from small variations are required.

  19. The effect of metacognitive monitoring feedback on performance in a computer-based training simulation.

    Science.gov (United States)

    Kim, Jung Hyup

    2018-02-01

    This laboratory experiment was designed to study the effect of metacognitive monitoring feedback on performance in a computer-based training simulation. According to prior research on metacognition, the accurate checking of learning is a critical part of improving the quality of human performance. However, only rarely have researchers studied the learning effects of the accurate checking of retrospective confidence judgments (RCJs) during a computer-based military training simulation. In this study, we provided participants feedback screens after they had completed a warning task and identification task in a radar monitoring simulation. There were two groups in this experiment. One group (group A) viewed the feedback screens with the flight path of all target aircraft and the triangular graphs of both RCJ scores and human performance together. The other group (group B) only watched the feedback screens with the flight path of all target aircraft. There was no significant difference in performance improvement between groups A and B for the warning task (Day 1: group A - 0.347, group B - 0.305; Day 2: group A - 0.488, group B - 0.413). However, the identification task yielded a significant difference in performance improvement between these groups (Day 1: group A - 0.174, group B - 0.1555; Day 2: group A - 0.324, group B - 0.199). The results show that debiasing self-judgment of the identification task produces a positive training effect on learners. The findings of this study will be beneficial for designing an advanced instructional strategy in a simulation-based training environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Computer simulation of two-phase flow in nuclear reactors

    International Nuclear Information System (INIS)

    Wulff, W.

    1993-01-01

    Two-phase flow models dominate the requirements of economic resources for the development and use of computer codes which serve to analyze thermohydraulic transients in nuclear power plants. An attempt is made to reduce the effort of analyzing reactor transients by combining purpose-oriented modelling with advanced computing techniques. Six principles are presented on mathematical modeling and the selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited than the two-fluid model for the analysis of two-phase flow in nuclear reactors, because of the latter's closure problems. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost. (orig.)

  1. Computer simulation of molecular sorption in zeolites

    International Nuclear Information System (INIS)

    Calmiano, Mark Daniel

    2001-01-01

    The work presented in this thesis encompasses the computer simulation of molecular sorption. In Chapter 1 we outline the aims and objectives of this work. Chapter 2 follows in which an introduction to sorption in zeolites is presented, with discussion of structure and properties of the main zeolites studied. Chapter 2 concludes with a description of the principles and theories of adsorption. In Chapter 3 we describe the methodology behind the work carried out in this thesis. In Chapter 4 we present our first computational study, that of the sorption of krypton in silicalite. We describe work carried out to investigate low energy sorption sites of krypton in silicalite where we observe krypton to preferentially sorb into straight and sinusoidal channels over channel intersections. We simulate single step type I adsorption isotherms and use molecular dynamics to study the diffusion of krypton and obtain division coefficients and the activation energy. We compare our results to previous experimental and computational studies where we show our work to be in good agreement. In Chapter 5 we present a systematic study of the sorption of oxygen and nitrogen in five lithium substituted zeolites using a transferable interatomic potential that we have developed from ab initio calculations. We show increased loading of nitrogen compared to oxygen in all five zeolites studied as expected and simulate adsorption isotherms, which we compare to experimental and simulated data in the literature. In Chapter 6 we present work on the sorption of ferrocene in the zeolite NaY. We show that a simulated, low energy sorption site for ferrocene is correctly located by comparing to X-ray powder diffraction results for this same system. The thesis concludes with some overall conclusions and discussion of opportunities for future work. (author)

  2. Factors cost effectively improved using computer simulations of ...

    African Journals Online (AJOL)

    LPhidza

    effectively managed using computer simulations in semi-arid conditions pertinent to much of sub-Saharan Africa. ... small scale farmers to obtain optimal crop yields thus ensuring their food security and livelihood is ... those that simultaneously incorporate and simulate processes involved throughout the course of crop ...

  3. CloudMC: a cloud computing application for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-01-01

    This work presents CloudMC, a cloud computing application—developed in Windows Azure®, the platform of the Microsoft® cloud—for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based—the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice. (note)

  4. CloudMC: a cloud computing application for Monte Carlo simulation.

    Science.gov (United States)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-04-21

    This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.

  5. Flocking and self-defense: experiments and simulations of avian mobbing

    Science.gov (United States)

    Kane, Suzanne Amador

    2011-03-01

    We have performed motion capture studies in the field of avian mobbing, in which flocks of prey birds harass predatory birds. Our empirical studies cover both field observations of mobbing occurring in mid-air, where both predator and prey are in flight, and an experimental system using actual prey birds and simulated predator ``perch and wait'' strategies. To model our results and establish the effectiveness of mobbing flight paths at minimizing risk of capture while optimizing predator harassment, we have performed computer simulations using the actual measured trajectories of mobbing prey birds combined with model predator trajectories. To accurately simulate predator motion, we also measured raptor acceleration and flight dynamics, well as prey-pursuit strategies. These experiments and theoretical studies were all performed with undergraduate research assistants in a liberal arts college setting. This work illustrates how biological physics provides undergraduate research projects well-suited to the abilities of physics majors with interdisciplinary science interests and diverse backgrounds.

  6. Computer simulation of grain growth in HAZ

    Science.gov (United States)

    Gao, Jinhua

    Two different models for Monte Carlo simulation of normal grain growth in metals and alloys were developed. Each simulation model was based on a different approach to couple the Monte Carlo simulation time to real time-temperature. These models demonstrated the applicability of Monte Carlo simulation to grain growth in materials processing. A grain boundary migration (GBM) model coupled the Monte Carlo simulation to a first principle grain boundary migration model. The simulation results, by applying this model to isothermal grain growth in zone-refined tin, showed good agreement with experimental results. An experimental data based (EDB) model coupled the Monte Carlo simulation with grain growth kinetics obtained from the experiment. The results of the application of the EDB model to the grain growth during continuous heating of a beta titanium alloy correlated well with experimental data. In order to acquire the grain growth kinetics from the experiment, a new mathematical method was developed and utilized to analyze the experimental data on isothermal grain growth. Grain growth in the HAZ of 0.2% Cu-Al alloy was successfully simulated using the EDB model combined with grain growth kinetics obtained from the experiment and measured thermal cycles from the welding process. The simulated grain size distribution in the HAZ was in good agreement with experimental results. The pinning effect of second phase particles on grain growth was also simulated in this work. The simulation results confirmed that by introducing the variable R, degree of contact between grain boundaries and second phase particles, the Zener pinning model can be modified as${D/ r} = {K/{Rf}}$where D is the pinned grain size, r the mean size of second phase particles, K a constant, f the area fraction (or the volume fraction in 3-D) of second phase.

  7. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  8. Formal Analysis of Dynamics Within Philosophy of Mind by Computer Simulation

    NARCIS (Netherlands)

    Bosse, T.; Schut, M.C.; Treur, J.

    2009-01-01

    Computer simulations can be useful tools to support philosophers in validating their theories, especially when these theories concern phenomena showing nontrivial dynamics. Such theories are usually informal, whilst for computer simulation a formally described model is needed. In this paper, a

  9. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  10. Computer simulation studies in condensed-matter physics 5. Proceedings

    International Nuclear Information System (INIS)

    Landau, D.P.; Mon, K.K.; Schuettler, H.B.

    1993-01-01

    As the role of computer simulations began to increase in importance, we sensed a need for a ''meeting place'' for both experienced simulators and neophytes to discuss new techniques and results in an environment which promotes extended discussion. As a consequence of these concerns, The Center for Simulational Physics established an annual workshop on Recent Developments in Computer Simulation Studies in Condensed-Matter Physics. This year's workshop was the fifth in this series and the interest which the scientific community has shown demonstrates quite clearly the useful purpose which the series has served. The workshop was held at the University of Georgia, February 17-21, 1992, and these proceedings from a record of the workshop which is published with the goal of timely dissemination of the papers to a wider audience. The proceedings are divided into four parts. The first part contains invited papers which deal with simulational studies of classical systems and includes an introduction to some new simulation techniques and special purpose computers as well. A separate section of the proceedings is devoted to invited papers on quantum systems including new results for strongly correlated electron and quantum spin models. The third section is comprised of a single, invited description of a newly developed software shell designed for running parallel programs. The contributed presentations comprise the final chapter. (orig.). 79 figs

  11. A compositional reservoir simulator on distributed memory parallel computers

    International Nuclear Information System (INIS)

    Rame, M.; Delshad, M.

    1995-01-01

    This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. A portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented

  12. Computer simulation of ultrasonic waves in solids

    International Nuclear Information System (INIS)

    Thibault, G.A.; Chaplin, K.

    1992-01-01

    A computer model that simulates the propagation of ultrasonic waves has been developed at AECL Research, Chalk River Laboratories. This program is called EWE, short for Elastic Wave Equations, the mathematics governing the propagation of ultrasonic waves. This report contains a brief summary of the use of ultrasonic waves in non-destructive testing techniques, a discussion of the EWE simulation code explaining the implementation of the equations and the types of output received from the model, and an example simulation showing the abilities of the model. (author). 2 refs., 2 figs

  13. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    International Nuclear Information System (INIS)

    Foster, C.

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  14. Thermodynamic and transport properties of nitrogen fluid: Molecular theory and computer simulations

    Science.gov (United States)

    Eskandari Nasrabad, A.; Laghaei, R.

    2018-04-01

    Computer simulations and various theories are applied to compute the thermodynamic and transport properties of nitrogen fluid. To model the nitrogen interaction, an existing potential in the literature is modified to obtain a close agreement between the simulation results and experimental data for the orthobaric densities. We use the Generic van der Waals theory to calculate the mean free volume and apply the results within the modified Cohen-Turnbull relation to obtain the self-diffusion coefficient. Compared to experimental data, excellent results are obtained via computer simulations for the orthobaric densities, the vapor pressure, the equation of state, and the shear viscosity. We analyze the results of the theory and computer simulations for the various thermophysical properties.

  15. X-ray strain tensor imaging: FEM simulation and experiments with a micro-CT.

    Science.gov (United States)

    Kim, Jae G; Park, So E; Lee, Soo Y

    2014-01-01

    In tissue elasticity imaging, measuring the strain tensor components is necessary to solve the inverse problem. However, it is impractical to measure all the tensor components in ultrasound or MRI elastography because of their anisotropic spatial resolution. The objective of this study is to compute 3D strain tensor maps from the 3D CT images of a tissue-mimicking phantom. We took 3D micro-CT images of the phantom twice with applying two different mechanical compressions to it. Applying the 3D image correlation technique to the CT images under different compression, we computed 3D displacement vectors and strain tensors at every pixel. To evaluate the accuracy of the strain tensor maps, we made a 3D FEM model of the phantom, and we computed strain tensor maps through FEM simulation. Experimentally obtained strain tensor maps showed similar patterns to the FEM-simulated ones in visual inspection. The correlation between the strain tensor maps obtained from the experiment and the FEM simulation ranges from 0.03 to 0.93. Even though the strain tensor maps suffer from high level noise, we expect the x-ray strain tensor imaging may find some biomedical applications such as malignant tissue characterization and stress analysis inside the tissues.

  16. Simulation Exploration Experience 2018 Overview

    Science.gov (United States)

    Paglialonga, Stephen; Elfrey, Priscilla; Crues, Edwin Z.

    2018-01-01

    The Simulation Exploration Experience (SEE) joins students, industry, professional associations, and faculty together for an annual modeling and simulation (M&S) challenge. SEE champions collaborative collegiate-level modeling and simulation by providing a venue for students to work in highly dispersed inter-university teams to design, develop, test, and execute simulated missions associated with space exploration. Participating teams gain valuable knowledge, skills, and increased employability by working closely with industry professionals, NASA, and faculty advisors. This presentation gives and overview of the SEE and the upcoming 2018 SEE event.

  17. A review of computer-based simulators for ultrasound training.

    Science.gov (United States)

    Blum, Tobias; Rieger, Andreas; Navab, Nassir; Friess, Helmut; Martignoni, Marc

    2013-04-01

    Computer-based simulators for ultrasound training are a topic of recent interest. During the last 15 years, many different systems and methods have been proposed. This article provides an overview and classification of systems in this domain and a discussion of their advantages. Systems are classified and discussed according to the image simulation method, user interactions and medical applications. Computer simulation of ultrasound has one key advantage over traditional training. It enables novel training concepts, for example, through advanced visualization, case databases, and automatically generated feedback. Qualitative evaluations have mainly shown positive learning effects. However, few quantitative evaluations have been performed and long-term effects have to be examined.

  18. Computer Graphics Simulations of Sampling Distributions.

    Science.gov (United States)

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  19. Experience with simulator training for emergency conditions

    International Nuclear Information System (INIS)

    1987-12-01

    The training of operators by the use of simulators is common to most countries with nuclear power plants. Simulator training programmes are generally well developed, but their value can be limited by the age, type, size and capability of the simulator. Within these limits, most full scope simulators have a capability of training operators for a range of design basis accidents. It is recognized that human performance under accident conditions is difficult to predict or analyse, particularly in the area of severe accidents. These are rare events and by their very nature, unpredictable. Of importance, therefore, is to investigate the training of operators for severe accident conditions, and to examine ways in which simulators may be used in this task. The International Nuclear Safety Advisory Group (INSAG) has reviewed this field and the associated elements of human behaviour. It has recommended that activities are concentrated on this area. Initially it is encouraging the following objectives: i) To train operators for accident conditions including severe accidents and to strongly encourage the development and use of simulators for this purpose; ii) To improve the man-machine interface by the use of computer aids to the operator; iii) To develop human performance requirements for plant operating staff. As part of this work, the IAEA convened a technical committee on 15-19 September 1986 to review the experience with simulator training for emergency conditions, to review simulator modelling for severe accident training, to examine the role of human cognitive behaviour modelling, and to review guidance on accident scenarios. A substantial deviation may be a major fuel failure, a Loss of Coolant Accident (LOCA), etc. Examples of engineered safety features are: an Emergency Core Cooling System (ECCS), and Containment Systems. This report was prepared by the participants during the meeting and reviewed further in a Consultant's Meeting. It also includes papers which were

  20. Computer simulation of nonequilibrium processes

    International Nuclear Information System (INIS)

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed

  1. Building an adiabatic quantum computer simulation in the classroom

    Science.gov (United States)

    Rodríguez-Laguna, Javier; Santalla, Silvia N.

    2018-05-01

    We present a didactic introduction to adiabatic quantum computation (AQC) via the explicit construction of a classical simulator of quantum computers. This constitutes a suitable route to introduce several important concepts for advanced undergraduates in physics: quantum many-body systems, quantum phase transitions, disordered systems, spin-glasses, and computational complexity theory.

  2. Numerical Simulation Applications in the Design of EGS Collab Experiment 1

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, Henry [National Renewable Energy Laboratory (NREL), Golden, CO (United States); White, Mark D. [Pacific Northwest National Laboratory; Fu, Pengcheng [Lawrence Livermore National Laboratory; Ghassemi, Ahmad [University of Oklahoma; Huang, Hai [Idaho National Laboratory; Rutqvist, Jonny [Lawrence Berkeley National Laboratory

    2018-02-14

    The United States Department of Energy, Geothermal Technologies Office (GTO) is funding a collaborative investigation of enhanced geothermal systems (EGS) processes at the meso-scale. This study, referred to as the EGS Collab project, is a unique opportunity for scientists and engineers to investigate the creation of fracture networks and circulation of fluids across those networks under in-situ stress conditions. The EGS Collab project is envisioned to comprise three experiments and the site for the first experiment is on the 4850 Level (4,850 feet below ground surface) in phyllite of the Precambrian Poorman formation, at the Sanford Underground Research Facility, located at the former Homestake Gold Mine, in Lead, South Dakota. Principal objectives of the project are to develop a number of intermediate-scale field sites and to conduct well-controlled in situ experiments focused on rock fracture behavior and permeability enhancement. Data generated during these experiments will be compared against predictions of a suite of computer codes specifically designed to solve problems involving coupled thermal, hydrological, geomechanical, and geochemical processes. Comparisons between experimental and numerical simulation results will provide code developers with direction for improvements and verification of process models, build confidence in the suite of available numerical tools, and ultimately identify critical future development needs for the geothermal modeling community. Moreover, conducting thorough comparisons of models, modelling approaches, measurement approaches and measured data, via the EGS Collab project, will serve to identify techniques that are most likely to succeed at the Frontier Observatory for Research in Geothermal Energy (FORGE), the GTO's flagship EGS research effort. As noted, outcomes from the EGS Collab project experiments will serve as benchmarks for computer code verification, but numerical simulation additionally plays an essential

  3. Computing for an SSC experiment

    International Nuclear Information System (INIS)

    Gaines, I.

    1993-01-01

    The hardware and software problems for SSC experiments are similar to those faced by present day experiments but larger in scale. In particular, the Solenoidal Detector Collaboration (SDC) anticipates the need for close to 10**6 MIPS of off-line computing and will produce several Petabytes (10**15 bytes) of data per year. Software contributions will be made from large numbers of highly geographically dispersed physicists. Hardware and software architectures to meet these needs have been designed. Providing the requisites amount of computing power and providing tools to allow cooperative software development using extensions of existing techniques look achievable. The major challenges will be to provide efficient methods of accessing and manipulating the enormous quantities of data that will be produced at the SSC, and to enforce the use of software engineering tools that will ensure the open-quotes correctnessclose quotes of experiment critical software

  4. Quantum computer gate simulations | Dada | Journal of the Nigerian ...

    African Journals Online (AJOL)

    A new interactive simulator for Quantum Computation has been developed for simulation of the universal set of quantum gates and for construction of new gates of up to 3 qubits. The simulator also automatically generates an equivalent quantum circuit for any arbitrary unitary transformation on a qubit. Available quantum ...

  5. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  6. Application of parallel computing techniques to a large-scale reservoir simulation

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-01-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance

  7. Comparison of electron cloud simulation and experiments in the high-current experiment

    International Nuclear Information System (INIS)

    Cohen, R.H.; Friedman, A.; Covo, M. Kireeff; Lund, S.M.; Molvik, A.W.; Bieniosek, F.M.; Seidl, P.A.; Vay, J.-L.; Verboncoeur, J.; Stoltz, P.; Veitzer, S.

    2004-01-01

    A set of experiments has been performed on the High-Current Experiment (HCX) facility at LBNL, in which the ion beam is allowed to collide with an end plate and thereby induce a copious supply of desorbed electrons. Through the use of combinations of biased and grounded electrodes positioned in between and downstream of the quadrupole magnets, the flow of electrons upstream into the magnets can be turned on or off. Properties of the resultant ion beam are measured under each condition. The experiment is modeled via a full three-dimensional, two species (electron and ion) particle simulation, as well as via reduced simulations (ions with appropriately chosen model electron cloud distributions, and a high-resolution simulation of the region adjacent to the end plate). The three-dimensional simulations are the first of their kind and the first to make use of a timestep-acceleration scheme that allows the electrons to be advanced with a timestep that is not small compared to the highest electron cyclotron period. The simulations reproduce qualitative aspects of the experiments, illustrate some unanticipated physical effects, and serve as an important demonstration of a developing simulation capability

  8. Axial power deviation control strategy and computer simulation for Daya Bay Nuclear Power Station

    International Nuclear Information System (INIS)

    Liao Yehong; Zhou Xiaoling, Xiao Min

    2004-01-01

    Daya Bay Nuclear Power Station has very tight operation diagram especially at its right side. Therefore the successful control of axial power deviation for PWR is crucial to nuclear safety. After analyzing various core characters' effect on axial power distribution, several axial power deviation control strategies has been proposed to comply with different power varying operation scenario. Application and computer simulation of the strategies has shown that our prediction of axial power deviation evolution are comparable to the measurement values, and that our control strategies are effective. Engineering experience shows that the application of our methodology can predict accurately the transient of axial power deviation, and therefore has become a useful tool for reactor operation and safety control. This paper presents the axial power control characteristics, reactor operation strategy research, computer simulation, and comparison to measurement results in Daya Bay Nuclear Power Station. (author)

  9. Optimizing clinical trial supply requirements: simulation of computer-controlled supply chain management.

    Science.gov (United States)

    Peterson, Magnus; Byrom, Bill; Dowlman, Nikki; McEntegart, Damian

    2004-01-01

    Computer-controlled systems are commonly used in clinical trials to control dispensing and manage site inventories of trial supplies. Typically such systems are used with an interactive telephone or web system that provide an interface with the study site. Realizing the maximum savings in medication associated with this approach has, in the past, been problematic as it has been difficult to fully estimate medication requirements due to the complexities of these algorithms and the inherent variation in the clinical trial recruitment process. We describe the traditional and automated methods of supplying sites. We detail a simulation approach that models the automated system. We design a number of simulation experiments using this model to investigate the supply strategy properties that influence medication overage and other strategy performance metrics. The computer-controlled medication system gave superior performance to the traditional method. In one example, a 75% overage of wasted medication in the traditional system was associated with higher supply failure than an automated system strategy with an overage of 47%. In a further example, we demonstrate that the impact of using a country stratified as opposed to site stratified scheme affects the number of deliveries and probability of supply failures more than the amount of drug wasted with respective increases of 20, 2300 and 4%. Medication savings with automated systems are particularly significant in repeat dispensing designs. We show that the number of packs required can fall by as much as 50% if one uses a predictive medication algorithm. We conclude that a computer-controlled supply chain enables medication savings to be realized and that it is possible to quantify the distribution of these savings using a simulation model. The simulation model can be used to optimize the prestudy medication supply strategy and for midstudy monitoring using real-time data contained in the study database.

  10. Cluster computing for lattice QCD simulations

    International Nuclear Information System (INIS)

    Coddington, P.D.; Williams, A.G.

    2000-01-01

    Full text: Simulations of lattice quantum chromodynamics (QCD) require enormous amounts of compute power. In the past, this has usually involved sharing time on large, expensive machines at supercomputing centres. Over the past few years, clusters of networked computers have become very popular as a low-cost alternative to traditional supercomputers. The dramatic improvements in performance (and more importantly, the ratio of price/performance) of commodity PCs, workstations, and networks have made clusters of off-the-shelf computers an attractive option for low-cost, high-performance computing. A major advantage of clusters is that since they can have any number of processors, they can be purchased using any sized budget, allowing research groups to install a cluster for their own dedicated use, and to scale up to more processors if additional funds become available. Clusters are now being built for high-energy physics simulations. Wuppertal has recently installed ALiCE, a cluster of 128 Alpha workstations running Linux, with a peak performance of 158 G flops. The Jefferson Laboratory in the US has a 16 node Alpha cluster and plans to upgrade to a 256 processor machine. In Australia, several large clusters have recently been installed. Swinburne University of Technology has a cluster of 64 Compaq Alpha workstations used for astrophysics simulations. Early this year our DHPC group constructed a cluster of 116 dual Pentium PCs (i.e. 232 processors) connected by a Fast Ethernet network, which is used by chemists at Adelaide University and Flinders University to run computational chemistry codes. The Australian National University has recently installed a similar PC cluster with 192 processors. The Centre for the Subatomic Structure of Matter (CSSM) undertakes large-scale high-energy physics calculations, mainly lattice QCD simulations. The choice of the computer and network hardware for a cluster depends on the particular applications to be run on the machine. Our

  11. Computer Networks E-learning Based on Interactive Simulations and SCORM

    Directory of Open Access Journals (Sweden)

    Francisco Andrés Candelas

    2011-05-01

    Full Text Available This paper introduces a new set of compact interactive simulations developed for the constructive learning of computer networks concepts. These simulations, which compose a virtual laboratory implemented as portable Java applets, have been created by combining EJS (Easy Java Simulations with the KivaNS API. Furthermore, in this work, the skills and motivation level acquired by the students are evaluated and measured when these simulations are combined with Moodle and SCORM (Sharable Content Object Reference Model documents. This study has been developed to improve and stimulate the autonomous constructive learning in addition to provide timetable flexibility for a Computer Networks subject.

  12. Computer Simulation of Angle-measuring System of Photoelectric Theodolite

    International Nuclear Information System (INIS)

    Zeng, L; Zhao, Z W; Song, S L; Wang, L T

    2006-01-01

    In this paper, a virtual test platform based on malfunction phenomena is designed, using the methods of computer simulation and numerical mask. It is used in the simulation training of angle-measuring system of photoelectric theodolite. Actual application proves that this platform supplies good condition for technicians making deep simulation training and presents a useful approach for the establishment of other large equipment simulation platforms

  13. Applications of Atomic Systems in Quantum Simulation, Quantum Computation and Topological Phases of Matter

    Science.gov (United States)

    Wang, Shengtao

    The ability to precisely and coherently control atomic systems has improved dramatically in the last two decades, driving remarkable advancements in quantum computation and simulation. In recent years, atomic and atom-like systems have also been served as a platform to study topological phases of matter and non-equilibrium many-body physics. Integrated with rapid theoretical progress, the employment of these systems is expanding the realm of our understanding on a range of physical phenomena. In this dissertation, I draw on state-of-the-art experimental technology to develop several new ideas for controlling and applying atomic systems. In the first part of this dissertation, we propose several novel schemes to realize, detect, and probe topological phases in atomic and atom-like systems. We first theoretically study the intriguing properties of Hopf insulators, a peculiar type of topological insulators beyond the standard classification paradigm of topological phases. Using a solid-state quantum simulator, we report the first experimental observation of Hopf insulators. We demonstrate the Hopf fibration with fascinating topological links in the experiment, showing clear signals of topological phase transitions for the underlying Hamiltonian. Next, we propose a feasible experimental scheme to realize the chiral topological insulator in three dimensions. They are a type of topological insulators protected by the chiral symmetry and have thus far remained unobserved in experiment. We then introduce a method to directly measure topological invariants in cold-atom experiments. This detection scheme is general and applicable to probe of different topological insulators in any spatial dimension. In another study, we theoretically discover a new type of topological gapless rings, dubbed a Weyl exceptional ring, in three-dimensional dissipative cold atomic systems. In the second part of this dissertation, we focus on the application of atomic systems in quantum computation

  14. Combustion-Powered Actuation for Dynamic Stall Suppression - Simulations and Low-Mach Experiments

    Science.gov (United States)

    Matalanis, Claude G.; Min, Byung-Young; Bowles, Patrick O.; Jee, Solkeun; Wake, Brian E.; Crittenden, Tom; Woo, George; Glezer, Ari

    2014-01-01

    An investigation on dynamic-stall suppression capabilities of combustion-powered actuation (COMPACT) applied to a tabbed VR-12 airfoil is presented. In the first section, results from computational fluid dynamics (CFD) simulations carried out at Mach numbers from 0.3 to 0.5 are presented. Several geometric parameters are varied including the slot chordwise location and angle. Actuation pulse amplitude, frequency, and timing are also varied. The simulations suggest that cycle-averaged lift increases of approximately 4% and 8% with respect to the baseline airfoil are possible at Mach numbers of 0.4 and 0.3 for deep and near-deep dynamic-stall conditions. In the second section, static-stall results from low-speed wind-tunnel experiments are presented. Low-speed experiments and high-speed CFD suggest that slots oriented tangential to the airfoil surface produce stronger benefits than slots oriented normal to the chordline. Low-speed experiments confirm that chordwise slot locations suitable for Mach 0.3-0.4 stall suppression (based on CFD) will also be effective at lower Mach numbers.

  15. Experience building and operating the CMS Tier-1 computing centres

    Science.gov (United States)

    Albert, M.; Bakken, J.; Bonacorsi, D.; Brew, C.; Charlot, C.; Huang, Chih-Hao; Colling, D.; Dumitrescu, C.; Fagan, D.; Fassi, F.; Fisk, I.; Flix, J.; Giacchetti, L.; Gomez-Ceballos, G.; Gowdy, S.; Grandi, C.; Gutsche, O.; Hahn, K.; Holzman, B.; Jackson, J.; Kreuzer, P.; Kuo, C. M.; Mason, D.; Pukhaeva, N.; Qin, G.; Quast, G.; Rossman, P.; Sartirana, A.; Scheurer, A.; Schott, G.; Shih, J.; Tader, P.; Thompson, R.; Tiradani, A.; Trunov, A.

    2010-04-01

    The CMS Collaboration relies on 7 globally distributed Tier-1 computing centres located at large universities and national laboratories for a second custodial copy of the CMS RAW data and primary copy of the simulated data, data serving capacity to Tier-2 centres for analysis, and the bulk of the reprocessing and event selection capacity in the experiment. The Tier-1 sites have a challenging role in CMS because they are expected to ingest and archive data from both CERN and regional Tier-2 centres, while they export data to a global mesh of Tier-2s at rates comparable to the raw export data rate from CERN. The combined capacity of the Tier-1 centres is more than twice the resources located at CERN and efficiently utilizing this large distributed resources represents a challenge. In this article we will discuss the experience building, operating, and utilizing the CMS Tier-1 computing centres. We will summarize the facility challenges at the Tier-1s including the stable operations of CMS services, the ability to scale to large numbers of processing requests and large volumes of data, and the ability to provide custodial storage and high performance data serving. We will also present the operations experience utilizing the distributed Tier-1 centres from a distance: transferring data, submitting data serving requests, and submitting batch processing requests.

  16. Experience building and operating the CMS Tier-1 computing centres

    International Nuclear Information System (INIS)

    Albert, M; Bakken, J; Huang, Chih-Hao; Dumitrescu, C; Fagan, D; Fisk, I; Giacchetti, L; Gutsche, O; Holzman, B; Bonacorsi, D; Grandi, C; Brew, C; Jackson, J; Charlot, C; Colling, D; Fassi, F; Flix, J; Gomez-Ceballos, G; Hahn, K; Gowdy, S

    2010-01-01

    The CMS Collaboration relies on 7 globally distributed Tier-1 computing centres located at large universities and national laboratories for a second custodial copy of the CMS RAW data and primary copy of the simulated data, data serving capacity to Tier-2 centres for analysis, and the bulk of the reprocessing and event selection capacity in the experiment. The Tier-1 sites have a challenging role in CMS because they are expected to ingest and archive data from both CERN and regional Tier-2 centres, while they export data to a global mesh of Tier-2s at rates comparable to the raw export data rate from CERN. The combined capacity of the Tier-1 centres is more than twice the resources located at CERN and efficiently utilizing this large distributed resources represents a challenge. In this article we will discuss the experience building, operating, and utilizing the CMS Tier-1 computing centres. We will summarize the facility challenges at the Tier-1s including the stable operations of CMS services, the ability to scale to large numbers of processing requests and large volumes of data, and the ability to provide custodial storage and high performance data serving. We will also present the operations experience utilizing the distributed Tier-1 centres from a distance: transferring data, submitting data serving requests, and submitting batch processing requests.

  17. Modelling and simulation in nuclear safety and the role of experiment

    International Nuclear Information System (INIS)

    Baek, W-P.

    2015-01-01

    'Full text:' Modeling and simulation (M&S) technology is a key element in assuring and enhancing the safety of nuclear installations. The M&S technology has been progressed continuously with the introduction of new designs, improved understanding on relevant physical processes, and the improvement of computing environment. This presentation covers the role, progresses and prospect of M&S technology relevant to nuclear safety. Special attention is given to the effective interaction between M&S and experiment. The expected role of experiment to motivate the advancement of M&S technology is emphasized with some typical examples. Finally, relevant R&D activities of Korea are introduced for thermal-hydraulics and severe accident safety. (author)

  18. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    International Nuclear Information System (INIS)

    Bryant, R M; Holloway, F W; Van Arsdall, P J.

    1999-01-01

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  19. Validation and computing and performance studies for the ATLAS simulation

    CERN Document Server

    Marshall, Z; The ATLAS collaboration

    2009-01-01

    We present the validation of the ATLAS simulation software pro ject. Software development is controlled by nightly builds and several levels of automatic tests to ensure stability. Computing validation, including CPU time, memory, and disk space required per event, is benchmarked for all software releases. Several different physics processes and event types are checked to thoroughly test all aspects of the detector simulation. The robustness of the simulation software is demonstrated by the production of 500 million events on the World-wide LHC Computing Grid in the last year.

  20. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    Science.gov (United States)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  1. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB

    OpenAIRE

    Sinha, Shriprakash

    2016-01-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational mo...

  2. Computational simulator of robotic manipulators

    International Nuclear Information System (INIS)

    Leal, Alexandre S.; Campos, Tarcisio P.R.

    1995-01-01

    Robotic application for industrial plants is discussed and a computational model for a mechanical manipulator of three links is presented. A neural network feed-forward type has been used to model the dynamic control of the manipulator. A graphic interface was developed in C programming language as a virtual world in order to visualize and simulate the arm movements handling radioactive waste environment. (author). 7 refs, 5 figs

  3. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  4. Macromod: Computer Simulation For Introductory Economics

    Science.gov (United States)

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  5. An Investigation of Computer-based Simulations for School Crises Management.

    Science.gov (United States)

    Degnan, Edward; Bozeman, William

    2001-01-01

    Describes development of a computer-based simulation program for training school personnel in crisis management. Addresses the data collection and analysis involved in developing a simulated event, the systems requirements for simulation, and a case study of application and use of the completed simulation. (Contains 21 references.) (Authors/PKP)

  6. The use of micro-computers in the simulation of ion beam optics

    International Nuclear Information System (INIS)

    Spaedtke, P.; Ivens, D.

    1989-01-01

    With computer simulation codes specific problems of the ion beam optics can be studied, which is useful in the design as in optimization of existing systems. Several such codes have been developed, unfortunately requiring substantial computer resources. Recent advances of mini- and micro-computers have now made it possible to develop simulation codes which can be run on these small computers also. In this paper, some of these codes will be presented and their computing time discussed. (author)

  7. Monte Carlo in radiotherapy: experience in a distributed computational environment

    Science.gov (United States)

    Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.

    2007-06-01

    New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.

  8. Computer simulation of driven Alfven waves

    International Nuclear Information System (INIS)

    Geary, J.L. Jr.

    1986-01-01

    The first particle simulation study of shear Alfven wave resonance heating is presented. Particle simulation codes self-consistently follow the time evolution of the individual and collective aspects of particle dynamics as well as wave dynamics in a fully nonlinear fashion. Alfven wave heating is a possible means of increasing the temperature of magnetized plasmas. A new particle simulation model was developed for this application that incorporates Darwin's formulation of the electromagnetic fields with a guiding center approximation for electron motion perpendicular to the ambient magnetic field. The implementation of this model and the examination of its theoretical and computational properties are presented. With this model, several cases of Alfven wave heating is examined in both uniform and nonuniform simulation systems in a two dimensional slab. For the inhomogeneous case studies, the kinetic Alfven wave develops in the vicinity of the shear Alfven resonance region

  9. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  10. Simulated experiments in modern physics

    International Nuclear Information System (INIS)

    Tirnini, Mahmud Hasan

    1981-01-01

    Author.In this thesis a number of the basic experiments of atomic and nuclear physics are simulated on a microcomputer interfaced to a chart recorder and CRT. These will induce the student to imagine that he is actually performing the experiments. He will collect data to be worked out. The thesis covers the relevant material to set up such experiments in the modern physics laboratory

  11. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  12. Positive Wigner functions render classical simulation of quantum computation efficient.

    Science.gov (United States)

    Mari, A; Eisert, J

    2012-12-07

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  13. Assessing Practical Skills in Physics Using Computer Simulations

    Science.gov (United States)

    Walsh, Kevin

    2018-01-01

    Computer simulations have been used very effectively for many years in the teaching of science but the focus has been on cognitive development. This study, however, is an investigation into the possibility that a student's experimental skills in the real-world environment can be judged via the undertaking of a suitably chosen computer simulation…

  14. Computer simulation of stair falls to investigate scenarios in child abuse.

    Science.gov (United States)

    Bertocci, G E; Pierce, M C; Deemer, E; Aguel, F

    2001-09-01

    To demonstrate the usefulness of computer simulation techniques in the investigation of pediatric stair falls. Since stair falls are a common falsely reported injury scenario in child abuse, our specific aim was to investigate the influence of stair characteristics on injury biomechanics of pediatric stair falls by using a computer simulation model. Our long-term goal is to use knowledge of biomechanics to aid in distinguishing between accidents and abuse. A computer simulation model of a 3-year-old child falling down stairs was developed using commercially available simulation software. This model was used to investigate the influence that stair characteristics have on biomechanical measures associated with injury risk. Since femur fractures occur in unintentional and abuse scenarios, biomechanical measures were focused on the lower extremities. The number and slope of steps and stair surface friction and elasticity were found to affect biomechanical measures associated with injury risk. Computer simulation techniques are useful for investigating the biomechanics of stair falls. Using our simulation model, we determined that stair characteristics have an effect on potential for lower extremity injuries. Although absolute values of biomechanical measures should not be relied on in an unvalidated model such as this, relationships between accident-environment factors and biomechanical measures can be studied through simulation. Future efforts will focus on model validation.

  15. Evaluation of Rankine cycle air conditioning system hardware by computer simulation

    Science.gov (United States)

    Healey, H. M.; Clark, D.

    1978-01-01

    A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.

  16. Simulation of Specular Surface Imaging Based on Computer Graphics: Application on a Vision Inspection System

    Directory of Open Access Journals (Sweden)

    Seulin Ralph

    2002-01-01

    Full Text Available This work aims at detecting surface defects on reflecting industrial parts. A machine vision system, performing the detection of geometric aspect surface defects, is completely described. The revealing of defects is realized by a particular lighting device. It has been carefully designed to ensure the imaging of defects. The lighting system simplifies a lot the image processing for defect segmentation and so a real-time inspection of reflective products is possible. To bring help in the conception of imaging conditions, a complete simulation is proposed. The simulation, based on computer graphics, enables the rendering of realistic images. Simulation provides here a very efficient way to perform tests compared to the numerous attempts of manual experiments.

  17. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  18. Doppler measurements of the ionosphere on the occasion of the Apollo-Soyuz test project. Part 1: Computer simulation of ionospheric-induced Doppler shifts

    Science.gov (United States)

    Grossi, M. D.; Gay, R. H.

    1975-01-01

    A computer simulation of the ionospheric experiment of the Apollo-Soyuz Test Project (ASTP) was performed. ASTP is the first example of USA/USSR cooperation in space and is scheduled for summer 1975. The experiment consists of performing dual-frequency Doppler measurements (at 162 and 324 MHz) between the Apollo Command Service Module (CSM) and the ASTP Docking Module (DM), both orbiting at 221-km height and at a relative distance of 300 km. The computer simulation showed that, with the Doppler measurement resolution of approximately 3 mHz provided by the instrumentation (in 10-sec integration time), ionospheric-induced Doppler shifts will be measurable accurately at all times, with some rare exceptions occurring when the radio path crosses regions of minimum ionospheric density. The computer simulation evaluated the ability of the experiment to measure changes of columnar electron content between CSM and DM (from which horizontal gradients of electron density at 221-km height can be obtained) and to measure variations in DM-to-ground columnar content (from which an averaged columnar content and the electron density at the DM can be deduced, under some simplifying assumptions).

  19. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  20. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  1. Computer Simulation Surgery for Mandibular Reconstruction Using a Fibular Osteotomy Guide

    Directory of Open Access Journals (Sweden)

    Woo Shik Jeong

    2014-09-01

    Full Text Available In the present study, a fibular osteotomy guide based on a computer simulation was applied to a patient who had undergone mandibular segmental ostectomy due to oncological complications. This patient was a 68-year-old woman who presented to our department with a biopsy-proven squamous cell carcinoma on her left gingival area. This lesion had destroyed the cortical bony structure, and the patient showed attenuation of her soft tissue along the inferior alveolar nerve, indicating perineural spread of the tumor. Prior to surgery, a three-dimensional computed tomography scan of the facial and fibular bones was performed. We then created a virtual computer simulation of the mandibular segmental defect through which we segmented the fibular to reconstruct the proper angulation in the original mandible. Approximately 2-cm segments were created on the basis of this simulation and applied to the virtually simulated mandibular segmental defect. Thus, we obtained a virtual model of the ideal mandibular reconstruction for this patient with a fibular free flap. We could then use this computer simulation for the subsequent surgery and minimize the bony gaps between the multiple fibular bony segments.

  2. SNOW: a digital computer program for the simulation of ion beam devices

    International Nuclear Information System (INIS)

    Boers, J.E.

    1980-08-01

    A digital computer program, SNOW, has been developed for the simulation of dense ion beams. The program simulates the plasma expansion cup (but not the plasma source itself), the acceleration region, and a drift space with neutralization if desired. The ion beam is simulated by computing representative trajectories through the device. The potentials are simulated on a large rectangular matrix array which is solved by iterative techniques. Poisson's equation is solved at each point within the configuration using space-charge densities computed from the ion trajectories combined with background electron and/or ion distributions. The simulation methods are described in some detail along with examples of both axially-symmetric and rectangular beams. A detailed description of the input data is presented

  3. Large Atmospheric Computation on the Earth Simulator: The LACES Project

    Directory of Open Access Journals (Sweden)

    Michel Desgagné

    2006-01-01

    Full Text Available The Large Atmospheric Computation on the Earth Simulator (LACES project is a joint initiative between Canadian and Japanese meteorological services and academic institutions that focuses on the high resolution simulation of Hurricane Earl (1998. The unique aspect of this effort is the extent of the computational domain, which covers all of North America and Europe with a grid spacing of 1 km. The Canadian Mesoscale Compressible Community (MC2 model is shown to parallelize effectively on the Japanese Earth Simulator (ES supercomputer; however, even using the extensive computing resources of the ES Center (ESC, the full simulation for the majority of Hurricane Earl's lifecycle takes over eight days to perform and produces over 5.2 TB of raw data. Preliminary diagnostics show that the results of the LACES simulation for the tropical stage of Hurricane Earl's lifecycle compare well with available observations for the storm. Further studies involving advanced diagnostics have commenced, taking advantage of the uniquely large spatial extent of the high resolution LACES simulation to investigate multiscale interactions in the hurricane and its environment. It is hoped that these studies will enhance our understanding of processes occurring within the hurricane and between the hurricane and its planetary-scale environment.

  4. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  5. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  6. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Directory of Open Access Journals (Sweden)

    Jakob Jordan

    2018-02-01

    Full Text Available State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  7. Computing challenges of the CMS experiment

    International Nuclear Information System (INIS)

    Krammer, N.; Liko, D.

    2017-01-01

    The success of the LHC experiments is due to the magnificent performance of the detector systems and the excellent operating computing systems. The CMS offline software and computing system is successfully fulfilling the LHC Run 2 requirements. For the increased data rate of future LHC operation, together with high pileup interactions, improvements of the usage of the current computing facilities and new technologies became necessary. Especially for the challenge of the future HL-LHC a more flexible and sophisticated computing model is needed. In this presentation, I will discuss the current computing system used in the LHC Run 2 and future computing facilities for the HL-LHC runs using flexible computing technologies like commercial and academic computing clouds. The cloud resources are highly virtualized and can be deployed for a variety of computing tasks providing the capacities for the increasing needs of large scale scientific computing.

  8. Chaos in reversed-field-pinch plasma simulation and experiment

    International Nuclear Information System (INIS)

    Watts, C.; Newman, D.E.; Sprott, J.C.

    1994-01-01

    We investigate the possibility that chaos and simple determinism are governing the dynamics of reversed-field-pinch (RFP) plasmas using data from both numerical simulations and experiment. A large repertoire of nonlinear-analysis techniques is used to identify low-dimensional chaos. These tools include phase portraits and Poincare sections, correlation dimension, the spectrum of Lyapunov exponents, and short-term predictability. In addition, nonlinear-noise-reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulate the plasma dynamics. These are the DEBS computer code, which models global RFP dynamics, and the dissipative trapped-electron-mode model, which models drift-wave turbulence. Data from both simulations show strong indications of low-dimensional chaos and simple determinism. Experimental data were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low-dimensional chaos or other simple determinism. Moreover, most of the analysis tools indicate that the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system

  9. Computer simulation of fatigue under diametrical compression

    OpenAIRE

    Carmona, H. A.; Kun, F.; Andrade Jr., J. S.; Herrmann, H. J.

    2006-01-01

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue, and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows to follow the development of the fracture process on the macro- and micro-level varying the relative influence of the mechanisms of damage accumulation over the ...

  10. Computer simulation of different designs of pseudo-random time-of-flight velocity analysers for molecular beam scattering experiments

    International Nuclear Information System (INIS)

    Rotzoll, G.

    1982-01-01

    After a brief summary of the pseudo-random time-of-flight (TOF) method, the design criteria for construction of a pseudo-random TOF disc are considered and complemented by computer simulations. The question of resolution and the choice of the sequence length and number of time channels per element are discussed. Moreover, the stability requirements of the chopper motor frequency are investigated. (author)

  11. Computer simulations and the changing face of scientific experimentation

    CERN Document Server

    Duran, Juan M

    2013-01-01

    Computer simulations have become a central tool for scientific practice. Their use has replaced, in many cases, standard experimental procedures. This goes without mentioning cases where the target system is empirical but there are no techniques for direct manipulation of the system, such as astronomical observation. To these cases, computer simulations have proved to be of central importance. The question about their use and implementation, therefore, is not only a technical one but represents a challenge for the humanities as well. In this volume, scientists, historians, and philosophers joi

  12. Computer Simulations and Theoretical Studies of Complex Systems: from complex fluids to frustrated magnets

    Science.gov (United States)

    Choi, Eunsong

    Computer simulations are an integral part of research in modern condensed matter physics; they serve as a direct bridge between theory and experiment by systemactically applying a microscopic model to a collection of particles that effectively imitate a macroscopic system. In this thesis, we study two very differnt condensed systems, namely complex fluids and frustrated magnets, primarily by simulating classical dynamics of each system. In the first part of the thesis, we focus on ionic liquids (ILs) and polymers--the two complementary classes of materials that can be combined to provide various unique properties. The properties of polymers/ILs systems, such as conductivity, viscosity, and miscibility, can be fine tuned by choosing an appropriate combination of cations, anions, and polymers. However, designing a system that meets a specific need requires a concrete understanding of physics and chemistry that dictates a complex interplay between polymers and ionic liquids. In this regard, molecular dynamics (MD) simulation is an efficient tool that provides a molecular level picture of such complex systems. We study the behavior of Poly (ethylene oxide) (PEO) and the imidazolium based ionic liquids, using MD simulations and statistical mechanics. We also discuss our efforts to develop reliable and efficient classical force-fields for PEO and the ionic liquids. The second part is devoted to studies on geometrically frustrated magnets. In particular, a microscopic model, which gives rise to an incommensurate spiral magnetic ordering observed in a pyrochlore antiferromagnet is investigated. The validation of the model is made via a comparison of the spin-wave spectra with the neutron scattering data. Since the standard Holstein-Primakoff method is difficult to employ in such a complex ground state structure with a large unit cell, we carry out classical spin dynamics simulations to compute spin-wave spectra directly from the Fourier transform of spin trajectories. We

  13. A Computational Framework for Bioimaging Simulation

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  14. A Computational Framework for Bioimaging Simulation.

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  15. A Computational Framework for Bioimaging Simulation.

    Directory of Open Access Journals (Sweden)

    Masaki Watabe

    Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  16. The use of computer simulations in whole-class versus small-group settings

    Science.gov (United States)

    Smetana, Lara Kathleen

    This study explored the use of computer simulations in a whole-class as compared to small-group setting. Specific consideration was given to the nature and impact of classroom conversations and interactions when computer simulations were incorporated into a high school chemistry course. This investigation fills a need for qualitative research that focuses on the social dimensions of actual classrooms. Participants included a novice chemistry teacher experienced in the use of educational technologies and two honors chemistry classes. The study was conducted in a rural school in the south-Atlantic United States at the end of the fall 2007 semester. The study took place during one instructional unit on atomic structure. Data collection allowed for triangulation of evidence from a variety of sources approximately 24 hours of video- and audio-taped classroom observations, supplemented with the researcher's field notes and analytic journal; miscellaneous classroom artifacts such as class notes, worksheets, and assignments; open-ended pre- and post-assessments; student exit interviews; teacher entrance, exit and informal interviews. Four web-based simulations were used, three of which were from the ExploreLearning collection. Assessments were analyzed using descriptive statistics and classroom observations, artifacts and interviews were analyzed using Erickson's (1986) guidelines for analytic induction. Conversational analysis was guided by methods outlined by Erickson (1982). Findings indicated (a) the teacher effectively incorporated simulations in both settings (b) students in both groups significantly improved their understanding of the chemistry concepts (c) there was no statistically significant difference between groups' achievement (d) there was more frequent exploratory talk in the whole-class group (e) there were more frequent and meaningful teacher-student interactions in the whole-class group (f) additional learning experiences not measured on the assessment

  17. simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    İlker Kalender

    2015-06-01

    Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described

  18. The challenge of quantum computer simulations of physical phenomena

    International Nuclear Information System (INIS)

    Ortiz, G.; Knill, E.; Gubernatis, J.E.

    2002-01-01

    The goal of physics simulation using controllable quantum systems ('physics imitation') is to exploit quantum laws to advantage, and thus accomplish efficient simulation of physical phenomena. In this Note, we discuss the fundamental concepts behind this paradigm of information processing, such as the connection between models of computation and physical systems. The experimental simulation of a toy quantum many-body problem is described

  19. High performance stream computing for particle beam transport simulations

    International Nuclear Information System (INIS)

    Appleby, R; Bailey, D; Higham, J; Salt, M

    2008-01-01

    Understanding modern particle accelerators requires simulating charged particle transport through the machine elements. These simulations can be very time consuming due to the large number of particles and the need to consider many turns of a circular machine. Stream computing offers an attractive way to dramatically improve the performance of such simulations by calculating the simultaneous transport of many particles using dedicated hardware. Modern Graphics Processing Units (GPUs) are powerful and affordable stream computing devices. The results of simulations of particle transport through the booster-to-storage-ring transfer line of the DIAMOND synchrotron light source using an NVidia GeForce 7900 GPU are compared to the standard transport code MAD. It is found that particle transport calculations are suitable for stream processing and large performance increases are possible. The accuracy and potential speed gains are compared and the prospects for future work in the area are discussed

  20. Computer simulation of variform fuel assemblies using Dragon code

    International Nuclear Information System (INIS)

    Ju Haitao; Wu Hongchun; Yao Dong

    2005-01-01

    The DRAGON is a cell code that developed for the CANDU reactor by the Ecole Polytechnique de Montreal of CANADA. Although, the DRAGON is mainly used to simulate the CANDU super-cell fuel assembly, it has an ability to simulate other geometries of the fuel assembly. However, only NEACRP benchmark problem of the BWR lattice cell was analyzed until now except for the CANDU reactor. We also need to develop the code to simulate the variform fuel assemblies, especially, for design of the advanced reactor. We validated that the cell code DRAGON is useful for simulating various kinds of the fuel assembly by analyzing the rod-shape fuel assembly of the PWR and the MTR plate-shape fuel assembly. Some other kinds of geometry of geometry were computed. Computational results show that the DRAGON is able to analyze variform fuel assembly problems and the precision is high. (authors)