Shuttle orbit IMU alignment. Single-precision computation error
Mcclain, C. R.
1980-01-01
The source of computational error in the inertial measurement unit (IMU) onorbit alignment software was investigated. Simulation runs were made on the IBM 360/70 computer with the IMU orbit alignment software coded in hal/s. The results indicate that for small IMU misalignment angles (less than 600 arc seconds), single precision computations in combination with the arc cosine method of eigen rotation angle extraction introduces an additional misalignment error of up to 230 arc seconds per axis. Use of the arc sine method, however, produced negligible misalignment error. As a result of this study, the arc sine method was recommended for use in the IMU onorbit alignment software.
Scientific computer simulation review
International Nuclear Information System (INIS)
Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.
2015-01-01
Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework
Simulation of quantum computers
De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB
2001-01-01
We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software
Simulation of quantum computers
Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.
2000-01-01
We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software
Parallel reservoir simulator computations
International Nuclear Information System (INIS)
Hemanth-Kumar, K.; Young, L.C.
1995-01-01
The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90
Computer Modeling and Simulation
Energy Technology Data Exchange (ETDEWEB)
Pronskikh, V. S. [Fermilab
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes
International Nuclear Information System (INIS)
Rasmussen, H.
1992-01-01
Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)
Accelerator simulation using computers
International Nuclear Information System (INIS)
Lee, M.; Zambre, Y.; Corbett, W.
1992-01-01
Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications
Advanced computers and simulation
International Nuclear Information System (INIS)
Ryne, R.D.
1993-01-01
Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators
International Nuclear Information System (INIS)
Schelonka, E.P.
1979-01-01
Development and application of a series of simulation codes used for computer security analysis and design are described. Boolean relationships for arrays of barriers within functional modules are used to generate composite effectiveness indices. The general case of multiple layers of protection with any specified barrier survival criteria is given. Generalized reduction algorithms provide numerical security indices in selected subcategories and for the system as a whole. 9 figures, 11 tables
Massively parallel quantum computer simulator
De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.
2007-01-01
We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray
Grid computing and biomolecular simulation.
Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W
2005-08-15
Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.
Fel simulations using distributed computing
Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.
2016-01-01
While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code
Fluid simulation for computer graphics
Bridson, Robert
2008-01-01
Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.
International Nuclear Information System (INIS)
Rowley, A.
1998-01-01
An ionic interaction model is developed which accounts for the effects of the ionic environment upon the electron densities of both cations and anions through changes in their size and shape and is transferable between materials. These variations are represented by additional dynamical variables which are handled within the model using the techniques of the Car-Parrinello method. The model parameters are determined as far as possible by input from external ab initio electronic structure calculations directed at examining the individual effects of the ionic environment upon the ions, particularly the oxide ion. Techniques for the evaluation of dipolar and quadrupolar Ewald sums in non-cubic simulation cells and the calculation of the pressure due to the terms in the potential are presented. This model is applied to the description of the perfect crystal properties and phonon dispersion curves of MgO. Consideration of the high symmetry phonon modes allows parameterization of the remaining model parameters in an unambiguous fashion. The same procedure is used to obtain parameters for CaO. These two parameter sets are examined to determine how they may be used to generate the parameters for SrO and simple scaling relationships based on ionic radii and polarizabilities are formulated. The transferability of the model to Cr 2 O 3 is investigated using parameters generated from the alkaline earth oxides. The importance of lower symmetry model terms, particularly quadrupolar interactions, at the low symmetry ion sites in the crystal structure is demonstrated. The correct ground-state crystal structure is predicted and the calculated surface energies and relaxation phenomena are found to agree well with previous ab initio studies. The model is applied to GeO 2 as a strong test of its applicability to ion environments far different from those encountered in MgO. An good description of the crystal structures is obtained and the interplay of dipolar and quadrupolar effects is
Simulating chemistry using quantum computers.
Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán
2011-01-01
The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.
Computer simulation of ductile fracture
International Nuclear Information System (INIS)
Wilkins, M.L.; Streit, R.D.
1979-01-01
Finite difference computer simulation programs are capable of very accurate solutions to problems in plasticity with large deformations and rotation. This opens the possibility of developing models of ductile fracture by correlating experiments with equivalent computer simulations. Selected experiments were done to emphasize different aspects of the model. A difficult problem is the establishment of a fracture-size effect. This paper is a study of the strain field around notched tensile specimens of aluminum 6061-T651. A series of geometrically scaled specimens are tested to fracture. The scaled experiments are conducted for different notch radius-to-diameter ratios. The strains at fracture are determined from computer simulations. An estimate is made of the fracture-size effect
Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation
Energy Technology Data Exchange (ETDEWEB)
Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)
2011-08-15
This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.
Computer simulation of martensitic transformations
Energy Technology Data Exchange (ETDEWEB)
Xu, Ping [Univ. of California, Berkeley, CA (United States)
1993-11-01
The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.
Computer simulation of electron beams
Energy Technology Data Exchange (ETDEWEB)
Sabchevski, S.P.; Mladenov, G.M. (Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. po Elektronika)
1994-04-14
Self-fields and forces as well as the local degree of space-charge neutralization in overcompensated electron beams are considered. The radial variation of the local degree of space-charge neutralization is analysed. A novel model which describes the equilibrium potential distribution in overcompensated beams is proposed and a method for computer simulation of the beam propagation is described. Results from numerical experiments which illustrate the propagation of finite emittance overneutralized beams are presented. (Author).
Computer simulation of nonequilibrium processes
Energy Technology Data Exchange (ETDEWEB)
Wallace, D.C.
1985-07-01
The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.
Computational simulator of robotic manipulators
International Nuclear Information System (INIS)
Leal, Alexandre S.; Campos, Tarcisio P.R.
1995-01-01
Robotic application for industrial plants is discussed and a computational model for a mechanical manipulator of three links is presented. A neural network feed-forward type has been used to model the dynamic control of the manipulator. A graphic interface was developed in C programming language as a virtual world in order to visualize and simulate the arm movements handling radioactive waste environment. (author). 7 refs, 5 figs
Computer simulation of liquid crystals
International Nuclear Information System (INIS)
McBride, C.
1999-01-01
Molecular dynamics simulation performed on modern computer workstations provides a powerful tool for the investigation of the static and dynamic characteristics of liquid crystal phases. In this thesis molecular dynamics computer simulations have been performed for two model systems. Simulations of 4,4'-di-n-pentyl-bibicyclo[2.2.2]octane demonstrate the growth of a structurally ordered phase directly from an isotropic fluid. This is the first time that this has been achieved for an atomistic model. The results demonstrate a strong coupling between orientational ordering and molecular shape, but indicate that the coupling between molecular conformational changes and molecular reorientation is relatively weak. Simulations have also been performed for a hybrid Gay-Berne/Lennard-Jones model resulting in thermodynamically stable nematic and smectic phases. Frank elastic constants have been calculated for the nematic phase formed by the hybrid model through analysis of the fluctuations of the nematic director, giving results comparable with those found experimentally. Work presented in this thesis also describes the parameterization of the torsional potential of a fragment of a dimethyl siloxane polymer chain, disiloxane diol (HOMe 2 Si) 2 O, using ab initio quantum mechanical calculations. (author)
Computer simulation of superionic fluorides
Castiglione, M
2000-01-01
experimentally gives an indication of the correlations between nearby defects is well-reproduced. The most stringent test of simulation model transferability is presented in the studies of lead tin fluoride, in which significant 'covalent' effects are apparent. Other similarly-structured compounds are also investigated, and the reasons behind the adoption of such an unusual layered structure, and the mobility and site occupation of the anions is quantified. In this thesis the nature of ion mobility in cryolite and lead fluoride based compounds is investigated by computer simulation. The phase transition of cryolite is characterised in terms of rotation of AIF sub 6 octahedra, and the conductive properties are shown to result from diffusion of the sodium ions. The two processes appear to be unrelated. Very good agreement with NMR experimental results is found. The Pb sup 2 sup + ion has a very high polarisability, yet treatment of this property in previous simulations has been problematic. In this thesis a mor...
FPGA-accelerated simulation of computer systems
Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S
2014-01-01
To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f
Parallel Computing for Brain Simulation.
Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A
2017-01-01
The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
20170312 - Computer Simulation of Developmental ...
Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of
Purex optimization by computer simulation
International Nuclear Information System (INIS)
Campbell, T.G.; McKibben, J.M.
1980-08-01
For the past 2 years computer simulation has been used to study the performance of several solvent extraction banks in the Purex facility at the Savannah River Plant in Aiken, South Carolina. Individual process parameters were varied about their normal base case values to determine their individual effects on concentration profiles and end-stream compositions. The data are presented in graphical form to show the extent to which product losses, decontamination factors, solvent extraction bank inventories of fissile materials, and other key properties are affected by process changes. Presented in this way, the data are useful for adapting flowsheet conditions to a particular feed material or product specification, and for evaluating nuclear safety as related to bank inventories
Computer-Graphical Simulation Of Robotic Welding
Fernandez, Ken; Cook, George
1988-01-01
Computer program ROBOSIM, developed to simulate operations of robots, applied to preliminary design of robotic arc-welding operation. Limitations on equipment investigated in advance to prevent expensive mistakes. Computer makes drawing of robotic welder and workpiece on positioning table. Such numerical simulation used to perform rapid, safe experiments in computer-aided design or manufacturing.
QCE : A Simulator for Quantum Computer Hardware
Michielsen, Kristel; Raedt, Hans De
2003-01-01
The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms.
Computer simulation of language competition by physicists
Schulze, Christian; Stauffer, Dietrich
2006-01-01
Computer simulation of languages is an old subject, but since the paper of Abrams and Strogatz (2003) several physics groups independently took up this field. We shortly review their work and bring more details on our own simulations.
Framework for utilizing computational devices within simulation
Directory of Open Access Journals (Sweden)
Miroslav Mintál
2013-12-01
Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.
Accounting Principles are Simulated on Quantum Computers
Diep, Do Ngoc; Giang, Do Hoang
2005-01-01
The paper is devoted to a new idea of simulation of accounting by quantum computing. We expose the actual accounting principles in a pure mathematics language. After that we simulated the accounting principles on quantum computers. We show that all arbitrary accounting actions are exhausted by the described basic actions. The main problem of accounting are reduced to some system of linear equations in the economic model of Leontief. In this simulation we use our constructed quantum Gau\\ss-Jor...
Efficient SDH Computation In Molecular Simulations Data.
Tu, Yi-Cheng; Chen, Shaoping; Pandit, Sagar; Kumar, Anand; Grupcev, Vladimir
2012-10-01
Analysis of large particle or molecular simulation data is integral part of the basic-science research community. It often involves computing functions such as point-to-point interactions of particles. Spatial distance histogram (SDH) is one such vital computation in scientific discovery. SDH is frequently used to compute Radial Distribution Function (RDF), and it takes quadratic time to compute using naive approach. Naive SDH computation is even more expensive as it is computed continuously over certain period of time to analyze simulation systems. Tree-based SDH computation is a popular approach. In this paper we look at different tree-based SDH computation techniques and briefly discuss about their performance. We present different strategies to improve the performance of these techniques. Specifically, we study the density map (DM) based SDH computation techniques. A DM is essentially a grid dividing simulated space into cells (3D cubes) of equal size (volume), which can be easily implemented by augmenting a Quad-tree (or Oct-tree) index. DMs are used in various configurations to compute SDH continuously over snapshots of the simulation system. The performance improvements using some of these configurations is presented in this paper. We also present the effect of utilizing computation power of Graphics Processing Units (GPUs) in computing SDH.
Computer Simulations of Molecular Propellers
National Research Council Canada - National Science Library
Vacek, Jaroslav
1999-01-01
...). The new program will be used to explore computationally a variety of possible structures for the synthesis of new materials capable acquiring significant internal mechanical angular momentum along...
Computer simulation of nonequilibrium processes
International Nuclear Information System (INIS)
Hoover, W.G.; Moran, B.; Holian, B.L.; Posch, H.A.; Bestiale, S.
1987-01-01
Recent atomistic simulations of irreversible macroscopic hydrodynamic flows are illustrated. An extension of Nose's reversible atomistic mechanics makes it possible to simulate such non-equilibrium systems with completely reversible equations of motion. The new techniques show that macroscopic irreversibility is a natural inevitable consequence of time-reversible Lyapunov-unstable microscopic equations of motion
Micro-computer simulation software: A review
Directory of Open Access Journals (Sweden)
P.S. Kruger
2003-12-01
Full Text Available Simulation modelling has proved to be one of the most powerful tools available to the Operations Research Analyst. The development of micro-computer technology has reached a state of maturity where the micro-computer can provide the necessary computing power and consequently various powerful and inexpensive simulation languages for micro-computers have became available. This paper will attempt to provide an introduction to the general philosophy and characteristics of some of the available micro-computer simulation languages. The emphasis will be on the characteristics of the specific micro-computer implementation rather than on a comparison of the modelling features of the various languages. Such comparisons may be found elsewhere.
Computer simulation in physics and engineering
Steinhauser, Martin Oliver
2013-01-01
This work is a needed reference for widely used techniques and methods of computer simulation in physics and other disciplines, such as materials science. The work conveys both: the theoretical foundations of computer simulation as well as applications and "tricks of the trade", that often are scattered across various papers. Thus it will meet a need and fill a gap for every scientist who needs computer simulations for his/her task at hand. In addition to being a reference, case studies and exercises for use as course reading are included.
Computer simulation of bounded plasmas
International Nuclear Information System (INIS)
Lawson, W.S.
1987-01-01
The problems of simulating a one-dimensional bounded plasma system using particles in a gridded space are systematically explored and solutions to them are given. Such problems include the injection of particles at the boundaries, the solution of Poisson's equation, and the inclusion of an external circuit between the confining boundaries. A recently discovered artificial cooling effect is explained as being a side-effect of quiet injection, and its potential for causing serious but subtle errors in bounded simulation is noted. The methods described in the first part of the thesis are then applied to the simulation of an extension of the Pierce diode problem, specifically a Pierce diode modified by an external circuit between the electrodes. The results of these simulations agree to high accuracy with theory when a theory exists, and also show some interesting chaotic behavior in certain parameter regimes. The chaotic behavior is described in detail
Computer Based Modelling and Simulation
Indian Academy of Sciences (India)
leaving students. It is a probabilistic model. In the next part of this article, two more models - 'input/output model' used for production systems or economic studies and a. 'discrete event simulation model' are introduced. Aircraft Performance Model.
Evaluation of Visual Computer Simulator for Computer Architecture Education
Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio
2013-01-01
This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…
Augmented Reality Simulations on Handheld Computers
Squire, Kurt; Klopfer, Eric
2007-01-01
Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…
Computer Simulation in Information and Communication Engineering
Anton Topurov
2005-01-01
CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...
Computer Simulations, Disclosure and Duty of Care
Directory of Open Access Journals (Sweden)
John Barlow
2006-05-01
Full Text Available Computer simulations provide cost effective methods for manipulating and modeling 'reality'. However they are not real. They are imitations of a system or event, real or fabricated, and as such mimic, duplicate or represent that system or event. The degree to which a computer simulation aligns with and reproduces the ‘reality’ of the system or event it attempts to mimic or duplicate depends upon many factors including the efficiency of the simulation algorithm, the processing power of the computer hardware used to run the simulation model, and the expertise, assumptions and prejudices of those concerned with designing, implementing and interpreting the simulation output. Computer simulations in particular are increasingly replacing physical experimentation in many disciplines, and as a consequence, are used to underpin quite significant decision-making which may impact on ‘innocent’ third parties. In this context, this paper examines two interrelated issues: Firstly, how much and what kind of information should a simulation builder be required to disclose to potential users of the simulation? Secondly, what are the implications for a decision-maker who acts on the basis of their interpretation of a simulation output without any reference to its veracity, which may in turn comprise the safety of other parties?
Computer Based Modelling and Simulation
Indian Academy of Sciences (India)
Most systems involve parameters and variables, which are random variables due to uncertainties. Probabilistic meth- ods are powerful in modelling such systems. In this second part, we describe probabilistic models and Monte Carlo simulation along with 'classical' matrix methods and differ- ential equations as most real ...
Computer Systems/Database Simulation.
1978-10-15
simulater, he runs the risk of running a simulation he does not understand. Technical documentation of CASE internals is virutally non- i-xi!;taiit; the...few users outside3 the domain of the team of researchers who worked to make the IPSS design methodology a reality . It was demonstrated effectively in
Discrete Event Simulation Computers can be used to simulate the ...
Indian Academy of Sciences (India)
IAS Admin
Science and Automation. Indian Institute of Science. Bangalore 560 012. Email: mjt@csa.iisc.ernet.in. Computers can be used to simulate the operation of complex systems and thereby study their performance. This article introduces you to the technique of discrete event simulation through a simple example. Introduction.
REACTOR: a computer simulation for schools
International Nuclear Information System (INIS)
Squires, D.
1985-01-01
The paper concerns computer simulation of the operation of a nuclear reactor, for use in schools. The project was commissioned by UKAEA, and carried out by the Computers in the Curriculum Project, Chelsea College. The program, for an advanced gas cooled reactor, is briefly described. (U.K.)
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Salesperson Ethics: An Interactive Computer Simulation
Castleberry, Stephen
2014-01-01
A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…
Learning and instruction with computer simulations
de Jong, Anthonius J.M.
1991-01-01
The present volume presents the results of an inventory of elements of such a computer learning environment. This inventory was conducted within a DELTA project called SIMULATE. In the project a learning environment that provides intelligent support to learners and that has a simulation as its
[Animal experimentation, computer simulation and surgical research].
Carpentier, Alain
2009-11-01
We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.
Computer simulations applied in materials
Energy Technology Data Exchange (ETDEWEB)
NONE
2003-07-01
This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)
Computer simulations applied in materials
International Nuclear Information System (INIS)
2003-01-01
This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La 2 Zr 2 O 7 pyrochlores; first principle calculations of defects formation energies in the Y 2 (Ti,Sn,Zr) 2 O 7 pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO 2 ; composition defect maps for A 3+ B 3+ O 3 perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)
Computer simulation of gear tooth manufacturing processes
Mavriplis, Dimitri; Huston, Ronald L.
1990-01-01
The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.
Computer graphics in heat-transfer simulations
International Nuclear Information System (INIS)
Hamlin, G.A. Jr.
1980-01-01
Computer graphics can be very useful in the setup of heat transfer simulations and in the display of the results of such simulations. The potential use of recently available low-cost graphics devices in the setup of such simulations has not been fully exploited. Several types of graphics devices and their potential usefulness are discussed, and some configurations of graphics equipment are presented in the low-, medium-, and high-price ranges
Cluster computing software for GATE simulations
International Nuclear Information System (INIS)
Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.
2007-01-01
Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values
Atomistic computer simulations a practical guide
Brazdova, Veronika
2013-01-01
Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o
Polymer Composites Corrosive Degradation: A Computational Simulation
Chamis, Christos C.; Minnetyan, Levon
2007-01-01
A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.
Creating science simulations through Computational Thinking Patterns
Basawapatna, Ashok Ram
Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.
Computer Code for Nanostructure Simulation
Filikhin, Igor; Vlahovic, Branislav
2009-01-01
Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.
Computer simulation of bubble formation
International Nuclear Information System (INIS)
Insepov, Z.; Bazhirov, T.; Norman, G.; Stegailov, V.
2007-01-01
Properties of liquid metals (Li, Pb, Na) containing nano-scale cavities were studied by atomistic Molecular Dynamics (MD). Two atomistic models of cavity simulation were developed that cover a wide area in the phase diagram with negative pressure. In the first model, the thermodynamics of cavity formation, stability and the dynamics of cavity evolution in bulk liquid metals have been studied. Radial densities, pressures, surface tensions, and work functions of nano-scale cavities of various radii were calculated for liquid Li, Na, and Pb at various temperatures and densities, and at small negative pressures near the liquid-gas spinodal, and the work functions for cavity formation in liquid Li were calculated and compared with the available experimental data. The cavitation rate can further be obtained by using the classical nucleation theory (CNT). The second model is based on the stability study and on the kinetics of cavitation of the stretched liquid metals. A MD method was used to simulate cavitation in a metastable Pb and Li melts and determine the stability limits. States at temperatures below critical (T < 0.5 Tc) and large negative pressures were considered. The kinetic boundary of liquid phase stability was shown to be different from the spinodal. The kinetics and dynamics of cavitation were studied. The pressure dependences of cavitation frequencies were obtained for several temperatures. The results of MD calculations were compared with estimates based on classical nucleation theory. (authors)
Computer simulation of thermal plant operations
O'Kelly, Peter
2012-01-01
This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.
Computer Simulations of Lipid Nanoparticles
Directory of Open Access Journals (Sweden)
Xavier F. Fernandez-Luengo
2017-12-01
Full Text Available Lipid nanoparticles (LNP are promising soft matter nanomaterials for drug delivery applications. In spite of their interest, little is known about the supramolecular organization of the components of these self-assembled nanoparticles. Here, we present a molecular dynamics simulation study, employing the Martini coarse-grain forcefield, of self-assembled LNPs made by tripalmitin lipid in water. We also study the adsorption of Tween 20 surfactant as a protective layer on top of the LNP. We show that, at 310 K (the temperature of interest in biological applications, the structure of the lipid nanoparticles is similar to that of a liquid droplet, in which the lipids show no nanostructuration and have high mobility. We show that, for large enough nanoparticles, the hydrophilic headgroups develop an interior surface in the NP core that stores liquid water. The surfactant is shown to organize in an inhomogeneous way at the LNP surface, with patches with high surfactant concentrations and surface patches not covered by surfactant.
Structural Composites Corrosive Management by Computational Simulation
Chamis, Christos C.; Minnetyan, Levon
2006-01-01
A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.
Time reversibility, computer simulation, and chaos
Hoover, William Graham
1999-01-01
A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful
Perspective: Computer simulations of long time dynamics
Energy Technology Data Exchange (ETDEWEB)
Elber, Ron [Department of Chemistry, The Institute for Computational Engineering and Sciences, University of Texas at Austin, Austin, Texas 78712 (United States)
2016-02-14
Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.
Uncertainty and error in computational simulations
Energy Technology Data Exchange (ETDEWEB)
Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.
1997-10-01
The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.
Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri
2017-01-01
Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…
Discrete Event Simulation Computers can be used to simulate the ...
Indian Academy of Sciences (India)
IAS Admin
systems and thereby study their performance. This article introduces you to the technique of discrete event simulation through a simple example. Introduction. Computers are playing an increasingly integral part in our daily lives. Communication, entertainment, finance, education, governance, health care … the list of areas ...
Progress in Computational Simulation of Earthquakes
Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert
2006-01-01
GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).
Designing Online Scaffolds for Interactive Computer Simulation
Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan
2013-01-01
The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high school…
Numerical Implementation and Computer Simulation of Tracer ...
African Journals Online (AJOL)
Numerical Implementation and Computer Simulation of Tracer Experiments in a Physical Aquifer Model. ... African Research Review ... A sensitivity analysis showed that the time required for complete source depletion, was most dependent on the source definition and the hydraulic conductivity K of the porous medium.
Computer simulations of phospholipid - membrane thermodynamic fluctuations
DEFF Research Database (Denmark)
Pedersen, U.R.; Peters, Günther H.j.; Schröder, T.B.
2008-01-01
This paper reports all-atom computer simulations of five phospholipid membranes, DMPC, DPPC, DMPG, DMPS, and DMPSH, with a focus on the thermal equilibrium fluctuations of volume, energy, area, thickness, and order parameter. For the slow fluctuations at constant temperature and pressure (defined...
A parallel computational model for GATE simulations.
Rannou, F R; Vega-Acevedo, N; El Bitar, Z
2013-12-01
GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Interoceanic canal excavation scheduling via computer simulation
International Nuclear Information System (INIS)
Baldonado, Orlino C.
1970-01-01
The computer simulation language GPSS/360 was used to simulate the schedule of several nuclear detonation programs for the interoceanic canal project. The effects of using different weather restriction categories due to air blast and fallout were investigated. The effect of increasing the number of emplacement and stemming crews and the effect of varying the reentry period after detonating a row charge or salvo were also studied. Detonation programs were simulated for the proposed Routes 17A and 25E. The study demonstrates the method of using computer simulation so that a schedule and its associated constraints can be assessed for feasibility. Since many simulation runs can be made for a given set of detonation program constraints, one readily obtains an average schedule for a range of conditions. This provides a method for analyzing time-sensitive operations so that time and cost-effective operational schedules can be established. A comparison of the simulated schedules with those that were published shows them to be similar. (author)
Computational Intelligence for Medical Imaging Simulations.
Chang, Victor
2017-11-25
This paper describes how to simulate medical imaging by computational intelligence to explore areas that cannot be easily achieved by traditional ways, including genes and proteins simulations related to cancer development and immunity. This paper has presented simulations and virtual inspections of BIRC3, BIRC6, CCL4, KLKB1 and CYP2A6 with their outputs and explanations, as well as brain segment intensity due to dancing. Our proposed MapReduce framework with the fusion algorithm can simulate medical imaging. The concept is very similar to the digital surface theories to simulate how biological units can get together to form bigger units, until the formation of the entire unit of biological subject. The M-Fusion and M-Update function by the fusion algorithm can achieve a good performance evaluation which can process and visualize up to 40 GB of data within 600 s. We conclude that computational intelligence can provide effective and efficient healthcare research offered by simulations and visualization.
Computer simulation on molten ionic salts
International Nuclear Information System (INIS)
Kawamura, K.; Okada, I.
1978-01-01
The extensive advances in computer technology have since made it possible to apply computer simulation to the evaluation of the macroscopic and microscopic properties of molten salts. The evaluation of the potential energy in molten salts systems is complicated by the presence of long-range energy, i.e. Coulomb energy, in contrast to simple liquids where the potential energy is easily evaluated. It has been shown, however, that no difficulties are encountered when the Ewald method is applied to the evaluation of Coulomb energy. After a number of attempts had been made to approximate the pair potential, the Huggins-Mayer potential based on ionic crystals became the most often employed. Since it is thought that the only appreciable contribution to many-body potential, not included in Huggins-Mayer potential, arises from the internal electrostatic polarization of ions in molten ionic salts, computer simulation with a provision for ion polarization has been tried recently. The computations, which are employed mainly for molten alkali halides, can provide: (1) thermodynamic data such as internal energy, internal pressure and isothermal compressibility; (2) microscopic configurational data such as radial distribution functions; (3) transport data such as the diffusion coefficient and electrical conductivity; and (4) spectroscopic data such as the intensity of inelastic scattering and the stretching frequency of simple molecules. The computed results seem to agree well with the measured results. Computer simulation can also be used to test the effectiveness of a proposed pair potential and the adequacy of postulated models of molten salts, and to obtain experimentally inaccessible data. A further application of MD computation employing the pair potential based on an ionic model to BeF 2 , ZnCl 2 and SiO 2 shows the possibility of quantitative interpretation of structures and glass transformation phenomena
Fluid dynamics theory, computation, and numerical simulation
Pozrikidis, C
2001-01-01
Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...
Computer simulations improve university instructional laboratories.
Gibbons, Nicola J; Evans, Chris; Payne, Annette; Shah, Kavita; Griffin, Darren K
2004-01-01
Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and "wet" laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff.
Computer Simulations Improve University Instructional Laboratories1
2004-01-01
Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599
Computational fluid dynamics for sport simulation
2009-01-01
All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.
Memory interface simulator: A computer design aid
Taylor, D. S.; Williams, T.; Weatherbee, J. E.
1972-01-01
Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.
Time reversibility, computer simulation, algorithms, chaos
Hoover, William Graham
2012-01-01
A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...
Computational plasticity algorithm for particle dynamics simulations
Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.
2018-01-01
The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.
Computer simulation of complexity in plasmas
International Nuclear Information System (INIS)
Hayashi, Takaya; Sato, Tetsuya
1998-01-01
By making a comprehensive comparative study of many self-organizing phenomena occurring in magnetohydrodynamics and kinetic plasmas, we came up with a hypothetical grand view of self-organization. This assertion is confirmed by a recent computer simulation for a broader science field, specifically, the structure formation of short polymer chains, where the nature of the interaction is completely different from that of plasmas. It is found that the formation of the global orientation order proceeds stepwise. (author)
Computer simulation of the aluminium extrusion process
Directory of Open Access Journals (Sweden)
A. Śliwa
2017-01-01
Full Text Available The purpose of the work is computer simulation of the aluminium extrusion process using the Finite elements method (FEM. The impact of the speed of a punch falling on the material in the aluminium extrusion process was investigated. It was found that high stresses are created, leading to material destruction, if the punch is falling too fast. The design cycle is significantly reduced in multiple industrial applications if the FEM is applied, which enhances productivity and profits.
Computer simulation of displacement cascades in copper
International Nuclear Information System (INIS)
Heinisch, H.L.
1983-06-01
More than 500 displacement cascades in copper have been generated with the computer simulation code MARLOWE over an energy range pertinent to both fission and fusion neutron spectra. Three-dimensional graphical depictions of selected cascades, as well as quantitative analysis of cascade shapes and sizes and defect densities, illustrate cascade behavior as a function of energy. With increasing energy, the transition from production of single compact damage regions to widely spaced multiple damage regions is clearly demonstrated
Accelerating Climate Simulations Through Hybrid Computing
Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark
2009-01-01
Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.
Computer simulation in nuclear science and engineering
International Nuclear Information System (INIS)
Akiyama, Mamoru; Miya, Kenzo; Iwata, Shuichi; Yagawa, Genki; Kondo, Shusuke; Hoshino, Tsutomu; Shimizu, Akinao; Takahashi, Hiroshi; Nakagawa, Masatoshi.
1992-01-01
The numerical simulation technology used for the design of nuclear reactors includes the scientific fields of wide range, and is the cultivated technology which grew in the steady efforts to high calculation accuracy through safety examination, reliability verification test, the assessment of operation results and so on. Taking the opportunity of putting numerical simulation to practical use in wide fields, the numerical simulation of five basic equations which describe the natural world and the progress of its related technologies are reviewed. It is expected that numerical simulation technology contributes to not only the means of design study but also the progress of science and technology such as the construction of new innovative concept, the exploration of new mechanisms and substances, of which the models do not exist in the natural world. The development of atomic energy and the progress of computers, Boltzmann's transport equation and its periphery, Navier-Stokes' equation and its periphery, Maxwell's electromagnetic field equation and its periphery, Schroedinger wave equation and its periphery, computational solid mechanics and its periphery, and probabilistic risk assessment and its periphery are described. (K.I.)
HTTR plant dynamic simulation using a hybrid computer
International Nuclear Information System (INIS)
Shimazaki, Junya; Suzuki, Katsuo; Nabeshima, Kunihiko; Watanabe, Koichi; Shinohara, Yoshikuni; Nakagawa, Shigeaki.
1990-01-01
A plant dynamic simulation of High-Temperature Engineering Test Reactor has been made using a new-type hybrid computer. This report describes a dynamic simulation model of HTTR, a hybrid simulation method for SIMSTAR and some results obtained from dynamics analysis of HTTR simulation. It concludes that the hybrid plant simulation is useful for on-line simulation on account of its capability of computation at high speed, compared with that of all digital computer simulation. With sufficient accuracy, 40 times faster computation than real time was reached only by changing an analog time scale for HTTR simulation. (author)
MDGRAPE-4: a special-purpose computer system for molecular dynamics simulations.
Ohmura, Itta; Morimoto, Gentaro; Ohno, Yousuke; Hasegawa, Aki; Taiji, Makoto
2014-08-06
We are developing the MDGRAPE-4, a special-purpose computer system for molecular dynamics (MD) simulations. MDGRAPE-4 is designed to achieve strong scalability for protein MD simulations through the integration of general-purpose cores, dedicated pipelines, memory banks and network interfaces (NIFs) to create a system on chip (SoC). Each SoC has 64 dedicated pipelines that are used for non-bonded force calculations and run at 0.8 GHz. Additionally, it has 65 Tensilica Xtensa LX cores with single-precision floating-point units that are used for other calculations and run at 0.6 GHz. At peak performance levels, each SoC can evaluate 51.2 G interactions per second. It also has 1.8 MB of embedded shared memory banks and six network units with a peak bandwidth of 7.2 GB s(-1) for the three-dimensional torus network. The system consists of 512 (8×8×8) SoCs in total, which are mounted on 64 node modules with eight SoCs. The optical transmitters/receivers are used for internode communication. The expected maximum power consumption is 50 kW. While MDGRAPE-4 software has still been improved, we plan to run MD simulations on MDGRAPE-4 in 2014. The MDGRAPE-4 system will enable long-time molecular dynamics simulations of small systems. It is also useful for multiscale molecular simulations where the particle simulation parts often become bottlenecks.
Energy Technology Data Exchange (ETDEWEB)
Jin, Zheming [Argonne National Lab. (ANL), Argonne, IL (United States); Yoshii, Kazutomo [Argonne National Lab. (ANL), Argonne, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)
2017-04-20
Open Computing Language (OpenCL) is a high-level language that enables software programmers to explore Field Programmable Gate Arrays (FPGAs) for application acceleration. The Intel FPGA software development kit (SDK) for OpenCL allows a user to specify applications at a high level and explore the performance of low-level hardware acceleration. In this report, we present the FPGA performance and power consumption results of the single-precision floating-point vector add OpenCL kernel using the Intel FPGA SDK for OpenCL on the Nallatech 385A FPGA board. The board features an Arria 10 FPGA. We evaluate the FPGA implementations using the compute unit duplication and kernel vectorization optimization techniques. On the Nallatech 385A FPGA board, the maximum compute kernel bandwidth we achieve is 25.8 GB/s, approximately 76% of the peak memory bandwidth. The power consumption of the FPGA device when running the kernels ranges from 29W to 42W.
New Computer Simulations of Macular Neural Functioning
Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.
1994-01-01
We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.
Simulation and computation in health physics training
International Nuclear Information System (INIS)
Lakey, S.R.A.; Gibbs, D.C.C.; Marchant, C.P.
1980-01-01
The Royal Naval College has devised a number of computer aided learning programmes applicable to health physics which include radiation shield design and optimisation, environmental impact of a reactor accident, exposure levels produced by an inert radioactive gas cloud, and the prediction of radiation detector response in various radiation field conditions. Analogue computers are used on reduced or fast time scales because time dependent phenomenon are not always easily assimilated in real time. The build-up and decay of fission products, the dynamics of intake of radioactive material and reactor accident dynamics can be effectively simulated. It is essential to relate these simulations to real time and the College applies a research reactor and analytical phantom to this end. A special feature of the reactor is a chamber which can be supplied with Argon-41 from reactor exhaust gases to create a realistic gaseous contamination environment. Reactor accident situations are also taught by using role playing sequences carried out in real time in the emergency facilities associated with the research reactor. These facilities are outlined and the training technique illustrated with examples of the calculations and simulations. The training needs of the future are discussed, with emphasis on optimisation and cost-benefit analysis. (H.K.)
Parallel Proximity Detection for Computer Simulation
Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)
1997-01-01
The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.
Parallel Proximity Detection for Computer Simulations
Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)
1998-01-01
The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.
Computer simulation of spacecraft/environment interaction
Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V
1999-01-01
This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.
Computer simulation of spacecraft/environment interaction
International Nuclear Information System (INIS)
Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.
1999-01-01
This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language
Investigation of Carbohydrate Recognition via Computer Simulation
Directory of Open Access Journals (Sweden)
Quentin R. Johnson
2015-04-01
Full Text Available Carbohydrate recognition by proteins, such as lectins and other (biomolecules, can be essential for many biological functions. Recently, interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. We focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.
Computer Simulations of Lipid Bilayers and Proteins
DEFF Research Database (Denmark)
Sonne, Jacob
2006-01-01
entitled Computer simulations of lipid bilayers and proteins describes two molecular dynamics (MD) simulation studies of pure lipid bilayers as well as a study of a transmembrane protein embedded in a lipid bilayer matrix. Below follows a brief overview of the thesis. Chapter 1. This chapter is a short...... introduction, where I briefly describe the basic biological background for the systems studied in Chapters 3, 4 and 5. This is done in a non-technical way to allow the general interested reader to get an impression of the work. Chapter 2, Methods: In this chapter the background for the methods used......, Pressure profile calculations in lipid bilayers: A lipid bilayer is merely $\\sim$5~nm thick, but the lateral pressure (parallel to the bilayer plane) varies several hundred bar on this short distance (normal to the bilayer). These variations in the lateral pressure are commonly referred to as the pressure...
Computer Simulations of Intrinsically Disordered Proteins
Chong, Song-Ho; Chatterjee, Prathit; Ham, Sihyun
2017-05-01
The investigation of intrinsically disordered proteins (IDPs) is a new frontier in structural and molecular biology that requires a new paradigm to connect structural disorder to function. Molecular dynamics simulations and statistical thermodynamics potentially offer ideal tools for atomic-level characterizations and thermodynamic descriptions of this fascinating class of proteins that will complement experimental studies. However, IDPs display sensitivity to inaccuracies in the underlying molecular mechanics force fields. Thus, achieving an accurate structural characterization of IDPs via simulations is a challenge. It is also daunting to perform a configuration-space integration over heterogeneous structural ensembles sampled by IDPs to extract, in particular, protein configurational entropy. In this review, we summarize recent efforts devoted to the development of force fields and the critical evaluations of their performance when applied to IDPs. We also survey recent advances in computational methods for protein configurational entropy that aim to provide a thermodynamic link between structural disorder and protein activity.
The learning effects of computer simulations in science education
Rutten, N.P.G.; van Joolingen, Wouter; van der Veen, Jan T.
2012-01-01
This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to
Computer generated timing diagrams to supplement simulation
Booth, A W
1981-01-01
The ISPS computer description language has been used in a simulation study to specify the components of a high speed data acquisition system and its protocols. A facility has been developed for automatically generating timing diagrams from the specification of the data acquisition system written in the ISPS description language. Diagrams can be generated for both normal and abnormal working modes of the system. They are particularly useful for design and debugging in the prototyping stage of a project and can be later used for reference by maintenance engineers. (11 refs).
Computer simulation of replacement sequences in copper
International Nuclear Information System (INIS)
Schiffgens, J.O.; Schwartz, D.W.; Ariyasu, R.G.; Cascadden, S.E.
1978-01-01
Results of computer simulations of , , and replacement sequences in copper are presented, including displacement thresholds, focusing energies, energy losses per replacement, and replacement sequence lengths. These parameters are tabulated for six interatomic potentials and shown to vary in a systematic way with potential stiffness and range. Comparisons of results from calculations made with ADDES, a quasi-dynamical code, and COMENT, a dynamical code, show excellent agreement, demonstrating that the former can be calibrated and used satisfactorily in the analysis of low energy displacement cascades. Upper limits on , , and replacement sequences were found to be approximately 10, approximately 30, and approximately 14 replacements, respectively. (author)
Reproducibility in Computational Neuroscience Models and Simulations
McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.
2016-01-01
Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845
A Computational Framework for Bioimaging Simulation
Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi
2015-01-01
Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508
A Computational Framework for Bioimaging Simulation.
Directory of Open Access Journals (Sweden)
Masaki Watabe
Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.
Computer simulations for the nano-scale
International Nuclear Information System (INIS)
Stich, I.
2007-01-01
A review of methods for computations for the nano-scale is presented. The paper should provide a convenient starting point into computations for the nano-scale as well as a more in depth presentation for those already working in the field of atomic/molecular-scale modeling. The argument is divided in chapters covering the methods for description of the (i) electrons, (ii) ions, and (iii) techniques for efficient solving of the underlying equations. A fairly broad view is taken covering the Hartree-Fock approximation, density functional techniques and quantum Monte-Carlo techniques for electrons. The customary quantum chemistry methods, such as post Hartree-Fock techniques, are only briefly mentioned. Description of both classical and quantum ions is presented. The techniques cover Ehrenfest, Born-Oppenheimer, and Car-Parrinello dynamics. The strong and weak points of both principal and technical nature are analyzed. In the second part we introduce a number of applications to demonstrate the different approximations and techniques introduced in the first part. They cover a wide range of applications such as non-simple liquids, surfaces, molecule-surface interactions, applications in nano technology, etc. These more in depth presentations, while certainly not exhaustive, should provide information on technical aspects of the simulations, typical parameters used, and ways of analysis of the huge amounts of data generated in these large-scale supercomputer simulations. (author)
Computational Modeling and Simulation of Developmental ...
Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic
Computer Simulation of Developmental Processes and ...
Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of
Computer simulation of surface and film processes
Tiller, W. A.; Halicioglu, M. T.
1984-01-01
All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.
Computer Simulation of the UMER Gridded Gun
Haber, Irving; Friedman, Alex; Grote, D P; Kishek, Rami A; Reiser, Martin; Vay, Jean-Luc; Zou, Yun
2005-01-01
The electron source in the University of Maryland Electron Ring (UMER) injector employs a grid 0.15 mm from the cathode to control the current waveform. Under nominal operating conditions, the grid voltage during the current pulse is sufficiently positive relative to the cathode potential to form a virtual cathode downstream of the grid. Three-dimensional computer simulations have been performed that use the mesh refinement capability of the WARP particle-in-cell code to examine a small region near the beam center in order to illustrate some of the complexity that can result from such a gridded structure. These simulations have been found to reproduce the hollowed velocity space that is observed experimentally. The simulations also predict a complicated time-dependent response to the waveform applied to the grid during the current turn-on. This complex temporal behavior appears to result directly from the dynamics of the virtual cathode formation and may therefore be representative of the expected behavior in...
Evaluation of Marine Corps Manpower Computer Simulation Model
2016-12-01
overall end strength are maintained. To assist their mission, an agent-based computer simulation model was developed in the Java computer language...maintained. To assist their mission, an agent-based computer simulation model was developed in the Java computer language. This thesis investigates that...a simulation software that models business practices to assist that business in its “ability to analyze and make decisions on how to improve (their
Computer Simulation of Electron Positron Annihilation Processes
Energy Technology Data Exchange (ETDEWEB)
Chen, y
2003-10-02
With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create
Computational simulation methods for composite fracture mechanics
Murthy, Pappu L. N.
1988-01-01
Structural integrity, durability, and damage tolerance of advanced composites are assessed by studying damage initiation at various scales (micro, macro, and global) and accumulation and growth leading to global failure, quantitatively and qualitatively. In addition, various fracture toughness parameters associated with a typical damage and its growth must be determined. Computational structural analysis codes to aid the composite design engineer in performing these tasks were developed. CODSTRAN (COmposite Durability STRuctural ANalysis) is used to qualitatively and quantitatively assess the progressive damage occurring in composite structures due to mechanical and environmental loads. Next, methods are covered that are currently being developed and used at Lewis to predict interlaminar fracture toughness and related parameters of fiber composites given a prescribed damage. The general purpose finite element code MSC/NASTRAN was used to simulate the interlaminar fracture and the associated individual as well as mixed-mode strain energy release rates in fiber composites.
Amorphous nanoparticles — Experiments and computer simulations
International Nuclear Information System (INIS)
Hoang, Vo Van; Ganguli, Dibyendu
2012-01-01
The data obtained by both experiments and computer simulations concerning the amorphous nanoparticles for decades including methods of synthesis, characterization, structural properties, atomic mechanism of a glass formation in nanoparticles, crystallization of the amorphous nanoparticles, physico-chemical properties (i.e. catalytic, optical, thermodynamic, magnetic, bioactivity and other properties) and various applications in science and technology have been reviewed. Amorphous nanoparticles coated with different surfactants are also reviewed as an extension in this direction. Much attention is paid to the pressure-induced polyamorphism of the amorphous nanoparticles or amorphization of the nanocrystalline counterparts. We also introduce here nanocomposites and nanofluids containing amorphous nanoparticles. Overall, amorphous nanoparticles exhibit a disordered structure different from that of corresponding bulks or from that of the nanocrystalline counterparts. Therefore, amorphous nanoparticles can have unique physico-chemical properties differed from those of the crystalline counterparts leading to their potential applications in science and technology.
Experiential Learning through Computer-Based Simulations.
Maynes, Bill; And Others
1992-01-01
Describes experiential learning instructional model and simulation for student principals. Describes interactive laser videodisc simulation. Reports preliminary findings about student principal learning from simulation. Examines learning approaches by unsuccessful and successful students and learning levels of model learners. Simulation's success…
Computer simulation, nuclear techniques and surface analysis
Directory of Open Access Journals (Sweden)
Reis, A. D.
2010-02-01
Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of ^{12}C and ^{18}O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.
Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de ^{12}C y de ^{18}O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.
Computer-Based Simulation Games in Public Administration Education
Kutergina Evgeniia
2017-01-01
Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...
Introducing Computational Physics in Introductory Physics using Intentionally Incorrect Simulations
Cox, Anne
2011-03-01
Students in physics courses routinely use and trust computer simulations. Finding errors in intentionally incorrect simulations can help students learn physics, be more skeptical of simulations, and provide an initial introduction to computational physics. This talk will provide examples of electrostatics simulations that students can correct using Easy Java Simulations and are housed in the Open Source Physics Collection on ComPADRE (http://www.compadre.org/osp). Partial support through the Open Source Physics Project, NSF DUE-0442581.
Associative Memory Computing Power and Its Simulation
Volpi, G; The ATLAS collaboration
2014-01-01
The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...
Associative Memory computing power and its simulation
Ancu, L S; The ATLAS collaboration; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G
2014-01-01
The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...
Computer simulation of sputtering: A review
International Nuclear Information System (INIS)
Robinson, M.T.; Hou, M.
1992-08-01
In 1986, H. H. Andersen reviewed attempts to understand sputtering by computer simulation and identified several areas where further research was needed: potential energy functions for molecular dynamics (MD) modelling; the role of inelastic effects on sputtering, especially near the target surface; the modelling of surface binding in models based on the binary collision approximation (BCA); aspects of cluster emission in MD models; and angular distributions of sputtered particles. To these may be added kinetic energy distributions of sputtered particles and the relationships between MD and BCA models, as well as the development of intermediate models. Many of these topics are discussed. Recent advances in BCA modelling include the explicit evaluation of the time in strict BCA codes and the development of intermediate codes able to simulate certain many-particle problems realistically. Developments in MD modelling include the wide-spread use of many-body potentials in sputtering calculations, inclusion of realistic electron excitation and electron-phonon interactions, and several studies of cluster ion impacts on solid surfaces
Computer simulations of the mouse spermatogenic cycle
Directory of Open Access Journals (Sweden)
Debjit Ray
2014-12-01
Full Text Available The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal–spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.
Sonification of simulations in computational physics
International Nuclear Information System (INIS)
Vogt, K.
2010-01-01
Sonification is the translation of information for auditory perception, excluding speech itself. The cognitive performance of pattern recognition is striking for sound, and has too long been disregarded by the scientific mainstream. Examples of 'spontaneous sonification' and systematic research for about 20 years have proven that sonification provides a valuable tool for the exploration of scientific data. The data in this thesis stem from computational physics, where numerical simulations are applied to problems in physics. Prominent examples are spin models and lattice quantum field theories. The corresponding data lend themselves very well to innovative display methods: they are structured on discrete lattices, often stochastic, high-dimensional and abstract, and they provide huge amounts of data. Furthermore, they have no inher- ently perceptual dimension. When designing the sonification of simulation data, one has to make decisions on three levels, both for the data and the sound model: the level of meaning (phenomenological; metaphoric); of structure (in time and space), and of elements ('display units' vs. 'gestalt units'). The design usually proceeds as a bottom-up or top-down process. This thesis provides a 'toolbox' for helping in these decisions. It describes tools that have proven particularly useful in the context of simulation data. An explicit method of top-down sonification design is the metaphoric sonification method, which is based on expert interviews. Furthermore, qualitative and quantitative evaluation methods are presented, on the basis of which a set of evaluation criteria is proposed. The translation between a scientific and the sound synthesis domain is elucidated by a sonification operator. For this formalization, a collection of notation modules is provided. Showcases are discussed in detail that have been developed in the interdisciplinary research projects SonEnvir and QCD-audio, during the second Science By Ear workshop and during a
[Thoughts on and probes into computer simulation of acupuncture manipulation].
Hu, Yin'e; Liu, Tangyi; Tang, Wenchao; Xu, Gang; Gao, Ming; Yang, Huayuan
2011-08-01
The studies of the simulation of acupuncture manipulation mainly focus on mechanical simulation and virtual simulation (SIM). In terms of mechanical simulation, the aim of the research is to develop the instruments of the simulation of acupuncture manipulation, and to apply them to the simulation or a replacement of the manual acupuncture manipulation; while the virtual simulation applies the virtual reality technology to present the manipulation in 3D real-time on the computer screen. This paper is to summarize the recent research progress on computer simulation of acupuncture manipulation at home and abroad, and thus concludes with the significance and the rising problems over the computer simulation of acupuncture manipulation. Therefore we put forward that the research on simulation manipulation should pay much attention to experts' manipulation simulation, as well as the verification studies on conformity and clinical effects.
Application of computer simulated persons in indoor environmental modeling
DEFF Research Database (Denmark)
Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft
2002-01-01
Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...
Using Computational Simulations to Confront Students' Mental Models
Rodrigues, R.; Carvalho, P. Simeão
2014-01-01
In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…
Proceedings of the meeting on large scale computer simulation research
International Nuclear Information System (INIS)
2004-04-01
The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)
The role of computer simulation in nuclear technologies development
International Nuclear Information System (INIS)
Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.
2001-01-01
In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)
Supporting hypothesis generation by learners exploring an interactive computer simulation
van Joolingen, Wouter; de Jong, Anthonius J.M.
1992-01-01
Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This
Simulation of quantum many-body systems and quantum computer
International Nuclear Information System (INIS)
Long Lugui
2010-01-01
Benioff and Feynman independently proposed quantum computer from the need of reversible computing and simulation of many-body systems. In this talk, I will briefly review the development of quantum computer, and report the study of many-body interactions in simple quantum computer and related development. (authors)
Factors promoting engaged exploration with computer simulations
Directory of Open Access Journals (Sweden)
Noah S. Podolefsky
2010-10-01
Full Text Available This paper extends prior research on student use of computer simulations (sims to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze interviews with college students using PhET sims in order to demonstrate engaged exploration, and to identify factors that can promote this type of inquiry. With minimal explicit guidance, students explore the topic of wave interference in ways that bear similarity to how scientists explore phenomena. PhET sims are flexible tools which allow students to choose their own learning path, but also provide constraints such that students’ choices are generally productive. This type of inquiry is supported by sim features such as concrete connections to the real world, representations that are not available in the real world, analogies to help students make meaning of and connect across multiple representations and phenomena, and a high level of interactivity with real-time, dynamic feedback from the sim. These features of PhET sims enable students to pose questions and answer them in ways that may not be supported by more traditional educational materials.
Teaching Computer Organization and Architecture Using Simulation and FPGA Applications
D. K.M. Al-Aubidy
2007-01-01
This paper presents the design concepts and realization of incorporating micro-operation simulation and FPGA implementation into a teaching tool for computer organization and architecture. This teaching tool helps computer engineering and computer science students to be familiarized practically with computer organization and architecture through the development of their own instruction set, computer programming and interfacing experiments. A two-pass assembler has been designed and implemente...
Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.
Jolly, Laura D.; Sisler, Grovalynn
1988-01-01
The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…
The visual simulators for architecture and computer organization learning
Nikolić Boško; Grbanović Nenad; Đorđević Jovan
2009-01-01
The paper proposes a method of an effective distance learning of architecture and computer organization. The proposed method is based on a software system that is possible to be applied in any course in this field. Within this system students are enabled to observe simulation of already created computer systems. The system provides creation and simulation of switch systems, too.
Computer Simulation (Microcultures): An Effective Model for Multicultural Education.
Nelson, Jorge O.
This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…
Overview of Computer Simulation Modeling Approaches and Methods
Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett
2005-01-01
The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...
A note on simulated annealing to computer laboratory scheduling ...
African Journals Online (AJOL)
Simulated Annealing algorithm is used in solving real life problem of Computer Laboratory scheduling in order to maximize the use of scarce and insufficient resources. KEY WORDS: Simulated Annealing (SA), Computer Laboratory Scheduling, Statistical Thermodynamics, Energy Function, and Heuristic etc. Global Jnl of ...
Simulation in computer forensics teaching: the student experience
Crellin, Jonathan; Adda, Mo; Duke-Williams, Emma; Chandler, Jane
2011-01-01
The use of simulation in teaching computing is well established, with digital forensic investigation being a subject area where the range of simulation required is both wide and varied demanding a corresponding breadth of fidelity. Each type of simulation can be complex and expensive to set up resulting in students having only limited opportunities to participate and learn from the simulation. For example students' participation in mock trials in the University mock courtroom or in simulation...
Simulation of the stress computation in shells
Salama, M.; Utku, S.
1978-01-01
A self-teaching computer program is described, whereby the stresses in thin shells can be computed with good accuracy using the best fit approach. The program is designed for use in interactive game mode to allow the structural engineer to learn about (1) the major sources of difficulties and associated errors in the computation of stresses in thin shells, (2) possible ways to reduce the errors, and (3) trade-off between computational cost and accuracy. Included are derivation of the computational approach, program description, and several examples illustrating the program usage.
Challenges & Roadmap for Beyond CMOS Computing Simulation.
Energy Technology Data Exchange (ETDEWEB)
Rodrigues, Arun F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frank, Michael P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-12-01
Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).
GPU-accelerated micromagnetic simulations using cloud computing
International Nuclear Information System (INIS)
Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.
2016-01-01
Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.
General-purpose parallel simulator for quantum computing
International Nuclear Information System (INIS)
Niwa, Jumpei; Matsumoto, Keiji; Imai, Hiroshi
2002-01-01
With current technologies, it seems to be very difficult to implement quantum computers with many qubits. It is therefore of importance to simulate quantum algorithms and circuits on the existing computers. However, for a large-size problem, the simulation often requires more computational power than is available from sequential processing. Therefore, simulation methods for parallel processors are required. We have developed a general-purpose simulator for quantum algorithms/circuits on the parallel computer (Sun Enterprise4500). It can simulate algorithms/circuits with up to 30 qubits. In order to test efficiency of our proposed methods, we have simulated Shor's factorization algorithm and Grover's database search, and we have analyzed robustness of the corresponding quantum circuits in the presence of both decoherence and operational errors. The corresponding results, statistics, and analyses are presented in this paper
Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation
Stocker, John C.; Golomb, Andrew M.
2011-01-01
Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.
Alternative energy technologies an introduction with computer simulations
Buxton, Gavin
2014-01-01
Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe
Quantum computer gate simulations | Dada | Journal of the Nigerian ...
African Journals Online (AJOL)
A new interactive simulator for Quantum Computation has been developed for simulation of the universal set of quantum gates and for construction of new gates of up to 3 qubits. The simulator also automatically generates an equivalent quantum circuit for any arbitrary unitary transformation on a qubit. Available quantum ...
CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.
Skrein, Dale
1994-01-01
CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)
Computer simulation of grain growth in HAZ
Gao, Jinhua
Two different models for Monte Carlo simulation of normal grain growth in metals and alloys were developed. Each simulation model was based on a different approach to couple the Monte Carlo simulation time to real time-temperature. These models demonstrated the applicability of Monte Carlo simulation to grain growth in materials processing. A grain boundary migration (GBM) model coupled the Monte Carlo simulation to a first principle grain boundary migration model. The simulation results, by applying this model to isothermal grain growth in zone-refined tin, showed good agreement with experimental results. An experimental data based (EDB) model coupled the Monte Carlo simulation with grain growth kinetics obtained from the experiment. The results of the application of the EDB model to the grain growth during continuous heating of a beta titanium alloy correlated well with experimental data. In order to acquire the grain growth kinetics from the experiment, a new mathematical method was developed and utilized to analyze the experimental data on isothermal grain growth. Grain growth in the HAZ of 0.2% Cu-Al alloy was successfully simulated using the EDB model combined with grain growth kinetics obtained from the experiment and measured thermal cycles from the welding process. The simulated grain size distribution in the HAZ was in good agreement with experimental results. The pinning effect of second phase particles on grain growth was also simulated in this work. The simulation results confirmed that by introducing the variable R, degree of contact between grain boundaries and second phase particles, the Zener pinning model can be modified as${D/ r} = {K/{Rf}}$where D is the pinned grain size, r the mean size of second phase particles, K a constant, f the area fraction (or the volume fraction in 3-D) of second phase.
Computer simulation of proton channelling in silicon
Indian Academy of Sciences (India)
2000-06-12
Jun 12, 2000 ... (23). For the system in figure 7 this yields a value of. 2 = 325 ˚A in fair agreement with the computed value. 5. Conclusion. This paper reports an indigenously developed computer code for channelling of fast ions in crystals using Vineyard model and screened binary Coulombic potential. The paper reports.
Sophistication of computational science and fundamental physics simulations
International Nuclear Information System (INIS)
Ishiguro, Seiji; Ito, Atsushi; Usami, Shunsuke; Ohtani, Hiroaki; Sakagami, Hitoshi; Toida, Mieko; Hasegawa, Hiroki; Horiuchi, Ritoku; Miura, Hideaki
2016-01-01
Numerical experimental reactor research project is composed of the following studies: (1) nuclear fusion simulation research with a focus on specific physical phenomena of specific equipment, (2) research on advanced simulation method to increase predictability or expand its application range based on simulation, (3) visualization as the foundation of simulation research, (4) research for advanced computational science such as parallel computing technology, and (5) research aiming at elucidation of fundamental physical phenomena not limited to specific devices. Specifically, a wide range of researches with medium- to long-term perspectives are being developed: (1) virtual reality visualization, (2) upgrading of computational science such as multilayer simulation method, (3) kinetic behavior of plasma blob, (4) extended MHD theory and simulation, (5) basic plasma process such as particle acceleration due to interaction of wave and particle, and (6) research related to laser plasma fusion. This paper reviews the following items: (1) simultaneous visualization in virtual reality space, (2) multilayer simulation of collisionless magnetic reconnection, (3) simulation of microscopic dynamics of plasma coherent structure, (4) Hall MHD simulation of LHD, (5) numerical analysis for extension of MHD equilibrium and stability theory, (6) extended MHD simulation of 2D RT instability, (7) simulation of laser plasma, (8) simulation of shock wave and particle acceleration, and (9) study on simulation of homogeneous isotropic MHD turbulent flow. (A.O.)
Fluid dynamics theory, computation, and numerical simulation
Pozrikidis, C
2017-01-01
This book provides an accessible introduction to the basic theory of fluid mechanics and computational fluid dynamics (CFD) from a modern perspective that unifies theory and numerical computation. Methods of scientific computing are introduced alongside with theoretical analysis and MATLAB® codes are presented and discussed for a broad range of topics: from interfacial shapes in hydrostatics, to vortex dynamics, to viscous flow, to turbulent flow, to panel methods for flow past airfoils. The third edition includes new topics, additional examples, solved and unsolved problems, and revised images. It adds more computational algorithms and MATLAB programs. It also incorporates discussion of the latest version of the fluid dynamics software library FDLIB, which is freely available online. FDLIB offers an extensive range of computer codes that demonstrate the implementation of elementary and advanced algorithms and provide an invaluable resource for research, teaching, classroom instruction, and self-study. This ...
Inovation of the computer system for the WWER-440 simulator
International Nuclear Information System (INIS)
Schrumpf, L.
1988-01-01
The configuration of the WWER-440 simulator computer system consists of four SMEP computers. The basic data processing unit consists of two interlinked SM 52/11.M1 computers with 1 MB of main memory. This part of the computer system of the simulator controls the operation of the entire simulator, processes the programs of technology behavior simulation, of the unit information system and of other special systems, guarantees program support and the operation of the instructor's console. An SM 52/11 computer with 256 kB of main memory is connected to each unit. It is used as a communication unit for data transmission using the DASIO 600 interface. Semigraphic color displays are based on the microprocessor modules of the SM 50/40 and SM 53/10 kit supplemented with a modified TESLA COLOR 110 ST tv receiver. (J.B.). 1 fig
Methodology of modeling and measuring computer architectures for plasma simulations
Wang, L. P. T.
1977-01-01
A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.
Radiotherapy Monte Carlo simulation using cloud computing technology
International Nuclear Information System (INIS)
Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.
2012-01-01
Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.
Computer Simulation of Angle-measuring System of Photoelectric Theodolite
International Nuclear Information System (INIS)
Zeng, L; Zhao, Z W; Song, S L; Wang, L T
2006-01-01
In this paper, a virtual test platform based on malfunction phenomena is designed, using the methods of computer simulation and numerical mask. It is used in the simulation training of angle-measuring system of photoelectric theodolite. Actual application proves that this platform supplies good condition for technicians making deep simulation training and presents a useful approach for the establishment of other large equipment simulation platforms
Computer Simulation Performed for Columbia Project Cooling System
Ahmad, Jasim
2005-01-01
This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff
Computed radiography simulation using the Monte Carlo code MCNPX
International Nuclear Information System (INIS)
Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.
2009-01-01
Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)
Large-scale computing techniques for complex system simulations
Dubitzky, Werner; Schott, Bernard
2012-01-01
Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and
Computed radiography simulation using the Monte Carlo code MCNPX
Energy Technology Data Exchange (ETDEWEB)
Correa, S.C.A. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Centro Universitario Estadual da Zona Oeste (CCMAT)/UEZO, Av. Manuel Caldeira de Alvarenga, 1203, Campo Grande, 23070-200, Rio de Janeiro, RJ (Brazil); Souza, E.M. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.b [PEN/COPPE-DNC/Poli CT, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Cassiano, D.H. [Instituto de Radioprotecao e Dosimetria/CNEN Av. Salvador Allende, s/n, Recreio, 22780-160, Rio de Janeiro, RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil)
2010-09-15
Simulating X-ray images has been of great interest in recent years as it makes possible an analysis of how X-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data.
The Forward Observer Personal Computer Simulator (FOPCSIM)
2002-09-01
Research Institute (no date). Intelligent Systems, Advanced Computer and Electronics Technology, Automatiom, and Manufactoring Processes (online...vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF CONTENTS I. INTRODUCTION ...7 A. INTRODUCTION ............................................................................................7 B. CURRENT TRAINING
Humans, computers and wizards human (simulated) computer interaction
Fraser, Norman; McGlashan, Scott; Wooffitt, Robin
2013-01-01
Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.
A portable high-quality random number generator for lattice field theory simulations
International Nuclear Information System (INIS)
Luescher, M.
1993-09-01
The theory underlying a proposed random number generator for numerical simulations in elementary particle physics and statistical mechanics is discussed. The generator is based on an algorithm introduced by Marsaglia and Zaman, with an important added feature leading to demonstrably good statistical properties. It can be implemented exactly on any computer complying with the IEEE-754 standard for single precision floating point arithmetic. (orig.)
Biocellion: accelerating computer simulation of multicellular biological system models.
Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya
2014-11-01
Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Modelling of dusty plasma properties by computer simulation methods
Energy Technology Data Exchange (ETDEWEB)
Baimbetov, F B [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Ramazanov, T S [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Dzhumagulova, K N [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Kadyrsizov, E R [Institute for High Energy Densities of RAS, Izhorskaya 13/19, Moscow 125412 (Russian Federation); Petrov, O F [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Gavrikov, A V [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan)
2006-04-28
Computer simulation of dusty plasma properties is performed. The radial distribution functions, the diffusion coefficient are calculated on the basis of the Langevin dynamics. A comparison with the experimental data is made.
Tutorial: Parallel Computing of Simulation Models for Risk Analysis.
Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D
2016-10-01
Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.
Validation of Computational Fluid Dynamics Simulations for Realistic Flows (Preprint)
National Research Council Canada - National Science Library
Davoudzadeh, Farhad
2007-01-01
Strategies used to verify and validate computational fluid dynamics (CFD) calculations are described via case studies of realistic flow simulations, each representing a complex flow physics and complex geometry...
On architectural acoustic design using computer simulation
DEFF Research Database (Denmark)
Schmidt, Anne Marie Due; Kirkegaard, Poul Henning
2004-01-01
acoustic design process. The emphasis is put on the first three out of five phases in the working process of the architect and a case study is carried out in which each phase is represented by typical results ? as exemplified with reference to the design of Bagsværd Church by Jørn Utzon. The paper...... discusses the advantages and disadvantages of the programme in each phase compared to the works of architects not using acoustic simulation programmes. The conclusion of the paper is that the application of acoustic simulation programs is most beneficial in the last of three phases but an application...
Understanding Islamist political violence through computational social simulation
Energy Technology Data Exchange (ETDEWEB)
Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory
2008-01-01
Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.
A simulator for quantum computer hardware
Michielsen, K.F L; de Raedt, H.A.; De Raedt, K.
We present new examples of the use of the quantum computer (QC) emulator. For educational purposes we describe the implementation of the CNOT and Toffoli gate, two basic building blocks of a QC, on a three qubit NMR-like QC.
Comparison of radiographic technique by computer simulation
International Nuclear Information System (INIS)
Brochi, M.A.C.; Ghilardi Neto, T.
1989-01-01
A computational algorithm to compare radiographic techniques (KVp, mAs and filters) is developed based in the fixation of parameters that defines the images, such as optical density and constrast. Before the experience, the results were used in a radiography of thorax. (author) [pt
Advanced Simulation and Computing Business Plan
Energy Technology Data Exchange (ETDEWEB)
Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-07-09
To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.
Role of computational efficiency in process simulation
Directory of Open Access Journals (Sweden)
Kurt Strand
1989-07-01
Full Text Available It is demonstrated how efficient numerical algorithms may be combined to yield a powerful environment for analysing and simulating dynamic systems. The importance of using efficient numerical algorithms is emphasized and demonstrated through examples from the petrochemical industry.
Computer Simulation Studies of Trishomocubane Heptapeptide of ...
African Journals Online (AJOL)
As part of an extension on the cage peptide chemistry, the present work involves an assessment of the conformational profile of trishomocubane heptapeptide of the type Ac-Ala3-Tris-Ala3-NHMe using molecular dynamics (MD) simulations. All MD protocols were explored within the framework of a molecular mechanics ...
Quantum chemistry simulation on quantum computers: theories and experiments.
Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng
2012-07-14
It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.
A computer simulator for development of engineering system design methodologies
Padula, S. L.; Sobieszczanski-Sobieski, J.
1987-01-01
A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.
Probability: Actual Trials, Computer Simulations, and Mathematical Solutions.
Walton, Karen Doyle; Walton, J. Doyle
The purpose of this teaching unit is to approach elementary probability problems in three ways. First, actual trials are performed and results are recorded. Second, a simple computer simulation of the problem provided on diskette and written for Apple IIe and IIc computers, is run several times. Finally, the mathematical solution of the problem is…
Teaching Macroeconomics with a Computer Simulation. Final Report.
Dolbear, F. Trenery, Jr.
The study of macroeconomics--the determination and control of aggregative variables such as gross national product, unemployment and inflation--may be facilitated by the use of a computer simulation policy game. An aggregative model of the economy was constructed and programed for a computer and (hypothetical) historical data were generated. The…
Assessing Practical Skills in Physics Using Computer Simulations
Walsh, Kevin
2018-01-01
Computer simulations have been used very effectively for many years in the teaching of science but the focus has been on cognitive development. This study, however, is an investigation into the possibility that a student's experimental skills in the real-world environment can be judged via the undertaking of a suitably chosen computer simulation…
Effect of computer game playing on baseline laparoscopic simulator skills.
Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd
2013-08-01
Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.
Factors cost effectively improved using computer simulations of ...
African Journals Online (AJOL)
Factors cost effectively improved using computer simulations of maize yields in semi-arid Sub-Saharan Africa. ... Abstract. Achieving food security is a challenge for the developed and developing world. ... Most African farmers do not have the computer resources or expertise to implement these types of technology.
Computer simulation program is adaptable to industrial processes
Schultz, F. E.
1966-01-01
The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.
Theoretical and computational foundations of management class simulation
Denie Gerold
1978-01-01
Investigations on complicated, complex, and not well-ordered systems are possible only with the aid of mathematical methods and electronic data processing. Simulation as a method of operations research is particularly suitable for this purpose. Theoretical and computational foundations of management class simulation must be integrated into the planning systems of...
Computational fluid dynamics simulations and validations of results
CSIR Research Space (South Africa)
Sitek, MA
2013-09-01
Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...
Computer simulation system of neural PID control on nuclear reactor
International Nuclear Information System (INIS)
Chen Yuzhong; Yang Kaijun; Shen Yongping
2001-01-01
Neural network proportional integral differential (PID) controller on nuclear reactor is designed, and the control process is simulated by computer. The simulation result show that neutral network PID controller can automatically adjust its parameter to ideal state, and good control result can be gotten in reactor control process
Computer simulations of the mechanical properties of metals
DEFF Research Database (Denmark)
Schiøtz, Jakob; Vegge, Tejs
1999-01-01
Atomic-scale computer simulations can be used to gain a better understanding of the mechanical properties of materials. In this paper we demonstrate how this can be done in the case of nanocrystalline copper, and give a brief overview of how simulations may be extended to larger length scales...
Simulation of Robot Kinematics Using Interactive Computer Graphics.
Leu, M. C.; Mahajan, R.
1984-01-01
Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…
Investigating the Effectiveness of Computer Simulations for Chemistry Learning
Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan
2012-01-01
Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…
Computational fluid dynamics (CFD) simulation of hot air flow ...
African Journals Online (AJOL)
Computational Fluid Dynamics simulation of air flow distribution, air velocity and pressure field pattern as it will affect moisture transient in a cabinet tray dryer is performed using SolidWorks Flow Simulation (SWFS) 2014 SP 4.0 program. The model used for the drying process in this experiment was designed with Solid ...
Development of a Computer Simulation for a Car Deceleration ...
African Journals Online (AJOL)
This is very practical, technical, and it happens every day. In this paper, we studied the factors responsible for this event. Using a computer simulation that is based on a mathematical model, we implemented the simulation of a car braking model and showed how long it takes a car to come to rest while considering certain ...
Computer-Based Simulation Games in Public Administration Education
Directory of Open Access Journals (Sweden)
Kutergina Evgeniia
2017-12-01
Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better
Monte Carlo simulations on SIMD computer architectures
International Nuclear Information System (INIS)
Burmester, C.P.; Gronsky, R.; Wille, L.T.
1992-01-01
In this paper algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SIMD) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carl updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures
Computer Simulation of Turbulent Reactive Gas Dynamics
Directory of Open Access Journals (Sweden)
Bjørn H. Hjertager
1984-10-01
Full Text Available A simulation procedure capable of handling transient compressible flows involving combustion is presented. The method uses the velocity components and pressure as primary flow variables. The differential equations governing the flow are discretized by integration over control volumes. The integration is performed by application of up-wind differencing in a staggered grid system. The solution procedure is an extension of the SIMPLE-algorithm accounting for compressibility effects.
Computer simulation of proton channelling in silicon
Indian Academy of Sciences (India)
2000-06-12
Jun 12, 2000 ... The channelling of 3 MeV protons in the 110 direction of silicon has been simulated .... (3). Due to divergence the azimuthal angle for an ion Т can also be different from the mean azimuthal angle ¼. Defining the Э-axis to be along the reference ... Жis and is take on random values with the former following a.
Computer Simulation of Shipboard Electrical Distribution Systems
1989-06-01
programs are optimized for ana- lyzing shipboard systems. Here is a brief summary of several existing programs: 1.3.2.1 EMTP The Electromagnetic Transients...Program ( EMTP ) 122] is a large-scale rnetw.•)k simulation program originally developed by the Bonneville Power Association in the 1960’s. It is capable...synchronous machines as well as other elements of a power network. EMTP handles stiff systems by using the Euler Backward method for integration. In general
Computational fluid dynamics simulations of light water reactor flows
International Nuclear Information System (INIS)
Tzanos, C.P.; Weber, D.P.
1999-01-01
Advances in computational fluid dynamics (CFD), turbulence simulation, and parallel computing have made feasible the development of three-dimensional (3-D) single-phase and two-phase flow CFD codes that can simulate fluid flow and heat transfer in realistic reactor geometries with significantly reduced reliance, especially in single phase, on empirical correlations. The objective of this work was to assess the predictive power and computational efficiency of a CFD code in the analysis of a challenging single-phase light water reactor problem, as well as to identify areas where further improvements are needed
Digital control computer upgrade at the Cernavoda NPP simulator
International Nuclear Information System (INIS)
Ionescu, T.
2006-01-01
The Plant Process Computer equips some Nuclear Power Plants, like CANDU-600, with Centralized Control performed by an assembly of two computers known as Digital Control Computers (DCC) and working in parallel for safely driving of the plan at steady state and during normal maneuvers but also during abnormal transients when the plant is automatically steered to a safe state. The Centralized Control means both hardware and software with obligatory presence in the frame of the Full Scope Simulator and subject to changing its configuration with specific requirements during the plant and simulator life and covered by this subsection
Computer based training simulator for Hunterston Nuclear Power Station
International Nuclear Information System (INIS)
Bowden, R.S.M.; Hacking, D.
1978-01-01
For reasons which are stated, the Hunterston-B nuclear power station automatic control system includes a manual over-ride facility. It is therefore essential for the station engineers to be trained to recognise and control all feasible modes of plant and logic malfunction. A training simulator has been built which consists of a replica of the shutdown monitoring panel in the Central Control Room and is controlled by a mini-computer. This paper highlights the computer aspects of the simulator and relevant derived experience, under the following headings: engineering background; shutdown sequence equipment; simulator equipment; features; software; testing; maintenance. (U.K.)
Use of computer graphics simulation for teaching of flexible sigmoidoscopy.
Baillie, J; Jowell, P; Evangelou, H; Bickel, W; Cotton, P
1991-05-01
The concept of simulation training in endoscopy is now well-established. The systems currently under development employ either computer graphics simulation or interactive video technology; each has its strengths and weaknesses. A flexible sigmoidoscopy training device has been designed which uses graphic routines--such as object oriented programming and double buffering--in entirely new ways. These programming techniques compensate for the limitations of currently available desk-top microcomputers. By boosting existing computer 'horsepower' with next generation coprocessors and sophisticated graphics tools such as intensity interpolation (Gouraud shading), the realism of computer simulation of flexible sigmoidoscopy is being greatly enhanced. The computer program has teaching and scoring capabilities, making it a truly interactive system. Use has been made of this ability to record, grade and store each trainee encounter in computer memory as part of a multi-center, prospective trial of simulation training being conducted currently in the USA. A new input device, a dummy endoscope, has been designed that allows application of variable resistance to the insertion tube. This greatly enhances tactile feedback, such as resistance during looping. If carefully designed trials show that computer simulation is an attractive and effective training tool, it is expected that this technology will evolve rapidly and be made widely available to trainee endoscopists.
Computer Simulation of Breast Cancer Screening
2001-07-01
100 200 300 400 500 600 signals at A and B may be, respectively, written as pixel position ESFA =P+S, (1) 80 60 ESFB = P + Sf2, (2) 40/ where P is the...40 / primary ratio at point A (SPR) may be computed from the -60 digital signal values (among other ways) as: -80,... .. . S=2X( ESFA -ESFB), (3) 0...100 200 300 400 500 600 pixel position P= ESFA -S, (4) FIG. 4. (a) Matched primary-only and primary plus scatter ESFs and (b) the SPR= SIP. (5) resulting
Computer Models Simulate Fine Particle Dispersion
2010-01-01
Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.
Computational physics of plasma turbulence: CUTIE simulations
Energy Technology Data Exchange (ETDEWEB)
Thyagaraja, A.
1995-11-01
In this work, direct numerical simulations of two-fluid plasma turbulence using the CUTIE code developed at Culham are briefly described. It presents the formulation of the model, an outline of the solution methods employed and a set of results obtained for COMPASS-D-like conditions with the code. The calculations show the formation of self-organized coherent structures and the existence of ``meso-scale`` current and vorticity fluctuations in the presence of imposed toroidal flow and self-generated poloidal electric drifts. (author).
Energy Technology Data Exchange (ETDEWEB)
HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK
2000-04-01
Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.
A computer code to simulate X-ray imaging techniques
Energy Technology Data Exchange (ETDEWEB)
Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel
2000-09-01
A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.
A computer code to simulate X-ray imaging techniques
International Nuclear Information System (INIS)
Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel
2000-01-01
A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests
An introduction to computer simulation methods applications to physical systems
Gould, Harvey; Christian, Wolfgang
2007-01-01
Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...
Computational Simulations and the Scientific Method
Kleb, Bil; Wood, Bill
2005-01-01
As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.
Computer simulations of adsorbed liquid crystal films
Wall, Greg D.; Cleaver, Douglas J.
2003-01-01
The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.
Osmosis : a molecular dynamics computer simulation study
Lion, Thomas
Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..
Computer simulation as an operational and training aid
International Nuclear Information System (INIS)
Lee, D.J.; Tottman-Trayner, E.
1995-01-01
The paper describes how the rapid development of desktop computing power, the associated fall in prices, and the advancement of computer graphics technology driven by the entertainment industry has enabled the nuclear industry to achieve improvements in operation and training through the use of computer simulation. Applications are focused on the fuel handling operations at Torness Power Station where visualization through computer modelling is being used to enhance operator awareness and to assist in a number of operational scenarios. It is concluded that there are significant benefits to be gained from the introduction of the facility at Torness as well as other locations. (author)
Caliper simulation using computer for vocational and technical education
Genc Garip; Sezen Sakir; Akkus Nihat; Toptas Ersin
2016-01-01
In this study a caliper simulation was developed as computer application for using in the Vocational Education and Training (VET) and Distance-based training. This simulation was developed for teaching the caliper as measurement tool based Metric and Whitworth measurement systems. The developed caliper simulation introduced in detail step by step as user guide. The vernier scales are simulated as 1/10, 1/20 and 1/50 for metric measurement system and 1/64, 1/128 and 1/1000 for Whitworth measur...
Noise simulation in cone beam CT imaging with parallel computing
International Nuclear Information System (INIS)
Tu, S.-J.; Shaw, Chris C; Chen, Lingyun
2006-01-01
We developed a computer noise simulation model for cone beam computed tomography imaging using a general purpose PC cluster. This model uses a mono-energetic x-ray approximation and allows us to investigate three primary performance components, specifically quantum noise, detector blurring and additive system noise. A parallel random number generator based on the Weyl sequence was implemented in the noise simulation and a visualization technique was accordingly developed to validate the quality of the parallel random number generator. In our computer simulation model, three-dimensional (3D) phantoms were mathematically modelled and used to create 450 analytical projections, which were then sampled into digital image data. Quantum noise was simulated and added to the analytical projection image data, which were then filtered to incorporate flat panel detector blurring. Additive system noise was generated and added to form the final projection images. The Feldkamp algorithm was implemented and used to reconstruct the 3D images of the phantoms. A 24 dual-Xeon PC cluster was used to compute the projections and reconstructed images in parallel with each CPU processing 10 projection views for a total of 450 views. Based on this computer simulation system, simulated cone beam CT images were generated for various phantoms and technique settings. Noise power spectra for the flat panel x-ray detector and reconstructed images were then computed to characterize the noise properties. As an example among the potential applications of our noise simulation model, we showed that images of low contrast objects can be produced and used for image quality evaluation
Using EDUCache Simulator for the Computer Architecture and Organization Course
Directory of Open Access Journals (Sweden)
Sasko Ristov
2013-07-01
Full Text Available The computer architecture and organization course is essential in all computer science and engineering programs, and the most selected and liked elective course for related engineering disciplines. However, the attractiveness brings a new challenge, it requires a lot of effort by the instructor, to explain rather complicated concepts to beginners or to those who study related disciplines. The usage of visual simulators can improve both the teaching and learning processes. The overall goal is twofold: 1~to enable a visual environment to explain the basic concepts and 2~to increase the student's willingness and ability to learn the material.A lot of visual simulators have been used for the computer architecture and organization course. However, due to the lack of visual simulators for simulation of the cache memory concepts, we have developed a new visual simulator EDUCache simulator. In this paper we present that it can be effectively and efficiently used as a supporting tool in the learning process of modern multi-layer, multi-cache and multi-core multi-processors.EDUCache's features enable an environment for performance evaluation and engineering of software systems, i.e. the students will also understand the importance of computer architecture building parts and hopefully, will increase their curiosity for hardware courses in general.
Computer simulation of sensitization in stainless steels
Energy Technology Data Exchange (ETDEWEB)
Logan, R W
1983-12-20
Stainless steel containers are prime candidates for the containment of nuclear waste in tuff rock. The thermal history of a container involves exposure to temperatures of 500 to 600/sup 0/C when it is welded and possibly filled with molten waste glass, followed by hundreds of years exposure in the 100 to 300/sup 0/C range. The problems of short- and long-term sensitization in stainless steels have been addressed by two computer programs. The TTS program uses classical nucleation and growth theory plus experimental input to predict the onset of precipitation or sensitization under complex thermal histories. The FEMGB program uses quadratic finite-element methods to analyze diffusion processes and chromium depletion during precipitate growth. The results of studies using both programs indicate that sensitization should not be a problem in any of the austenitic stainless steels considered. However, more precise information on the process thermal cycles, especially during welding of the container, is needed. Contributions from dislocation pipe diffusion could promote long-term low-temperature sensitization.
Computer simulation of LMFBR piping systems
International Nuclear Information System (INIS)
A-Moneim, M.T.; Chang, Y.W.; Fistedis, S.H.
1977-01-01
Integrity of piping systems is one of the main concerns of the safety issues of Liquid Metal Fast Breeder Reactors (LMFBR). Hypothetical core disruptive accidents (HCDA) and water-sodium interaction are two examples of sources of high pressure pulses that endanger the integrity of the heat transport piping systems of LMFBRs. Although plastic wall deformation attenuates pressure peaks so that only pressures slightly higher than the pipe yield pressure propagate along the system, the interaction of these pulses with the different components of the system, such as elbows, valves, heat exchangers, etc.; and with one another produce a complex system of pressure pulses that cause more plastic deformation and perhaps damage to components. A generalized piping component and a tee branching model are described. An optional tube bundle and interior rigid wall simulation model makes such a generalized component model suited for modelling of valves, reducers, expansions, and heat exchangers. The generalized component and the tee branching junction models are combined with the pipe-elbow loop model so that a more general piping system can be analyzed both hydrodynamically and structurally under the effect of simultaneous pressure pulses
Predictive Toxicology and Computer Simulation of Male ...
The reproductive tract is a complex, integrated organ system with diverse embryology and unique sensitivity to prenatal environmental exposures that disrupt morphoregulatory processes and endocrine signaling. U.S. EPA’s in vitro high-throughput screening (HTS) database (ToxCastDB) was used to profile the bioactivity of 54 chemicals with male developmental consequences across ~800 molecular and cellular features. The in vitro bioactivity on molecular targets could be condensed into 156 gene annotations in a bipartite network. These results highlighted the role of estrogen and androgen signaling pathways in male reproductive tract development, and importantly, broadened the list of molecular targets to include GPCRs, cytochrome-P450s, vascular remodeling proteins, and retinoic acid signaling. A multicellular agent-based model was used to simulate the complex interactions between morphoregulatory, endocrine, and environmental influences during genital tubercle (GT) development. Spatially dynamic signals (e.g., SHH, FGF10, and androgen) were implemented in the model to address differential adhesion, cell motility, proliferation, and apoptosis. Under control of androgen signaling, urethral tube closure was an emergent feature of the model that was linked to gender-specific rates of ventral mesenchymal proliferation and urethral plate endodermal apoptosis. A systemic parameter sweep was used to examine the sensitivity of crosstalk between genetic deficiency and envi
Coupling Computer-Aided Process Simulation and ...
A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable
Associative Memory computing power and its simulation.
Volpi, G; The ATLAS collaboration
2014-01-01
The associative memory (AM) chip is ASIC device specifically designed to perform ``pattern matching'' at very high speed and with parallel access to memory locations. The most extensive use for such device will be the ATLAS Fast Tracker (FTK) processor, where more than 8000 chips will be installed in 128 VME boards, specifically designed for high throughput in order to exploit the chip's features. Each AM chip will store a database of about 130000 pre-calculated patterns, allowing FTK to use about 1 billion patterns for the whole system, with any data inquiry broadcast to all memory elements simultaneously within the same clock cycle (10 ns), thus data retrieval time is independent of the database size. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS FTK processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 $\\mathrm{\\mu s}$. The simulation of such a parallelized system is an extremely complex task when executed in comm...
SiMon: Simulation Monitor for Computational Astrophysics
Qian, Penny Xuran; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming
2017-09-01
Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.
Computer simulation of a 3-phase induction motor
International Nuclear Information System (INIS)
Memon, N.A.; Unsworth, P.J.
2004-01-01
Computer Simulation of a 3-phase squirrel-cage induction motor is presented in Microsoft QBASIC for understanding trends and various operational modes of an induction motor. Thyristor fed, phase controlled induction motor (three-wire) model has been simulated. In which voltage is applied to the motor stator winding through back-to-back connected thyristors as controlled switches in series with the stator. The simulated induction motor system opens up towards a wide range of investigation/analysis options for research and development work in the field. Key features of the simulation performed are highlighted for development of better understanding of the work done. Complete study of an Induction Motor, starting modes in terms the voltage/current, torque/speed characteristics and their graphical representation produced is presented. Ideal agreement of the simulation results with the notional outcome encourages users to go ahead for various hardware development projects based on the study through the simulation. (author)
Advanced Simulation and Computing FY17 Implementation Plan, Version 0
Energy Technology Data Exchange (ETDEWEB)
McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment
2016-08-29
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.
Computer simulation for horizontal axis wind turbine rotor optimization
International Nuclear Information System (INIS)
Mehfooz, O.; Ullah, I.
2011-01-01
Wind turbine design is a complex process that includes multiple and conflicting criteria like maximizing energy production and minimizing the cost incurred. Often, such problems are solved using optimization techniques. A computer simulation is essential in analyzing the performance of a wind turbine rotor and determining suitable values of various design variables. The simulation will work with a design optimizer to optimize the design. In this paper, the problem of optimal rotor design is formulated and a computer simulation presented to analyze the performance of a horizontal axis wind turbine rotor of a given airfoil over a range of rotor tip speed ratios. The MATLAB simulation takes inputs of blade twist angle and chord solidity along the rotor radius as well as wind speed distribution; it provides output in the form of plots of coefficient of performance against tip speed ratio, variation of induction factors, angle of attack and coefficients of lift and drag with blade radial positions. (author)
National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...
Strategic Implications of Cloud Computing for Modeling and Simulation (Briefing)
2016-04-01
of Promises with Cloud • Cost efficiency • Unlimited storage • Backup and recovery • Automatic software integration • Easy access to information...Strategic Implications of Cloud Computing for Modeling and Simulation (Briefing) Amy E. Henninger I N S T I T U T E F O R D E F E N S E A N A L...under contract HQ0034-14-D-0001, Project AI-2-3077, “ Cloud Computing for Modeling and Simulation,” for Office of the Deputy Assistant Director of
Study on Computer Numerical Simulation of Driving Static Pressure Pile
Hong, Ji; Xueyi, Yu
The method to study soil compaction effect caused by driving static pressure pile was proposed with the holes expansion principle analysis. It uses FEM (finite element method) computer numerical simulation to research holes expansion commonly. The expansion of holes radius changes from a0 to 2a0 corresponding to original one from zero to R. Comparing with conclusions obtained from other theories, FEM computer numerical simulation is valid for the analysis of holes expansion. Comparing with the traditional holes expansion principle, it expands the application scope and can be extended to analyze other cross-section forms of holes.
Method for simulating paint mixing on computer monitors
Carabott, Ferdinand; Lewis, Garth; Piehl, Simon
2002-06-01
Computer programs like Adobe Photoshop can generate a mixture of two 'computer' colors by using the Gradient control. However, the resulting colors diverge from the equivalent paint mixtures in both hue and value. This study examines why programs like Photoshop are unable to simulate paint or pigment mixtures, and offers a solution using Photoshops existing tools. The article discusses how a library of colors, simulating paint mixtures, is created from 13 artists' colors. The mixtures can be imported into Photoshop as a color swatch palette of 1248 colors and as 78 continuous or stepped gradient files, all accessed in a new software package, Chromafile.
Computer simulations for thorium doped tungsten crystals
Energy Technology Data Exchange (ETDEWEB)
Eberhard, Bernd
2009-07-17
set of Langevin equations, i.e. stochastic differential equations including properly chosen ''noise'' terms. A new integration scheme is derived for integrating the equations of motion, which closely resembles the well-known Velocity Verlet algorithm. As a first application of the EAM potentials, we calculate the phonon dispersion for tungsten and thorium. Furthermore, the potentials are used to derive the excess volumes of point defects, i.e. for vacancies and Th-impurities in tungsten, grain boundary structures and energies. Additionally, we take a closer look at various stacking fault energies and link the results to the potential splitting of screw dislocations in tungsten into partials. We also compare the energetic stability of screw, edge and mixed-type dislocations. Besides this, we are interested in free enthalpy differences, for which we make use of the Overlapping Distribution Method (ODM), an efficient, albeit computationally demanding, method to calculate free enthalpy differences, with which we address the question of lattice formation, vacancy formation and impurity formation at varying temperatures. (orig.)
Computer simulations for thorium doped tungsten crystals
International Nuclear Information System (INIS)
Eberhard, Bernd
2009-01-01
differential equations including properly chosen ''noise'' terms. A new integration scheme is derived for integrating the equations of motion, which closely resembles the well-known Velocity Verlet algorithm. As a first application of the EAM potentials, we calculate the phonon dispersion for tungsten and thorium. Furthermore, the potentials are used to derive the excess volumes of point defects, i.e. for vacancies and Th-impurities in tungsten, grain boundary structures and energies. Additionally, we take a closer look at various stacking fault energies and link the results to the potential splitting of screw dislocations in tungsten into partials. We also compare the energetic stability of screw, edge and mixed-type dislocations. Besides this, we are interested in free enthalpy differences, for which we make use of the Overlapping Distribution Method (ODM), an efficient, albeit computationally demanding, method to calculate free enthalpy differences, with which we address the question of lattice formation, vacancy formation and impurity formation at varying temperatures. (orig.)
Towards accurate quantum simulations of large systems with small computers.
Yang, Yonggang
2017-01-24
Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.
Improved Pyrolysis Micro reactor Design via Computational Fluid Dynamics Simulations
2017-05-23
NUMBER (Include area code) 23 May 2017 Briefing Charts 25 April 2017 - 23 May 2017 Improved Pyrolysis Micro-reactor Design via Computational Fluid... PYROLYSIS MICRO-REACTOR DESIGN VIA COMPUTATIONAL FLUID DYNAMICS SIMULATIONS Ghanshyam L. Vaghjiani* DISTRIBUTION A: Approved for public release...Approved for public release, distribution unlimited. PA Clearance 17247 Chen-Source (>240 references from SciFinder as of 5/1/17): Flash pyrolysis
Moment aberrations in magneto-electrostatic plasma lenses (computer simulation)
Butenko, V I
2001-01-01
In this work moment aberrations in the plasma magneto-electrostatic lenses are considered in more detail with the use of the computer modeling. For solution of the problem we have developed a special computer code - the model of plasma optical focusing device, allowing to display the main parameters and operations of experimental sample of a lens, to simulate the moment and geometrical aberrations and give recommendations on their elimination.
Sakamoto, Shinichi; Otsuru, Toru
2014-01-01
This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.
Blast Load Simulator Experiments for Computational Model Validation Report 3
2017-07-01
establish confidence in the simulation results specific to their intended use. One method for providing experimental data for computational model...walls, to higher blast pressures required to evaluate the performance of protective construction methods . Figure 1. ERDC Blast Load Simulator (BLS... Instrumentation included 3 pressure gauges mounted on the steel calibration plate, 2 pressure gauges mounted in the wall of the BLS, and 25 pressure gauges
OSL sensitivity changes during single aliquot procedures: Computer simulations
DEFF Research Database (Denmark)
McKeever, S.W.S.; Agersnap Larsen, N.; Bøtter-Jensen, L.
1997-01-01
We present computer simulations of sensitivity changes obtained during single aliquot, regeneration procedures. The simulations indicate that the sensitivity changes are the combined result of shallow trap and deep trap effects. Four separate processes have been identified. Although procedures can...... dose used and the natural dose. However, the sensitivity changes appear only weakly dependent upon added dose, suggesting that the SARA single aliquot technique may be a suitable method to overcome the sensitivity changes. (C) 1997 Elsevier Science Ltd....
Computer Simulation of Sexual Selection on Age-Structured Populations
Martins, S. G. F.; Penna, T. J. P.
Using computer simulations of a bit-string model for age-structured populations, we found that sexual selection of older males is advantageous, from an evolutionary point of view. These results are in opposition to a recent proposal of females choosing younger males. Our simulations are based on findings from recent studies of polygynous bird species. Since secondary sex characters are found mostly in males, we could make use of asexual populations that can be implemented in a fast and efficient way.
“CHRIS”: A Computer Simulation of Schizophrenia*
Santo, Yoav; Finkel, Andy
1982-01-01
“CHRIS” is an experimental computer simulation of a patient with a schizophrenic disorder responding to an initial diagnostic interview with a clinician. The program is designed as a teaching aid in psychiatric interviewing and diagnosis. The user of the simulation assumes the role of the “clinician” conducting a diagnostic interview, Upon completion of the interview, the program checks the user's diagnosis for accuracy; it reports a corrected diagnosis if necessary; and finally, it lists all...
Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center
Directory of Open Access Journals (Sweden)
E. V. Vorozhtsov
2011-12-01
Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.
A Computer Simulation of Community Pharmacy Practice for Educational Use.
Bindoff, Ivan; Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert
2014-11-15
To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor.
A compositional reservoir simulator on distributed memory parallel computers
International Nuclear Information System (INIS)
Rame, M.; Delshad, M.
1995-01-01
This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. A portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented
Computer simulation studies in condensed-matter physics 5. Proceedings
International Nuclear Information System (INIS)
Landau, D.P.; Mon, K.K.; Schuettler, H.B.
1993-01-01
As the role of computer simulations began to increase in importance, we sensed a need for a ''meeting place'' for both experienced simulators and neophytes to discuss new techniques and results in an environment which promotes extended discussion. As a consequence of these concerns, The Center for Simulational Physics established an annual workshop on Recent Developments in Computer Simulation Studies in Condensed-Matter Physics. This year's workshop was the fifth in this series and the interest which the scientific community has shown demonstrates quite clearly the useful purpose which the series has served. The workshop was held at the University of Georgia, February 17-21, 1992, and these proceedings from a record of the workshop which is published with the goal of timely dissemination of the papers to a wider audience. The proceedings are divided into four parts. The first part contains invited papers which deal with simulational studies of classical systems and includes an introduction to some new simulation techniques and special purpose computers as well. A separate section of the proceedings is devoted to invited papers on quantum systems including new results for strongly correlated electron and quantum spin models. The third section is comprised of a single, invited description of a newly developed software shell designed for running parallel programs. The contributed presentations comprise the final chapter. (orig.). 79 figs
The advanced computational testing and simulation toolkit (ACTS)
International Nuclear Information System (INIS)
Drummond, L.A.; Marques, O.
2002-01-01
During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts
An introduction to statistical computing a simulation-based approach
Voss, Jochen
2014-01-01
A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems. Sampling-based simulation techniques are now an invaluable tool for exploring statistical models. This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods. It also includes some advanced met
Dynamic computer simulation of the Fort St. Vrain steam turbines
International Nuclear Information System (INIS)
Conklin, J.C.
1983-01-01
A computer simulation is described for the dynamic response of the Fort St. Vrain nuclear reactor regenerative intermediate- and low-pressure steam turbines. The fundamental computer-modeling assumptions for the turbines and feedwater heaters are developed. A turbine heat balance specifying steam and feedwater conditions at a given generator load and the volumes of the feedwater heaters are all that are necessary as descriptive input parameters. Actual plant data for a generator load reduction from 100 to 50% power (which occurred as part of a plant transient on November 9, 1981) are compared with computer-generated predictions, with reasonably good agreement
Technology computer aided design simulation for VLSI MOSFET
Sarkar, Chandan Kumar
2013-01-01
Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and
Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong
2010-10-01
Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Computation of induced dipoles in molecular mechanics simulations using graphics processors.
Pratas, Frederico; Sousa, Leonel; Dieterich, Johannes M; Mata, Ricardo A
2012-05-25
In this work, we present a tentative step toward the efficient implementation of polarizable molecular mechanics force fields with GPU acceleration. The computational bottleneck of such applications is found in the treatment of electrostatics, where higher-order multipoles and a self-consistent treatment of polarization effects are needed. We have implemented a GPU accelerated code, based on the Tinker program suite, for the computation of induced dipoles. The largest test system used shows a speedup factor of over 20 for a single precision GPU implementation, when comparing to the serial CPU version. A discussion of the optimization and parametrization steps is included. Comparison between different graphic cards and CPU-GPU embedding is also given. The current work demonstrates the potential usefulness of GPU programming in accelerating this field of applications.
simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing
Directory of Open Access Journals (Sweden)
İlker Kalender
2015-06-01
Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described
Statistical properties of dynamical systems – Simulation and abstract computation
International Nuclear Information System (INIS)
Galatolo, Stefano; Hoyrup, Mathieu; Rojas, Cristóbal
2012-01-01
Highlights: ► A survey on results about computation and computability on the statistical properties of dynamical systems. ► Computability and non-computability results for invariant measures. ► A short proof for the computability of the convergence speed of ergodic averages. ► A kind of “constructive” version of the pointwise ergodic theorem. - Abstract: We survey an area of recent development, relating dynamics to theoretical computer science. We discuss some aspects of the theoretical simulation and computation of the long term behavior of dynamical systems. We will focus on the statistical limiting behavior and invariant measures. We present a general method allowing the algorithmic approximation at any given accuracy of invariant measures. The method can be applied in many interesting cases, as we shall explain. On the other hand, we exhibit some examples where the algorithmic approximation of invariant measures is not possible. We also explain how it is possible to compute the speed of convergence of ergodic averages (when the system is known exactly) and how this entails the computation of arbitrarily good approximations of points of the space having typical statistical behaviour (a sort of constructive version of the pointwise ergodic theorem).
Role of Computer Graphics in Simulations for Teaching Physiology.
Modell, H. I.; And Others
1983-01-01
Discusses a revision of existing respiratory physiology simulations to promote active learning experiences for individual students. Computer graphics were added to aid student's conceptualization of the physiological system. Specific examples are provided, including those dealing with alveolar gas equations and effects of anatomic shunt flow on…
Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.
Moore, Gwendolyn B.; And Others
1986-01-01
Describes possible applications of new technologies to special education. Discusses results of a study designed to explore the use of robotics, artificial intelligence, and computer simulations to aid people with handicapping conditions. Presents several scenarios in which specific technological advances may contribute to special education…
Computer Simulation and Laboratory Work in the Teaching of Mechanics.
Borghi, L.; And Others
1987-01-01
Describes a teaching strategy designed to help high school students learn mechanics by involving them in simple experimental work, observing didactic films, running computer simulations, and executing more complex laboratory experiments. Provides an example of the strategy as it is applied to the topic of projectile motion. (TW)
Calculation of liquid-crystal Frank constants by computer simulation
Allen, M.P.; Frenkel, D.
1988-01-01
We present the first calculations, by computer simulation, of the Frank elastic constants of a liquid crystal composed of freely rotating and translating molecules. Extensive calculations are performed for hard prolate ellipsoids at a single density, and for hard spherocylinders at three densities.
Computational Fluid Dynamics and Building Energy Performance Simulation
DEFF Research Database (Denmark)
Nielsen, Peter V.; Tryggvason, Tryggvi
An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...
Computational Fluid Dynamics and Building Energy Performance Simulation
DEFF Research Database (Denmark)
Nielsen, Peter Vilhelm; Tryggvason, T.
1998-01-01
An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...
Pedagogical Approaches to Teaching with Computer Simulations in Science Education
Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael
2013-01-01
For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is
Using computer simulations to improve concept formation in chemistry
African Journals Online (AJOL)
The goal of this research project was to investigate whether computer simulations used as a visually-supporting teaching strategy, can improve concept formation with regard to molecules and chemical bonding, as found in water. Both the qualitative and quantitative evaluation of responses supported the positive outcome ...
Computer simulation for integrated pest management of spruce budworms
Carroll B. Williams; Patrick J. Shea
1982-01-01
Some field studies of the effects of various insecticides on the spruce budworm (Choristoneura sp.) and their parasites have shown severe suppression of host (budworm) populations and increased parasitism after treatment. Computer simulation using hypothetical models of spruce budworm-parasite systems based on these field data revealed that (1)...
Faster quantum chemistry simulation on fault-tolerant quantum computers
International Nuclear Information System (INIS)
Cody Jones, N; McMahon, Peter L; Yamamoto, Yoshihisa; Whitfield, James D; Yung, Man-Hong; Aspuru-Guzik, Alán; Van Meter, Rodney
2012-01-01
Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. We propose methods which substantially improve the performance of a particular form of simulation, ab initio quantum chemistry, on fault-tolerant quantum computers; these methods generalize readily to other quantum simulation problems. Quantum teleportation plays a key role in these improvements and is used extensively as a computing resource. To improve execution time, we examine techniques for constructing arbitrary gates which perform substantially faster than circuits based on the conventional Solovay–Kitaev algorithm (Dawson and Nielsen 2006 Quantum Inform. Comput. 6 81). For a given approximation error ϵ, arbitrary single-qubit gates can be produced fault-tolerantly and using a restricted set of gates in time which is O(log ϵ) or O(log log ϵ); with sufficient parallel preparation of ancillas, constant average depth is possible using a method we call programmable ancilla rotations. Moreover, we construct and analyze efficient implementations of first- and second-quantized simulation algorithms using the fault-tolerant arbitrary gates and other techniques, such as implementing various subroutines in constant time. A specific example we analyze is the ground-state energy calculation for lithium hydride. (paper)
Solving wood chip transport problems with computer simulation.
Dennis P. Bradley; Sharon A. Winsauer
1976-01-01
Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.
Computer Simulation of the Impact of Cigarette Smoking On Humans
African Journals Online (AJOL)
2012-12-01
Dec 1, 2012 ... Abstract. In this edition, emphasis has been laid on computer simulation of the impact of cigarette smoking on the population between now and the next 50 years, if no government intervention is exercised to control the behaviour of smokers. The statistical indices derived from the previous article (WAJIAR ...
Biology Students Building Computer Simulations Using StarLogo TNG
Smith, V. Anne; Duncan, Ishbel
2011-01-01
Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…
Flow Through a Laboratory Sediment Sample by Computer Simulation Modeling
2006-09-07
sands; Interacting lattice gas; Computer simulation: Driven flow 16. SECURITY CLASSIFICATION OF: a. REPORT Unclassified b. ABSTRACT Unclassified...Transport in Porous Media, Springer, Berlin. 2000. [3] B. Loret, J.M. Huyghe (Eds.), Chemo-Mechanical Couplings in Porous Media Geomechanics and
Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.
Moore, Gwendolyn B.; And Others
The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…
Computer Simulation of the Impact of Cigarette Smoking On Humans
African Journals Online (AJOL)
In this edition, emphasis has been laid on computer simulation of the impact of cigarette smoking on the population between now and the next 50 years, if no government intervention is exercised to control the behaviour of smokers. The statistical indices derived from the previous article (WAJIAR Volume 4) in the series ...
Learner Perceptions of Realism and Magic in Computer Simulations.
Hennessy, Sara; O'Shea, Tim
1993-01-01
Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive…
Computer simulation of polymer-induced clustering of colloids
Meijer, E.J.; Frenkel, D.
1991-01-01
We have developed a novel computational scheme that allows direct numerical simulation of polymer-colloid mixtures at constant osmotic pressure. Using this technique, we have studied the entropic attraction that is caused by ideal polymers dissolved in a simple (hard-sphere) colloidal dispersion. In
The acoustical history of Hagia Sophia revived through computer simulations
DEFF Research Database (Denmark)
Rindel, Jens Holger; Weitze, C.A.; Christensen, Claus Lynge
2002-01-01
The present paper deals with acoustic computer simulations of Hagia Sophia, which is characterized not only by being one of the largest worship buildings in the world, but also by – in its 1500 year history – having served three purposes: as a church, as a mosque and today as a museum...
Computer simulation study of water using a fluctuating charge model
Indian Academy of Sciences (India)
Unknown
study of water through computer simulation methods has attracted considerable attention. ... water. In particular, the single particle and collective relaxation times obtained using this model are in rough agreement with experiment. Yet, in all these quantities, the ..... The fictitious mass of the charge has to be chosen with care.
The tension of framed membranes from computer simulations
DEFF Research Database (Denmark)
Hamkens, Daniel; Jeppesen, Claus; Ipsen, John H.
2018-01-01
Abstract.: We have analyzed the behavior of a randomly triangulated, self-avoiding surface model of a flexible, fluid membrane subject to a circular boundary by Wang-Landau Monte Carlo computer simulation techniques. The dependence of the canonical free energy and frame tension on the frame area...
Interactive Electronic Circuit Simulation on Small Computer Systems
1979-11-01
State Circuits, SC-11, No. 5, 730-732, Octo- ber 1976. 3. A. R. Newton and G. L. Taylor, BIASL.25, A MOS Circuit Simulator, Tenth Annual Asilo ...Analysis Time, Accuracy, and Memory Requirement Tradeoffs in SPICE2, Eleventh Annual Asilo - mar Conference on Circuits, Systems and Computers
Highway traffic simulation on multi-processor computers
Energy Technology Data Exchange (ETDEWEB)
Hanebutte, U.R.; Doss, E.; Tentner, A.M.
1997-04-01
A computer model has been developed to simulate highway traffic for various degrees of automation with a high level of fidelity in regard to driver control and vehicle characteristics. The model simulates vehicle maneuvering in a multi-lane highway traffic system and allows for the use of Intelligent Transportation System (ITS) technologies such as an Automated Intelligent Cruise Control (AICC). The structure of the computer model facilitates the use of parallel computers for the highway traffic simulation, since domain decomposition techniques can be applied in a straight forward fashion. In this model, the highway system (i.e. a network of road links) is divided into multiple regions; each region is controlled by a separate link manager residing on an individual processor. A graphical user interface augments the computer model kv allowing for real-time interactive simulation control and interaction with each individual vehicle and road side infrastructure element on each link. Average speed and traffic volume data is collected at user-specified loop detector locations. Further, as a measure of safety the so- called Time To Collision (TTC) parameter is being recorded.
Simulations Using a Computer/Videodisc System: Instructional Design Considerations.
Ehrlich, Lisa R.
Instructional design considerations involved in using level four videodisc systems when designing simulations are explored. Discussion of the hardware and software system characteristics notes that computer based training offers the features of text, graphics, color, animation, and highlighting techniques, while a videodisc player offers all of…
Atomic Force Microscopy and Real Atomic Resolution. Simple Computer Simulations
Koutsos, V.; Manias, E.; Brinke, G. ten; Hadziioannou, G.
1994-01-01
Using a simple computer simulation for AFM imaging in the contact mode, pictures with true and false atomic resolution are demonstrated. The surface probed consists of two f.c.c. (111) planes and an atomic vacancy is introduced in the upper layer. Changing the size of the effective tip and its
Graphical Visualization on Computational Simulation Using Shared Memory
International Nuclear Information System (INIS)
Lima, A B; Correa, Eberth
2014-01-01
The Shared Memory technique is a powerful tool for parallelizing computer codes. In particular it can be used to visualize the results ''on the fly'' without stop running the simulation. In this presentation we discuss and show how to use the technique conjugated with a visualization code using openGL
Computer simulation study of water using a fluctuating charge model
Indian Academy of Sciences (India)
Unknown
Abstract. Hydrogen bonding in small water clusters is studied through computer simulation methods using a sophisticated, empirical model of interaction developed by Rick et al (S W Rick, S J Stuart and B J Berne 1994 J. Chem. Phys. 101 6141) and others. The model allows for the charges on the interacting sites to ...
Computer simulation and metallography of locally non-homogeneous materials
Czech Academy of Sciences Publication Activity Database
Ilucová, Lucia; Ponížil, P.; Svoboda, Milan; Saxl, Ivan
2007-01-01
Roč. 13, č. 1 (2007), s. 84-90 ISSN 1335-1532 R&D Projects: GA ČR GA201/06/0302 Institutional research plan: CEZ:AV0Z10190503; CEZ:AV0Z20410507 Keywords : non-homogeneous materials * computer simulation and metallography Subject RIV: JP - Industrial Processing
Student generated assignments about electrical circuits in a computer simulation
Vreman-de Olde, Cornelise; de Jong, Anthonius J.M.
2004-01-01
In this study we investigated the design of assignments by students as a knowledge-generating activity. Students were required to design assignments for 'other students' in a computer simulation environment about electrical circuits. Assignments consisted of a question, alternatives, and feedback on
Advanced Simulation and Computing Co-Design Strategy
Energy Technology Data Exchange (ETDEWEB)
Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-11-01
This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.
Computer simulation and image guidance for individualised dynamic spinal stabilization.
Kantelhardt, S R; Hausen, U; Kosterhon, M; Amr, A N; Gruber, K; Giese, A
2015-08-01
Dynamic implants for the human spine are used to re-establish regular segmental motion. However, the results have often been unsatisfactory and complications such as screw loosening are common. Individualisation of appliances and precision implantation are needed to improve the outcome of this procedure. Computer simulation, virtual implant optimisation and image guidance were used to improve the technique. A human lumbar spine computer model was developed using multi-body simulation software. The model simulates spinal motion under load and degenerative changes. After virtual degeneration of a L4/5 segment, virtual pedicle screw-based implants were introduced. The implants' positions and properties were iteratively optimised. The resulting implant positions were used as operative plan for image guidance and finally implemented in a physical spine model. In the simulation, the introduction and optimisation of virtually designed dynamic implants could partly compensate for the effects of virtual lumbar segment degeneration. The optimised operative plan was exported to two different image-guidance systems for transfer to a physical spine model. Three-dimensional computer graphic simulation is a feasible means to develop operative plans for dynamic spinal stabilization. These operative plans can be transferred to commercially available image-guidance systems for use in implantation of physical implants in a spine model. This concept has important potential in the design of operative plans and implants for individualised dynamic spine stabilization surgery.
SPINET: A Parallel Computing Approach to Spine Simulations
Directory of Open Access Journals (Sweden)
Peter G. Kropf
1996-01-01
Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.
Bibliography for Verification and Validation in Computational Simulation
International Nuclear Information System (INIS)
Oberkampf, W.L.
1998-01-01
A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering
Bibliography for Verification and Validation in Computational Simulations
Energy Technology Data Exchange (ETDEWEB)
Oberkampf, W.L.
1998-10-01
A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.
Computational Dehydration of Crystalline Hydrates Using Molecular Dynamics Simulations
DEFF Research Database (Denmark)
Larsen, Anders Støttrup; Rantanen, Jukka; Johansson, Kristoffer E
2017-01-01
Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach to the de......Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach...... to the dehydration of ampicillin trihydrate. The crystallographic unit cell of the trihydrate is used to construct the simulation cell containing 216 ampicillin and 648 water molecules. This system is dehydrated by removing water molecules during a 2200 ps simulation, and depending on the computational dehydration...... rate, different dehydrated structures were observed. Removing all water molecules immediately and removing water relatively fast (10 water molecules/10 ps) resulted in an amorphous system, whereas relatively slow computational dehydration (3 water molecules/10 ps) resulted in a crystalline anhydrate...
Computer simulation of human motion in sports biomechanics.
Vaughan, C L
1984-01-01
This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
Sharma, Gulshan B.; Robertson, Douglas D.
2013-07-01
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
Energy Technology Data Exchange (ETDEWEB)
Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States); University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4 (Canada); Robertson, Douglas D., E-mail: douglas.d.robertson@emory.edu [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States)
2013-07-01
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than
Computational physics simulation of classical and quantum systems
Scherer, Philipp O J
2013-01-01
This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...
Ravenscar Computational Model compliant AADL Simulation on LEON2
Directory of Open Access Journals (Sweden)
Roberto Varona-Gómez
2013-02-01
Full Text Available AADL has been proposed for designing and analyzing SW and HW architectures for real-time mission-critical embedded systems. Although the Behavioral Annex improves its simulation semantics, AADL is a language for analyzing architectures and not for simulating them. AADS-T is an AADL simulation tool that supports the performance analysis of the AADL specification throughout the refinement process from the initial system architecture until the complete, detailed application and execution platform are developed. In this way, AADS-T enables the verification of the initial timing constraints during the complete design process. In this paper we focus on the compatibility of AADS-T with the Ravenscar Computational Model (RCM as part of the TASTE toolset. Its flexibility enables AADS-T to support different processors. In this work we have focused on performing the simulation on a LEON2 processor.
Computer simulation of two-phase flow in nuclear reactors
International Nuclear Information System (INIS)
Wulff, W.
1993-01-01
Two-phase flow models dominate the requirements of economic resources for the development and use of computer codes which serve to analyze thermohydraulic transients in nuclear power plants. An attempt is made to reduce the effort of analyzing reactor transients by combining purpose-oriented modelling with advanced computing techniques. Six principles are presented on mathematical modeling and the selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited than the two-fluid model for the analysis of two-phase flow in nuclear reactors, because of the latter's closure problems. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost. (orig.)
Monte Carlo simulation with the Gate software using grid computing
International Nuclear Information System (INIS)
Reuillon, R.; Hill, D.R.C.; Gouinaud, C.; El Bitar, Z.; Breton, V.; Buvat, I.
2009-03-01
Monte Carlo simulations are widely used in emission tomography, for protocol optimization, design of processing or data analysis methods, tomographic reconstruction, or tomograph design optimization. Monte Carlo simulations needing many replicates to obtain good statistical results can be easily executed in parallel using the 'Multiple Replications In Parallel' approach. However, several precautions have to be taken in the generation of the parallel streams of pseudo-random numbers. In this paper, we present the distribution of Monte Carlo simulations performed with the GATE software using local clusters and grid computing. We obtained very convincing results with this large medical application, thanks to the EGEE Grid (Enabling Grid for E-science), achieving in one week computations that could have taken more than 3 years of processing on a single computer. This work has been achieved thanks to a generic object-oriented toolbox called DistMe which we designed to automate this kind of parallelization for Monte Carlo simulations. This toolbox, written in Java is freely available on SourceForge and helped to ensure a rigorous distribution of pseudo-random number streams. It is based on the use of a documented XML format for random numbers generators statuses. (authors)
COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS
International Nuclear Information System (INIS)
Foster, C.
2001-01-01
The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of
Performance predictions for solar-chemical convertors by computer simulation
Energy Technology Data Exchange (ETDEWEB)
Luttmer, J.D.; Trachtenberg, I.
1985-08-01
A computer model which simulates the operation of Texas Instruments solar-chemical convertor (SCC) was developed. The model allows optimization of SCC processes, material, and configuration by facilitating decisions on tradeoffs among ease of manufacturing, power conversion efficiency, and cost effectiveness. The model includes various algorithms which define the electrical, electrochemical, and resistance parameters and which describ the operation of the discrete components of the SCC. Results of the model which depict the effect of material and geometric changes on various parameters are presented. The computer-calculated operation is compared with experimentall observed hydrobromic acid electrolysis rates.
Modeling and simulation the computer science of illusion
Raczynski, Stanislaw
2006-01-01
Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of
A computer simulation approach to measurement of human control strategy
Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III
1982-01-01
Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.
How Many Times Should One Run a Computational Simulation?
DEFF Research Database (Denmark)
Seri, Raffaello; Secchi, Davide
2017-01-01
This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...... statistical power analysis as a way to determine the appropriate number of runs. Two examples are then produced using results from an agent-based model. The reader is then guided through the application of this statistical technique and exposed to its limits and potentials....
COMPUTATIONAL SIMULATION OF FIRE DEVELOPMENT INSIDE A TRADE CENTRE
Directory of Open Access Journals (Sweden)
Constantin LUPU
2015-07-01
Full Text Available Real scale fire experiments involve considerable costs compared to computational mathematical modelling. This paperwork is the result of such a virtual simulation of a fire occurred in a hypothetical wholesale warehouse comprising a large number of trade stands. The analysis starts from the ignition source located inside a trade stand towards the fire expansion over three groups of compartments, by highlighting the heat transfer, both in small spaces, as well as over large distances. In order to confirm the accuracy of the simulation, the obtained values are compared to the ones from the specialized literature.
Computational electronics semiclassical and quantum device modeling and simulation
Vasileska, Dragica; Klimeck, Gerhard
2010-01-01
Starting with the simplest semiclassical approaches and ending with the description of complex fully quantum-mechanical methods for quantum transport analysis of state-of-the-art devices, Computational Electronics: Semiclassical and Quantum Device Modeling and Simulation provides a comprehensive overview of the essential techniques and methods for effectively analyzing transport in semiconductor devices. With the transistor reaching its limits and new device designs and paradigms of operation being explored, this timely resource delivers the simulation methods needed to properly model state-of
Methodology for characterizing modeling and discretization uncertainties in computational simulation
Energy Technology Data Exchange (ETDEWEB)
ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.
2000-03-01
This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.
The null-event method in computer simulation
International Nuclear Information System (INIS)
Lin, S.L.
1978-01-01
The simulation of collisions of ions moving under the influence of an external field through a neutral gas to non-zero temperatures is discussed as an example of computer models of processes in which a probe particle undergoes a series of interactions with an ensemble of other particles, such that the frequency and outcome of the events depends on internal properties of the second particles. The introduction of null events removes the need for much complicated algebra, leads to a more efficient simulation and reduces the likelihood of logical error. (Auth.)
ASAS: Computational code for Analysis and Simulation of Atomic Spectra
Directory of Open Access Journals (Sweden)
Jhonatha R. dos Santos
2017-01-01
Full Text Available The laser isotopic separation process is based on the selective photoionization principle and, because of this, it is necessary to know the absorption spectrum of the desired atom. Computational resource has become indispensable for the planning of experiments and analysis of the acquired data. The ASAS (Analysis and Simulation of Atomic Spectra software presented here is a helpful tool to be used in studies involving atomic spectroscopy. The input for the simulations is friendly and essentially needs a database containing the energy levels and spectral lines of the atoms subjected to be studied.
Analytical simulation platform describing projections in computed tomography systems
International Nuclear Information System (INIS)
Youn, Hanbean; Kim, Ho Kyung
2013-01-01
To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment
A computer program for scanning transmission ion microscopy simulation
International Nuclear Information System (INIS)
Wu, R.; Shen, H.; Mi, Y.; Sun, M.D.; Yang, M.J.
2005-01-01
With the installation of the Scanning Proton Microprobe system at Fudan University, we are in the process of developing a three-dimension reconstruction technique based on scanning transmission ion microscopy-computed tomography (STIM-CT). As the first step, a related computer program of STIM simulation has been established. This program is written in the Visual C++[reg], using the technique of OOP (Object Oriented Programming) and it is a standard multiple-document Windows[reg] program. It can be run with all MS Windows[reg] operating systems. The operating mode is the menu mode, using a multiple process technique. The stopping power theory is based on the Bethe-Bloch formula. In order to simplify the calculation, the improved cylindrical coordinate model was introduced in the program instead of a usual spherical or cylindrical coordinate model. The simulated results of a sample at several rotation angles are presented
A Computational Approach for Probabilistic Analysis of Water Impact Simulations
Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.
2009-01-01
NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.
Development of computer simulations for landfill methane recovery
Energy Technology Data Exchange (ETDEWEB)
Massmann, J.W.; Moore, C.A.; Sykes, R.M.
1981-12-01
Two- and three-dimensional finite-difference computer programs simulating methane recovery systems in landfills have been developed. These computer programs model multicomponent combined pressure and diffusional flow in porous media. Each program and the processes it models are described in this report. Examples of the capabilities of each program are also presented. The two-dimensional program was used to simulate methane recovery systems in a cylindrically shaped landfill. The effects of various pump locations, geometries, and extraction rates were determined. The three-dimensional program was used to model the Puente Hills landfill, a field test site in southern California. The biochemical and microbiological details of methane generation in landfills are also given. Effects of environmental factors, such as moisture, oxygen, temperature, and nutrients on methane generation are discussed and an analytical representation of the gas generation rate is developed.
Adding computationally efficient realism to Monte Carlo turbulence simulation
Campbell, C. W.
1985-01-01
Frequently in aerospace vehicle flight simulation, random turbulence is generated using the assumption that the craft is small compared to the length scales of turbulence. The turbulence is presumed to vary only along the flight path of the vehicle but not across the vehicle span. The addition of the realism of three-dimensionality is a worthy goal, but any such attempt will not gain acceptance in the simulator community unless it is computationally efficient. A concept for adding three-dimensional realism with a minimum of computational complexity is presented. The concept involves the use of close rational approximations to irrational spectra and cross-spectra so that systems of stable, explicit difference equations can be used to generate the turbulence.
Designing intelligent computer-based simulations: a pragmatic approach
Directory of Open Access Journals (Sweden)
Bernard M. Garrett
2001-12-01
Full Text Available There has been great interest in the potential use of multimedia computer-based learning (CBL packages within higher education. The effectiveness of such systems, however, remains controversial. There are suggestions that such multimedia applications may hold no advantage over traditional formats (Barron and Atkins, 1994; Ellis, 1994; Laurillard, 1995; Simms, 1997; Leibowitz, 1999. One area where multimedia CBL may still prove its value is in the simulation of activities where experiential learning is expensive, undesirable or even dangerous.
Application of computer simulation in the stereology of materials
Czech Academy of Sciences Publication Activity Database
Saxl, Ivan; Ponížil, P.; Löflerová, M.
2009-01-01
Roč. 4, č. 2 (2009), s. 231-249 ISSN 1741-8410 R&D Projects: GA ČR GA201/06/0302 Grant - others:GA ČR(CZ) GA106/05/0550 Institutional research plan: CEZ:AV0Z10190503 Keywords : 3D computer simulation * fibre anisotropy * fracture surface * grain size estimation * random tessellation * rough surface analysis * fibre processes Subject RIV: BA - General Mathematics
Few-Body Problem: Theory and Computer Simulations
Flynn, Chris
A conference held in honour of the 60th birthday of Professor Mauri Valtonen in Turku, Finland, 4th-9th July 2005. The conference's major themes were the few-body problem in celestial mechanics and its numerical study; the theory of few-body escape; dynamics of multiple stars; computer simulations versus observations; planetary systems and few-body dynamics, and chaos in the few-body problem.
Simulating soil melting with CFD [computational fluid dynamics
International Nuclear Information System (INIS)
Hawkes, G.L.
1997-01-01
Computational fluid dynamics (CFD) is being used to validate the use of thermal plasma arc vitrification for treatment of contaminated soil. Soil melting is modelled by a CFD calculation code which links electrical fields, heat transport, and natural convection. The developers believe it is the first successful CFD analysis to incorporate a simulated PID (proportional-integral-derivative) controller, which plays a vital role by following the specified electrical power curve. (Author)
Simulation of Profiles Data For Computed Tomography Using Object Images
International Nuclear Information System (INIS)
Srisatit, Somyot
2007-08-01
Full text: It is necessary to use a scanning system to obtain the profiles data for computed tomographic images. A good profile data can give a good contrast and resolution. For the scanning system, high efficiency and high price of radiation equipments must be used. So, the simulated profiles data to obtain a good CT images quality as same as the real one for the demonstration can be used
Computer simulation of RBS spectra from samples with surface roughness
Czech Academy of Sciences Publication Activity Database
Malinský, Petr; Hnatowicz, Vladimír; Macková, Anna
2016-01-01
Roč. 371, MAR (2016), s. 101-105 ISSN 0168-583X. [22nd International conference on Ion Beam Analysis (IBA). Opatija, 14.06.2015-19.06.2015] R&D Projects: GA MŠk(CZ) LM2011019; GA ČR GA15-01602S Institutional support: RVO:61389005 Keywords : computer simulation * Rutherford backscattering * surface roughness Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.109, year: 2016
A Computational Cluster for Advanced Plasma Physics Simulations
2010-02-08
were made amongst several cluster manufacturers, including Cray, IBM, Dell, Silicon Mechanics, Rackable Sytems , and SiCortex before deciding on the...simulated. The algorithm implements the discontinuous Galerkin method to achieve high-order accuracy and will use body -fitted computational meshes to...APS Poster’s work and ICC 2010 made use of the ICE cluster: 2009 APS: ”Plasma Solution Quality in Distorted, Body -Fitted Meshes in SEL/HiFi”, W
IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report
Energy Technology Data Exchange (ETDEWEB)
William M. Bond; Salih Ersayin
2007-03-30
This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern
High performance computer code for molecular dynamics simulations
International Nuclear Information System (INIS)
Levay, I.; Toekesi, K.
2007-01-01
Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions
Neurosurgical simulation by interactive computer graphics on iPad.
Maruyama, Keisuke; Kin, Taichi; Saito, Toki; Suematsu, Shinya; Gomyo, Miho; Noguchi, Akio; Nagane, Motoo; Shiokawa, Yoshiaki
2014-11-01
Presurgical simulation before complicated neurosurgery is a state-of-the-art technique, and its usefulness has recently become well known. However, simulation requires complex image processing, which hinders its widespread application. We explored handling the results of interactive computer graphics on the iPad tablet, which can easily be controlled anywhere. Data from preneurosurgical simulations from 12 patients (4 men, 8 women) who underwent complex brain surgery were loaded onto an iPad. First, DICOM data were loaded using Amira visualization software to create interactive computer graphics, and ParaView, another free visualization software package, was used to convert the results of the simulation to be loaded using the free iPad software KiwiViewer. The interactive computer graphics created prior to neurosurgery were successfully displayed and smoothly controlled on the iPad in all patients. The number of elements ranged from 3 to 13 (mean 7). The mean original data size was 233 MB, which was reduced to 10.4 MB (4.4% of original size) after image processing by ParaView. This was increased to 46.6 MB (19.9%) after decompression in KiwiViewer. Controlling the magnification, transfer, rotation, and selection of translucence in 10 levels of each element were smoothly and easily performed using one or two fingers. The requisite skill to smoothly control the iPad software was acquired within 1.8 trials on average in 12 medical students and 6 neurosurgical residents. Using an iPad to handle the result of preneurosurgical simulation was extremely useful because it could easily be handled anywhere.
Improving computational efficiency of Monte Carlo simulations with variance reduction
International Nuclear Information System (INIS)
Turner, A.; Davis, A.
2013-01-01
CCFE perform Monte-Carlo transport simulations on large and complex tokamak models such as ITER. Such simulations are challenging since streaming and deep penetration effects are equally important. In order to make such simulations tractable, both variance reduction (VR) techniques and parallel computing are used. It has been found that the application of VR techniques in such models significantly reduces the efficiency of parallel computation due to 'long histories'. VR in MCNP can be accomplished using energy-dependent weight windows. The weight window represents an 'average behaviour' of particles, and large deviations in the arriving weight of a particle give rise to extreme amounts of splitting being performed and a long history. When running on parallel clusters, a long history can have a detrimental effect on the parallel efficiency - if one process is computing the long history, the other CPUs complete their batch of histories and wait idle. Furthermore some long histories have been found to be effectively intractable. To combat this effect, CCFE has developed an adaptation of MCNP which dynamically adjusts the WW where a large weight deviation is encountered. The method effectively 'de-optimises' the WW, reducing the VR performance but this is offset by a significant increase in parallel efficiency. Testing with a simple geometry has shown the method does not bias the result. This 'long history method' has enabled CCFE to significantly improve the performance of MCNP calculations for ITER on parallel clusters, and will be beneficial for any geometry combining streaming and deep penetration effects. (authors)
Some computer simulations based on the linear relative risk model
International Nuclear Information System (INIS)
Gilbert, E.S.
1991-10-01
This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs
Computer modeling of road bridge for simulation moving load
Directory of Open Access Journals (Sweden)
Miličić Ilija M.
2016-01-01
Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .
COMPUTER SIMULATION THE MECHANICAL MOVEMENT BODY BY MEANS OF MATHCAD
Directory of Open Access Journals (Sweden)
Leonid Flehantov
2017-03-01
Full Text Available Here considered the technique of using computer mathematics system MathCAD for computer implementation of mathematical model of the mechanical motion of the physical body thrown at an angle to the horizon, and its use for educational computer simulation experiment in teaching the fundamentals of mathematical modeling. The advantages of MathCAD as environment of implementation mathematical models in the second stage of higher education are noted. It describes the creation the computer simulation model that allows you to comprehensively analyze the process of mechanical movement of the body, changing the input parameters of the model: the acceleration of gravity, the initial and final position of the body, the initial velocity and angle, the geometric dimensions of the body and goals. The technique aimed at the effective assimilation of basic knowledge and skills of students on the basics of mathematical modeling, it provides an opportunity to better master the basic theoretical principles of mathematical modeling and related disciplines, promotes logical thinking development of students, their motivation to learn discipline, improves cognitive interest, forms skills research activities than creating conditions for the effective formation of professional competence of future specialists.
An FPGA computing demo core for space charge simulation
Energy Technology Data Exchange (ETDEWEB)
Wu, Jinyuan; Huang, Yifei; /Fermilab
2009-01-01
In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.
Mathematical and computational modeling and simulation fundamentals and case studies
Moeller, Dietmar P F
2004-01-01
Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...
Real time simulation of large systems on mini-computer
International Nuclear Information System (INIS)
Nakhle, Michel; Roux, Pierre.
1979-01-01
Most simulation languages will only accept an explicit formulation of differential equations, and logical variables hold no special status therein. The pace of the suggested methods of integration is limited by the smallest time constant of the model submitted. The NEPTUNIX 2 simulation software has a language that will take implicit equations and an integration method of which the variable pace is not limited by the time constants of the model. This, together with high time and memory ressources optimization of the code generated, makes NEPTUNIX 2 a basic tool for simulation on mini-computers. Since the logical variables are specific entities under centralized control, correct processing of discontinuities and synchronization with a real process are feasible. The NEPTUNIX 2 is the industrial version of NEPTUNIX 1 [fr
Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models; FINAL
International Nuclear Information System (INIS)
Cook, Chris B; Richmond, Marshall C
2001-01-01
This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields
Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models
Energy Technology Data Exchange (ETDEWEB)
Cook, Christopher B.; Richmond, Marshall C.
2001-05-01
This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.
Computational physics simulation of classical and quantum systems
Scherer, Philipp O J
2017-01-01
This textbook presents basic numerical methods and applies them to a large variety of physical models in multiple computer experiments. Classical algorithms and more recent methods are explained. Partial differential equations are treated generally comparing important methods, and equations of motion are solved by a large number of simple as well as more sophisticated methods. Several modern algorithms for quantum wavepacket motion are compared. The first part of the book discusses the basic numerical methods, while the second part simulates classical and quantum systems. Simple but non-trivial examples from a broad range of physical topics offer readers insights into the numerical treatment but also the simulated problems. Rotational motion is studied in detail, as are simple quantum systems. A two-level system in an external field demonstrates elementary principles from quantum optics and simulation of a quantum bit. Principles of molecular dynamics are shown. Modern bounda ry element methods are presented ...
Lightweight computational steering of very large scale molecular dynamics simulations
International Nuclear Information System (INIS)
Beazley, D.M.
1996-01-01
We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages
Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes
West, Jan; Veenstra, Anneke
2012-01-01
Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…
Computational simulation of materials notes for lectures given at UCSB, May 1996--June 1996
Energy Technology Data Exchange (ETDEWEB)
LeSar, R.
1997-01-01
This report presents information from a lecture given on the computational simulation of materials. The purpose is to introduce modern computerized simulation methods for materials properties and response.
Accelerating Climate and Weather Simulations through Hybrid Computing
Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark
2011-01-01
Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.
Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation
Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.
2012-01-01
The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.
Lu, Jing; Yu, Jie; Shi, Heshui
2017-01-01
Adding functional features to morphological features offers a new method for non-invasive assessment of myocardial perfusion. This study aimed to explore technical routes of assessing the left coronary artery pressure gradient, wall shear stress distribution and blood flow velocity distribution, combining three-dimensional coronary model which was based on high resolution dual-source computed tomography (CT) with computational fluid dynamics (CFD) simulation. Three cases of no obvious stenosis, mild stenosis and severe stenosis in left anterior descending (LAD) were enrolled. Images acquired on dual-source CT were input into software Mimics, ICEMCFD and FLUENT to simulate pressure gradient, wall shear stress distribution and blood flow velocity distribution. Measuring coronary enhancement ratio of coronary artery was to compare with pressure gradient. Results conformed to theoretical values and showed difference between normal and abnormal samples. The study verified essential parameters and basic techniques in blood flow numerical simulation preliminarily. It was proved feasible.
Scientific and computational challenges of the fusion simulation project (FSP)
Tang, W. M.
2008-07-01
This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER — a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied
Scientific and computational challenges of the fusion simulation project (FSP)
International Nuclear Information System (INIS)
Tang, W M
2008-01-01
This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics
Quantum computation and simulation with trapped ions using dissipation
International Nuclear Information System (INIS)
Schindler, P.
2013-01-01
current quantum systems do not allow for the required level of control. Nevertheless it seems promising to adapt the techniques developed for quantum information processing to build a quantum simulator. Such a device is able to efficiently reproduce the dynamics of any quantum system - a task that is only possible for small systems on existing classical computers. However, the quantum system of interest may be coupled to a classical environment where many examples for such systems can be found in quantum biology and quantum chemistry. These systems are often embedded in a thermal environment and, analogous to classical physics, show non-reversible, or dissipative, dynamics. Thus, also the quantum simulator should be able to reproduce dissipative dynamics which requires an extension of the usual quantum computing toolbox. In the context of quantum computing, such a coupling is usually treated as a noise process that defeats the possible gain from using such a device. Interestingly it has been shown that an environment can be engineered that drives the system towards a state that features entanglement and can serve as a resource for quantum information processing. In this thesis, an extended toolbox that goes beyond coherent operations is introduced in our small-scale ion-trap quantum information processor. This is then used to create an entangled state through dissipative dynamics. In the next step a quantum simulation of a dissipative many-body system is performed, demonstrating the hallmark feature of a novel type of quantum phase transitions. (author) [de
A Computer-Based Simulation of an Acid-Base Titration
Boblick, John M.
1971-01-01
Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)
Numerical simulation of NQR/NMR: Applications in quantum computing.
Possa, Denimar; Gaudio, Anderson C; Freitas, Jair C C
2011-04-01
A numerical simulation program able to simulate nuclear quadrupole resonance (NQR) as well as nuclear magnetic resonance (NMR) experiments is presented, written using the Mathematica package, aiming especially applications in quantum computing. The program makes use of the interaction picture to compute the effect of the relevant nuclear spin interactions, without any assumption about the relative size of each interaction. This makes the program flexible and versatile, being useful in a wide range of experimental situations, going from NQR (at zero or under small applied magnetic field) to high-field NMR experiments. Some conditions specifically required for quantum computing applications are implemented in the program, such as the possibility of use of elliptically polarized radiofrequency and the inclusion of first- and second-order terms in the average Hamiltonian expansion. A number of examples dealing with simple NQR and quadrupole-perturbed NMR experiments are presented, along with the proposal of experiments to create quantum pseudopure states and logic gates using NQR. The program and the various application examples are freely available through the link http://www.profanderson.net/files/nmr_nqr.php. Copyright © 2011 Elsevier Inc. All rights reserved.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Temi Linjewile; Mike Maguire; Adel Sarofim; Connie Senior; Changguan Yang; Hong-Shig Shim
2004-04-28
This is the fourteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused primarily on completing a prototype detachable user interface for the framework and on integrating Carnegie Mellon Universities IECM model core with the computational engine. In addition to this work, progress has been made on several other development and modeling tasks for the program. These include: (1) improvements to the infrastructure code of the computational engine, (2) enhancements to the model interfacing specifications, (3) additional development to increase the robustness of all framework components, (4) enhanced coupling of the computational and visualization engine components, (5) a series of detailed simulations studying the effects of gasifier inlet conditions on the heat flux to the gasifier injector, and (6) detailed plans for implementing models for mercury capture for both warm and cold gas cleanup have been created.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio
2003-04-25
This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two gasifier types. An improved process model for simulating entrained flow gasifiers has been implemented into the workbench. Model development has focused on: a pre-processor module to compute global gasification parameters from standard fuel properties and intrinsic rate information; a membrane based water gas shift; and reactors to oxidize fuel cell exhaust gas. The data visualization capabilities of the workbench have been extended by implementing the VTK visualization software that supports advanced visualization methods, including inexpensive Virtual Reality techniques. The ease-of-use, functionality and plug-and-play features of the workbench were highlighted through demonstrations of the workbench at a DOE sponsored coal utilization conference. A white paper has been completed that contains recommendations on the use of component architectures, model interface protocols and software frameworks for developing a Vision 21 plant simulator.
COMPARATIVE STUDY OF TERTIARY WASTEWATER TREATMENT BY COMPUTER SIMULATION
Directory of Open Access Journals (Sweden)
Stefania Iordache
2010-01-01
Full Text Available The aim of this work is to asses conditions for implementation of a Biological Nutrient Removal (BNR process in theWastewater Treatment Plant (WWTP of Moreni city (Romania. In order to meet the more increased environmentalregulations, the wastewater treatment plant that was studied, must update the actual treatment process and have tomodernize it. A comparative study was undertaken of the quality of effluents that could be obtained by implementationof biological nutrient removal process like A2/O (Anaerobic/Anoxic/Oxic and VIP (Virginia Plant Initiative aswastewater tertiary treatments. In order to asses the efficiency of the proposed treatment schemata based on the datamonitored at the studied WWTP, it were realized computer models of biological nutrient removal configurations basedon A2/O and VIP process. Computer simulation was realized using a well-known simulator, BioWin by EnviroSimAssociates Ltd. The simulation process allowed to obtain some data that can be used in design of a tertiary treatmentstage at Moreni WWTP, in order to increase the efficiency in operation.
Trace contaminant control simulation computer program, version 8.1
Perry, J. L.
1994-01-01
The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various process technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. Included in the simulation are chemical and physical adsorption by activated charcoal, chemical adsorption by lithium hydroxide, absorption by humidity condensate, and low- and high-temperature catalytic oxidation. Means are provided for simulating regenerable as well as nonregenerable systems. The program provides an overall mass balance of chemical contaminants in a spacecraft cabin given specified generation rates. Removal rates are based on device flow rates specified by the user and calculated removal efficiencies based on cabin concentration and removal technology experimental data. Versions 1.0 through 8.0 are documented in NASA TM-108409. TM-108409 also contains a source file listing for version 8.0. Changes to version 8.0 are documented in this technical memorandum and a source file listing for the modified version, version 8.1, is provided. Detailed descriptions for the computer program subprograms are extracted from TM-108409 and modified as necessary to reflect version 8.1. Version 8.1 supersedes version 8.0. Information on a separate user's guide is available from the author.
Simulation of computed tomography dose based on voxel phantom
Liu, Chunyu; Lv, Xiangbo; Li, Zhaojun
2017-01-01
Computed Tomography (CT) is one of the preferred and the most valuable imaging tool used in diagnostic radiology, which provides a high-quality cross-sectional image of the body. It still causes higher doses of radiation to patients comparing to the other radiological procedures. The Monte-Carlo method is appropriate for estimation of the radiation dose during the CT examinations. The simulation of the Computed Tomography Dose Index (CTDI) phantom was developed in this paper. Under a similar conditions used in physical measurements, dose profiles were calculated and compared against the measured values that were reported. The results demonstrate a good agreement between the calculated and the measured doses. From different CT exam simulations using the voxel phantom, the highest absorbed dose was recorded for the lung, the brain, the bone surface. A comparison between the different scan type shows that the effective dose for a chest scan is the highest one, whereas the effective dose values during abdomen and pelvis scan are very close, respectively. The lowest effective dose resulted from the head scan. Although, the dose in CT is related to various parameters, such as the tube current, exposure time, beam energy, slice thickness and patient size, this study demonstrates that the MC simulation is a useful tool to accurately estimate the dose delivered to any specific organs for patients undergoing the CT exams and can be also a valuable technique for the design and the optimization of the CT x-ray source.
Optical computing for image bandwidth compression: analysis and simulation.
Hunt, B R
1978-09-15
Image bandwidth compression is dominated by digital methods for carrying out the required computations. This paper discusses the general problem of using optics to realize the computations in bandwidth compression. A common method of digital bandwidth compression, feedback differential pulse code modulation (DPCM), is reviewed, and the obstacles to making a direct optical analogy to feedback DPCM are discussed. Instead of a direct optical analogy to DPCM, an optical system which captures the essential features of DPCM without optical feedback is introduced. The essential features of this incoherent optical system are encoding of low-frequency information and generation of difference samples which can be coded with a small number of bits. A simulation of this optical system by means of digital image processing is presented, and performance data are also included.
Computer simulations of the activity of RND efflux pumps.
Vargiu, Attilio Vittorio; Ramaswamy, Venkata Krishnan; Malloci, Giuliano; Malvacio, Ivana; Atzori, Alessio; Ruggerone, Paolo
2018-01-31
The putative mechanism by which bacterial RND-type multidrug efflux pumps recognize and transport their substrates is a complex and fascinating enigma of structural biology. How a single protein can recognize a huge number of unrelated compounds and transport them through one or just a few mechanisms is an amazing feature not yet completely unveiled. The appearance of cooperativity further complicates the understanding of structure-dynamics-activity relationships in these complex machineries. Experimental techniques may have limited access to the molecular determinants and to the energetics of key processes regulating the activity of these pumps. Computer simulations are a complementary approach that can help unveil these features and inspire new experiments. Here we review recent computational studies that addressed the various molecular processes regulating the activity of RND efflux pumps. Copyright © 2018 The Authors. Published by Elsevier Masson SAS.. All rights reserved.
Development of a Computer Application to Simulate Porous Structures
Directory of Open Access Journals (Sweden)
S.C. Reis
2002-09-01
Full Text Available Geometric modeling is an important tool to evaluate structural parameters as well as to follow the application of stereological relationships. The obtention, visualization and analysis of volumetric images of the structure of materials, using computational geometric modeling, facilitates the determination of structural parameters of difficult experimental access, such as topological and morphological parameters. In this work, we developed a geometrical model implemented by computer software that simulates random pore structures. The number of nodes, number of branches (connections between nodes and the number of isolated parts, are obtained. Also, the connectivity (C is obtained from this application. Using a list of elements, nodes and branches, generated by the software, in AutoCAD® command line format, the obtained structure can be viewed and analyzed.
Software Development Processes Applied to Computational Icing Simulation
Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.
1999-01-01
The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.
Computational Fluid Dynamics (CFD) simulations of a Heisenberg Vortex Tube
Bunge, Carl; Sitaraman, Hariswaran; Leachman, Jake
2017-11-01
A 3D Computational Fluid Dynamics (CFD) simulation of a Heisenberg Vortex Tube (HVT) is performed to estimate cooling potential with cryogenic hydrogen. The main mechanism driving operation of the vortex tube is the use of fluid power for enthalpy streaming in a highly turbulent swirl in a dual-outlet tube. This enthalpy streaming creates a temperature separation between the outer and inner regions of the flow. Use of a catalyst on the peripheral wall of the centrifuge enables endothermic conversion of para-ortho hydrogen to aid primary cooling. A κ- ɛ turbulence model is used with a cryogenic, non-ideal equation of state, and para-orthohydrogen species evolution. The simulations are validated with experiments and strategies for parametric optimization of this device are presented.
Computer code for simulating pressurized water reactor core
International Nuclear Information System (INIS)
Serrano, A.M.B.
1978-01-01
A computer code was developed for the simulation of the steady-state and transient behaviour of the average channel of a Pressurizer Water Reactor core. Point kinetics equations were used with the reactivity calculated for average temperatures in the channel with the fuel and moderator temperature feedbacks. The radial heat conduction equation in the fuel was solved numerically. For calculating the thermodynamic properties of the coolant, the fundamental equations of conservation (mass, energy and momentum) were solved. The gap and clad were treated as a resistance added to the film coefficient. The fuel system equations were decoupled from the coolant equations. The program permitted the changes in the heat transfer correlations and the flow patterns along the coolant channel. Various test were performed to determine the steady-state and transient response employing the PWR core simulator developed, obtaining results with adequate precision. (author)
Simulating the impacts of fire: A computer program
Ffolliott, Peter F.; Guertin, D. Phillip; Rasmussen, William D.
1988-11-01
Recurrent fire has played a dominant role in the ecology of southwestern ponderosa pine forests. To assess the benefits or losses of fire in these forests, a computer simulation model, called BURN, considers vegetation (mortality, regeneration, and production of herbaceous vegetation), wildlife (populations and habitats), and hydrology (streamflow and water quality). In the formulation of the model, graphical representations (time-trend response curves) of increases or losses (compared to an unburned control) after the occurrence of fire are converted to fixedterm annual ratios, and then annuities for the simulation components. Annuity values higher than 1.0 indicate benefits, while annuity values lower than 1.0 indicate losses. Studies in southwestern ponderosa pine forests utilized in the development of BURN are described briefly.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio
2002-07-28
This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.
[Possibilities of computer graphics simulation in orthopedic surgery].
Kessler, P; Wiltfang, J; Teschner, M; Girod, B; Neukam, F W
2000-11-01
In addition to standard X-rays, photographic documentation, cephalometric and model analysis, a computer-aided, three-dimensional (3D) simulation system has been developed in close cooperation with the Institute of Communications of the Friedrich-Alexander-Universität Erlangen-Nürnberg. With this simulation system a photorealistic prediction of the expected soft tissue changes can be made. Prerequisites are a 3D reconstruction of the facial skeleton and a 3D laser scan of the face. After data reduction, the two data sets can be matched. Cutting planes enable the transposition of bony segments. The laser scan of the facial surface is combined with the underlying bone via a five-layered soft tissue model to convert bone movements on the soft tissue cover realistically. Further research is necessary to replace the virtual subcutaneous soft tissue model by correct, topographic tissue anatomy.
Simulation models for computational plasma physics: Concluding report
International Nuclear Information System (INIS)
Hewett, D.W.
1994-01-01
In this project, the authors enhanced their ability to numerically simulate bounded plasmas that are dominated by low-frequency electric and magnetic fields. They moved towards this goal in several ways; they are now in a position to play significant roles in the modeling of low-frequency electromagnetic plasmas in several new industrial applications. They have significantly increased their facility with the computational methods invented to solve the low frequency limit of Maxwell's equations (DiPeso, Hewett, accepted, J. Comp. Phys., 1993). This low frequency model is called the Streamlined Darwin Field model (SDF, Hewett, Larson, and Doss, J. Comp. Phys., 1992) has now been implemented in a fully non-neutral SDF code BEAGLE (Larson, Ph.D. dissertation, 1993) and has further extended to the quasi-neutral limit (DiPeso, Hewett, Comp. Phys. Comm., 1993). In addition, they have resurrected the quasi-neutral, zero-electron-inertia model (ZMR) and began the task of incorporating internal boundary conditions into this model that have the flexibility of those in GYMNOS, a magnetostatic code now used in ion source work (Hewett, Chen, ICF Quarterly Report, July--September, 1993). Finally, near the end of this project, they invented a new type of banded matrix solver that can be implemented on a massively parallel computer -- thus opening the door for the use of all their ADI schemes on these new computer architecture's (Mattor, Williams, Hewett, submitted to Parallel Computing, 1993)
Computer simulation of displacement energies for several ceramic materials
Williford, R. E.; Devanathan, R.; Weber, W. J.
1998-05-01
Displacement energies ( Ed) are fundamental parameters controlling the production of radiation damage in materials, and as such, are useful for understanding and modeling the effects of radiation on materials. These energies are not easily determined experimentally for many ceramic materials. However, advances in computational methodologies and their application to ceramic materials provide a means to determine these energies in a number of materials of interest. Although computationally intensive molecular dynamics methods can be used to determine Ed for the various cations and anions, energy minimization methods can also provide a more expedient means to obtain reasonable estimates of these energies. In this paper, the energy minimization code General Utility Lattice Program (GULP), which uses a Mott-Littleton approximation to simulate isolated defects in extended solids, is used to calculate displacement energies. The validity of using this code for these computations is established by calculating Ed for several ceramics for which these energies are known. Computational results are in good agreement with the experimental values for alumina, MgO, and ZnO. Results are also presented for two ceramic materials, zircon and spinel, for which there are little or no experimental values yet available.
Nonlinear simulations with and computational issues for NIMROD
International Nuclear Information System (INIS)
Sovinec, C.R.
1998-01-01
The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this
Nonlinear simulations with and computational issues for NIMROD
Energy Technology Data Exchange (ETDEWEB)
Sovinec, C.R. [Los Alamos National Lab., NM (United States)
1998-12-31
The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.
Scientific and Computational Challenges of the Fusion Simulation Program (FSP)
International Nuclear Information System (INIS)
Tang, William M.
2011-01-01
This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e
Light & Skin Interactions Simulations for Computer Graphics Applications
Baranoski, Gladimir V G
2010-01-01
Light and Skin Interactions immerses you in one of the most fascinating application areas of computer graphics: appearance simulation. The book first illuminates the fundamental biophysical processes that affect skin appearance, and reviews seminal related works aimed at applications in life and health sciences. It then examines four exemplary modeling approaches as well as definitive algorithms that can be used to generate realistic images depicting skin appearance. An accompanying companion site also includes complete code and data sources for the BioSpec model, which is considered to be the
Scientific and Computational Challenges of the Fusion Simulation Program (FSP)
Energy Technology Data Exchange (ETDEWEB)
William M. Tang
2011-02-09
This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e
Computational strategies for three-dimensional flow simulations on distributed computer systems
Sankar, Lakshmi N.; Weed, Richard A.
1995-01-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior
2004-12-22
, immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.
Optimization of suspension smelting technology by computer simulation
Energy Technology Data Exchange (ETDEWEB)
Lilius, K.; Jokilaakso, A.; Ahokainen, T.; Teppo, O.; Yang Yongxiang [Helsinki Univ. of Technology, Otaniemi (Finland). Lab. of Materials Processing and Powder Metallurgy
1996-12-31
An industrial-scale flash smelting furnace and waste-heat boilers have been modelled by using commercial Computational-Fluid-Dynamics software. The work has proceeded from cold gas flow to heat transfer, combustion, and two-phase flow simulations. In the present study, the modelling task has been divided into three sub-models: (1) the concentrate burner, (2) the flash smelting furnace (reaction shaft and uptake shaft), and (3) the waste-heat boiler. For the concentrate burner, the flow of the process gas and distribution air together with the concentrate or a feed mixture was simulated. Eulerian - Eulerian approach was used for the carrier gas-phase and the dispersed particle-phase. A large parametric study was carried out by simulating a laboratory scale burner with varying turbulence intensities and then extending the simulations to the industrial scale model. For the flash smelting furnace, the simulation work concentrated on gas and gas-particle two-phase flows, as well as the development of combustion model for sulphide concentrate particles. Both Eulerian and Lagrangian approaches have been utilised in describing the particle phase and the spreading of the concentrate in the reaction shaft as well as the particle tracks have been obtained. Combustion of sulphides was first approximated with gaseous combustion by using a built-in combustion model of the software. The real oxidation reactions of the concentrate particles were then coded as a user-defined sub-routine and that was tested with industrial flash smelting cases. For the waste-heat boiler, both flow and heat transfer calculations have been carried out for an old boiler and a modified boiler SULA 2 Research Programme; 23 refs.
CISBAT 2007 - Software and new information technologies (computer simulation)
Energy Technology Data Exchange (ETDEWEB)
NONE
2007-07-01
This is the eleventh and final part of the proceedings of the 2007 CISBAT conference on Renewables in a changing climate, held in Lausanne, Switzerland. On the subject Information technologies and software the following oral contributions are summarised: 'A comparison study of the likely performance of an advanced naturally ventilated building: the relationship between computer simulation analysis and findings from a monitoring exercise', 'PhotonSim: Developing and testing a Monte Carlo ray tracing software for the simulation of planar solar concentrators' and 'Calibration of multiple energy simulations: case study on a Norwegian school building'. Posters summarised include 'Implementing energy building simulation into design studio: lessons learned in Brazil', 'MaterialsDB.org: a Tool for facilitating information exchange between building material providers and building physics softwares', 'Urban built environment climate modification: a modelling approach' and 'An assessment of the Simple Building Energy Model'. Further, the following Software that was presented on stands is summarised: 'Polysun 4: Simulation of solar thermal systems with complex hydraulics', 'Solangles: an internet online service to draw sun rays in plans and sections', 'Daylight 1-2-3: A text guide and a software as integrated tools for initial daylight/energy design', 'Meteonorm Version 5.0' and 'Transol 2.0 - Software for the design of solar thermal systems'. Further groups of presentations at the conference are reported on in separate database records. An index of authors completes the proceedings.
Pohorille, Andrew; New, Michael H.; Schweighofer, Karl; Wilson, Michael A.; DeVincenzi, Donald L. (Technical Monitor)
2000-01-01
Two of Ernest Overton's lasting contributions to biology are the Meyer-Overton relationship between the potency of an anesthetic and its solubility in oil, and the Overton rule which relates the permeability of a membrane to the oil-water partition coefficient of the permeating molecule. A growing body of experimental evidence, however, cannot be reconciled with these theories. In particular, the molecular nature of membranes, unknown to Overton, needs to be included in any description of these phenomena. Computer simulations are ideally suited for providing atomic-level information about the behavior of small molecules in membranes. The authors discuss simulation studies relevant to Overton's ideas. Through simulations it was found that anesthetics tend to concentrate at interfaces and their anesthetic potency correlates better with solubility at the water-membrane interface than with solubility in oil. Simulation studies of membrane permeation revealed the anisotropic nature of the membranes, as evidenced, for example, by the highly nonuniform distribution of free volume in the bilayer. This, in turn, influences the diffusion rates of solutes, which increase with the depth in the membrane. Small solutes tend to move by hopping between voids in the bilayer, and this hopping motion may be responsible for the deviation from the Overton rule of the permeation rates of these molecules.
Moon, Hongsik
What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the
Computer Simulation of Hydraulic Systems with Typical Nonlinear Characteristics
Directory of Open Access Journals (Sweden)
D. N. Popov
2017-01-01
Full Text Available The task was to synthesise an adjustable hydraulic system structure, the mathematical model of which takes into account its inherent nonlinearity. Its solution suggests using a successive computer simulations starting with a structure of the linearized stable hydraulic system, which is then complicated by including the essentially non-linear elements. The hydraulic system thus obtained may be unable to meet the Lyapunov stability criterion and be unstable. This can be eliminated through correcting elements. Control of correction results is provided according to the form of transition processes due to stepwise variation of the control signal.Computer simulation of a throttle-controlled electrohydraulic servo drive with the rotary output element illustrates the proposed method application. A constant pressure power source provides fluid feed for the drive under pressure.For drive simulation the following models were involved: the linear model, the model taking into consideration a non-linearity of the flow-dynamic characteristics of a spool-type valve, and the non-linear models that take into account the dry friction in the spool-type valve, the backlash in the steering angle sensor of the motor shaft.The paper shows possibility of damping oscillation caused by variable hydrodynamic forces through introducing a correction device.The list of references attached contains 16 sources, which were used to justify and explain certain factors of the automatic control theory and the fluid mechanics of unsteady flows.The article presents 6 block-diagrams of the electrohydraulic servo drive and their appropriate transition processes, which have been studied.
Computer Simulation of Embryonic Systems: What can a ...
(1) Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative pr
Computer simulation of plastic deformation in irradiated metals
International Nuclear Information System (INIS)
Colak, U.
1989-01-01
A computer-based model is developed for the localized plastic deformation in irradiated metals by dislocation channeling, and it is applied to irradiated single crystals of niobium. In the model, the concentrated plastic deformation in the dislocation channels is postulated to occur by virtue of the motion of dislocations in a series of pile-tips on closely spaced parallel slip planes. The dynamics of this dislocation motion is governed by an experimentally determined dependence of dislocation velocity on shear stress. This leads to a set of coupled differential equations for the positions of the individual dislocations in the pile-up as a function of time. Shear displacement in the channel region is calculated from the total distance traveled by the dislocations. The macroscopic shape change in single crystal metal sheet samples is determined by the axial displacement produced by the shear displacements in the dislocation channels. Computer simulations are performed for the plastic deformation up to 20% engineering strain at a constant strain rate. Results of the computer calculations are compared with experimental observations of the shear stress-engineering strain curve obtained in tensile tests described in the literature. Agreement between the calculated and experimental stress-strain curves is obtained for shear displacement of 1.20-1.25 μm and 1000 active slip planes per channel, which is reasonable in the view of experimental observations
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
International Nuclear Information System (INIS)
Mike Bockelie; Dave Swensen; Martin Denison
2002-01-01
This is the fifth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, our efforts have become focused on developing an improved workbench for simulating a gasifier based Vision 21 energyplex. To provide for interoperability of models developed under Vision 21 and other DOE programs, discussions have been held with DOE and other organizations developing plant simulator tools to review the possibility of establishing a common software interface or protocol to use when developing component models. A component model that employs the CCA protocol has successfully been interfaced to our CCA enabled workbench. To investigate the software protocol issue, DOE has selected a gasifier based Vision 21 energyplex configuration for use in testing and evaluating the impacts of different software interface methods. A Memo of Understanding with the Cooperative Research Centre for Coal in Sustainable Development (CCSD) in Australia has been completed that will enable collaborative research efforts on gasification issues. Preliminary results have been obtained for a CFD model of a pilot scale, entrained flow gasifier. A paper was presented at the Vision 21 Program Review Meeting at NETL (Morgantown) that summarized our accomplishments for Year One and plans for Year Two and Year Three
Protein adsorption on nanoparticles: model development using computer simulation.
Shao, Qing; Hall, Carol K
2016-10-19
The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter = 5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.
COMPUTER GRAPHICS IN SIMULATION OF CARDIOVASCULAR TRANSPORT PHENOMENA*
Sidell, P. M.; Anderson, D. U.; Knopp, T. J.; Bassingthwaighte, J. B.
2010-01-01
Simulation is a necessary tool if we are to understand better the complexities involved in cardiovascular transport. While some of the phenomena modeled can be described analytically, perusal of the equations alone often doesn’t result in full appreciation of the model system. It therefore becomes pertinent to utilize computer graphics in order to enhance simulation of physiologic transport processes. Graphic representation not only facilitates interaction between the investigator and the simulation, it provides a juxtaposition of the model to the real system, as well as a simplification of relationships between various features of the model. Increased mathematical sophistication required in the investigation of cardiovascular transport phenomena often makes traditional graphic representation cumbersome. Therefore several different types of graphics have been utilized, including 2-, 3-, and 4-dimensional displays. The methods and algorithms for these displays have been generalized to make them easy to use over a broad spectrum of applications. In some cases we have generated motion pictures of sequential model solutions which have increased and accelerated model comprehension, as well as been valuable for teaching purposes. PMID:21743760
Value stream mapping in a computational simulation model
Directory of Open Access Journals (Sweden)
Ricardo Becker Mendes de Oliveira
2014-08-01
Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.
Development of computational science in JAEA. R and D of simulation
International Nuclear Information System (INIS)
Nakajima, Norihiro; Araya, Fumimasa; Hirayama, Toshio
2006-01-01
R and D of computational science in JAEA (Japan Atomic Energy Agency) is described. Environment of computer, R and D system in CCSE (Center for Computational Science and e-Systems), joint computational science researches in Japan and world, development of computer technologies, the some examples of simulation researches, 3-dimensional image vibrational platform system, simulation researches of FBR cycle techniques, simulation of large scale thermal stress for development of steam generator, simulation research of fusion energy techniques, development of grid computing technology, simulation research of quantum beam techniques and biological molecule simulation researches are explained. Organization of JAEA, development of computational science in JAEA, network of JAEA, international collaboration of computational science, and environment of ITBL (Information-Technology Based Laboratory) project are illustrated. (S.Y.)
A benchmark on computational simulation of a CT fracture experiment
International Nuclear Information System (INIS)
Franco, C.; Brochard, J.; Ignaccolo, S.; Eripret, C.
1992-01-01
For a better understanding of the fracture behavior of cracked welds in piping, FRAMATOME, EDF and CEA have launched an important analytical research program. This program is mainly based on the analysis of the effects of the geometrical parameters (the crack size and the welded joint dimensions) and the yield strength ratio on the fracture behavior of several cracked configurations. Two approaches have been selected for the fracture analyses: on one hand, the global approach based on the concept of crack driving force J and on the other hand, a local approach of ductile fracture. In this approach the crack initiation and growth are modelized by the nucleation, growth and coalescence of cavities in front of the crack tip. The model selected in this study estimates only the growth of the cavities using the RICE and TRACEY relationship. The present study deals with a benchmark on computational simulation of CT fracture experiments using three computer codes : ALIBABA developed by EDF the CEA's code CASTEM 2000 and the FRAMATOME's code SYSTUS. The paper is split into three parts. At first, the authors present the experimental procedure for high temperature toughness testing of two CT specimens taken from a welded pipe, characteristic of pressurized water reactor primary piping. Secondly, considerations are outlined about the Finite Element analysis and the application procedure. A detailed description is given on boundary and loading conditions, on the mesh characteristics, on the numerical scheme involved and on the void growth computation. Finally, the comparisons between numerical and experimental results are presented up to the crack initiation, the tearing process being not taken into account in the present study. The variations of J and of the local variables used to estimate the damage around the crack tip (triaxiality and hydrostatic stresses, plastic deformations, void growth ...) are computed as a function of the increasing load
International Nuclear Information System (INIS)
Chernyshenko, Dmitri; Fangohr, Hans
2015-01-01
In the finite difference method which is commonly used in computational micromagnetics, the demagnetizing field is usually computed as a convolution of the magnetization vector field with the demagnetizing tensor that describes the magnetostatic field of a cuboidal cell with constant magnetization. An analytical expression for the demagnetizing tensor is available, however at distances far from the cuboidal cell, the numerical evaluation of the analytical expression can be very inaccurate. Due to this large-distance inaccuracy numerical packages such as OOMMF compute the demagnetizing tensor using the explicit formula at distances close to the originating cell, but at distances far from the originating cell a formula based on an asymptotic expansion has to be used. In this work, we describe a method to calculate the demagnetizing field by numerical evaluation of the multidimensional integral in the demagnetizing tensor terms using a sparse grid integration scheme. This method improves the accuracy of computation at intermediate distances from the origin. We compute and report the accuracy of (i) the numerical evaluation of the exact tensor expression which is best for short distances, (ii) the asymptotic expansion best suited for large distances, and (iii) the new method based on numerical integration, which is superior to methods (i) and (ii) for intermediate distances. For all three methods, we show the measurements of accuracy and execution time as a function of distance, for calculations using single precision (4-byte) and double precision (8-byte) floating point arithmetic. We make recommendations for the choice of scheme order and integrating coefficients for the numerical integration method (iii). - Highlights: • We study the accuracy of demagnetization in finite difference micromagnetics. • We introduce a new sparse integration method to compute the tensor more accurately. • Newell, sparse integration and asymptotic method are compared for all ranges
International Nuclear Information System (INIS)
Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.
1989-01-01
Results obtained on a strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers. (orig.)
International Nuclear Information System (INIS)
Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.
1989-01-01
Results obtained on strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers
Computer simulation of the Blumlein pulse forming network
International Nuclear Information System (INIS)
Edwards, C.B.
1981-03-01
A computer simulation of the Blumlein pulse-forming network is described. The model is able to treat the case of time varying loads, non-zero conductor resistance, and switch closure effects as exhibited by real systems employing non-ohmic loads such as field-emission vacuum diodes in which the impedance is strongly time and voltage dependent. The application of the code to various experimental arrangements is discussed, with particular reference to the prediction of the behaviour of the output circuit of 'ELF', the electron beam generator in operation at the Rutherford Laboratory. The output from the code is compared directly with experimentally obtained voltage waveforms applied to the 'ELF' diode. (author)
Simulating Smoke Filling in Big Halls by Computational Fluid Dynamics
Directory of Open Access Journals (Sweden)
W. K. Chow
2011-01-01
Full Text Available Many tall halls of big space volume were built and, to be built in many construction projects in the Far East, particularly Mainland China, Hong Kong, and Taiwan. Smoke is identified to be the key hazard to handle. Consequently, smoke exhaust systems are specified in the fire code in those areas. An update on applying Computational Fluid Dynamics (CFD in smoke exhaust design will be presented in this paper. Key points to note in CFD simulations on smoke filling due to a fire in a big hall will be discussed. Mathematical aspects concerning of discretization of partial differential equations and algorithms for solving the velocity-pressure linked equations are briefly outlined. Results predicted by CFD with different free boundary conditions are compared with those on room fire tests. Standards on grid size, relaxation factors, convergence criteria, and false diffusion should be set up for numerical experiments with CFD.
Application of Computer Simulation Modeling to Medication Administration Process Redesign
Directory of Open Access Journals (Sweden)
Nathan Huynh
2012-01-01
Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.
A model ecosystem experiment and its computational simulation studies
International Nuclear Information System (INIS)
Doi, M.
2002-01-01
Simplified microbial model ecosystem and its computer simulation model are introduced as eco-toxicity test for the assessment of environmental responses from the effects of environmental impacts. To take the effects on the interactions between species and environment into account, one option is to select the keystone species on the basis of ecological knowledge, and to put it in the single-species toxicity test. Another option proposed is to put the eco-toxicity tests as experimental micro ecosystem study and a theoretical model ecosystem analysis. With these tests, the stressors which are more harmful to the ecosystems should be replace with less harmful ones on the basis of unified measures. Management of radioactive materials, chemicals, hyper-eutrophic, and other artificial disturbances of ecosystem should be discussed consistently from the unified view point of environmental protection. (N.C.)
Experiences using DAKOTA stochastic expansion methods in computational simulations.
Energy Technology Data Exchange (ETDEWEB)
Templeton, Jeremy Alan; Ruthruff, Joseph R.
2012-01-01
Uncertainty quantification (UQ) methods bring rigorous statistical connections to the analysis of computational and experiment data, and provide a basis for probabilistically assessing margins associated with safety and reliability. The DAKOTA toolkit developed at Sandia National Laboratories implements a number of UQ methods, which are being increasingly adopted by modeling and simulation teams to facilitate these analyses. This report disseminates results as to the performance of DAKOTA's stochastic expansion methods for UQ on a representative application. Our results provide a number of insights that may be of interest to future users of these methods, including the behavior of the methods in estimating responses at varying probability levels, and the expansion levels for the methodologies that may be needed to achieve convergence.
COMPUTER EMULATORS AND SIMULATORS OFMEASURING INSTRUMENTS ON THE PHYSICS LESSONS
Directory of Open Access Journals (Sweden)
Yaroslav Yu. Dyma
2010-10-01
Full Text Available Prominent feature of educational physical experiment at the present stage is applications of computer equipment and special software – virtual measuring instruments. The purpose of this article – to explain, when by means of virtual instruments it is possible to lead real experience (they are emulators, and when – virtual (they are simulators. For the best understanding implementation of one laboratory experimentation with usage of the software of both types is given. As at learning physics advantage should be given to carrying out of natural experiment with learning the real phenomena and measuring of real physical quantities the most perspective examination of programs-emulators of measuring instruments for their further implantation in educational process sees.
Mixed-Language High-Performance Computing for Plasma Simulations
Directory of Open Access Journals (Sweden)
Quanming Lu
2003-01-01
Full Text Available Java is receiving increasing attention as the most popular platform for distributed computing. However, programmers are still reluctant to embrace Java as a tool for writing scientific and engineering applications due to its still noticeable performance drawbacks compared with other programming languages such as Fortran or C. In this paper, we present a hybrid Java/Fortran implementation of a parallel particle-in-cell (PIC algorithm for plasma simulations. In our approach, the time-consuming components of this application are designed and implemented as Fortran subroutines, while less calculation-intensive components usually involved in building the user interface are written in Java. The two types of software modules have been glued together using the Java native interface (JNI. Our mixed-language PIC code was tested and its performance compared with pure Java and Fortran versions of the same algorithm on a Sun E6500 SMP system and a Linux cluster of Pentium~III machines.
Kim, Sangroh; Song, Haijun; Movsas, Benjamin; Chetty, Indrin J
2012-01-01
As multidetector computed tomography (MDCT) scanning is routinely performed for treatment planning in radiation oncology, understanding the characteristics of the MDCT x-ray beam is essential to accurately estimate patient dose. The purpose of this study is to characterize the x-ray beams of two commercial MDCT simulators widely used in radiation oncology by Monte Carlo (MC) simulations. X-ray tube systems of two wide bore MDCT scanners (GE LightSpeed RT 4 and Philips Brilliance Big Bore) were modeled in the BEAMNRC/EGSNRC MC system. All the tube components were modeled from targets to bowtie filters. To validate our MC models, the authors measured half-value layers (HVL) using aluminum sheets and multifunctional radiation detectors and compared them to those obtained from MC simulations for 120 kVp beams. The authors also compared x-ray spectra obtained from MC simulation to the data provided by manufacturers. Additionally, lateral/axial beam profiles were measured in-air using radiochromic films and compared to the MC results. To understand the scatter effect, the authors also derived the scatter-to-primary energy fluence ratio (SPR) profiles and calculated the total SPR for each CT system with the CT dose index (CTDI) head and body phantoms using the BEAMNRC system. The authors found that the HVL, x-ray spectrum and beam profiles of the MC simulations agreed well with the manufacturer-specified data within 1%-10% on average for both scanners. The total SPR were ranged from 7.8 to 13.7% for the head phantom and from 10.7 to 18.9% for the body phantom. The authors demonstrate the full MC simulations of two commercial MDCT simulators to characterize their x-ray beams. This study may be useful to establish a patient-specific dosimetry for the MDCT systems.
Uncertainty analysis of NDA waste measurements using computer simulations
International Nuclear Information System (INIS)
Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.
2000-01-01
plutonium in a variety of waste types contained in 208-ell drums measured by the passive active neutron (PAN) radioassay system at the Idaho National Engineering and Environmental Laboratory (INEEL). Computer simulation of the PAN system performance uses the Monte Carlo N-Particle (MCNP) code to produce a neutron transport calculation for a simulated waste drum. A followup program was written to combine the MCNP output with other parameters generated by the modeling process to yield simulated measured plutonium mass values. The accuracy of the simulations is verified using surrogate waste drums with known contents
Computer simulation of chemical reactions in porous materials
Turner, Christoffer Heath
Understanding reactions in nanoporous materials from a purely experimental perspective is a difficult task. Measuring the chemical composition of a reacting system within a catalytic material is usually only accomplished through indirect methods, and it is usually impossible to distinguish between true chemical equilibrium and metastable states. In addition, measuring molecular orientation or distribution profiles within porous systems is not easily accomplished. However, molecular simulation techniques are well-suited to these challenges. With appropriate simulation techniques and realistic molecular models, it is possible to validate the dominant physical and chemical forces controlling nanoscale reactivity. Novel nanostructured catalysts and supports can be designed, optimized, and tested using high-performance computing and advanced modeling techniques in order to guide the search for next-generation catalysts---setting new targets for the materials synthesis community. We have simulated the conversion of several different equilibrium-limited reactions within microporous carbons and we find that the pore size, pore geometry, and surface chemistry are important factors for determining the reaction yield. The equilibrium-limited reactions that we have modeled include nitric oxide dimerization, ammonia synthesis, and the esterification of acetic acid, all of which show yield enhancements within microporous carbons. In conjunction with a yield enhancement of the esterification reaction, selective adsorption of ethyl acetate within carbon micropores demonstrates an efficient method for product recovery. Additionally, a new method has been developed for simulating reaction kinetics within porous materials and other heterogeneous environments. The validity of this technique is first demonstrated by reproducing the kinetics of hydrogen iodide decomposition in the gas phase, and then predictions are made within slit-shaped carbon pores and carbon nanotubes. The rate
Petascale computation of multi-physics seismic simulations
Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie; Duru, Kenneth C.
2017-04-01
Capturing the observed complexity of earthquake sources in concurrence with seismic wave propagation simulations is an inherently multi-scale, multi-physics problem. In this presentation, we present simulations of earthquake scenarios resolving high-detail dynamic rupture evolution and high frequency ground motion. The simulations combine a multitude of representations of model complexity; such as non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure to capture dynamic rupture behavior at the source; and seismic wave attenuation, 3D subsurface structure and bathymetry impacting seismic wave propagation. Performing such scenarios at the necessary spatio-temporal resolution requires highly optimized and massively parallel simulation tools which can efficiently exploit HPC facilities. Our up to multi-PetaFLOP simulations are performed with SeisSol (www.seissol.org), an open-source software package based on an ADER-Discontinuous Galerkin (DG) scheme solving the seismic wave equations in velocity-stress formulation in elastic, viscoelastic, and viscoplastic media with high-order accuracy in time and space. Our flux-based implementation of frictional failure remains free of spurious oscillations. Tetrahedral unstructured meshes allow for complicated model geometry. SeisSol has been optimized on all software levels, including: assembler-level DG kernels which obtain 50% peak performance on some of the largest supercomputers worldwide; an overlapping MPI-OpenMP parallelization shadowing the multiphysics computations; usage of local time stepping; parallel input and output schemes and direct interfaces to community standard data formats. All these factors enable aim to minimise the time-to-solution. The results presented highlight the fact that modern numerical methods and hardware-aware optimization for modern supercomputers are essential
In Silico Dynamics: computer simulation in a Virtual Embryo ...
Abstract: Utilizing cell biological information to predict higher order biological processes is a significant challenge in predictive toxicology. This is especially true for highly dynamical systems such as the embryo where morphogenesis, growth and differentiation require precisely orchestrated interactions between diverse cell populations. In patterning the embryo, genetic signals setup spatial information that cells then translate into a coordinated biological response. This can be modeled as ‘biowiring diagrams’ representing genetic signals and responses. Because the hallmark of multicellular organization resides in the ability of cells to interact with one another via well-conserved signaling pathways, multiscale computational (in silico) models that enable these interactions provide a platform to translate cellular-molecular lesions perturbations into higher order predictions. Just as ‘the Cell’ is the fundamental unit of biology so too should it be the computational unit (‘Agent’) for modeling embryogenesis. As such, we constructed multicellular agent-based models (ABM) with ‘CompuCell3D’ (www.compucell3d.org) to simulate kinematics of complex cell signaling networks and enable critical tissue events for use in predictive toxicology. Seeding the ABMs with HTS/HCS data from ToxCast demonstrated the potential to predict, quantitatively, the higher order impacts of chemical disruption at the cellular or biochemical level. This is demonstrate
System for simulating fluctuation diagnostics for application to turbulence computations
International Nuclear Information System (INIS)
Bravenec, R.V.; Nevins, W.M.
2006-01-01
Present-day nonlinear microstability codes are able to compute the saturated fluctuations of a turbulent fluid versus space and time, whether the fluid be liquid, gas, or plasma. They are therefore able to determine turbulence-induced fluid (or particle) and energy fluxes. These codes, however, must be tested against experimental data not only with respect to transport but also characteristics of the fluctuations. The latter is challenging because of limitations in the diagnostics (e.g., finite spatial resolution) and the fact that the diagnostics typically do not measure exactly the quantities that the codes compute. In this work, we present a system based on IDL registered analysis and visualization software in which user-supplied 'diagnostic filters' are applied to the code outputs to generate simulated diagnostic signals. The same analysis techniques as applied to the measurements, e.g., digital time-series analysis, may then be applied to the synthesized signals. Their statistical properties, such as rms fluctuation level, mean wave numbers, phase and group velocities, correlation lengths and times, and in some cases full S(k,ω) spectra, can then be compared directly to those of the measurements
Optomechanical verification of COSTAR using computer graphics simulation
Hancock, Dennis M.
1993-10-01
One of NASA's planned tasks during the first servicing mission to the Hubble Space Telescope in December 1993 is to correct the well known vision problem of the telescope due to an incorrect fabrication of the primary mirror. An exhaustive study of solutions to this problem resulted in a recommendation to place dime sized pairs of mirrors into the beam paths of five instrument channels to correct the spherical aberration attendant to the primary mirror. The name of the mechanism designed to carry these correction optics into the focal plane region of the telescope is COSTAR (correction optics for the space telescope axial replacement). For COSTAR to successfully deploy, four articulating arms carrying the correction optics into the crowded focal plane volume of the telescope must physically clear another opto-mechanical device sharing this space, the Wide-Field Planetary Camera (WF/PC). This paper describes the application of 3-dimensional computer graphics in a through-the-window virtual reality environment to simulate and visualize the planned deployment of COSTAR. Several computer generated animation sequences are shown that verify mechanical clearance of COSTAR's arms with respect to WF/PC.
A hybrid computer simulation of reactor spatial dynamics
International Nuclear Information System (INIS)
Hinds, H.W.
1977-08-01
The partial differential equations describing the one-speed spatial dynamics of thermal neutron reactors were converted to a set of ordinary differential equations, using finite-difference approximations for the spatial derivatives. The variables were then normalized to a steady-state reference condition in a novel manner, to yield an equation set particularly suitable for implementation on a hybrid computer. One Applied Dynamics AD/FIVE analog-computer console is capable of solving, all in parallel, up to 30 simultaneous differential equations. This corresponds roughly to eight reactor nodes, each with two active delayed-neutron groups. To improve accuracy, an increase in the number of nodes is usually required. Using the Hsu-Howe multiplexing technique, an 8-node, one-dimensional module was switched back and forth between the left and right halves of the reactor, to simulate a 16-node model, also in one dimension. These two versions (8 or 16 nodes) of the model were tested on benchmark problems of the loss-of-coolant type, which were also solved using the digital code FORSIM, with two energy groups and 26 nodes. Good agreement was obtained between the two solution techniques. (author)
Feedback controlled electrical nerve stimulation: a computer simulation.
Doruk, R Ozgur
2010-07-01
The role of repetitive firing in neurophysiologic or neuropsychiatric disorders, such as Parkinson, epilepsy and bipolar type disorders, has always been a topic of medical research as therapies target either the cease of firing or a decrease in its frequency. In electrotherapy, one of the mechanisms to achieve the purpose in point is to apply a low density electric current to the nervous system. In this study, a computer simulation is provided of a treatment in which the stimulation current is computed by nerve fiber cell membrane potential feedback so that the level of the current is automatically instead of manually adjusted. The behavior of the nerve cell is represented by the Hodgkin-Huxley (HH) model, which is slightly modified into a linear model with state dependent coefficients. Due to this modification, the algebraic and differential Riccati equations can be applied, which allows an optimal controller minimizing a quadratic performance index given by the user. Using a controlled current injection can decrease unnecessarily long current injection times that may be harmful to the neuronal network. This study introduces a prototype for a possible future application to a network of neurons as it is more realistic than a single neuron. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Neurosurgical simulation and navigation with three-dimensional computer graphics.
Hayashi, N; Endo, S; Shibata, T; Ikeda, H; Takaku, A
1999-01-01
We developed a pre-operative simulation and intra-operative navigation system with three-dimensional computer graphics (3D-CG). Because the 3D-CG created by the present system enables visualization of lesions via semitransparent imaging of the scalp surface and brain, the expected operative field could be visualized on the computer display pre-operatively. We used two different configurative navigators. One is assembled by an arciform arm and a laser pointer. The arciform arm consists of 3 joints mounted with rotary encoders forming an iso-center system. The distal end of the arm has a laser pointer, which has a CCD for measurement of the distance between the outlet of the laser beam, and the position illuminated by the laser pointer. Using this navigator, surgeons could accurately estimate the trajectory to the target lesion, and the boundaries of the lesion. Because the other navigator has six degrees of freedom and an interchangeable probe shaped like a bayonet on its tip, it can be used in deep structures through narrow openings. Our system proved efficient and yielded an unobstructed view of deep structures during microscopic neurosurgical procedures.
Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization
Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton
As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.
Computer simulation studies of the rheology of soft condensed matter
International Nuclear Information System (INIS)
Daivis, P.J.; Snook, I.K.; Matin, M.L.; Kairn, T.; McPhie, M.
2004-01-01
Full text: The rheology of soft condensed matter systems, such as polymer melts, polymer solutions and colloidal dispersions, is a subject of enduring interest - not only because of its importance in materials processing technology, but also because of the fascinating theoretical challenges it presents. Many of the rheological features possessed by these systems, such as normal stress differences, non-Newtonian viscosity and elasticity, are spectacularly evident on the macroscopic scale, but these properties are also crucial to emerging modern technologies such as micro- and nano-fluidics. Over the last seven years, we have studied many different aspects of the rheology of soft condensed matter systems using non-equilibrium molecular dynamics computer simulation techniques. Of particular importance, has been our development of a new algorithm for studying elongational flow, a comparison of the planar elongational and shear flow rheology of molecular fluids, our examination of the approach to the Brownian limit in colloidal fluids, and our detailed investigation of the concentration dependence of the viscosity and normal stress differences in short-chain polymer solutions. In this paper, we review the results of these investigations, discuss the current capabilities and limitations of non-equilibrium molecular dynamics simulations, and discuss our current work and future directions
[Simulation of lung lobe resection with personal computer].
Onuki, T; Murasugi, M; Mae, M; Koyama, K; Ikeda, T; Shimizu, T
2005-09-01
Various patterns of branching are seen for pulmonary arteries and veins in the lung hilum. However, thoracic surgeons usually cannot expect to discern much anatomical detail preoperatively. If the surgeon can gain an understanding of individual patterns preoperatively, the risks inherent in exposing the pulmonary vessels in the hilum can be avoided, reducing invasiveness. This software will meet the increasing needs of them in video-assisted thoracoscopic surgery (VATS) which prefer lesser dissections of the vessels and bronchus of hilum. We have produced free application software, where we can mark on pulmonary arteries, vein, bronchus and tumor of the successive images of computed tomography (CT). After receiving a compact disk containing 60 images of 2 mm CT slices, from tumor to hilum, in DICOM format, we required only 1 hour to obtain 3-dimensional images for a patient with other free software (Metasequoia LE). Furthermore, with Metasequoia LE, we can simulate cut the vessels and change the figure of them 3-dimensionally. Although the picture image leaves much room for improvement, we believe it is very attractive for residents because they can simulate operations.
Adolescent girls' energy expenditure during dance simulation active computer gaming.
Fawkner, Samantha G; Niven, Alisa; Thin, Alasdair G; Macdonald, Mhairi J; Oakes, Jemma R
2010-01-01
The objective of this study was to determine the energy expended and intensity of physical activity achieved by adolescent girls while playing on a dance simulation game. Twenty adolescent girls were recruited from a local secondary school. Resting oxygen uptake (VO(2)) and heart rate were analysed while sitting quietly and subsequently during approximately 30 min of game play, with 10 min at each of three increasing levels of difficulty. Energy expenditure was predicted from VO(2) at rest and during game play at three levels of play, from which the metabolic equivalents (METS) of game playing were derived. Mean +/- standard deviation energy expenditure for levels 1, 2, and 3 was 3.63 +/- 0.58, 3.65 +/- 0.54, and 4.14 +/- 0.71 kcal . min(-1) respectively, while mean activity for each level of play was at least of moderate intensity (>3 METS). Dance simulation active computer games provide an opportunity for most adolescent girls to exercise at moderate intensity. Therefore, regular playing might contribute to daily physical activity recommendations for good health in this at-risk population.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Mike Maguire; Adel Sarofim; Changguan Yang; Hong-Shig Shim
2004-01-28
This is the thirteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused on a preliminary detailed software design for the enhanced framework. Given the complexity of the individual software tools from each team (i.e., Reaction Engineering International, Carnegie Mellon University, Iowa State University), a robust, extensible design is required for the success of the project. In addition to achieving a preliminary software design, significant progress has been made on several development tasks for the program. These include: (1) the enhancement of the controller user interface to support detachment from the Computational Engine and support for multiple computer platforms, (2) modification of the Iowa State University interface-to-kernel communication mechanisms to meet the requirements of the new software design, (3) decoupling of the Carnegie Mellon University computational models from their parent IECM (Integrated Environmental Control Model) user interface for integration with the new framework and (4) development of a new CORBA-based model interfacing specification. A benchmarking exercise to compare process and CFD based models for entrained flow gasifiers was completed. A summary of our work on intrinsic kinetics for modeling coal gasification has been completed. Plans for implementing soot and tar models into our entrained flow gasifier models are outlined. Plans for implementing a model for mercury capture based on conventional capture technology, but applied to an IGCC system, are outlined.
Computer simulation of cascade damage in iron: PKA mass effects
International Nuclear Information System (INIS)
Calder, A.; Bacon, D.J.; Barashev, A.; Osetsky, Y.
2007-01-01
Full text of publication follows: Results are presented from an extensive series of computer simulations of the damage created by displacement cascades in alpha-iron. The objective has been to determine for the first time the effect of the mass of the primary knock-on atom (PKA) on defect number, defect clustering and cluster morphology. Cascades with PKA energy in the range 5 to 20 keV have been simulated by molecular dynamics for temperature up to 600 K using an interatomic potential for iron for which the energy difference between the dumbbell interstitial and the crowdion is close to the value from ab initio calculation (Ackland et al., J. Phys.: Condens. Matter 2004). At least 30 cascades have been simulated for each condition in order to generate reasonable statistics. The influence of PKA species on damage has been investigated in two ways. In one, the PKA atom was treated as an Fe atom as far as its interaction with other atoms was concerned, but its atomic weight (in amu) was either 12 (C), 56 (Fe) or 209 (Bi). Pairs of Bi PKAs have also been used to mimic heavy molecular ion irradiation. In the other approach, the short-range pair part of the interatomic potential was changed from Fe-Fe to that for Bi-Fe, either with or without a change of PKA mass, in order to study the influence of high-energy collisions on the cascade outcome. It is found that PKA mass is more influential than the interatomic potential between the PKA and Fe atoms. At low cascade energy (5-10 keV), increasing PKA mass leads to a decrease in number of interstitials and vacancies. At high energy (20 keV), the main effect of increasing mass is to increase the probability of creation of interstitial and vacancy clusters in the form of 1/2 and dislocation loops. The simulation results are consistent with experimental TEM observations of damage in irradiated iron. (authors)
2016-04-01
ARL-TR-7660 ● APR 2016 US Army Research Laboratory Computational Fluid Dynamics (CFD) Simulations of a Finned Projectile with... Computational Fluid Dynamics (CFD) Simulations of a Finned Projectile with Microflaps for Flow Control by Jubaraj Sahu Weapons and Materials Research...TITLE AND SUBTITLE Computational Fluid Dynamics (CFD) Simulations of a Finned Projectile with Microflaps for Flow Control 5a. CONTRACT NUMBER 5b
Computational Fluid Dynamics Simulation of Dual Bell Nozzle Film Cooling
Braman, Kalen; Garcia, Christian; Ruf, Joseph; Bui, Trong
2015-01-01
Marshall Space Flight Center (MSFC) and Armstrong Flight Research Center (AFRC) are working together to advance the technology readiness level (TRL) of the dual bell nozzle concept. Dual bell nozzles are a form of altitude compensating nozzle that consists of two connecting bell contours. At low altitude the nozzle flows fully in the first, relatively lower area ratio, nozzle. The nozzle flow separates from the wall at the inflection point which joins the two bell contours. This relatively low expansion results in higher nozzle efficiency during the low altitude portion of the launch. As ambient pressure decreases with increasing altitude, the nozzle flow will expand to fill the relatively large area ratio second nozzle. The larger area ratio of the second bell enables higher Isp during the high altitude and vacuum portions of the launch. Despite a long history of theoretical consideration and promise towards improving rocket performance, dual bell nozzles have yet to be developed for practical use and have seen only limited testing. One barrier to use of dual bell nozzles is the lack of control over the nozzle flow transition from the first bell to the second bell during operation. A method that this team is pursuing to enhance the controllability of the nozzle flow transition is manipulation of the film coolant that is injected near the inflection between the two bell contours. Computational fluid dynamics (CFD) analysis is being run to assess the degree of control over nozzle flow transition generated via manipulation of the film injection. A cold flow dual bell nozzle, without film coolant, was tested over a range of simulated altitudes in 2004 in MSFC's nozzle test facility. Both NASA centers have performed a series of simulations of that dual bell to validate their computational models. Those CFD results are compared to the experimental results within this paper. MSFC then proceeded to add film injection to the CFD grid of the dual bell nozzle. A series of
Computational model for simulation small testing launcher, technical solution
Chelaru, Teodor-Viorel; Cristian, Barbu; Chelaru, Adrian
2014-12-01
The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project "Suborbital Launcher for Testing" (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital vehicle
Simulation-Based Inquiry Learning and Computer Modeling: Pitfalls and Potentials
Mulder, Y.G.; Lazonder, Adrianus W.; de Jong, Anthonius J.M.
2015-01-01
Background. Inquiry learning environments increasingly incorporate simulation and modeling facilities. Students acquire knowledge through systematic experimentation with the simulations and express that knowledge in runnable computer models. Aim. As inquiry and modeling activities are new and
Computer simulation of randomly cross-linked polymer networks
International Nuclear Information System (INIS)
Williams, Timothy Philip
2002-01-01
In this work, Monte Carlo and Stochastic Dynamics computer simulations of mesoscale model randomly cross-linked networks were undertaken. Task parallel implementations of the lattice Monte Carlo Bond Fluctuation model and Kremer-Grest Stochastic Dynamics bead-spring continuum model were designed and used for this purpose. Lattice and continuum precursor melt systems were prepared and then cross-linked to varying degrees. The resultant networks were used to study structural changes during deformation and relaxation dynamics. The effects of a random network topology featuring a polydisperse distribution of strand lengths and an abundance of pendant chain ends, were qualitatively compared to recent published work. A preliminary investigation into the effects of temperature on the structural and dynamical properties was also undertaken. Structural changes during isotropic swelling and uniaxial deformation, revealed a pronounced non-affine deformation dependant on the degree of cross-linking. Fractal heterogeneities were observed in the swollen model networks and were analysed by considering constituent substructures of varying size. The network connectivity determined the length scales at which the majority of the substructure unfolding process occurred. Simulated stress-strain curves and diffraction patterns for uniaxially deformed swollen networks, were found to be consistent with experimental findings. Analysis of the relaxation dynamics of various network components revealed a dramatic slowdown due to the network connectivity. The cross-link junction spatial fluctuations for networks close to the sol-gel threshold, were observed to be at least comparable with the phantom network prediction. The dangling chain ends were found to display the largest characteristic relaxation time. (author)
International Nuclear Information System (INIS)
2003-03-01
Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)
Valasek, Lukas; Glasa, Jan
2017-12-01
Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.
International Nuclear Information System (INIS)
Cha, K. H.; Kweon, K. C.
2001-01-01
A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments
Computer simulations for biological aging and sexual reproduction
Directory of Open Access Journals (Sweden)
DIETRICH STAUFFER
2001-03-01
Full Text Available The sexual version of the Penna model of biological aging, simulated since 1996, is compared here with alternative forms of reproduction as well as with models not involving aging. In particular we want to check how sexual forms of life could have evolved and won over earlier asexual forms hundreds of million years ago. This computer model is based on the mutation-accumulation theory of aging, using bits-strings to represent the genome. Its population dynamics is studied by Monte Carlo methods.A versão sexual do modelo de envelhecimento biológico de Penna, simulada desde 1996, é comparada aqui com formas alternativas de reprodução bem como com modelos que não envolvem envelhecimento. Em particular, queremos verificar como formas sexuais de vida poderiam ter evoluído e predominado sobre formas assexuais há centenas de milhões de anos. Este modelo computacional baseia-se na teoria do envelhecimento por acumulação de mutações, usando 'bits-strings' para representar o genoma. Sua dinâmica de populações é estudada por métodos de Monte Carlo.
Predictive Capability Maturity Model for computational modeling and simulation.
Energy Technology Data Exchange (ETDEWEB)
Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.
2007-10-01
The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.
Experimental validation of a computer simulation of radiographic film
Energy Technology Data Exchange (ETDEWEB)
Goncalves, Elicardo A. de S., E-mail: elicardo.goncalves@ifrj.edu.br [Instituto Federal do Rio de Janeiro (IFRJ), Paracambi, RJ (Brazil). Laboratorio de Instrumentacao e Simulacao Computacional Cientificas Aplicadas; Azeredo, Raphaela, E-mail: raphaelaazeredo@yahoo.com.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica Armando Dias Tavares. Programa de Pos-Graduacao em Fisica; Assis, Joaquim T., E-mail: joaquim@iprj.uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Instituto Politecnico; Anjos, Marcelino J. dos; Oliveira, Davi F.; Oliveira, Luis F. de, E-mail: marcelin@uerj.br, E-mail: davi.oliveira@uerj.br, E-mail: lfolive@uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica Armando Dias Tavares. Departamento de Fisica Aplicada e Termodinamica
2015-07-01
In radiographic films, the behavior of characteristic curve is very important for the image quality. Digitization/visualization are always performed by light transmission and the characteristic curve is known as a behavior of optical density in function of exposure. In a first approach, in a Monte-Carlo computer simulation trying to build a Hurter-Driffield curve by a stochastic model, the results showed the same known shape, but some behaviors, like the influence of silver grain size, are not expected. A real H and D curve was build exposing films, developing and measuring the optical density. When comparing model results with a real curve, trying to fit them and estimating some parameters, a difference in high exposure region shows a divergence between the models and the experimental data. Since the optical density is a function of metallic silver generated by chemical development, direct proportion was considered, but the results suggests a limitation in this proportion. In fact, when the optical density was changed by another way to measure silver concentration, like x-ray fluorescence, the new results agree with the models. Therefore, overexposed films can contain areas with different silver concentrations but it can't be seen due to the fact that optical density measurement is limited. Mapping the silver concentration in the film area can be a solution to reveal these dark images, and x-ray fluorescence has shown to be the best way to perform this new way to digitize films. (author)
Stepping out: a computer simulation of hominid dispersal from Africa.
Mithen, Steven; Reed, Melissa
2002-10-01
A succession of new discoveries and the recent application of new dating methods provide strong evidence that Eurasia was colonized soon after 2.0m.y.a. In light of this new evidence many scenarios have been proposed regarding the influence of glacial/interglacial cycles on hominid dispersal, the role of mountain chains and deserts as barriers, the significance of land bridges and the possibility of sea-crossings. Such factors have been proposed to explain the apparent early arrival of hominids in East Asia and relatively late arrival in Europe, although the evidence in both regions remains open to various interpretations. While it is relatively easy to propose environmental factors that may have influenced dispersal patterns, it is more difficult to evaluate such proposals and to establish what the combined impact of several factors might have been. Moreover, the role of historical contingency in creating the observed pattern of dispersal has yet to be considered. This paper describes a computer simulation model of hominid dispersal which seeks to provide a means to evaluate environmental and ecological factors in hominid dispersal. It creates probability distributions for arrival dates at six key localities and compares these with current estimates from the archaeological and fossil records. It uses these to support some of the current arguments about dispersal and to challenge others.
Airflow Patterns In Nuclear Workplace - Computer Simulation And Qualitative Tests
International Nuclear Information System (INIS)
Haim, M.; Szanto, M.; Weiss, Y.; Kravchick, T.; Levinson, S.; German, U.
1999-01-01
Concentration of airborne radioactive materials inside a room can vary widely from one location to another, sometimes by orders of magnitude even for locations that are relatively close. Inappropriately placed samplers can give misleading results and. therefore, the location of air samplers is important. Proper placement of samplers cannot be determined simply by observing the position of room air supply and exhaust vents. Airflow studies, such as the release of smoke aerosols, should be used. The significance of airflow pattern studies depends on the purpose of sampling - for estimating worker intakes, warning of high concentrations. defacing airborne radioactive areas, testing for confinement of sealed radioactive materials. etc. When sampling air in rooms with complex airflow patterns, it may be useful to use qualitative airflow studies with smoke tubes, smoke candles or isostatic bubbles. The U.S. Nuclear Regulatory Commission - Regulatory Guide 8.25 [1]. suggests that an airflow study should be conducted after any changes at work area including changes in the setup of work areas, ventilation system changes, etc. The present work presents an airflow patterns study conducted in a typical room using two methods: a computer simulation and a qualitative test using a smoke tube
Molecular mechanism of myoglobin autoxidation: insights from computer simulations.
Arcon, J P; Rosi, P; Petruk, A A; Marti, M A; Estrin, D A
2015-02-05
Myoglobin (Mb) and hemoglobin have the biological ability to carry/store oxygen (O2), a property which requires its heme iron atom to be in the ferrous--Fe(II)--state. However, the thermodynamically stable state in the presence of O2 is Fe(III) and thus the oxidation rate of a globin is a critical parameter related to its function. Mb has been extensively studied and many mutants have been characterized regarding its oxygen mediated oxidation (i.e., autoxidation) rates. Site directed mutants in residues 29 (B10), which shapes the distal cavity, and 64 (E7), the well-known histidine gate, have been shown to display a wide range of autoxidation rate constants. In this work, we have thoroughly studied the mechanism underlying the autoxidation process by means of state-of-the-art computer simulation methodologies, using Mb and site directed mutants as benchmark cases. Our results explain the observed autoxidation rate tendencies in different variants of Mb, L29F bonds protect the oxy complex from autoxidation.
Computational Fluid Dynamics Simulation of Fluidized Bed Polymerization Reactors
Energy Technology Data Exchange (ETDEWEB)
Fan, Rong [Iowa State Univ., Ames, IA (United States)
2006-01-01
Fluidized beds (FB) reactors are widely used in the polymerization industry due to their superior heat- and mass-transfer characteristics. Nevertheless, problems associated with local overheating of polymer particles and excessive agglomeration leading to FB reactors defluidization still persist and limit the range of operating temperatures that can be safely achieved in plant-scale reactors. Many people have been worked on the modeling of FB polymerization reactors, and quite a few models are available in the open literature, such as the well-mixed model developed by McAuley, Talbot, and Harris (1994), the constant bubble size model (Choi and Ray, 1985) and the heterogeneous three phase model (Fernandes and Lona, 2002). Most these research works focus on the kinetic aspects, but from industrial viewpoint, the behavior of FB reactors should be modeled by considering the particle and fluid dynamics in the reactor. Computational fluid dynamics (CFD) is a powerful tool for understanding the effect of fluid dynamics on chemical reactor performance. For single-phase flows, CFD models for turbulent reacting flows are now well understood and routinely applied to investigate complex flows with detailed chemistry. For multiphase flows, the state-of-the-art in CFD models is changing rapidly and it is now possible to predict reasonably well the flow characteristics of gas-solid FB reactors with mono-dispersed, non-cohesive solids. This thesis is organized into seven chapters. In Chapter 2, an overview of fluidized bed polymerization reactors is given, and a simplified two-site kinetic mechanism are discussed. Some basic theories used in our work are given in detail in Chapter 3. First, the governing equations and other constitutive equations for the multi-fluid model are summarized, and the kinetic theory for describing the solid stress tensor is discussed. The detailed derivation of DQMOM for the population balance equation is given as the second section. In this section
Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics
Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI
2006-01-01
This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.
Computer simulation of confined and flexoelectric liquid crystalline systems
International Nuclear Information System (INIS)
Barmes, F.
2003-01-01
In this Thesis, systems of confined and flexoelectric liquid crystal systems have been studied using molecular computer simulations. The aim of this work was to provide a molecular model of a bistable display cell in which switching is induced through the application of directional electric field pulses. In the first part of this Thesis, the study of confined systems of liquid crystalline particles has been addressed. Computation of the anchoring phase diagrams for three different surface interaction models showed that the hard needle wall and rod-surface potentials induce both planar and homeotropic alignment separated by a bistability region, this being stronger and wider for the rod-surface varant. The results obtained using the rod-sphere surface model, in contrast, showed that tilled surface arrangements can be induced by surface absorption mechanisms. Equivalent studies of hybrid anchored systems showed that a bend director structure can be obtained in a slab with monostable homeotropic anchoring at the top surface and bistable anchoring at the bottom, provided that the slab height is sufficiently large and the top homeotropic anchoring is not too strong. In the second part of the Thesis, the development of models for tapered (pear-shaped) mesogens has been addressed. The first model considered, the truncated Stone expansion model, proved to be unsuccessful in that it did not display liquid crystalline phases. This drawback was then overcome using the alternative parametric hard Gaussian overlap model which was found to display a much richer phase behaviour. With a molecular elongation k = 5, both nematic and interdigitated smectic A 2 phases were obtained. In the final part of this Thesis, the knowledge acquired from the two previous studies was united in an attempt to model a bistable display cell. Switching between the hybrid aligned nematic and vertical states of the cell was successfully performed using pear shaped particles with both dielectric and
Monte Carlo computer simulation of sedimentation of charged hard spherocylinders
Energy Technology Data Exchange (ETDEWEB)
Viveros-Méndez, P. X., E-mail: xviveros@fisica.uaz.edu.mx; Aranda-Espinoza, S. [Unidad Académica de Física, Universidad Autónoma de Zacatecas, Calzada Solidaridad esq. Paseo, La Bufa s/n, 98060 Zacatecas, Zacatecas, México (Mexico); Gil-Villegas, Alejandro [Departamento de Ingeniería Física, División de Ciencias e Ingenierías, Campus León, Universidad de Guanajuato, Loma del Bosque 103, Lomas del Campestre, 37150 León, Guanajuato, México (Mexico)
2014-07-28
In this article we present a NVT Monte Carlo computer simulation study of sedimentation of an electroneutral mixture of oppositely charged hard spherocylinders (CHSC) with aspect ratio L/σ = 5, where L and σ are the length and diameter of the cylinder and hemispherical caps, respectively, for each particle. This system is an extension of the restricted primitive model for spherical particles, where L/σ = 0, and it is assumed that the ions are immersed in an structureless solvent, i.e., a continuum with dielectric constant D. The system consisted of N = 2000 particles and the Wolf method was implemented to handle the coulombic interactions of the inhomogeneous system. Results are presented for different values of the strength ratio between the gravitational and electrostatic interactions, Γ = (mgσ)/(e{sup 2}/Dσ), where m is the mass per particle, e is the electron's charge and g is the gravitational acceleration value. A semi-infinite simulation cell was used with dimensions L{sub x} ≈ L{sub y} and L{sub z} = 5L{sub x}, where L{sub x}, L{sub y}, and L{sub z} are the box dimensions in Cartesian coordinates, and the gravitational force acts along the z-direction. Sedimentation effects were studied by looking at every layer formed by the CHSC along the gravitational field. By increasing Γ, particles tend to get more packed at each layer and to arrange in local domains with an orientational ordering along two perpendicular axis, a feature not observed in the uncharged system with the same hard-body geometry. This type of arrangement, known as tetratic phase, has been observed in two-dimensional systems of hard-rectangles and rounded hard-squares. In this way, the coupling of gravitational and electric interactions in the CHSC system induces the arrangement of particles in layers, with the formation of quasi-two dimensional tetratic phases near the surface.
Duality quantum computer and the efficient quantum simulations
Wei, Shi-Jie; Long, Gui-Lu
2015-01-01
In this paper, we firstly briefly review the duality quantum computer. Distinctly, the generalized quantum gates, the basic evolution operators in a duality quantum computer are no longer unitary, and they can be expressed in terms of linear combinations of unitary operators. All linear bounded operators can be realized in a duality quantum computer, and unitary operators are just the extreme points of the set of generalized quantum gates. A d-slits duality quantum computer can be realized in...
Computer Simulations: Inelegant Mathematics and Worse Social Science?
Alker, Hayward R., Jr.
1974-01-01
Achievements, limitations, and difficulties of social science simulation efforts are discussed with particular reference to three examples. The pedagogical use of complementary developmental, philosophical, mathematical, and scientific approaches is advocated to minimize potential abuses of social simulation research. (LS)
Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations
Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying
2010-09-01
Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).
Analysis of pellet cladding mechanical interaction using computational simulation
International Nuclear Information System (INIS)
Berretta, José R.; Suman, Ricardo B.; Faria, Danilo P.; Rodi, Paulo A.; Giovedi, Claudia
2017-01-01
During the operation of Pressurized Water Reactors (PWR), specifically under power transients, the fuel pellet experiences many phenomena, such as swelling and thermal expansion. These dimensional changes in the fuel pellet can enable occurrence of contact it and the cladding along the fuel rod. Thus, pellet cladding mechanical interaction (PCMI), due this contact, induces stress increase at the contact points during a period, until the accommodation of the cladding to the stress increases. This accommodation occurs by means of the cladding strain, which can produce failure, if the fuel rod deformation is permanent or the burst limit of the cladding is reached. Therefore, the mechanical behavior of the cladding during the occurrence of PCMI under power transients shall be investigated during the fuel rod design. Considering the Accident Tolerant Fuel program which aims to develop new materials to be used as cladding in PWR, one important design condition to be evaluated is the cladding behavior under PCMI. The purpose of this paper is to analyze the effects of the PCMI on a typical PWR fuel rod geometry with stainless steel cladding under normal power transients using computational simulation (ANSYS code). The PCMI was analyzed considering four geometric situations at the region of interaction between pellet and cladding. The first case, called “perfect fuel model” was used as reference for comparison. In the second case, it was considered the occurrence of a pellet crack with the loss of a chip. The goal for the next two cases was that a pellet chip was positioned into the gap of pellet-cladding, in the situations described in the first two cases. (author)
Single seed precise sowing of maize using computer simulation.
Zhao, Longgang; Han, Zhongzhi; Yang, Jinzhong; Qi, Hua
2018-01-01
In order to test the feasibility of computer simulation in field maize planting, the selection of the method of single seed precise sowing in maize is studied based on the quadratic function model Y = A×(D-Dm)2+Ym, which depicts the relationship between maize yield and planting density. And the advantages and disadvantages of the two planting methods under the condition of single seed sowing are also compared: Method 1 is optimum density planting, while Method 2 is the ideal seedling emergence number planting. It is found that the yield reduction rate and yield fluctuation of Method 2 are all lower than those of Method 1. The yield of Method 2 increased by at least 0.043 t/hm2, and showed more advantages over Method 1 with higher yield level. Further study made on the influence of seedling emergence rate on the yield of maize finds that the yields of the two methods are both highly positively correlated with the seedling emergence rate and the standard deviations of their yields are both highly negatively correlated with the seedling emergence rate. For the study of the break-up problem of sparse caused by the method of single seed precise sowing, the definition of seedling missing spots is put forward. The study found that the relationship between number of hundred-dot spot and field seedling emergence rate is as the parabola function y = -189.32x2 + 309.55x - 118.95 and the relationship between number of spot missing seedling and field seedling emergence rate is as the negative exponent function y = 395.69e-6.144x. The results may help to guide the maize seeds production and single seed precise sowing to some extent.
Analysis of pellet cladding mechanical interaction using computational simulation
Energy Technology Data Exchange (ETDEWEB)
Berretta, José R.; Suman, Ricardo B.; Faria, Danilo P.; Rodi, Paulo A., E-mail: jose.berretta@marinha.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP), São Paulo, SP (Brazil); Giovedi, Claudia, E-mail: claudia.giovedi@labrisco.usp.br [Universidade de Sao Paulo (LabRisco/USP), São Paulo, SP (Brazil). Laboratório de Análise, Avaliação e Gerenciamento de Riscos
2017-07-01
During the operation of Pressurized Water Reactors (PWR), specifically under power transients, the fuel pellet experiences many phenomena, such as swelling and thermal expansion. These dimensional changes in the fuel pellet can enable occurrence of contact it and the cladding along the fuel rod. Thus, pellet cladding mechanical interaction (PCMI), due this contact, induces stress increase at the contact points during a period, until the accommodation of the cladding to the stress increases. This accommodation occurs by means of the cladding strain, which can produce failure, if the fuel rod deformation is permanent or the burst limit of the cladding is reached. Therefore, the mechanical behavior of the cladding during the occurrence of PCMI under power transients shall be investigated during the fuel rod design. Considering the Accident Tolerant Fuel program which aims to develop new materials to be used as cladding in PWR, one important design condition to be evaluated is the cladding behavior under PCMI. The purpose of this paper is to analyze the effects of the PCMI on a typical PWR fuel rod geometry with stainless steel cladding under normal power transients using computational simulation (ANSYS code). The PCMI was analyzed considering four geometric situations at the region of interaction between pellet and cladding. The first case, called “perfect fuel model” was used as reference for comparison. In the second case, it was considered the occurrence of a pellet crack with the loss of a chip. The goal for the next two cases was that a pellet chip was positioned into the gap of pellet-cladding, in the situations described in the first two cases. (author)
Professors' and students' perceptions and experiences of computational simulations as learning tools
Magana de Leon, Alejandra De Jesus
Computational simulations are becoming a critical component of scientific and engineering research, and now are becoming an important component for learning. This dissertation provides findings from a multifaceted research study exploring the ways computational simulations have been perceived and experienced as learning tools by instructors and students. Three studies were designed with an increasing focus on the aspects of learning and instructing with computational simulation tools. Study One used a student survey with undergraduate and graduate students whose instructors enhanced their teaching using online computational tools. Results of this survey were used to identify students' perceptions and experiences with these simulations as learning tools. The results provided both an evaluation of the instructional design and an indicator of which instructors were selected in Study Two. Study Two used a phenomenographic research design resulting in a two dimensional outcome space with six qualitatively different ways instructors perceived their learning outcomes associated with using simulation tools as part of students' learning experiences. Results from this work provide a framework for identifying major learning objectives to promote learning with computational simulation tools. Study Three used a grounded theory methodology to expand on instructors' learning objectives to include their perceptions of formative assessment and pedagogy. These perceptions were compared and contrasted with students' perceptions associated with learning with computational tools. The study is organized around three phases and analyzed as a collection of case studies focused on the instructors and their students' perceptions and experiences of computational simulations as learning tools. This third study resulted in a model for using computational simulations as learning tools. This model indicates the potential of integrating the computational simulation tools into formal learning
Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories
Ng, Hok Kwan; Sridhar, Banavar
2016-01-01
This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.
Computing equation of state parameters of gases from Monte Carlo simulations
Ramdin, M.; Becker, T.M.; Jamali, S.H.; Wang, M.; Vlugt, T.J.H.
2016-01-01
Monte Carlo (MC) simulations in ensembles with a fixed chemical potential or fugacity, for example the grand-canonical or the osmotic ensemble, are often used to compute phase equilibria. Chemical potentials can be computed either with an equation of state (EoS) or from molecular simulations. The
Formal Analysis of Dynamics Within Philosophy of Mind by Computer Simulation
Bosse, T.; Schut, M.C.; Treur, J.
2009-01-01
Computer simulations can be useful tools to support philosophers in validating their theories, especially when these theories concern phenomena showing nontrivial dynamics. Such theories are usually informal, whilst for computer simulation a formally described model is needed. In this paper, a
Rieber, Lloyd P.; Tzeng, Shyh-Chii; Tribble, Kelly
2004-01-01
The purpose of this research was to explore how adult users interact and learn during an interactive computer-based simulation supplemented with brief multimedia explanations of the content. A total of 52 college students interacted with a computer-based simulation of Newton's laws of motion in which they had control over the motion of a simple…
Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System
2017-08-01
Ballistic System by Daniel J Hornbaker Approved for public release; distribution is unlimited. NOTICES...Uncertainty from Computational Factors in Simulations of a Model Ballistic System by Daniel J Hornbaker Weapons and Materials Research...November 2016 4. TITLE AND SUBTITLE Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System 5a. CONTRACT NUMBER
Possibilities and importance of using computer games and simulations in educational process
Danilović Mirčeta S.
2003-01-01
The paper discusses if it is possible and appropriate to use simulations (simulation games) and traditional games in the process of education. It is stressed that the terms "game" and "simulation" can and should be taken in a broader sense, although they are chiefly investigated herein as video-computer games and simulations. Any activity combining the properties of game (competition, rules, players) and the properties of simulation (i.e. operational presentation of reality) should be underst...
Interferences and events on epistemic shifts in physics through computer simulations
Warnke, Martin
2017-01-01
Computer simulations are omnipresent media in today's knowledge production. For scientific endeavors such as the detection of gravitational waves and the exploration of subatomic worlds, simulations are essential; however, the epistemic status of computer simulations is rather controversial as they are neither just theory nor just experiment. Therefore, computer simulations have challenged well-established insights and common scientific practices as well as our very understanding of knowledge. This volume contributes to the ongoing discussion on the epistemic position of computer simulations in a variety of physical disciplines, such as quantum optics, quantum mechanics, and computational physics. Originating from an interdisciplinary event, it shows that accounts of contemporary physics can constructively interfere with media theory, philosophy, and the history of science.
NeuroManager: A workflow analysis based simulation management engine for computational neuroscience
Directory of Open Access Journals (Sweden)
David Bruce Stockton
2015-10-01
Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.
Watts, G.
1992-01-01
A programming technique to eliminate computational instability in multibody simulations that use the Lagrange multiplier is presented. The computational instability occurs when the attached bodies drift apart and violate the constraints. The programming technique uses the constraint equation, instead of integration, to determine the coordinates that are not independent. Although the equations of motion are unchanged, a complete derivation of the incorporation of the Lagrange multiplier into the equation of motion for two bodies is presented. A listing of a digital computer program which uses the programming technique to eliminate computational instability is also presented. The computer program simulates a solid rocket booster and parachute connected by a frictionless swivel.
Math modeling and computer mechanization for real time simulation of rotary-wing aircraft
Howe, R. M.
1979-01-01
Mathematical modeling and computer mechanization for real time simulation of rotary wing aircraft is discussed. Error analysis in the digital simulation of dynamic systems, such as rotary wing aircraft is described. The method for digital simulation of nonlinearities with discontinuities, such as exist in typical flight control systems and rotor blade hinges, is discussed.
The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention
Elangovan, Tavasuria; Ismail, Zurida
2014-01-01
A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…
The Effects of Computer-Simulation Game Training on Participants' Opinions on Leadership Styles
Siewiorek, Anna; Gegenfurtner, Andreas; Lainema, Timo; Saarinen, Eeli; Lehtinen, Erno
2013-01-01
The objective of this study is to elucidate new information on the possibility of leadership training through business computer-simulation gaming in a virtual working context. In the study, a business-simulation gaming session was organised for graduate students ("n"?=?26). The participants played the simulation game in virtual teams…
Optimizing Cognitive Load for Learning from Computer-Based Science Simulations
Lee, Hyunjeong; Plass, Jan L.; Homer, Bruce D.
2006-01-01
How can cognitive load in visual displays of computer simulations be optimized? Middle-school chemistry students (N = 257) learned with a simulation of the ideal gas law. Visual complexity was manipulated by separating the display of the simulations in two screens (low complexity) or presenting all information on one screen (high complexity). The…
Simulation model of load balancing in distributed computing systems
Botygin, I. A.; Popov, V. N.; Frolov, S. G.
2017-02-01
The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.
Simulating quantum systems on classical computers with matrix product states
Energy Technology Data Exchange (ETDEWEB)
Kleine, Adrian
2010-11-08
In this thesis, the numerical simulation of strongly-interacting many-body quantum-mechanical systems using matrix product states (MPS) is considered. Matrix-Product-States are a novel representation of arbitrary quantum many-body states. Using quantum information theory, it is possible to show that Matrix-Product-States provide a polynomial-sized representation of one-dimensional quantum systems, thus allowing an efficient simulation of one-dimensional quantum system on classical computers. Matrix-Product-States form the conceptual framework of the density-matrix renormalization group (DMRG). After a general introduction in the first chapter of this thesis, the second chapter deals with Matrix-Product-States, focusing on the development of fast and stable algorithms. To obtain algorithms to efficiently calculate ground states, the density-matrix renormalization group is reformulated using the Matrix-Product-States framework. Further, time-dependent problems are considered. Two different algorithms are presented, one based on a Trotter decomposition of the time-evolution operator, the other one on Krylov subspaces. Finally, the evaluation of dynamical spectral functions is discussed, and a correction vector-based method is presented. In the following chapters, the methods presented in the second chapter, are applied to a number of different physical problems. The third chapter deals with the existence of chiral phases in isotropic one-dimensional quantum spin systems. A preceding analytical study based on a mean-field approach indicated the possible existence of those phases in an isotropic Heisenberg model with a frustrating zig-zag interaction and a magnetic field. In this thesis, the existence of the chiral phases is shown numerically by using Matrix-Product-States-based algorithms. In the fourth chapter, we propose an experiment using ultracold atomic gases in optical lattices, which allows a well controlled observation of the spin-charge separation (of
Computer simulation of electronic excitation in atomic collision cascades
Energy Technology Data Exchange (ETDEWEB)
Duvenbeck, A.
2007-04-05
The impact of an keV atomic particle onto a solid surface initiates a complex sequence of collisions among target atoms in a near-surface region. The temporal and spatial evolution of this atomic collision cascade leads to the emission of particles from the surface - a process usually called sputtering. In modern surface analysis the so called SIMS technology uses the flux of sputtered particles as a source of information on the microscopical stoichiometric structure in the proximity of the bombarded surface spots. By laterally varying the bombarding spot on the surface, the entire target can be scanned and chemically analyzed. However, the particle detection, which bases upon deflection in electric fields, is limited to those species that leave the surface in an ionized state. Due to the fact that the ionized fraction of the total flux of sputtered atoms often only amounts to a few percent or even less, the detection is often hampered by rather low signals. Moreover, it is well known, that the ionization probability of emitted particles does not only depend on the elementary species, but also on the local environment from which a particle leaves the surface. Therefore, the measured signals for different sputtered species do not necessarily represent the stoichiometric composition of the sample. In the literature, this phenomenon is known as the Matrix Effect in SIMS. In order to circumvent this principal shortcoming of SIMS, the present thesis develops an alternative computer simulation concept, which treats the electronic energy losses of all moving atoms as excitation sources feeding energy into the electronic sub-system of the solid. The particle kinetics determining the excitation sources are delivered by classical molecular dynamics. The excitation energy calculations are combined with a diffusive transport model to describe the spread of excitation energy from the initial point of generation. Calculation results yield a space- and time-resolved excitation
Computer simulation of molecular absorption spectra for asymmetric top molecules
International Nuclear Information System (INIS)
Bende, A.; Tosa, V.; Cosma, V.
2001-01-01
The effective Hamiltonian formalism has been used to develop a model for infrared multiple-photon absorption (IRMPA) process in asymmetric top molecules. Assuming a collisionless regime, the interaction between the molecule and laser field can be described by the time-dependent Schroedinger equation. By using the rotating wave approximation and Laplace transformation, the time-dependent problem reduces to a time-independent eigen problem for an effective Hamiltonian which can be solved only numerically for a real vibrational-rotational structure of polyatomic molecule. The vibrational-rotational structure is assumed to be an anharmonic oscillator coupled to an asymmetric rigid rotor. The main assumptions taken into account for this model are the following: (1) the excitation is coherent, i.e. the collision (if present during the laser pulse) does not influence the excitation; (2) the excitation starts from the ground state and is near resonant to a normal mode, thus, the rotating wave approximation can be applied; (3) after absorbing N photons the vibrational energy of the excited mode leak into a quasicontinuum; (4) the thermal population of the ground state is given by the Maxwell-Boltzmann distribution law. The energy levels of the asymmetric top molecules cannot be represented by an explicit formula analogous to that for the symmetric top, according to quantum mechanics, but we can consider it a deviation from the prolate or oblate case of the symmetric top, and we can find in the same manner the selection rules of the asymmetric case using the selection rules for the symmetric case. The infrared bands of asymmetric top molecules are not resolved, but if the dispersion used is not too small, so that the envelopes of the bands can be distinguished from simple maxima, it is possible to draw conclusions as to the type of the bands. In this case, the simulation of the absorption spectra can give us some important information about the types of these bands. In
Computer Simulation Studies of Ion Channels at High Temperatures
Song, Hyun Deok
The gramicidin channel is the smallest known biological ion channel, and it exhibits cation selectivity. Recently, Dr. John Cuppoletti's group at the University of Cincinnati showed that the gramicidin channel can function at high temperatures (360 ˜ 380K) with significant currents. This finding may have significant implications for fuel cell technology. In this thesis, we have examined the gramicidin channel at 300K, 330K, and 360K by computer simulation. We have investigated how the temperature affects the current and differences in magnitude of free energy between the two gramicidin forms, the helical dimer (HD) and the double helix (DH). A slight decrease of the free energy barrier inside the gramicidin channel and increased diffusion at high temperatures result in an increase of current. An applied external field of 0.2V/nm along the membrane normal results in directly observable ion transport across the channels at high temperatures for both HD and DH forms. We found that higher temperatures also affect the probability distribution of hydrogen bonds, the bending angle, the distance between dimers, and the size of the pore radius for the helical dimer structure. These findings may be related to the gating of the gramicidin channel. Methanococcus jannaschii (MJ) is a methane-producing thermophile, which was discovered at a depth of 2600m in a Pacific Ocean vent in 1983. It has the ability to thrive at high temperatures and high pressures, which are unfavorable for most life forms. There have been some experiments to study its stability under extreme conditions, but still the origin of the stability of MJ is not exactly known. MJ0305 is the chloride channel protein from the thermophile MJ. After generating a structure of MJ0305 by homology modeling based on the Ecoli ClC templates, we examined the thermal stability, and the network stability from the change of network entropy calculated from the adjacency matrices of the protein. High temperatures increase the
A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.
Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui
2017-01-08
Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.
Fukuda, Ikuo; Osanai, Satoshi; Shirota, Minori; Inamura, Takao; Yanaoka, Hideki; Minakawa, Masahito; Fukui, Kozo
2009-06-01
Atheroembolism due to aortic manipulation remains an unsolved problem in surgery for thoracic aortic aneurysm. The goal of the present study is to create a computer simulation (CS) model with which to analyze blood flow in the diseased aorta. A three-dimensional glass model of the aortic arch was constructed from CT images of a normal, healthy person and a patient with transverse aortic arch aneurysm. Separately, a CS model of the curved end-hole cannula was created, and flow from the aortic cannula was recreated using a numerical simulation. Comparison of the data obtained by the glass model analyses revealed that the flow velocity and the vector of the flow around the exit of the cannula were similar to that in the CS model. A high-velocity area was observed around the cannula exit in both the glass model and the CS model. The maximum flow velocity was as large as 1.0 m/s at 20 mm from the cannula exit and remained as large as 0.5 to 0.6 m/s within 50 mm of the exit. In the aortic arch aneurysm models, the rapid jet flow from the cannula moved straight toward the lesser curvature of the transverse aortic arch. The locations and intensities of the calculated vortices were slightly different from those obtained for the glass model. The proposed CS method for the analysis of blood flow from the aortic cannulae during extracorporeal circulation can reproduce the flow velocity and flow pattern in the proximal and transverse aortic arches.
Li, Pengcheng; Liu, Celong; Li, Xianpeng; He, Honghui; Ma, Hui
2016-09-20
In earlier studies, we developed scattering models and the corresponding CPU-based Monte Carlo simulation programs to study the behavior of polarized photons as they propagate through complex biological tissues. Studying the simulation results in high degrees of freedom that created a demand for massive simulation tasks. In this paper, we report a parallel implementation of the simulation program based on the compute unified device architecture running on a graphics processing unit (GPU). Different schemes for sphere-only simulations and sphere-cylinder mixture simulations were developed. Diverse optimizing methods were employed to achieve the best acceleration. The final-version GPU program is hundreds of times faster than the CPU version. Dependence of the performance on input parameters and precision were also studied. It is shown that using single precision in the GPU simulations results in very limited losses in accuracy. Consumer-level graphics cards, even those in laptop computers, are more cost-effective than scientific graphics cards for single-precision computation.
Snowden, Jonathan M; Rose, Sherri; Mortimer, Kathleen M
2011-04-01
The growing body of work in the epidemiology literature focused on G-computation includes theoretical explanations of the method but very few simulations or examples of application. The small number of G-computation analyses in the epidemiology literature relative to other causal inference approaches may be partially due to a lack of didactic explanations of the method targeted toward an epidemiology audience. The authors provide a step-by-step demonstration of G-computation that is intended to familiarize the reader with this procedure. The authors simulate a data set and then demonstrate both G-computation and traditional regression to draw connections and illustrate contrasts between their implementation and interpretation relative to the truth of the simulation protocol. A marginal structural model is used for effect estimation in the G-computation example. The authors conclude by answering a series of questions to emphasize the key characteristics of causal inference techniques and the G-computation procedure in particular.
Enabling Breakthrough Kinetic Simulations of the Magnetosphere Using Petascale Computing
Vu, H. X.; Karimabadi, H.; Omelchenko, Y.; Tatineni, M.; Majumdar, A.; Krauss-Varban, D.; Dorelli, J.
2009-12-01
Currently global magnetospheric simulations are predominantly based on single-fluid magnetohydrodynamics (MHD). MHD simulations have proven useful in studies of the global dynamics of the magnetosphere with the goal of predicting eminent features of substorms and other global events. But it is well known that the magnetosphere is dominated by ion kinetic effects, which is ignored in MHD simulations, and many key aspects of the magnetosphere relating to transport and structure of boundaries await global kinetic simulations. We are using our recent innovations in hybrid (electron fluid, kinetic ions) simulations, as being developed in our Hybrid3D (H3D) code, and the power of massively parallel machines to make, breakthrough 3D global kinetic simulations of the magnetosphere. The innovations include (i) multi-zone (asynchronous) algorithm, (ii) dynamic load balancing, and (iii) code adaptation and optimization to large number of processors. In this presentation we will show preliminary results of our progress to date using from 512 to over 8192 cores. In particular, we focus on what we believe to be the first demonstration of the formation of a flux rope in 3D global hybrid simulations. As in the MHD simulations, the resulting flux rope has a very complex structure, wrapping up field lines from different regions and appears to be connected on at least one end to Earth. Magnetic topology of the FTE is examined to reveal the existence of several separators (3D X-lines). The formation and growth of this structure will be discussed and spatial profile of the magnetic and plasma variables will be compared with those from MHD simulations.
A scalable parallel black oil simulator on distributed memory parallel computers
Wang, Kun; Liu, Hui; Chen, Zhangxin
2015-11-01
This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.
Parallel Monte Carlo simulations on an ARC-enabled computing grid
International Nuclear Information System (INIS)
Nilsen, Jon K; Samset, Bjørn H
2011-01-01
Grid computing opens new possibilities for running heavy Monte Carlo simulations of physical systems in parallel. The presentation gives an overview of GaMPI, a system for running an MPI-based random walker simulation on grid resources. Integrating the ARC middleware and the new storage system Chelonia with the Ganga grid job submission and control system, we show that MPI jobs can be run on a world-wide computing grid with good performance and promising scaling properties. Results for relatively communication-heavy Monte Carlo simulations run on multiple heterogeneous, ARC-enabled computing clusters in several countries are presented.
Computer simulation of forest fire and its possible usage
International Nuclear Information System (INIS)
Halada, L.; Weisenpacher, P.; Glasa, J.
2005-01-01
In this presentation authors deal with computer modelling of forest fires. Their possible usage is discussed. Results of modelling are compared with real forest fire in the National Park Slovensky Raj (Slovak Paradise) in 2000 year
Computational Methods for Predictive Simulation of Stochastic Turbulence Systems
2015-11-05
computing time (even weeks) while performing enough realizations to generate a full PDF can require thousands of realizations. This is the fundamantal and...grant or contract. Catalin Trenchea Program Manager The AFOSR Program Manager currently assigned to the award Fariba Fahroo Reporting Period Start...Program Manager , if any: Jean-Luc Cambier Program Officer, Computational Mathematics, AFOSR/RTA 875 N. Randolph St., Suite 325, Room 4104, Arlington
Molecular Dynamic Simulations of Nanostructured Ceramic Materials on Parallel Computers
International Nuclear Information System (INIS)
Vashishta, Priya; Kalia, Rajiv
2005-01-01
Large-scale molecular-dynamics (MD) simulations have been performed to gain insight into: (1) sintering, structure, and mechanical behavior of nanophase SiC and SiO2; (2) effects of dynamic charge transfers on the sintering of nanophase TiO2; (3) high-pressure structural transformation in bulk SiC and GaAs nanocrystals; (4) nanoindentation in Si3N4; and (5) lattice mismatched InAs/GaAs nanomesas. In addition, we have designed a multiscale simulation approach that seamlessly embeds MD and quantum-mechanical (QM) simulations in a continuum simulation. The above research activities have involved strong interactions with researchers at various universities, government laboratories, and industries. 33 papers have been published and 22 talks have been given based on the work described in this report
Computer simulation of radiation damage in gallium arsenide
Stith, John J.; Davenport, James C.; Copeland, Randolph L.
1989-01-01
A version of the binary-collision simulation code MARLOWE was used to study the spatial characteristics of radiation damage in proton and electron irradiated gallium arsenide. Comparisons made with the experimental results proved to be encouraging.
Computer simulation of the fire-tube boiler hydrodynamics
Directory of Open Access Journals (Sweden)
Khaustov Sergei A.
2015-01-01
Full Text Available Finite element method was used for simulating the hydrodynamics of fire-tube boiler with the ANSYS Fluent 12.1.4 engineering simulation software. Hydrodynamic structure and volumetric temperature distribution were calculated. The results are presented in graphical form. Complete geometric model of the fire-tube boiler based on boiler drawings was considered. Obtained results are suitable for qualitative analysis of hydrodynamics and singularities identification in fire-tube boiler water shell.
Computer simulation of mixed classical-quantum systems
International Nuclear Information System (INIS)
Kalia, R.K.; Vashishta, P.
1988-11-01
We briefly review three important methods that are currently used in the simulation of mixed systems. Two of these techniques, path integral Monte Carlo or molecular dynamics and dynamical simulated annealing, have the limitation that they can only describe the structural properties in the ground state. The third so-called quantum molecular dynamics (QMD) method can provide not only the static properties but also the real-time dynamics of a quantum particle at finite temperatures. 10 refs
Smetana, Lara Kathleen; Bell, Randy L.
2012-06-01
Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
International Nuclear Information System (INIS)
Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei
2011-01-01
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)
Computer simulations of the atmospheric composition climate of Bulgaria
Energy Technology Data Exchange (ETDEWEB)
Gadzhev, G.; Ganev, K.; Syrakov, D.; Prodanova, M.; Georgieva, I.; Georgiev, G.
2015-07-01
Some extensive numerical simulations of the atmospheric composition fields in Bulgaria have been recently performed. The US EPA Model-3 system was chosen as a modelling tool. As the NCEP Global Analysis Data with 1 degree resolution was used as meteorological background, the MM5 and CMAQ nesting capabilities were applied for downscaling the simulations to a 3 km resolution over Bulgaria. The TNO emission inventory was used as emission input. Special pre-processing procedures are created for introducing temporal profiles and speciation of the emissions. The biogenic emissions of VOC are estimated by the model SMOKE. The simulations were carried out for years 2000-2007. The numerical experiments have been carried out for different emission scenarios, which makes it possible the contribution of emissions from different source categories to be evaluated. The Models-3 Integrated Process Rate Analysis option is applied to discriminate the role of different dynamic and chemical processes for the air pollution formation. The obtained ensemble of numerical simulation results is extensive enough to allow statistical treatment calculating not only the mean concentrations and different source categories contribution mean fields, but also standard deviations, skewness, etc. with their dominant temporal modes (seasonal and/or diurnal variations). Thus some basic facts about the atmospheric composition climate of Bulgaria can be retrieved from the simulation ensemble. (Author)
Computer simulations of the atmospheric composition climate of Bulgaria
Energy Technology Data Exchange (ETDEWEB)
Gadzhev, G.; Ganev, K.; Syrkov, D.; Prodanova, M.; Georgieva, I.; Georgiev, G.
2015-07-01
Some extensive numerical simulations of the atmospheric composition fields in Bulgaria have been recently performed. The US EPA Model-3 system was chosen as a modelling tool. As the NCEP Global Analysis Data with 1 degree resolution was used as meteorological background, the MM5 and CMAQ nesting capabilities were applied for downscaling the simulations to a 3 km resolution over Bulgaria. The TNO emission inventory was used as emission input. Special pre-processing procedures are created for introducing temporal profiles and speciation of the emissions. The biogenic emissions of VOC are estimated by the model SMOKE. The simulations were carried out for years 2000-2007. The numerical experiments have been carried out for different emission scenarios, which makes it possible the contribution of emissions from different source categories to be evaluated. The Models-3 “Integrated Process Rate Analysis” option is applied to discriminate the role of different dynamic and chemical processes for the air pollution formation. The obtained ensemble of numerical simulation results is extensive enough to allow statistical treatment – calculating not only the mean concentrations and different source categories contribution mean fields, but also standard deviations, skewness, etc. with their dominant temporal modes (seasonal and/or diurnal variations). Thus some basic facts about the atmospheric composition climate of Bulgaria can be retrieved from the simulation ensemble. (Author)
National Research Council Canada - National Science Library
Litvin, F
1999-01-01
An integrated tooth contact analysis (TCA) computer program for the simulation of meshing and contact of gear drives that calculates transmission errors and shift of hearing contact for misaligned gear drives has been developed...
Thermodynamic and transport properties of nitrogen fluid: Molecular theory and computer simulations
Eskandari Nasrabad, A.; Laghaei, R.
2018-04-01
Computer simulations and various theories are applied to compute the thermodynamic and transport properties of nitrogen fluid. To model the nitrogen interaction, an existing potential in the literature is modified to obtain a close agreement between the simulation results and experimental data for the orthobaric densities. We use the Generic van der Waals theory to calculate the mean free volume and apply the results within the modified Cohen-Turnbull relation to obtain the self-diffusion coefficient. Compared to experimental data, excellent results are obtained via computer simulations for the orthobaric densities, the vapor pressure, the equation of state, and the shear viscosity. We analyze the results of the theory and computer simulations for the various thermophysical properties.
The Tortoise Versus the Hare: Computer Simulation of Euendolithic Microbial Alteration Textures
Banerjee, N. R.; Lee, J.; Izawa, M. R. M.; Tiampo, K.
2010-04-01
A computer model has been developed to simulate the production of euendolithic microbial alteration textures in basalt glass; and produces textures that are qualitatively and quantitatively similar to those observed in natural glass bioalteration.
International Nuclear Information System (INIS)
Littmark, U.
1994-01-01
The ''philosophy'' behind, and the ''psychology'' of the development from analytic theory to computer simulations in the field of atomic collisions in solids is discussed and a few examples of achievements and perspectives are given. (orig.)
Criteria for Appraising Computer-Based Simulations for Teaching Arabic as a Foreign Language
National Research Council Canada - National Science Library
Dabrowski, Richard
2005-01-01
This was an exploratory study aimed at defining more sharply the pedagogical and practical challenges entailed in designing and creating computer-based game-types simulations for learning Arabic as a foreign language...
Simulation of partially coherent light propagation using parallel computing devices
Magalhães, Tiago C.; Rebordão, José M.
2017-08-01
Light acquires or loses coherence and coherence is one of the few optical observables. Spectra can be derived from coherence functions and understanding any interferometric experiment is also relying upon coherence functions. Beyond the two limiting cases (full coherence or incoherence) the coherence of light is always partial and it changes with propagation. We have implemented a code to compute the propagation of partially coherent light from the source plane to the observation plane using parallel computing devices (PCDs). In this paper, we restrict the propagation in free space only. To this end, we used the Open Computing Language (OpenCL) and the open-source toolkit PyOpenCL, which gives access to OpenCL parallel computation through Python. To test our code, we chose two coherence source models: an incoherent source and a Gaussian Schell-model source. In the former case, we divided into two different source shapes: circular and rectangular. The results were compared to the theoretical values. Our implemented code allows one to choose between the PyOpenCL implementation and a standard one, i.e using the CPU only. To test the computation time for each implementation (PyOpenCL and standard), we used several computer systems with different CPUs and GPUs. We used powers of two for the dimensions of the cross-spectral density matrix (e.g. 324, 644) and a significant speed increase is observed in the PyOpenCL implementation when compared to the standard one. This can be an important tool for studying new source models.
Emergence of Anisotropy in Flock Simulations and Its Computational Analysis
Makiguchi, Motohiro; Inoue, Jun-Ichi
In real flocks, it was revealed that the angular density of nearest neighbors shows a strong anisotropic structure of individuals by very recent extensive field studies [Ballerini et al, Proceedings of the National Academy of Sciences USA, 105, pp. 1232-1237 (2008)]. In this paper, we show this structure of anisotropy also emerges in an artificial flock simulation, namely, Boid simulation. To quantify the anisotropy, we evaluate a useful statistics, that is to say, the so-called γ-value which is defined as an inner product between the vector in the direction of the lowest angular density of flocks and the vector in the direction of moving of the flock. Our results concerning the emergence of the anisotropy through the γ-value might enable us to judge whether an optimal flock simulation seems to be realistic or not.
Some recent trends in computer simulations of aqueous double layers
International Nuclear Information System (INIS)
Spohr, E.
2003-01-01
Recent molecular simulations of the electric double layer between an aqueous and a metallic phase are reviewed. Several trends in the field can be identified: (i) the increasing use of ab initio simulation methods, most notably the Car-Parrinello method, allows to combine a statistical mechanical description of the double layer with a description of elementary chemical processes on the electronic structure level; (ii) the application of free-energy methods in one and (recently) two dimensions to describe chemical reactivity within and beyond the framework of the Marcus theory of electron transfer; and (iii) at high concentrations, direct simulations of two-phase systems with an aqueous solution and a charged or uncharged solid phase or surface can model the entire double layer region
From Architectural Acoustics to Acoustical Architecture Using Computer Simulation
DEFF Research Database (Denmark)
Schmidt, Anne Marie Due; Kirkegaard, Poul Henning
2005-01-01
acoustic design process and to set up a strategy to develop future programmes. The emphasis is put on the first three out of four phases in the working process of the architect and a case study is carried out in which each phase is represented by typical results ? as exemplified with reference...... to the design of Bagsvaerd Church by Jørn Utzon. The paper discusses the advantages and disadvantages of the programme in each phase compared to the works of architects not using acoustic simulation programmes. The conclusion of the paper points towards the need to apply the acoustic simulation programmes...
Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I
Energy Technology Data Exchange (ETDEWEB)
Schmalz, Mark S
2011-07-24
Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient
Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist.
Directory of Open Access Journals (Sweden)
Brian Drawert
2016-12-01
Full Text Available We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.
International Nuclear Information System (INIS)
Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano
2016-01-01
Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi–Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources. (paper)
SNOW: a digital computer program for the simulation of ion beam devices
International Nuclear Information System (INIS)
Boers, J.E.
1980-08-01
A digital computer program, SNOW, has been developed for the simulation of dense ion beams. The program simulates the plasma expansion cup (but not the plasma source itself), the acceleration region, and a drift space with neutralization if desired. The ion beam is simulated by computing representative trajectories through the device. The potentials are simulated on a large rectangular matrix array which is solved by iterative techniques. Poisson's equation is solved at each point within the configuration using space-charge densities computed from the ion trajectories combined with background electron and/or ion distributions. The simulation methods are described in some detail along with examples of both axially-symmetric and rectangular beams. A detailed description of the input data is presented
The design and calibration of a simulation model of a star computer network
Gomaa, H
1982-01-01
A simulation model of the CERN(European Organization for Nuclear Research) SPS star computer network is described. The model concentrates on simulating the message handling computer, through which all messages in the network pass. The paper describes the main features of the model, the transfer time parameters in the model and how performance measurements were used to assist in the calibration of the model.
Computed tomography of von Meyenburg complex simulating micro-abscesses
International Nuclear Information System (INIS)
Sada, P.N.; Ramakrishna, B.
1994-01-01
A case is presented of a bile duct hamartoma in a 44 year old man being evaluated for abdominal pain. The computed tomography (CT) findings suggested micro-abscesses in the liver and a CT guided tru-cut biopsy showed von Meyenburg complex. 9 refs., 3 figs
Simulation of Hooke's Joint on the Analog Computer.
Mitchell, Eugene E., Ed.
1980-01-01
A problem is presented that teaches the engineering student or practicing engineer the behavior of Hooke's joint, a widely used mechanism for transmitting rotary power in mechanical equipment. Also provided by this problem is an exercise in analog programing which utilizes nonlinear computer elements. (Author/CS)
Computer simulation of remote operations at nuclear power stations
International Nuclear Information System (INIS)
Lee, D.J.; Beaumont, F.R.
1993-01-01
A study incorporating an animated 3-D computer model of a remote recovery operation in a complex nuclear fuel handling environment has highlighted the following benefits: a significant reduction in time and cost, greatly improved recovery route evaluation consequently reducing the probability of collision, and greater operator awareness and confidence. (author)
Computer Simulation of Water-Ice Transition in Hydrophobic Nanopores
Czech Academy of Sciences Publication Activity Database
Slovák, Jan; Tanaka, H.; Koga, K.; Zeng, X. C.
2001-01-01
Roč. 292, - (2001), s. 87-101 ISSN 0378-4371 Institutional research plan: CEZ:AV0Z4072921 Keywords : computer * water-ice transition * hydrophobic nanopore s Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.295, year: 2001
Simulating elastic light scattering using high performance computing methods
Hoekstra, A.G.; Sloot, P.M.A.; Verbraeck, A.; Kerckhoffs, E.J.H.
1993-01-01
The Coupled Dipole method, as originally formulated byPurcell and Pennypacker, is a very powerful method tosimulate the Elastic Light Scattering from arbitraryparticles. This method, which is a particle simulationmodel for Computational Electromagnetics, has one majordrawback: if the size of the
A Generalized Computer Simulation Language for Naval Systems Modeling.
1981-06-30
A. B., "GASP", Enclopedia of Computer Science and Technology, J. Belzer, A. G. Holzman , and A. Kent, Editors, Vol. 8, Marcel Dekker, Inc., 1977. 39...Blacksburg, VA 24061 David Taylor Naval Ship Research & Development Center (DTNSRDC) Dr. A. Alan B. Pritsker Carderock Department of Industrial Engineering
Low-complexity computer simulation of multichannel room impulse responses
Martínez Castañeda, J.A.
2013-01-01
The "telephone'' model has been, for the last one hundred thirty years, the base of modern telecommunications with virtually no changes in its fundamental concept. The arise of smaller and more powerful computing devices have opened new possibilities. For example, to build systems able to give to
Computer Simulations for Lab Experiences in Secondary Physics
Murphy, David Shannon
2012-01-01
Physical science instruction often involves modeling natural systems, such as electricity that possess particles which are invisible to the unaided eye. The effect of these particles' motion is observable, but the particles are not directly observable to humans. Simulations have been developed in physics, chemistry and biology that, under certain…
Computer simulation of quantum phenomena in nano-scale devices
Raedt, Hans De
1996-01-01
This paper reviews the general concepts for building algorithms to solve the time-dependent Schrödinger equation and to discuss ways of turning these concepts into unconditionally stable, accurate and efficient simulation algorithms. Applications to focussed electron emission from nano-scale
Computer Simulations Imply Forelimb-Dominated Underwater Flight in Plesiosaurs.
Directory of Open Access Journals (Sweden)
Shiqiu Liu
2015-12-01
Full Text Available Plesiosaurians are an extinct group of highly derived Mesozoic marine reptiles with a global distribution that spans 135 million years from the Early Jurassic to the Late Cretaceous. During their long evolutionary history they maintained a unique body plan with two pairs of large wing-like flippers, but their locomotion has been a topic of debate for almost 200 years. Key areas of controversy have concerned the most efficient biologically possible limb stroke, e.g. whether it consisted of rowing, underwater flight, or modified underwater flight, and how the four limbs moved in relation to each other: did they move in or out of phase? Previous studies have investigated plesiosaur swimming using a variety of methods, including skeletal analysis, human swimmers, and robotics. We adopt a novel approach using a digital, three-dimensional, articulated, free-swimming plesiosaur in a simulated fluid. We generated a large number of simulations under various joint degrees of freedom to investigate how the locomotory repertoire changes under different parameters. Within the biologically possible range of limb motion, the simulated plesiosaur swims primarily with its forelimbs using an unmodified underwater flight stroke, essentially the same as turtles and penguins. In contrast, the hindlimbs provide relatively weak thrust in all simulations. We conclude that plesiosaurs were forelimb-dominated swimmers that used their hind limbs mainly for maneuverability and stability.
Computer simulation of superionic conductors: II. Cationic conductors. Review
International Nuclear Information System (INIS)
Ivanov-Shitz, A. K.
2007-01-01
The state of the art of the molecular-dynamics simulation of superionic conductors is reviewed. The main studies devoted to the structural, dynamic, and transport properties of the basic classes of solid electrolytes with conductivity via silver, copper, lithium, sodium, and hydrogen cations are considered. The premelting effect in ionic crystals is discussed
Computer simulations of small semiconductor and metal clusters
International Nuclear Information System (INIS)
Andreoni, W.
1991-01-01
A brief survey is presented of recent simulations of small clusters, made with both ab-initio and classical approaches, with particular emphasis on the application of the Car-Parrinello method. The discussion mainly focusses on the structural properties of a variety of materials and on the effects of temperature. (orig.)
Computer Simulation in Manufacturing Technology: A Case Study.
Young, La Verne H.; And Others
1991-01-01
A sample of 72 students tested manufacturing simulation software about production planning. The package enriched learning, although compared to lecture only, no clear-cut advantages were perceived. Amount of time between lecture and lab experience did not affect learning or retention. (SK)
Computer simulations of 3d Lorentzian quantum gravity
Ambjørn, J.; Jurkiewicz, J.; Loll, R.
2000-01-01
We investigate the phase diagram of non-perturbative three-dimensional Lorentzian quantum gravity with the help of Monte Carlo simulations. The system has a first-order phase transition at a critical value kc 0 of the bare inverse gravitational coupling constant k0. For k0 > kc0 the system
Defect Detection in Composite Coatings by Computational Simulation Aided Thermography
Almeida, R. M.; Souza, M. P. V.; Rebello, J. M. A.
2010-02-01
Thermography is based on the measurement of superficial temperature distribution of an object inspected subjected to tension, normally thermal heat. This measurement is performed with a thermographic camera that detects the infrared radiation emitted by every object. In this work thermograph was simulated by COMSOL software for optimize experimental parameters in composite material coatings inspection.
Computation and Simulation of Circuit Topology Describing Secular ...
African Journals Online (AJOL)
A circuit topology for simulation of artificial coupled differential equation of Secular equilibrium decay of Strontium was designed. The equilibrium decay considered was;. An integrating time constant of 33 millisecond was chosen so as to minimize integrating error and a maximum input voltage level of 10V was chosen for ...
Computer Simulation in the Teaching of Translation and International Studies.
Brecht, Richard D.; And Others
1984-01-01
Describes the National Simulation in International Studies and Translation Program which links international studies and foreign languages programs at a number of universities. This program provides a natural context for the exercise of translation for the language student and an authenticity of experience for students of international politics.…
Computer-aided simulation and design of nanofiltration processes.
Noronha, Mohan; Mavrov, Valko; Chmiel, Horst
2003-03-01
The modelling of membrane filtration processes is often performed by applying black-box models or short-cut methods, because of the complexity of the molecular interactions on and inside the membrane. The assumptions made for short-cut methods can be applied with accuracy to reverse osmosis processes, whereas the simulation of nanofiltration can lead to unreliable results that sometimes deviate from real conditions to a great extent. A steady-state process simulation, NF-PROJECT, based on input information from membrane characterization, was developed (isothermal operation). The individual separation characteristics of each membrane element are calculated in an iterative sequence, illustrating the successive reduction in permeability and rejection between the elements arranged inside the pressure vessel. The simulation provides information on the increasing feed concentration and osmotic pressure, the hydraulic pressure loss, the deterioration of the flow conditions in the vessel, and the joint performance of the membrane elements to be analyzed. Taking an example from a practical application, a two-stage nanofiltration pilot plant was simulated, the results of which are presented in this article. Examples of optimization potentials are illustrated for the target criteria of economic efficiency (specific energy costs), permeate quality, and flow.
Jia, Qing-Shan
2012-01-01
The dynamics of many systems nowadays follow not only physical laws but also man-made rules. These systems are known as discrete event dynamic systems and their performances can be accurately evaluated only through simulations. Existing studies on simulation-based optimization (SBO) usually assume deterministic simulation time for each replication. However, in many applications such as evacuation, smoke detection, and territory exploration, the simulation time is stochastic due to the randomn...
The adaptation method in the Monte Carlo simulation for computed tomography
Energy Technology Data Exchange (ETDEWEB)
Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)
2015-06-15
The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.
The adaptation method in the Monte Carlo simulation for computed tomography
Directory of Open Access Journals (Sweden)
Hyounggun Lee
2015-06-01
Full Text Available The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT. To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA and a human-like voxel phantom (KTMAN-2 (Los Alamos National Laboratory, Los Alamos, NM, USA. For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations—assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.
What do we want from computer simulation of SIMS using clusters?
International Nuclear Information System (INIS)
Webb, R.P.
2008-01-01
Computer simulation of energetic cluster interactions with surfaces has provided much needed insight into some of the complex processes which occur and are responsible for the desirable as well as undesirable effects which make the use of clusters in SIMS both useful and challenging. Simulations have shown how cluster impacts can cause meso-scale motion of the target material which can result in the relatively gentle up-lift of large intact molecules adsorbed on the surface in contrast to the behaviour of single atom impacts which tend to create discrete motion in the surface often ejecting fragments of adsorbed molecules instead. With the insight provided from simulations experimentalists can then improve their equipment to best maximise the desired effects. The past 40 years has seen great progress in simulation techniques and computer equipment. 40 years ago simulations were performed on simple atomic systems of around 300 atoms employing only simple pair-wise interaction potentials to times of several hundred femtoseconds. Currently simulations can be performed on large organic materials employing many body potentials for millions of atoms for times of many picoseconds. These simulations, however, can take several months of computation time. Even with the degree of realism introduced with these long time simulations they are still not perfect are often not capable of being used in a completely predictive way. Computer simulation is reaching a position where by any more effort to increase its realism will make it completely intractable to solution in a reasonable time frame and yet there is an increasing demand from experimentalists for something that can help in a predictive way to help in experiment design and interpretation. This paper will discuss the problems of computer simulation and what might be possible to achieve in the short term, what is unlikely ever to be possible without a major new break through and how we might exploit the meso-scale effects in
Computer simulation of laboratory leaching and washing of tank waste sludges
International Nuclear Information System (INIS)
Meng, C.D.; MacLean, G.T.; Landeene, B.C.
1994-01-01
The process simulator ESP (Environmental Simulation Program) was used to simulate laboratory caustic leaching and washing of core samples from Tanks B-110, C-109, and C-112. The results of the laboratory tests and the computer simulations are compared. The results from both, agreed reasonably well for elements contained in solid phases included in the ESP Public data bank. The use of the GEOCHEM data bank and/or a custom Hanford Data bank should improve the agreement, making ESP a useful process simulator for aqueous based processing
Grgurina, Natasa; van Veen, Klaas; Barendsen, Erik; Zwaneveld, Bert; Suhre, Cor; Gal-Ezer, Judith; Sentance, Sue; Vahrenhold, Jan
2015-01-01
Computational Thinking (CT) is gaining a lot of attention in education. We explored how to discern the occurrences of CT in the projects of 12th grade high school students in the computer science (CS) course. Within the projects, they constructed models and ran simulations of phenomena from other
Conversational Simulation in Computer-Assisted Language Learning: Potential and Reality.
Coleman, D. Wells
1988-01-01
Addresses the potential of conversational simulations for computer-assisted language learning (CALL) and reasons why this potential is largely untapped. Topics discussed include artificial intelligence; microworlds; parsing; realism versus reality in computer software; intelligent tutoring systems; and criteria to clarify what kinds of CALL…
Simulation of Hiërarchical Resource Management for Meta-computing Systems
Santoso, J.; van Albada, G.D.; Sloot, P.M.A.; Nazief, B.A.A.
2000-01-01
Optimal scheduling in meta-computing environments still is an open research question. Various research management (RM) architectures have been proposed in the literature. In the present paper we explore, through simulation, various muli-level scheduling strategies for compound computing environments