Computer simulations applied in materials
Energy Technology Data Exchange (ETDEWEB)
NONE
2003-07-01
This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)
Advanced line heating system applying FEM computer simulation
Energy Technology Data Exchange (ETDEWEB)
Ishiyama, M.; Tango, Y. [Ishikawajima-Harima Heavy Industries Co. Ltd., Tokyo (Japan)
2000-01-01
Line heating is a key technique to form the curvature of steel hull plates in shipbuilding, and only experts can skillfully perform line heating. However, the accuracy of conventional line heating is not always well controlled. IHI has established an accurate database for the heating and forming relationship based on parametric experiments and FEM analyses on simple heating lines. It has been confirmed that distribution of inherent strains induced in a plate by flattening the objective curvature, which is defined based on elastic FEM simulation, can be assimilated using the database. This is used for heating process planning for the NC line heating machine with a high frequency induction heater, and facilitates automated thermal forming. (author)
Building Energy Assessment and Computer Simulation Applied to Social Housing in Spain
Directory of Open Access Journals (Sweden)
Juan Aranda
2018-01-01
Full Text Available The actual energy consumption and simulated energy performance of a building usually differ. This gap widens in social housing, owing to the characteristics of these buildings and the consumption patterns of economically vulnerable households affected by energy poverty. The aim of this work is to characterise the energy poverty of the households that are representative of those residing in social housing, specifically in blocks of apartments in Southern Europe. The main variables that affect energy consumption and costs are analysed, and the models developed for software energy-performance simulations (which are applied to predict energy consumption in social housing are validated against actual energy-consumption values. The results demonstrate that this type of household usually lives in surroundings at a temperature below the average thermal comfort level. We have taken into account that a standard thermal comfort level may lead to significant differences between computer-aided energy building simulation and actual consumption data (which are 40–140% lower than simulated consumption. This fact is of integral importance, as we use computer simulation to predict building energy performance in social housing.
Boudreau, Joseph F; Bianchi, Riccardo Maria
2018-01-01
Applied Computational Physics is a graduate-level text stressing three essential elements: advanced programming techniques, numerical analysis, and physics. The goal of the text is to provide students with essential computational skills that they will need in their careers, and to increase the confidence with which they write computer programs designed for their problem domain. The physics problems give them an opportunity to reinforce their programming skills, while the acquired programming skills augment their ability to solve physics problems. The C++ language is used throughout the text. Physics problems include Hamiltonian systems, chaotic systems, percolation, critical phenomena, few-body and multi-body quantum systems, quantum field theory, simulation of radiation transport, and data modeling. The book, the fruit of a collaboration between a theoretical physicist and an experimental physicist, covers a broad range of topics from both viewpoints. Examples, program libraries, and additional documentatio...
Lugovsky, A. Yu.; Popov, Yu. P.
2015-08-01
The Roe-Einfeldt-Osher scheme is considered, which has the third order of accuracy. Its advantages over the first-order accurate Roe scheme are demonstrated, and its choice for the simulation of accretion disk flows is justified. The Roe-Einfeldt-Osher scheme is shown to be efficient as applied to the simulation of real-world problems on parallel computers. Results of simulation of flows in accretion disks in two and three dimensions are presented. Limited capabilities of two-dimensional disk models are noted.
Simulated annealing applied to IMRT beam angle optimization: A computational study.
Dias, Joana; Rocha, Humberto; Ferreira, Brígida; Lopes, Maria do Carmo
2015-11-01
Electing irradiation directions to use in IMRT treatments is one of the first decisions to make in treatment planning. Beam angle optimization (BAO) is a difficult problem to tackle from the mathematical optimization point of view. It is highly non-convex, and optimization approaches based on gradient descent methods will probably get trapped in one of the many local minima. Simulated Annealing (SA) is a local search probabilistic procedure that is known to be able to deal with multimodal problems. SA for BAO was retrospectively applied to ten clinical examples of treated cases of head-and neck tumors signalized as complex cases where proper target coverage and organ sparing proved difficult to achieve. The number of directions to use was considered fixed and equal to 5 or 7. It is shown that SA can lead to solutions that significantly improve organ sparing, even considering a reduced number of angles, without jeopardizing tumor coverage. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Applied Parallel Computing Industrial Computation and Optimization
DEFF Research Database (Denmark)
Madsen, Kaj; NA NA NA Olesen, Dorte
Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...
Directory of Open Access Journals (Sweden)
Takashi Watanabe
2010-01-01
Full Text Available Feedback error-learning (FEL controller that consists of a proportional-integral-derivative (PID controller and an artificial neural network (ANN had applicability to functional electrical stimulation (FES. Because of the integral (reset windup, however, delay or overshoot sometimes occurred in feedback FES control, which was considered to cause inappropriate ANN learning and to limit the feasibility of the FEL controller for FES to controlling 1-DOF movements stimulating 2 muscles. In this paper, an FEL-FES controller was developed applying antireset windup (ARW scheme that worked based on total controller output. The FEL-FES controller with the ARW was examined in controlling 2-DOF movements of the wrist joint stimulating 4 muscles through computer simulation. The developed FEL-FES controller was found to realize appropriately inverse dynamics model and to have a possibility of being used as an open-loop controller. The developed controller would be effective in multiple DOF movement control stimulating several muscles.
Applied Computational Transonic Aerodynamics,
1982-08-01
Viviand, Henri, Formes conservatives des equations de la dynamique des gaz , La Recherche A6rospatiale, 1974-1, p. 65-68. 12. Avis, Rutherford, Vectors...Alto, Ca., June 1981. 50. Roach, R.L., and Sankar, N.L., The strongly implicit procedure applied to the flow field of transonic turbine cascades, AIAA...NASA CR-2729, July 1977. 135. Camarero, R., and Younis, M., Generation of body-fitted coordinates for turbine cascades using multigrid, AIAA Paper No.79
Simulation of quantum computers
De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB
2001-01-01
We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software
Applied Computational Mathematics in Social Sciences
Damaceanu, Romulus-Catalin
2010-01-01
Applied Computational Mathematics in Social Sciences adopts a modern scientific approach that combines knowledge from mathematical modeling with various aspects of social science. Special algorithms can be created to simulate an artificial society and a detailed analysis can subsequently be used to project social realities. This Ebook specifically deals with computations using the NetLogo platform, and is intended for researchers interested in advanced human geography and mathematical modeling studies.
FPGA-accelerated simulation of computer systems
Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S
2014-01-01
To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f
Glowinski, R; Kuznetsov, Y A; Periaux, Jacques; Neittaanmaki, Pekka; Pironneau, Olivier
2010-01-01
Standing at the intersection of mathematics and scientific computing, this collection of state-of-the-art papers in nonlinear PDEs examines their applications to subjects as diverse as dynamical systems, computational mechanics, and the mathematics of finance.
Applied Mathematics, Modelling and Computational Science
Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan
2015-01-01
The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...
1984-09-01
appearance of readiness to act in reduction of these needs, and they can PROVIDE AN ADEQUATE SETTING, as well as the means, for an Immediate translation ...first was Oacquisition and comprehension of knowlodge .’ Here, he stated that simulation-games were probably too expensive and time consuming compared...questions were then translated into four views: anti-union, pro-employer, pro- union, and anti-employer (32:383). Out of the 16 paired comparisons, the
Computer Modeling and Simulation
Energy Technology Data Exchange (ETDEWEB)
Pronskikh, V. S. [Fermilab
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes
Simulating chemistry using quantum computers.
Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán
2011-01-01
The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.
Applied computation and security systems
Saeed, Khalid; Choudhury, Sankhayan; Chaki, Nabendu
2015-01-01
This book contains the extended version of the works that have been presented and discussed in the First International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2014) held during April 18-20, 2014 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland and University of Calcutta, India. The Volume I of this double-volume book contains fourteen high quality book chapters in three different parts. Part 1 is on Pattern Recognition and it presents four chapters. Part 2 is on Imaging and Healthcare Applications contains four more book chapters. The Part 3 of this volume is on Wireless Sensor Networking and it includes as many as six chapters. Volume II of the book has three Parts presenting a total of eleven chapters in it. Part 4 consists of five excellent chapters on Software Engineering ranging from cloud service design to transactional memory. Part 5 in Volume II is on Cryptography with two book...
Energy Technology Data Exchange (ETDEWEB)
Cozin, Cristiane; Lueders, Ricardo; Morales, Rigoberto E.M. [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil). Dept. de Engenharia Mecanica
2008-07-01
In recent years, computer cluster has emerged as a real alternative to solution of problems which require high performance computing. Consequently, the development of new applications has been driven. Among them, flow simulation represents a real computational burden specially for large systems. This work presents a study of using parallel computing for numerical fluid flow simulation in pipelines. A mathematical flow model is numerically solved. In general, this procedure leads to a tridiagonal system of equations suitable to be solved by a parallel algorithm. In this work, this is accomplished by a parallel odd-oven reduction method found in the literature which is implemented on Fortran programming language. A computational platform composed by twelve processors was used. Many measures of CPU times for different tridiagonal system sizes and number of processors were obtained, highlighting the communication time between processors as an important issue to be considered when evaluating the performance of parallel applications. (author)
Massively parallel quantum computer simulator
De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.
2007-01-01
We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray
Priority Queues for Computer Simulations
Steinman, Jeffrey S. (Inventor)
1998-01-01
The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.
Numerical simulation in applied geophysics
Santos, Juan Enrique
2016-01-01
This book presents the theory of waves propagation in a fluid-saturated porous medium (a Biot medium) and its application in Applied Geophysics. In particular, a derivation of absorbing boundary conditions in viscoelastic and poroelastic media is presented, which later is employed in the applications. The partial differential equations describing the propagation of waves in Biot media are solved using the Finite Element Method (FEM). Waves propagating in a Biot medium suffer attenuation and dispersion effects. In particular the fast compressional and shear waves are converted to slow diffusion-type waves at mesoscopic-scale heterogeneities (on the order of centimeters), effect usually occurring in the seismic range of frequencies. In some cases, a Biot medium presents a dense set of fractures oriented in preference directions. When the average distance between fractures is much smaller than the wavelengths of the travelling fast compressional and shear waves, the medium behaves as an effective viscoelastic an...
Reversible simulation of irreversible computation
Li, Ming; Tromp, John; Vitányi, Paul
1998-09-01
Computer computations are generally irreversible while the laws of physics are reversible. This mismatch is penalized by among other things generating excess thermic entropy in the computation. Computing performance has improved to the extent that efficiency degrades unless all algorithms are executed reversibly, for example by a universal reversible simulation of irreversible computations. All known reversible simulations are either space hungry or time hungry. The leanest method was proposed by Bennett and can be analyzed using a simple ‘reversible’ pebble game. The reachable reversible simulation instantaneous descriptions (pebble configurations) of such pebble games are characterized completely. As a corollary we obtain the reversible simulation by Bennett and, moreover, show that it is a space-optimal pebble game. We also introduce irreversible steps and give a theorem on the tradeoff between the number of allowed irreversible steps and the memory gain in the pebble game. In this resource-bounded setting the limited erasing needs to be performed at precise instants during the simulation. The reversible simulation can be modified so that it is applicable also when the simulated computation time is unknown.
Applied modelling and computing in social science
Povh, Janez
2015-01-01
In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.
Fel simulations using distributed computing
Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.
2016-01-01
While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code
Fluid simulation for computer graphics
Bridson, Robert
2008-01-01
Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.
Flow simulation and high performance computing
Tezduyar, T.; Aliabadi, S.; Behr, M.; Johnson, A.; Kalro, V.; Litke, M.
1996-10-01
Flow simulation is a computational tool for exploring science and technology involving flow applications. It can provide cost-effective alternatives or complements to laboratory experiments, field tests and prototyping. Flow simulation relies heavily on high performance computing (HPC). We view HPC as having two major components. One is advanced algorithms capable of accurately simulating complex, real-world problems. The other is advanced computer hardware and networking with sufficient power, memory and bandwidth to execute those simulations. While HPC enables flow simulation, flow simulation motivates development of novel HPC techniques. This paper focuses on demonstrating that flow simulation has come a long way and is being applied to many complex, real-world problems in different fields of engineering and applied sciences, particularly in aerospace engineering and applied fluid mechanics. Flow simulation has come a long way because HPC has come a long way. This paper also provides a brief review of some of the recently-developed HPC methods and tools that has played a major role in bringing flow simulation where it is today. A number of 3D flow simulations are presented in this paper as examples of the level of computational capability reached with recent HPC methods and hardware. These examples are, flow around a fighter aircraft, flow around two trains passing in a tunnel, large ram-air parachutes, flow over hydraulic structures, contaminant dispersion in a model subway station, airflow past an automobile, multiple spheres falling in a liquid-filled tube, and dynamics of a paratrooper jumping from a cargo aircraft.
Ritsch, Elmar; Froidevaux, Daniel; Salzburger, Andreas
One of the cornerstones for the success of the ATLAS experiment at the Large Hadron Collider (LHC) is a very accurate Monte Carlo detector simulation. However, a limit is being reached regarding the amount of simulated data which can be produced and stored with the computing resources available through the worldwide LHC computing grid (WLCG). The Integrated Simulation Framework (ISF) is a novel approach to detector simula- tion which enables a more efficient use of these computing resources and thus allows for the generation of more simulated data. Various simulation technologies are combined to allow for faster simulation approaches which are targeted at the specific needs of in- dividual physics studies. Costly full simulation technologies are only used where high accuracy is required by physics analyses and fast simulation technologies are applied everywhere else. As one of the first applications of the ISF, a new combined simulation approach is developed for the generation of detector calibration samples ...
Computer Based Modelling and Simulation
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:
Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation
Energy Technology Data Exchange (ETDEWEB)
Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)
2011-08-15
This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.
[Thoughts on and probes into computer simulation of acupuncture manipulation].
Hu, Yin'e; Liu, Tangyi; Tang, Wenchao; Xu, Gang; Gao, Ming; Yang, Huayuan
2011-08-01
The studies of the simulation of acupuncture manipulation mainly focus on mechanical simulation and virtual simulation (SIM). In terms of mechanical simulation, the aim of the research is to develop the instruments of the simulation of acupuncture manipulation, and to apply them to the simulation or a replacement of the manual acupuncture manipulation; while the virtual simulation applies the virtual reality technology to present the manipulation in 3D real-time on the computer screen. This paper is to summarize the recent research progress on computer simulation of acupuncture manipulation at home and abroad, and thus concludes with the significance and the rising problems over the computer simulation of acupuncture manipulation. Therefore we put forward that the research on simulation manipulation should pay much attention to experts' manipulation simulation, as well as the verification studies on conformity and clinical effects.
Evolutionary Games and Computer Simulations
Huberman, B A; Huberman, Bernardo A.; Glance, Natalie S.
1993-01-01
Abstract: The prisoner's dilemma has long been considered the paradigm for studying the emergence of cooperation among selfish individuals. Because of its importance, it has been studied through computer experiments as well as in the laboratory and by analytical means. However, there are important differences between the way a system composed of many interacting elements is simulated by a digital machine and the manner in which it behaves when studied in real experiments. In some instances, these disparities can be marked enough so as to cast doubt on the implications of cellular automata type simulations for the study of cooperation in social systems. In particular, if such a simulation imposes space-time granularity, then its ability to describe the real world may be compromised. Indeed, we show that the results of digital simulations regarding territoriality and cooperation differ greatly when time is discrete as opposed to continuous.
Computer simulation of martensitic transformations
Energy Technology Data Exchange (ETDEWEB)
Xu, Ping [Univ. of California, Berkeley, CA (United States)
1993-11-01
The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.
A parallel computational model for GATE simulations.
Rannou, F R; Vega-Acevedo, N; El Bitar, Z
2013-12-01
GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Computer Simulations of Space Plasmas
Goertz, C. K.
Even a superficial scanning of the latest issues of the Journal of Geophysical Research reveals that numerical simulation of space plasma processes is an active and growing field. The complexity and sophistication of numerically produced “data” rivals that of the real stuff. Sometimes numerical results need interpretation in terms of a simple “theory,” very much as the results of real experiments and observations do. Numerical simulation has indeed become a third independent tool of space physics, somewhere between observations and analytic theory. There is thus a strong need for textbooks and monographs that report the latest techniques and results in an easily accessible form. This book is an attempt to satisfy this need. The editors want it not only to be “proceedings of selected lectures (given) at the first ISSS (International School of Space Simulations in Kyoto, Japan, November 1-2, 1982) but rather…a form of textbook of computer simulations of space plasmas.” This is, of course, a difficult task when many authors are involved. Unavoidable redundancies and differences in notation may confuse the beginner. Some important questions, like numerical stability, are not discussed in sufficient detail. The recent book by C.K. Birdsall and A.B. Langdon (Plasma Physics via Computer Simulations, McGraw-Hill, New York, 1985) is more complete and detailed and seems more suitable as a textbook for simulations. Nevertheless, this book is useful to the beginner and the specialist because it contains not only descriptions of various numerical techniques but also many applications of simulations to space physics phenomena.
Computer simulation of electron beams
Energy Technology Data Exchange (ETDEWEB)
Sabchevski, S.P.; Mladenov, G.M. (Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. po Elektronika)
1994-04-14
Self-fields and forces as well as the local degree of space-charge neutralization in overcompensated electron beams are considered. The radial variation of the local degree of space-charge neutralization is analysed. A novel model which describes the equilibrium potential distribution in overcompensated beams is proposed and a method for computer simulation of the beam propagation is described. Results from numerical experiments which illustrate the propagation of finite emittance overneutralized beams are presented. (Author).
Computer simulation of nonequilibrium processes
Energy Technology Data Exchange (ETDEWEB)
Wallace, D.C.
1985-07-01
The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.
Computer simulation of superionic fluorides
Castiglione, M
2000-01-01
experimentally gives an indication of the correlations between nearby defects is well-reproduced. The most stringent test of simulation model transferability is presented in the studies of lead tin fluoride, in which significant 'covalent' effects are apparent. Other similarly-structured compounds are also investigated, and the reasons behind the adoption of such an unusual layered structure, and the mobility and site occupation of the anions is quantified. In this thesis the nature of ion mobility in cryolite and lead fluoride based compounds is investigated by computer simulation. The phase transition of cryolite is characterised in terms of rotation of AIF sub 6 octahedra, and the conductive properties are shown to result from diffusion of the sodium ions. The two processes appear to be unrelated. Very good agreement with NMR experimental results is found. The Pb sup 2 sup + ion has a very high polarisability, yet treatment of this property in previous simulations has been problematic. In this thesis a mor...
Computational Challenges in Nuclear Weapons Simulation
Energy Technology Data Exchange (ETDEWEB)
McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M
2003-08-29
After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.
Computational methods applied to wind tunnel optimization
Lindsay, David
This report describes computational methods developed for optimizing the nozzle of a three-dimensional subsonic wind tunnel. This requires determination of a shape that delivers flow to the test section, typically with a speed increase of 7 or more and a velocity uniformity of .25% or better, in a compact length without introducing boundary layer separation. The need for high precision, smooth solutions, and three-dimensional modeling required the development of special computational techniques. These include: (1) alternative formulations to Neumann and Dirichlet boundary conditions, to deal with overspecified, ill-posed, or cyclic problems, and to reduce the discrepancy between numerical solutions and boundary conditions; (2) modification of the Finite Element Method to obtain solutions with numerically exact conservation properties; (3) a Matlab implementation of general degree Finite Element solvers for various element designs in two and three dimensions, exploiting vector indexing to obtain optimal efficiency; (4) derivation of optimal quadrature formulas for integration over simplexes in two and three dimensions, and development of a program for semi-automated generation of formulas for any degree and dimension; (5) a modification of a two-dimensional boundary layer formulation to provide accurate flow conservation in three dimensions, and modification of the algorithm to improve stability; (6) development of multi-dimensional spline functions to achieve smoother solutions in three dimensions by post-processing, new three-dimensional elements for C1 basis functions, and a program to assist in the design of elements with higher continuity; and (7) a development of ellipsoidal harmonics and Lame's equation, with generalization to any dimension and a demonstration that Cartesian, cylindrical, spherical, spheroidal, and sphero-conical harmonics are all limiting cases. The report includes a description of the Finite Difference, Finite Volume, and domain remapping
Application of computer simulated persons in indoor environmental modeling
DEFF Research Database (Denmark)
Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft
2002-01-01
Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...... effort, however, has been focused on the influence of the geometry. This work provides an investigation of geometrically different computer simulated persons with respect to both local and global airflow distribution. The results show that a simple geometry is sufficient when the global airflow...... of a ventilated enclosure is considered, as little or no influence of geometry was observed at some distance from the computer simulated person. For local flow conditions, though, a more detailed geometry should be applied in order to assess thermal and atmospheric comfort....
QCE : A Simulator for Quantum Computer Hardware
Michielsen, Kristel; Raedt, Hans De
2003-01-01
The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms.
Computer Simulations of Intrinsically Disordered Proteins
Chong, Song-Ho; Chatterjee, Prathit; Ham, Sihyun
2017-05-01
The investigation of intrinsically disordered proteins (IDPs) is a new frontier in structural and molecular biology that requires a new paradigm to connect structural disorder to function. Molecular dynamics simulations and statistical thermodynamics potentially offer ideal tools for atomic-level characterizations and thermodynamic descriptions of this fascinating class of proteins that will complement experimental studies. However, IDPs display sensitivity to inaccuracies in the underlying molecular mechanics force fields. Thus, achieving an accurate structural characterization of IDPs via simulations is a challenge. It is also daunting to perform a configuration-space integration over heterogeneous structural ensembles sampled by IDPs to extract, in particular, protein configurational entropy. In this review, we summarize recent efforts devoted to the development of force fields and the critical evaluations of their performance when applied to IDPs. We also survey recent advances in computational methods for protein configurational entropy that aim to provide a thermodynamic link between structural disorder and protein activity.
Applied time series analysis and innovative computing
Ao, Sio-Iong
2010-01-01
This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.
Simulation Theory Applied to Direct Systematic Observation.
Manolov, Rumen; Losada, José L
2017-01-01
Observational studies entail making several decisions before data collection, such as the observational design to use, the sampling of sessions within the observational period, the need for time sampling within the observation sessions, as well as the observation recording procedures to use. The focus of the present article is on observational recording procedures different from continuous recording (i.e., momentary time sampling, partial and whole interval recording). The main aim is to develop an online software application, constructed using R and the Shiny package, on the basis of simulations using the alternating renewal process (a model implemented in the ARPobservation package). The application offers graphical representations that can be useful to both university students constructing knowledge on Observational Methodology and to applied researchers planning to use discontinuous recording in their studies, because it helps identifying the conditions (e.g., interval length, average duration of the behavior of interest) in which the prevalence of the target behavior is expected to be estimated with less bias or no bias and with more efficiency. The estimation of frequency is another topic covered.
Deterministic event-based simulation of universal quantum computation
Michielsen, K.; Raedt, H. De; Raedt, K. De; Landau, DP; Lewis, SP; Schuttler, HB
2006-01-01
We demonstrate that locally connected networks of classical processing units that leave primitive learning capabilities can be used to perform a deterministic; event-based simulation of universal tluanttim computation. The new simulation method is applied to implement Shor's factoring algorithm.
Computer Simulation in Tomorrow's Schools.
Foster, David
1984-01-01
Suggests use of simulation as an educational strategy has promise for the school of the future; discusses specific advantages of simulations over alternative educational methods, role of microcomputers in educational simulation, and past obstacles and future promise of microcomputer simulations; and presents a literature review on effectiveness of…
Computer simulation of working stress of heat treated steel specimen
B. Smoljan; D. Iljkić; S. Smokvina Hanza
2009-01-01
Purpose: In this paper, the prediction of working stress of quenched and tempered steel has been done. The working stress was characterized by yield strength and fracture toughness. The method of computer simulation of working stress was applied in workpiece of complex form.Design/methodology/approach: Hardness distribution of quenched and tempered workpiece of complex form was predicted by computer simulation of steel quenching using a finite volume method. The algorithm of estimation of yie...
Research in Applied Mathematics, Fluid Mechanics and Computer Science
1999-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.
[Research activities in applied mathematics, fluid mechanics, and computer science
1995-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.
Applying and evaluating computer-animated tutors
Massaro, Dominic W.; Bosseler, Alexis; Stone, Patrick S.; Connors, Pamela
2002-05-01
We have developed computer-assisted speech and language tutors for deaf, hard of hearing, and autistic children. Our language-training program utilizes our computer-animated talking head, Baldi, as the conversational agent, who guides students through a variety of exercises designed to teach vocabulary and grammer, to improve speech articulation, and to develop linguistic and phonological awareness. Baldi is an accurate three-dimensional animated talking head appropriately aligned with either synthesized or natural speech. Baldi has a tongue and palate, which can be displayed by making his skin transparent. Two specific language-training programs have been evaluated to determine if they improve word learning and speech articulation. The results indicate that the programs are effective in teaching receptive and productive language. Advantages of utilizing a computer-animated agent as a language tutor are the popularity of computers and embodied conversational agents with autistic kids, the perpetual availability of the program, and individualized instruction. Students enjoy working with Baldi because he offers extreme patience, he doesn't become angry, tired, or bored, and he is in effect a perpetual teaching machine. The results indicate that the psychology and technology of Baldi holds great promise in language learning and speech therapy. [Work supported by NSF Grant Nos. CDA-9726363 and BCS-9905176 and Public Health Service Grant No. PHS R01 DC00236.
Applying Human Computation Methods to Information Science
Harris, Christopher Glenn
2013-01-01
Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…
Discrete Event Simulation Computers can be used to simulate the ...
Indian Academy of Sciences (India)
IAS Admin
Department of Computer. Science and Automation. Indian Institute of Science. Bangalore 560 012. Email: mjt@csa.iisc.ernet.in. Computers can be used to simulate the operation of complex systems and thereby study their performance. This article introduces you to the technique of discrete event simulation through a simple ...
Framework for utilizing computational devices within simulation
Directory of Open Access Journals (Sweden)
Miroslav Mintál
2013-12-01
Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.
Efficient SDH Computation In Molecular Simulations Data.
Tu, Yi-Cheng; Chen, Shaoping; Pandit, Sagar; Kumar, Anand; Grupcev, Vladimir
2012-10-01
Analysis of large particle or molecular simulation data is integral part of the basic-science research community. It often involves computing functions such as point-to-point interactions of particles. Spatial distance histogram (SDH) is one such vital computation in scientific discovery. SDH is frequently used to compute Radial Distribution Function (RDF), and it takes quadratic time to compute using naive approach. Naive SDH computation is even more expensive as it is computed continuously over certain period of time to analyze simulation systems. Tree-based SDH computation is a popular approach. In this paper we look at different tree-based SDH computation techniques and briefly discuss about their performance. We present different strategies to improve the performance of these techniques. Specifically, we study the density map (DM) based SDH computation techniques. A DM is essentially a grid dividing simulated space into cells (3D cubes) of equal size (volume), which can be easily implemented by augmenting a Quad-tree (or Oct-tree) index. DMs are used in various configurations to compute SDH continuously over snapshots of the simulation system. The performance improvements using some of these configurations is presented in this paper. We also present the effect of utilizing computation power of Graphics Processing Units (GPUs) in computing SDH.
Computer Simulation of the Neuronal Action Potential.
Solomon, Paul R.; And Others
1988-01-01
A series of computer simulations of the neuronal resting and action potentials are described. Discusses the use of simulations to overcome the difficulties of traditional instruction, such as blackboard illustration, which can only illustrate these events at one point in time. Describes systems requirements necessary to run the simulations.…
Computer Simulation of a Hardwood Processing Plant
D. Earl Kline; Philip A. Araman
1990-01-01
The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...
Computational optimization techniques applied to microgrids planning
DEFF Research Database (Denmark)
Gamarra, Carlos; Guerrero, Josep M.
2015-01-01
Microgrids are expected to become part of the next electric power system evolution, not only in rural and remote areas but also in urban communities. Since microgrids are expected to coexist with traditional power grids (such as district heating does with traditional heating systems...... appear along the planning process. In this context, technical literature about optimization techniques applied to microgrid planning have been reviewed and the guidelines for innovative planning methodologies focused on economic feasibility can be defined. Finally, some trending techniques and new...
Computer simulation of fatigue under diametrical compression.
Carmona, H A; Kun, F; Andrade, J S; Herrmann, H J
2007-04-01
We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.
Computer Simulation of the UMER Gridded Gun
Haber, Irving; Friedman, Alex; Grote, D P; Kishek, Rami A; Reiser, Martin; Vay, Jean-Luc; Zou, Yun
2005-01-01
The electron source in the University of Maryland Electron Ring (UMER) injector employs a grid 0.15 mm from the cathode to control the current waveform. Under nominal operating conditions, the grid voltage during the current pulse is sufficiently positive relative to the cathode potential to form a virtual cathode downstream of the grid. Three-dimensional computer simulations have been performed that use the mesh refinement capability of the WARP particle-in-cell code to examine a small region near the beam center in order to illustrate some of the complexity that can result from such a gridded structure. These simulations have been found to reproduce the hollowed velocity space that is observed experimentally. The simulations also predict a complicated time-dependent response to the waveform applied to the grid during the current turn-on. This complex temporal behavior appears to result directly from the dynamics of the virtual cathode formation and may therefore be representative of the expected behavior in...
FEL Simulation Using Distributed Computing
Energy Technology Data Exchange (ETDEWEB)
Einstein, Joshua [Fermilab; Bernabeu Altayo, Gerard [Fermilab; Biedron, Sandra [Ljubljana U.; Freund, Henry [Colorado State U., Fort Collins; Milton, Stephen [Colorado State U., Fort Collins; van der Slot, Peter [Colorado State U., Fort Collins
2016-06-01
While simulation tools are available and have been used regularly for simulating light sources, the increasing availability and lower cost of GPU-based processing opens up new opportunities. This poster highlights a method of how accelerating and parallelizing code processing through the use of COTS software interfaces.
Micro-computer simulation software: A review
Directory of Open Access Journals (Sweden)
P.S. Kruger
2003-12-01
Full Text Available Simulation modelling has proved to be one of the most powerful tools available to the Operations Research Analyst. The development of micro-computer technology has reached a state of maturity where the micro-computer can provide the necessary computing power and consequently various powerful and inexpensive simulation languages for micro-computers have became available. This paper will attempt to provide an introduction to the general philosophy and characteristics of some of the available micro-computer simulation languages. The emphasis will be on the characteristics of the specific micro-computer implementation rather than on a comparison of the modelling features of the various languages. Such comparisons may be found elsewhere.
Computer simulation in physics and engineering
Steinhauser, Martin Oliver
2013-01-01
This work is a needed reference for widely used techniques and methods of computer simulation in physics and other disciplines, such as materials science. The work conveys both: the theoretical foundations of computer simulation as well as applications and "tricks of the trade", that often are scattered across various papers. Thus it will meet a need and fill a gap for every scientist who needs computer simulations for his/her task at hand. In addition to being a reference, case studies and exercises for use as course reading are included.
Filtration theory using computer simulations
Energy Technology Data Exchange (ETDEWEB)
Bergman, W.; Corey, I.
1997-01-01
We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three- dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements.
Filtration theory using computer simulations
Energy Technology Data Exchange (ETDEWEB)
Bergman, W.; Corey, I. [Lawrence Livermore National Lab., CA (United States)
1997-08-01
We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.
Augmented Reality Simulations on Handheld Computers
Squire, Kurt; Klopfer, Eric
2007-01-01
Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…
Computer Simulation in Information and Communication Engineering
Anton Topurov
2005-01-01
CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...
Computer Based Modelling and Simulation
Indian Academy of Sciences (India)
This is supposed to recall gambling and hence the name Monte Carlo simulation. The procedure was developed by. Stanislaw Ulam and John Van Neumann. They used the simu- lation method to solve partial differential equations for diffu- sion of neutrons! (Box 2). We can illustrate the MC method by a simple example.
Computer simulation, nuclear techniques and surface analysis
Directory of Open Access Journals (Sweden)
Reis, A. D.
2010-02-01
Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of ^{12}C and ^{18}O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.
Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de ^{12}C y de ^{18}O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Salesperson Ethics: An Interactive Computer Simulation
Castleberry, Stephen
2014-01-01
A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…
[Animal experimentation, computer simulation and surgical research].
Carpentier, Alain
2009-11-01
We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.
Computer Systems/Database Simulation.
1978-10-15
defined distribution of inter-arrival times. Hence the process of model building and execution is considerably eased with the help of simulation langauges ...the hands of only the data creater need not be forwarded to the data user. This removes both JCL and format diffi-- culties from the users domain . 3...emulators avail- iblte on any machine for most source langauges .) Lower level languages, such as Assembler or Macro-like code will always be machine
Research in applied mathematics, numerical analysis, and computer science
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
Atomistic computer simulations a practical guide
Brazdova, Veronika
2013-01-01
Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o
Energy Technology Data Exchange (ETDEWEB)
HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK
2000-04-01
Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.
Computational Modeling of Simulation Tests.
1980-06-01
cavity was simulated with a nonrigid, partially reflecting heavy gas (the rigid wall of 905.0021 was replaced with additional cells of ideal gas which...the shock tunnel at the 4.14-Mpa range found in calculation 906.1081. The driver consisted of 25 cells of burned ammonium nitrate and fuel oil ( ANFO ...mm AX = 250 mm Reflected Wave Geometry--Calculation 906.1091 65 m Driver Region Reaction Region Boundary Burned Rigid ANFO Real Air Reflecting k 90.6
Computer Code for Nanostructure Simulation
Filikhin, Igor; Vlahovic, Branislav
2009-01-01
Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.
Editorial: Special Issue on Computational Problems in Applied Mathematics
Walailak Journal of Science and Technology
2014-01-01
Computational Fluid Dynamics (CFD) is a highly interdisciplinary research area which lies at the interface of physics, applied mathematics, and computer science. CFD is the science of predicting fluid flow, heat transfer, mass transfer, chemical reactions, and related phenomena by solving the mathematical equations which govern these processes using a numerical process. Theoretical and Computational Fluid Dynamics provides a forum for the cross-fertilization of notions, tools and techniques a...
Monte Carlo simulations applied to conjunctival lymphoma radiotherapy treatment
Energy Technology Data Exchange (ETDEWEB)
Brualla, Lorenzo; Sauerwein, Wolfgang [Universitaetsklinikum Essen (Germany). NCTeam, Strahlenklinik; Palanco-Zamora, Ricardo [Karolinska University Hospital, Stockholm (Sweden); Steuhl, Klaus-Peter [Universitaetsklinikum Essen (Germany). Klinik fuer Erkrankungen des vorderen Augenabschnittes; Bornfeld, Norbert [Universitaetsklinikum Essen (Germany). Klinik fuer Erkrankungen des hinteren Augenabschnittes
2011-08-15
Small radiation fields are increasingly applied in clinical routine. In particular, they are necessary for the treatment of eye tumors. However, available treatment planning systems do not calculate the absorbed dose with the desired accuracy in the presence of small fields. Absorbed dose estimations obtained with Monte Carlo methods have the required accuracy for clinical applications, but the exceedingly long computation times associated with them hinder their routine use. In this article, a code for automatic Monte Carlo simulation of linacs and an application in the treatment of conjunctival lymphoma are presented. Simulations of clinical linear accelerators were performed with the general-purpose radiation transport Monte Carlo code penelope. Accelerator geometry files, in electron mode, were generated with the program AutolinaC. The Monte Carlo simulation of an annular electron 6 MeV field used for the treatment of the conjunctival lymphoma yielded absorbed dose results statistically compatible with experimental measurements. In this simulation, 2% standard statistical uncertainty was reached in the same time employed by a hybrid Monte Carlo commercial code (eMC); however, eMC showed discrepancies of up to 7% on the absorbed dose with respect to experimental data. Results obtained with the analytic algorithm Pencil Beam Convolution differed from experimental data by 10% for this case. Owing to the systematic application of variance-reduction techniques, it is possible to accurately estimate the absorbed dose in patient images, using Monte Carlo methods, in times within clinical routine requirements. The program AutolinaC allows systematic use of these variance-reduction techniques within the code penelope. (orig.)
Computer simulation of thermal plant operations
O'Kelly, Peter
2012-01-01
This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.
Computer Simulations of Lipid Nanoparticles
Directory of Open Access Journals (Sweden)
Xavier F. Fernandez-Luengo
2017-12-01
Full Text Available Lipid nanoparticles (LNP are promising soft matter nanomaterials for drug delivery applications. In spite of their interest, little is known about the supramolecular organization of the components of these self-assembled nanoparticles. Here, we present a molecular dynamics simulation study, employing the Martini coarse-grain forcefield, of self-assembled LNPs made by tripalmitin lipid in water. We also study the adsorption of Tween 20 surfactant as a protective layer on top of the LNP. We show that, at 310 K (the temperature of interest in biological applications, the structure of the lipid nanoparticles is similar to that of a liquid droplet, in which the lipids show no nanostructuration and have high mobility. We show that, for large enough nanoparticles, the hydrophilic headgroups develop an interior surface in the NP core that stores liquid water. The surfactant is shown to organize in an inhomogeneous way at the LNP surface, with patches with high surfactant concentrations and surface patches not covered by surfactant.
Enabling Computational Technologies for Terascale Scientific Simulations
Energy Technology Data Exchange (ETDEWEB)
Ashby, S.F.
2000-08-24
We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.
Electric Propulsion Plume Simulations Using Parallel Computer
Directory of Open Access Journals (Sweden)
Joseph Wang
2007-01-01
Full Text Available A parallel, three-dimensional electrostatic PIC code is developed for large-scale electric propulsion simulations using parallel supercomputers. This code uses a newly developed immersed-finite-element particle-in-cell (IFE-PIC algorithm designed to handle complex boundary conditions accurately while maintaining the computational speed of the standard PIC code. Domain decomposition is used in both field solve and particle push to divide the computation among processors. Two simulations studies are presented to demonstrate the capability of the code. The first is a full particle simulation of near-thruster plume using real ion to electron mass ratio. The second is a high-resolution simulation of multiple ion thruster plume interactions for a realistic spacecraft using a domain enclosing the entire solar array panel. Performance benchmarks show that the IFE-PIC achieves a high parallel efficiency of ≥ 90%
Time reversibility, computer simulation, and chaos
Hoover, William Graham
1999-01-01
A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful
Perspective: Computer simulations of long time dynamics
Energy Technology Data Exchange (ETDEWEB)
Elber, Ron [Department of Chemistry, The Institute for Computational Engineering and Sciences, University of Texas at Austin, Austin, Texas 78712 (United States)
2016-02-14
Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.
Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri
2017-01-01
Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…
Quantitative computer simulations of extraterrestrial processing operations
Vincent, T. L.; Nikravesh, P. E.
1989-01-01
The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.
Computer simulation of proton channelling in silicon
Indian Academy of Sciences (India)
2000-06-12
Jun 12, 2000 ... Computer simulation of proton channelling in silicon. N K DEEPAK, K RAJASEKHARAN* and K NEELAKANDAN. Department of Physics, University of Calicut, Malappuram 673 635, India. *. Department of Physics, Malabar Christian College, Kozhikode 673 001, India. MS received 11 October 1999; revised ...
Computer simulations of phospholipid - membrane thermodynamic fluctuations
DEFF Research Database (Denmark)
Pedersen, U.R.; Peters, Günther H.j.; Schröder, T.B.
2008-01-01
This paper reports all-atom computer simulations of five phospholipid membranes, DMPC, DPPC, DMPG, DMPS, and DMPSH, with a focus on the thermal equilibrium fluctuations of volume, energy, area, thickness, and order parameter. For the slow fluctuations at constant temperature and pressure (defined...
Spiking network simulation code for petascale computers
Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz
2014-01-01
Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682
Spiking network simulation code for petascale computers.
Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz
2014-01-01
Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.
Hybrid annealing: Coupling a quantum simulator to a classical computer
Graß, Tobias; Lewenstein, Maciej
2017-05-01
Finding the global minimum in a rugged potential landscape is a computationally hard task, often equivalent to relevant optimization problems. Annealing strategies, either classical or quantum, explore the configuration space by evolving the system under the influence of thermal or quantum fluctuations. The thermal annealing dynamics can rapidly freeze the system into a low-energy configuration, and it can be simulated well on a classical computer, but it easily gets stuck in local minima. Quantum annealing, on the other hand, can be guaranteed to find the true ground state and can be implemented in modern quantum simulators; however, quantum adiabatic schemes become prohibitively slow in the presence of quasidegeneracies. Here, we propose a strategy which combines ideas from simulated annealing and quantum annealing. In such a hybrid algorithm, the outcome of a quantum simulator is processed on a classical device. While the quantum simulator explores the configuration space by repeatedly applying quantum fluctuations and performing projective measurements, the classical computer evaluates each configuration and enforces a lowering of the energy. We have simulated this algorithm for small instances of the random energy model, showing that it potentially outperforms both simulated thermal annealing and adiabatic quantum annealing. It becomes most efficient for problems involving many quasidegenerate ground states.
Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong
2010-10-01
Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
High-performance computing MRI simulations.
Stöcker, Tony; Vahedipour, Kaveh; Pflugfelder, Daniel; Shah, N Jon
2010-07-01
A new open-source software project is presented, JEMRIS, the Jülich Extensible MRI Simulator, which provides an MRI sequence development and simulation environment for the MRI community. The development was driven by the desire to achieve generality of simulated three-dimensional MRI experiments reflecting modern MRI systems hardware. The accompanying computational burden is overcome by means of parallel computing. Many aspects are covered that have not hitherto been simultaneously investigated in general MRI simulations such as parallel transmit and receive, important off-resonance effects, nonlinear gradients, and arbitrary spatiotemporal parameter variations at different levels. The latter can be used to simulate various types of motion, for instance. The JEMRIS user interface is very simple to use, but nevertheless it presents few limitations. MRI sequences with arbitrary waveforms and complex interdependent modules are modeled in a graphical user interface-based environment requiring no further programming. This manuscript describes the concepts, methods, and performance of the software. Examples of novel simulation results in active fields of MRI research are given. (c) 2010 Wiley-Liss, Inc.
Computer Game-based Learning: Applied Game Development Made Simpler
Nyamsuren, Enkhbold
2018-01-01
The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish
Fluid Dynamics Theory, Computation, and Numerical Simulation
Pozrikidis, Constantine
2009-01-01
Fluid Dynamics: Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner. The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming. This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice. There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes. Two distinguishing features of the discourse are: solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty. Matlab codes are presented and discussed for ...
Fluid dynamics theory, computation, and numerical simulation
Pozrikidis, C
2001-01-01
Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...
Neumann, David L.
2010-01-01
Interactive computer-based simulations have been applied in several contexts to teach statistical concepts in university level courses. In this report, the use of interactive simulations as part of summative assessment in a statistics course is described. Students accessed the simulations via the web and completed questions relating to the…
1991-07-01
d’Electronique Indust:ielle, Unitt de Recherche Associee au CNRS na847, E!S--IHT, 31071 Toulouse, France. Abstact The authors present- a modelling The different...Eleotroniquii d Nariy ... . ENSE9,’B.P. 8118, Oasis,-Casablanca, ENSEM, INPL, 2 Av. de ]a For~t de Hays, MAROC 54516 Vandoeuvre 18s’Nancy, France. Abstract - This...positionse controller fura DC drive is presented. The ,applied metod uses the C represensuidon of the system. -1he robustness is assured by the use
4th International Conference on Applied Computing and Information Technology
2017-01-01
This edited book presents scientific results of the 4th International Conference on Applied Computing and Information Technology (ACIT 2016) which was held on December 12–14, 2016 in Las Vegas, USA. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. The aim of this conference was also to bring out the research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the best papers from those papers accepted for presentation at the conference. The papers were chosen based on review scores submitted by members of the Program Committee, and underwent further rigorous rounds of review. Th...
Computer Simulation for Emergency Incident Management
Energy Technology Data Exchange (ETDEWEB)
Brown, D L
2004-12-03
This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.
Computational fluid dynamics for sport simulation
2009-01-01
All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.
Computer simulation of multiple dynamic photorefractive gratings
DEFF Research Database (Denmark)
Buchhave, Preben
1998-01-01
The benefits of a direct visualization of space-charge grating buildup are described. The visualization is carried out by a simple repetitive computer program, which simulates the basic processes in the band-transport model and displays the result graphically or in the form of numerical data....... The simulation sheds light on issues that are not amenable to analytical solutions, such as the spectral content of the wave forms, cross talk in three-beam interaction, and the range of applications of the band-transport model. (C) 1998 Optical Society of America....
Computational cell biology: spatiotemporal simulation of cellular events.
Slepchenko, Boris M; Schaff, James C; Carson, John H; Loew, Leslie M
2002-01-01
The field of computational cell biology has emerged within the past 5 years because of the need to apply disciplined computational approaches to build and test complex hypotheses on the interacting structural, physical, and chemical features that underlie intracellular processes. To meet this need, newly developed software tools allow cell biologists and biophysicists to build models and generate simulations from them. The construction of general-purpose computational approaches is especially challenging if the spatial complexity of cellular systems is to be explicitly treated. This review surveys some of the existing efforts in this field with special emphasis on a system being developed in the authors' laboratory, Virtual Cell. The theories behind both stochastic and deterministic simulations are discussed. Examples of respective applications to cell biological problems in RNA trafficking and neuronal calcium dynamics are provided to illustrate these ideas.
Time reversibility, computer simulation, algorithms, chaos
Hoover, William Graham
2012-01-01
A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...
Computer simulation of molecular sorption in zeolites
Calmiano, M D
2001-01-01
The work presented in this thesis encompasses the computer simulation of molecular sorption. In Chapter 1 we outline the aims and objectives of this work. Chapter 2 follows in which an introduction to sorption in zeolites is presented, with discussion of structure and properties of the main zeolites studied. Chapter 2 concludes with a description of the principles and theories of adsorption. In Chapter 3 we describe the methodology behind the work carried out in this thesis. In Chapter 4 we present our first computational study, that of the sorption of krypton in silicalite. We describe work carried out to investigate low energy sorption sites of krypton in silicalite where we observe krypton to preferentially sorb into straight and sinusoidal channels over channel intersections. We simulate single step type I adsorption isotherms and use molecular dynamics to study the diffusion of krypton and obtain division coefficients and the activation energy. We compare our results to previous experimental and computat...
Understanding membrane fouling mechanisms through computational simulations
Xiang, Yuan
This dissertation focuses on a computational simulation study on the organic fouling mechanisms of reverse osmosis and nanofiltration (RO/NF) membranes, which have been widely used in industry for water purification. The research shows that through establishing a realistic computational model based on available experimental data, we are able to develop a deep understanding of membrane fouling mechanism. This knowledge is critical for providing a strategic plan for membrane experimental community and RO/NF industry for further improvements in membrane technology for water treatment. This dissertation focuses on three major research components (1) Development of the realistic molecular models, which could well represent the membrane surface properties; (2) Investigation of the interactions between the membrane surface and foulants by steered molecular dynamics simulations, in order to determine the major factors that contribute to surface fouling; and (3) Studies of the interactions between the surface-modified membranes (polyethylene glycol) to provide strategies for antifouling.
Computational plasticity algorithm for particle dynamics simulations
Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.
2018-01-01
The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.
Computer simulations and theory of protein translocation.
Makarov, Dmitrii E
2009-02-17
The translocation of proteins through pores is central to many biological phenomena, such as mitochondrial protein import, protein degradation, and delivery of protein toxins to their cytosolic targets. Because proteins typically have to pass through constrictions that are too narrow to accommodate folded structures, translocation must be coupled to protein unfolding. The simplest model that accounts for such co-translocational unfolding assumes that both translocation and unfolding are accomplished by pulling on the end of the polypeptide chain mechanically. In this Account, we describe theoretical studies and computer simulations of this model and discuss how the time scales of translocation depend on the pulling force and on the protein structure. Computationally, this is a difficult problem because biologically or experimentally relevant time scales of translocation are typically orders of magnitude slower than those accessible by fully atomistic simulations. For this reason, we explore one-dimensional free energy landscapes along suitably defined translocation coordinates and discuss various approaches to their computation. We argue that the free energy landscape of translocation is often bumpy because confinement partitions the protein's configuration space into distinct basins of attraction separated by large entropic barriers. Favorable protein-pore interactions and nonnative interactions within the protein further contribute to the complexity. Computer simulations and simple scaling estimates show that forces of just 2-6 pN are often sufficient to ensure transport of unstructured polypeptides, whereas much higher forces are typically needed to translocate folded protein domains. The unfolding mechanisms found from simulations of translocation are different from those observed in the much better understood case of atomic force microscopy (AFM) pulling studies, in which proteins are unraveled by stretching them between their N- and C-termini. In contrast to
Computer Simulation of Multidimensional Archaeological Artefacts
Directory of Open Access Journals (Sweden)
Vera Moitinho de Almeida
2012-11-01
Our project focuses on the Neolithic lakeside site of La Draga (Banyoles, Catalonia. In this presentation we will begin by providing a clear overview of the major guidelines used to capture and process 3D digital data of several wooden artefacts. Then, we shall present the use of semi-automated relevant feature extractions. Finally, we intend to share preliminary computer simulation issues.
Computational intelligence applied to the growth of quantum dots
Singulani, Anderson P.; Vilela Neto, Omar P.; Aurélio Pacheco, Marco C.; Vellasco, Marley B. R.; Pires, Maurício P.; Souza, Patrícia L.
2008-11-01
We apply two computational intelligence techniques, namely, artificial neural network and genetic algorithm to the growth of self-assembled quantum dots. The method relies on an existing database of growth parameters with a resulting quantum dot characteristic to be able to later obtain the growth parameters needed to reach a specific value for such a quantum dot characteristic. The computational techniques were used to associate the growth input parameters with the mean height of the deposited quantum dots. Trends of the quantum dot mean height behavior as a function of growth parameters were correctly predicted and the growth parameters required to minimize the quantum dot mean height were provided.
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
Energy Technology Data Exchange (ETDEWEB)
Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States); University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4 (Canada); Robertson, Douglas D., E-mail: douglas.d.robertson@emory.edu [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States)
2013-07-01
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than
Accelerating Climate Simulations Through Hybrid Computing
Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark
2009-01-01
Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.
Global Conference on Applied Computing in Science and Engineering
2016-01-01
The Global Conference on Applied Computing in Science and Engineering is organized by academics and researchers belonging to different scientific areas of the C3i/Polytechnic Institute of Portalegre (Portugal) and the University of Extremadura (Spain) with the technical support of ScienceKnow Conferences. The event has the objective of creating an international forum for academics, researchers and scientists from worldwide to discuss worldwide results and proposals regarding to the soundest issues related to Applied Computing in Science and Engineering. This event will include the participation of renowned keynote speakers, oral presentations, posters sessions and technical conferences related to the topics dealt with in the Scientific Program as well as an attractive social and cultural program. The papers will be published in the Proceedings e-books. The proceedings of the conference will be sent to possible indexing on Thomson Reuters (selective by Thomson Reuters, not all-inclusive) and Google Scholar...
Computational physics simulation of classical and quantum systems
Scherer, Philipp O J
2013-01-01
This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...
SPINET: A Parallel Computing Approach to Spine Simulations
Directory of Open Access Journals (Sweden)
Peter G. Kropf
1996-01-01
Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.
Computer Game-based Learning: Applied Game Development Made Simpler
Nyamsuren, Enkhbold
2018-01-01
The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish an online platform for managing and sharing these technologies. Examples technologies are but not limited to real-time player performance assessment and game difficulty adaptation, emotion recognit...
Computer Models Simulate Fine Particle Dispersion
2010-01-01
Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.
Morgan, Philip E.
2004-01-01
This final report contains reports of research related to the tasks "Scalable High Performance Computing: Direct and Lark-Eddy Turbulent FLow Simulations Using Massively Parallel Computers" and "Devleop High-Performance Time-Domain Computational Electromagnetics Capability for RCS Prediction, Wave Propagation in Dispersive Media, and Dual-Use Applications. The discussion of Scalable High Performance Computing reports on three objectives: validate, access scalability, and apply two parallel flow solvers for three-dimensional Navier-Stokes flows; develop and validate a high-order parallel solver for Direct Numerical Simulations (DNS) and Large Eddy Simulation (LES) problems; and Investigate and develop a high-order Reynolds averaged Navier-Stokes turbulence model. The discussion of High-Performance Time-Domain Computational Electromagnetics reports on five objectives: enhancement of an electromagnetics code (CHARGE) to be able to effectively model antenna problems; utilize lessons learned in high-order/spectral solution of swirling 3D jets to apply to solving electromagnetics project; transition a high-order fluids code, FDL3DI, to be able to solve Maxwell's Equations using compact-differencing; develop and demonstrate improved radiation absorbing boundary conditions for high-order CEM; and extend high-order CEM solver to address variable material properties. The report also contains a review of work done by the systems engineer.
On architectural acoustic design using computer simulation
DEFF Research Database (Denmark)
Schmidt, Anne Marie Due; Kirkegaard, Poul Henning
2004-01-01
acoustic design process. The emphasis is put on the first three out of five phases in the working process of the architect and a case study is carried out in which each phase is represented by typical results ? as exemplified with reference to the design of Bagsværd Church by Jørn Utzon. The paper...... discusses the advantages and disadvantages of the programme in each phase compared to the works of architects not using acoustic simulation programmes. The conclusion of the paper is that the application of acoustic simulation programs is most beneficial in the last of three phases but an application...... properties prior to the actual construction of a building. With the right tools applied, acoustic design can become an integral part of the architectural design process. The aim of this paper is to investigate the field of application that an acoustic simulation programme can have during an architectural...
Methods for Computationally Efficient Structured CFD Simulations of Complex Turbomachinery Flows
Herrick, Gregory P.; Chen, Jen-Ping
2012-01-01
This research presents more efficient computational methods by which to perform multi-block structured Computational Fluid Dynamics (CFD) simulations of turbomachinery, thus facilitating higher-fidelity solutions of complicated geometries and their associated flows. This computational framework offers flexibility in allocating resources to balance process count and wall-clock computation time, while facilitating research interests of simulating axial compressor stall inception with more complete gridding of the flow passages and rotor tip clearance regions than is typically practiced with structured codes. The paradigm presented herein facilitates CFD simulation of previously impractical geometries and flows. These methods are validated and demonstrate improved computational efficiency when applied to complicated geometries and flows.
Computer simulation of electrokinetics in colloidal systems
Schmitz, R.; Starchenko, V.; Dünweg, B.
2013-11-01
The contribution gives a brief overview outlining how our theoretical understanding of the phenomenon of colloidal electrophoresis has improved over the decades. Particular emphasis is put on numerical calculations and computer simulation models, which have become more and more important as the level of description became more detailed and refined. Due to computational limitations, it has so far not been possible to study "perfect" models. Different complementary models have hence been developed, and their various strengths and deficiencies are briefly discussed. This is contrasted with the experimental situation, where there are still observations waiting for theoretical explanation. The contribution then outlines our recent development of a numerical method to solve the electrokinetic equations for a finite volume in three dimensions, and describes some new results that could be obtained by the approach.
Computer simulation of spacecraft/environment interaction
Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V
1999-01-01
This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.
How to simulate a universal quantum computer using negative probabilities
Hofmann, Holger F.
2009-07-01
The concept of negative probabilities can be used to decompose the interaction of two qubits mediated by a quantum controlled-NOT into three operations that require only classical interactions (that is, local operations and classical communication) between the qubits. For a single gate, the probabilities of the three operations are 1, 1 and -1. This decomposition can be applied in a probabilistic simulation of quantum computation by randomly choosing one of the three operations for each gate and assigning a negative statistical weight to the outcomes of sequences with an odd number of negative probability operations. The maximal exponential speed-up of a quantum computer can then be evaluated in terms of the increase in the number of sequences needed to simulate a single operation of the quantum circuit.
Multiscale Computer Simulation of Failure in Aerogels
Good, Brian S.
2008-01-01
Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.
Computed neutron coincidence counting applied to passive waste assay
Energy Technology Data Exchange (ETDEWEB)
Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R. [Nuclear Research Centre, Mol (Belgium)
1997-11-01
Neutron coincidence counting applied for the passive assay of fissile material is generally realised with dedicated electronic circuits. This paper presents a software based neutron coincidence counting method with data acquisition via a commercial PC-based Time Interval Analyser (TIA). The TIA is used to measure and record all time intervals between successive pulses in the pulse train up to count-rates of 2 Mpulses/s. Software modules are then used to compute the coincidence count-rates and multiplicity related data. This computed neutron coincidence counting (CNCC) offers full access to all the time information contained in the pulse train. This paper will mainly concentrate on the application and advantages of CNCC for the non-destructive assay of waste. An advanced multiplicity selective Rossi-alpha method is presented and its implementation via CNCC demonstrated. 13 refs., 4 figs., 2 tabs.
Computer simulation of arcuate keratotomy for astigmatism.
Hanna, K D; Jouve, F E; Waring, G O; Ciarlet, P G
1992-01-01
The development of refractive corneal surgery involves numerous attempts to isolate the effect of individual factors on surgical outcome. Computer simulation of refractive keratotomy allows the surgeon to alter variables of the technique and to isolate the effect of specific factors independent of other factors, something that cannot easily be done in any of the currently available experimental models. We used the finite element numerical method to construct a mathematical model of the eye. The model analyzed stress-strain relationships in the normal corneoscleral shell and after astigmatic surgery. The model made the following assumptions: an axisymmetric eye, an idealized aspheric anterior corneal surface, transversal isotropy of the cornea, nonlinear strain tensor for large displacements, and near incompressibility of the corneoscleral shell. The eye was assumed to be fixed at the level of the optic nerve. The model described the acute elastic response of the eye to corneal surgery. We analyzed the effect of paired transverse arcuate corneal incisions for the correction of astigmatism. We evaluated the following incision variables and their effect on change in curvature of the incised and unincised meridians: length (longer, more steepening of unincised meridian), distance from the center of the cornea (farther, less flattening of incised meridian), depth (deeper, more effect), and the initial amount of astigmatism (small effect). Our finite element computer model gives reasonably accurate information about the relative effects of different surgical variables, and demonstrates the feasibility of using nonlinear, anisotropic assumptions in the construction of such a computer model. Comparison of these computer-generated results to clinically achieved results may help refine the computer model.
On Architectural Acoustics Design using Computer Simulation
DEFF Research Database (Denmark)
Schmidt, Anne Marie Due; Kirkegaard, Poul Henning
2004-01-01
room acoustic simulation programs it is now possible to subjectively analyze and evaluate acoustic properties prior to the actual construction of a facility. With the right tools applied, the acoustic design can become an integrated part of the architectural design process. The aim of the present paper...... is to investigate the field of application an acoustic simulation program can have during an architectural acoustics design process. A case study is carried out in order to represent the iterative working process of an architect. The working process is divided into five phases and represented by typical results...... in each phase ? exemplified by Bagsværd Church by Jørn Utzon - and a description of which information would be beneficial to progress in the work. Among other things the applicability as a tool giving inspiration for finding forms of structures and rooms for an architect compared with an architect without...
A Computational Framework for Bioimaging Simulation.
Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi
2015-01-01
Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.
A Computational Framework for Bioimaging Simulation.
Directory of Open Access Journals (Sweden)
Masaki Watabe
Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.
A Computational Framework for Bioimaging Simulation
Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi
2015-01-01
Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508
Computational physics simulation of classical and quantum systems
Scherer, Philipp O J
2017-01-01
This textbook presents basic numerical methods and applies them to a large variety of physical models in multiple computer experiments. Classical algorithms and more recent methods are explained. Partial differential equations are treated generally comparing important methods, and equations of motion are solved by a large number of simple as well as more sophisticated methods. Several modern algorithms for quantum wavepacket motion are compared. The first part of the book discusses the basic numerical methods, while the second part simulates classical and quantum systems. Simple but non-trivial examples from a broad range of physical topics offer readers insights into the numerical treatment but also the simulated problems. Rotational motion is studied in detail, as are simple quantum systems. A two-level system in an external field demonstrates elementary principles from quantum optics and simulation of a quantum bit. Principles of molecular dynamics are shown. Modern bounda ry element methods are presented ...
Computer simulation of human motion in sports biomechanics.
Vaughan, C L
1984-01-01
This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that
Coating-substrate-simulations applied to HFQ® forming tools
Directory of Open Access Journals (Sweden)
Leopold Jürgen
2015-01-01
Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.
Computer Simulation of Developmental Processes and ...
Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of
Computational Modeling and Simulation of Developmental ...
Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic
Hygrothermal Numerical Simulation Tools Applied to Building Physics
Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto
2013-01-01
This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...
Discrete simulation applied to the production process of electronic components
Directory of Open Access Journals (Sweden)
Willians dos Santos Lúcio
2017-07-01
Full Text Available The objective of this research is to demonstrate through simulation techniques and analyses performed in production systems of a company located in the city of Guarulhos and which produces an electronic component that has plastic, acrylic and steel, the improvements that can be applied in it with the use of a specialist software to assist in troubleshooting and help the manager in decision making. For the study, concepts of simulation, Monte Carlo method, queueing theory and the software Arena will be used. By simulating processes and evaluating performance, the software offers reports that assist the manager to see more clearly potential bottlenecks and points of improvement in process, thus effectively contributing to the company's competitiveness on the market. With the study presented in this article it will be possible to verify the importance of the use of this tool.
Investigating European genetic history through computer simulations.
Currat, Mathias; Silva, Nuno M
2013-01-01
The genetic diversity of Europeans has been shaped by various evolutionary forces including their demographic history. Genetic data can thus be used to draw inferences on the population history of Europe using appropriate statistical methods such as computer simulation, which constitutes a powerful tool to study complex models. Here, we focus on spatially explicit simulation, a method which takes population movements over space and time into account. We present its main principles and then describe a series of studies using this approach that we consider as particularly significant in the context of European prehistory. All simulation studies agree that ancient demographic events played a significant role in the establishment of the European gene pool; but while earlier works support a major genetic input from the Near East during the Neolithic transition, the most recent ones revalue positively the contribution of pre-Neolithic hunter-gatherers and suggest a possible impact of very ancient demographic events. This result of a substantial genetic continuity from pre-Neolithic times to the present challenges some recent studies analyzing ancient DNA. We discuss the possible reasons for this discrepancy and identify future lines of investigation in order to get a better understanding of European evolution.
Coupling Computer-Aided Process Simulation and ...
A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable
Symplectic molecular dynamics simulations on specially designed parallel computers.
Borstnik, Urban; Janezic, Dusanka
2005-01-01
We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.
Applying DNA computation to intractable problems in social network analysis.
Chen, Rick C S; Yang, Stephen J H
2010-09-01
From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Are We Sims? How Computer Simulations Represent and What This Means for the Simulation Argument
Beisbart, Claus
2017-01-01
N. Bostrom's simulation argument and two additional assumptions imply that we likely live in a computer simulation. The argument is based upon the following assumption about the workings of realistic brain simulations: The hardware of a computer on which a brain simulation is run bears a close analogy to the brain itself. To inquire whether this is so, I analyze how computer simulations trace processes in their targets. I describe simulations as fictional, mathematical, pictorial, and materia...
Atomistic Method Applied to Computational Modeling of Surface Alloys
Bozzolo, Guillermo H.; Abel, Phillip B.
2000-01-01
the BFS (Bozzolo, Ferrante, and Smith) method for the calculation of the energetics, consists of a small number of simple PCbased computer codes that deal with the different aspects of surface alloy formation. Two analysis modes are available within this package. The first mode provides an atom-by-atom description of real and virtual stages 1. during the process of surface alloying, based on the construction of catalogues of configurations where each configuration describes one possible atomic distribution. BFS analysis of this catalogue provides information on accessible states, possible ordering patterns, and details of island formation or film growth. More importantly, it provides insight into the evolution of the system. Software developed by the Computational Materials Group allows for the study of an arbitrary number of elements forming surface alloys, including an arbitrary number of surface atomic layers. The second mode involves large-scale temperature-dependent computer 2. simulations that use the BFS method for the energetics and provide information on the dynamic processes during surface alloying. These simulations require the implementation of Monte-Carlo-based codes with high efficiency within current workstation environments. This methodology capitalizes on the advantages of the BFS method: there are no restrictions on the number or type of elements or on the type of crystallographic structure considered. This removes any restrictions in the definition of the configuration catalogues used in the analytical calculations, thus allowing for the study of arbitrary ordering patterns, ultimately leading to the actual surface alloy structure. Moreover, the Monte Carlo numerical technique used for the large-scale simulations allows for a detailed visualization of the simulated process, the main advantage of this type of analysis being the ability to understand the underlying features that drive these processes. Because of the simplicity of the BFS method for e
Computer Simulation of Electron Positron Annihilation Processes
Energy Technology Data Exchange (ETDEWEB)
Chen, y
2003-10-02
With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create
Computer Simulation of Intergranular Stress Corrosion Cracking via Hydrogen Embrittlement
Energy Technology Data Exchange (ETDEWEB)
Smith, R.W.
2000-04-01
Computer simulation has been applied to the investigation of intergranular stress corrosion cracking in Ni-based alloys based on a hydrogen embrittlement mechanism. The simulation employs computational modules that address (a) transport and reactions of aqueous species giving rise to hydrogen generation at the liquid-metal interface, (b) solid state transport of hydrogen via intergranular and transgranular diffusion pathways, and (c) fracture due to the embrittlement of metallic bonds by hydrogen. A key focus of the computational model development has been the role of materials microstructure (precipitate particles and grain boundaries) on hydrogen transport and embrittlement. Simulation results reveal that intergranular fracture is enhanced as grain boundaries are weakened and that microstructures with grains elongated perpendicular to the stress axis are more susceptible to cracking. The presence of intergranular precipitates may be expected to either enhance or impede cracking depending on the relative distribution of hydrogen between the grain boundaries and the precipitate-matrix interfaces. Calculations of hydrogen outgassing and in gassing demonstrate a strong effect of charging method on the fracture behavior.
Computer simulation of amorphous MIS solar cells
Energy Technology Data Exchange (ETDEWEB)
Shousha, A.H.M.; El-Kosheiry, M.A. [Cairo University (Egypt). Electronics and Communications Engineering Dept.
1997-10-01
A computer model to simulate amorphous MIS solar cells is developed. The model is based on the self-consistent solution of the electron and hole continuity equations, together with the Poisson equation under proper boundary conditions. The program developed is used to investigate the cell performance characteristics in terms of its physical and structural parameters. The current-voltage characteristics of the solar cell are obtained under AMI solar illumination. The dependences of the short-circuit current, open-circuit voltage, fill factor and cell conversion efficiency on localized gap state density, carrier lifetime, cell thickness and surface recombination velocity are obtained and discussed. The results presented show how cell parameters can be varied to improve the cell performance characteristics. (Author)
Towards A Novel Environment For Simulation Of Quantum Computing
Directory of Open Access Journals (Sweden)
Joanna Patrzyk
2015-01-01
Full Text Available In this paper we analyze existing quantum computer simulation techniquesand their realizations to minimize the impact of the exponentialcomplexity of simulated quantum computations. As a result of thisinvestigation, we propose a quantum computer simulator with an integrateddevelopment environment - QuIDE - supporting development of algorithms forfuture quantum computers. The simulator simplifies building and testingquantum circuits and understand quantum algorithms in an efficient way.The development environment provides flexibility of source codeedition and ease of graphical building of circuit diagrams. We alsodescribe and analyze the complexity of algorithms used for simulationand present performance results of the simulator as well as results ofits deployment during university classes.
Associative Memory Computing Power and Its Simulation
Volpi, G; The ATLAS collaboration
2014-01-01
The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...
Associative Memory computing power and its simulation
Ancu, L S; The ATLAS collaboration; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G
2014-01-01
The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...
Computer simulations of the mouse spermatogenic cycle
Directory of Open Access Journals (Sweden)
Debjit Ray
2014-12-01
Full Text Available The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal–spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.
Computer-aided Instructional System for Transmission Line Simulation.
Reinhard, Erwin A.; Roth, Charles H., Jr.
A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…
Using Computational Simulations to Confront Students' Mental Models
Rodrigues, R.; Carvalho, P. Simeão
2014-01-01
In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…
Subglacial sediment mechanics investigated by computer simulation of granular material
DEFF Research Database (Denmark)
Damsgaard, Anders; Egholm, David Lundbek; Tulaczyk, Slawek
to the mechanical nonlinearity of the sediment, internal porosity changes during deformation, and associated structural and kinematic phase transitions. In this presentation, we introduce the Discrete Element Method (DEM) for particle-scale granular simulation. The DEM is fully coupled with fluid dynamics....... The numerical method is applied to better understand the mechanical properties of the subglacial sediment and its interaction with meltwater. The computational approach allows full experimental control and offers insights into the internal kinematics, stress distribution, and mechanical stability. During...... by linear-viscous sediment movement. We demonstrate how channel flanks are stabilized by the sediment frictional strength. Additionally, sediment liquefaction proves to be a possible mechanism for causing large and episodic sediment transport by water flow. Though computationally intense, our coupled...
Directory of Open Access Journals (Sweden)
Francesca Spyrakis
2016-10-01
Thus, key computational medicinal chemistry methods like molecular dynamics can be used to decipher protein flexibility and to obtain stable models for docking and scoring in food-related studies, and virtual screening is increasingly being applied to identify molecules with potential to act as endocrine disruptors, food mycotoxins, and new nutraceuticals [3,4,5]. All of these methods and simulations are based on protein-ligand interaction phenomena, and represent the basis for any subsequent modification of the targeted receptor's or enzyme's physiological activity. We describe here the energetics of binding of biological complexes, providing a survey of the most common and successful algorithms used in evaluating these energetics, and we report case studies in which computational techniques have been applied to food science issues. In particular, we explore a handful of studies involving the estrogen receptors for which we have a long-term interest.
Traffic Simulations on Parallel Computers Using Domain Decomposition Techniques
1995-01-01
Large scale simulations of Intelligent Transportation Systems (ITS) can only be acheived by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic...
Supporting hypothesis generation by learners exploring an interactive computer simulation
van Joolingen, Wouter; de Jong, Anthonius J.M.
1992-01-01
Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This
Numerical simulation of NQR/NMR: Applications in quantum computing.
Possa, Denimar; Gaudio, Anderson C; Freitas, Jair C C
2011-04-01
A numerical simulation program able to simulate nuclear quadrupole resonance (NQR) as well as nuclear magnetic resonance (NMR) experiments is presented, written using the Mathematica package, aiming especially applications in quantum computing. The program makes use of the interaction picture to compute the effect of the relevant nuclear spin interactions, without any assumption about the relative size of each interaction. This makes the program flexible and versatile, being useful in a wide range of experimental situations, going from NQR (at zero or under small applied magnetic field) to high-field NMR experiments. Some conditions specifically required for quantum computing applications are implemented in the program, such as the possibility of use of elliptically polarized radiofrequency and the inclusion of first- and second-order terms in the average Hamiltonian expansion. A number of examples dealing with simple NQR and quadrupole-perturbed NMR experiments are presented, along with the proposal of experiments to create quantum pseudopure states and logic gates using NQR. The program and the various application examples are freely available through the link http://www.profanderson.net/files/nmr_nqr.php. Copyright © 2011 Elsevier Inc. All rights reserved.
Computer simulation of rapid crystal growth under microgravity
Hisada, Yasuhiro; Saito, Osami; Mitachi, Koshi; Nishinaga, Tatau
We are planning to grow a Ge single crystal under microgravity by the TR-IA rocket in 1992. The furnace temperature should be controlled so as to finish the crystal growth in a quite short time interval (about 6 min). This study deals with the computer simulation of rapid crystal growth in space to find the proper conditions for the experiment. The crystal growth process is influenced by various physical phenomena such as heat conduction, natural and Marangoni convections, phase change, and radiation from the furnace. In this study, a 2D simulation with axial symmetry is carried out, taking into account the radiation field with a specific temperature distribution of the furnace wall. The simulation program consists of four modules. The first module is applied for the calculation of the parabolic partial differential equation by using the control volume method. The second one evaluates implicitly the phase change by the enthalpy method. The third one is for computing the heat flux from surface by radiation. The last one is for calculating with the Monte Carlo method the view factors which are necessary to obtain the heat flux.
Artificial Neural Network Metamodels of Stochastic Computer Simulations
1994-08-10
23 Haddock, J. and O’Keefe, R., "Using Artificial Intelligence to Facilitate Manufacturing Systems Simulation," Computers & Industrial Engineering , Vol...Feedforward Neural Networks," Computers & Industrial Engineering , Vol. 21, No. 1- 4, (1991), pp. 247-251. 87 Proceedings of the 1992 Summer Computer...Using Simulation Experiments," Computers & Industrial Engineering , Vol. 22, No. 2 (1992), pp. 195-209. 119 Kuei, C. and Madu, C., "Polynomial
QDENSITY—A Mathematica quantum computer simulation
Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank
2009-03-01
This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples
Computer simulation boosts automation in the stockyard
Energy Technology Data Exchange (ETDEWEB)
NONE
2001-04-01
Today's desktop computer and advanced software keep pace with handling equipment to reach new heights of sophistication with graphic simulation able to show precisely what is and could happen in the coal terminal's stockyard. The article describes an innovative coal terminal nearing completion on the Pacific coast at Lazaro Cardenas in Mexico, called the Petracalco terminal. Here coal is unloaded, stored and fed to the nearby power plant of Pdte Plutarco Elias Calles. The R & D department of the Italian company Techint, Italimpianti has developed MHATIS, a sophisticated software system for marine terminal management here, allowing analysis of performance with the use of graphical animation. Strategies can be tested before being put into practice and likely power station demand can be predicted. The design and operation of the MHATIS system is explained. Other integrated coal handling plants described in the article are that developed by the then PWH (renamed Krupp Foerdertechnik) of Germany for the Israel Electric Corporation and the installation by the same company of a further bucketwheel for a redesigned coal stockyard at the Port of Hamburg operated by Hansaport. 1 fig., 4 photos.
Computational simulation of liquid rocket injector anomalies
Przekwas, A. J.; Singhal, A. K.; Tam, L. T.; Davidian, K.
1986-01-01
A computer model has been developed to analyze the three-dimensional two-phase reactive flows in liquid fueled rocket combustors. The model is designed to study the influence of liquid propellant injection nonuniformities on the flow pattern, combustion and heat transfer within the combustor. The Eulerian-Lagrangian approach for simulating polidisperse spray flow, evaporation and combustion has been used. Full coupling between the phases is accounted for. A nonorthogonal, body fitted coordinate system along with a conservative control volume formulation is employed. The physical models built into the model include a kappa-epsilon turbulence model, a two-step chemical reaction, and the six-flux radiation model. Semiempirical models are used to describe all interphase coupling terms as well as chemical reaction rates. The purpose of this study was to demonstrate an analytical capability to predict the effects of reactant injection nonuniformities (injection anomalies) on combustion and heat transfer within the rocket combustion chamber. The results show promising application of the model to comprehensive modeling of liquid propellant rocket engines.
Factors promoting engaged exploration with computer simulations
Directory of Open Access Journals (Sweden)
Noah S. Podolefsky
2010-10-01
Full Text Available This paper extends prior research on student use of computer simulations (sims to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze interviews with college students using PhET sims in order to demonstrate engaged exploration, and to identify factors that can promote this type of inquiry. With minimal explicit guidance, students explore the topic of wave interference in ways that bear similarity to how scientists explore phenomena. PhET sims are flexible tools which allow students to choose their own learning path, but also provide constraints such that students’ choices are generally productive. This type of inquiry is supported by sim features such as concrete connections to the real world, representations that are not available in the real world, analogies to help students make meaning of and connect across multiple representations and phenomena, and a high level of interactivity with real-time, dynamic feedback from the sim. These features of PhET sims enable students to pose questions and answer them in ways that may not be supported by more traditional educational materials.
Nature preservation acceptance model applied to tanker oil spill simulations
DEFF Research Database (Denmark)
Friis-Hansen, Peter; Ditlevsen, Ove Dalager
2003-01-01
is exemplified by a study of oil spills due to simulated tanker collisions in the Danish straits. It is found that the distribution of the oil spill volume per spill is well represented by an exponential distribution both in Oeresund and in Great Belt. When applied in the Poisson model, a risk profile reasonably...... close to the standard lognormal profile is obtained. Moreover, based on data pairs (volume, cost) for world wide oil spills it is inferred that the conditional distribution of the costs given the spill volume is well modeled by a lognormal distribution. By unconditioning by the exponential distribution...... of the single oil spill, a risk profile for the costs is obtained that is indistinguishable from the standard lognormal risk profile.Finally the question of formulating a public risk acceptance criterion is addressed following Ditlevsen, and it is argued that a Nature Preservation Willingness Index can...
Simulation Assisted Risk Assessment Applied to Launch Vehicle Conceptual Design
Mathias, Donovan L.; Go, Susie; Gee, Ken; Lawrence, Scott
2008-01-01
A simulation-based risk assessment approach is presented and is applied to the analysis of abort during the ascent phase of a space exploration mission. The approach utilizes groupings of launch vehicle failures, referred to as failure bins, which are mapped to corresponding failure environments. Physical models are used to characterize the failure environments in terms of the risk due to blast overpressure, resulting debris field, and the thermal radiation due to a fireball. The resulting risk to the crew is dynamically modeled by combining the likelihood of each failure, the severity of the failure environments as a function of initiator and time of the failure, the robustness of the crew module, and the warning time available due to early detection. The approach is shown to support the launch vehicle design process by characterizing the risk drivers and identifying regions where failure detection would significantly reduce the risk to the crew.
Explore Effective Use of Computer Simulations for Physics Education
Lee, Yu-Fen; Guo, Yuying
2008-01-01
The dual purpose of this article is to provide a synthesis of the findings related to the use of computer simulations in physics education and to present implications for teachers and researchers in science education. We try to establish a conceptual framework for the utilization of computer simulations as a tool for learning and instruction in…
Overview of Computer Simulation Modeling Approaches and Methods
Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett
2005-01-01
The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...
How Effective Is Instructional Support for Learning with Computer Simulations?
Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute
2013-01-01
The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…
Computers for real time flight simulation: A market survey
Bekey, G. A.; Karplus, W. J.
1977-01-01
An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.
Challenges & Roadmap for Beyond CMOS Computing Simulation.
Energy Technology Data Exchange (ETDEWEB)
Rodrigues, Arun F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frank, Michael P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-12-01
Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).
Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation
Stocker, John C.; Golomb, Andrew M.
2011-01-01
Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
A scalable parallel black oil simulator on distributed memory parallel computers
Wang, Kun; Liu, Hui; Chen, Zhangxin
2015-11-01
This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.
Alternative energy technologies an introduction with computer simulations
Buxton, Gavin
2014-01-01
Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe
High performance computing system for flight simulation at NASA Langley
Cleveland, Jeff I., II; Sudik, Steven J.; Grove, Randall D.
1991-01-01
The computer architecture and components used in the NASA Langley Advanced Real-Time Simulation System (ARTSS) are briefly described and illustrated with diagrams and graphs. Particular attention is given to the advanced Convex C220 processing units, the UNIX-based operating system, the software interface to the fiber-optic-linked Computer Automated Measurement and Control system, configuration-management and real-time supervisor software, ARTSS hardware modifications, and the current implementation status. Simulation applications considered include the Transport Systems Research Vehicle, the Differential Maneuvering Simulator, the General Aviation Simulator, and the Visual Motion Simulator.
Quantum computer gate simulations | Dada | Journal of the Nigerian ...
African Journals Online (AJOL)
A new interactive simulator for Quantum Computation has been developed for simulation of the universal set of quantum gates and for construction of new gates of up to 3 qubits. The simulator also automatically generates an equivalent quantum circuit for any arbitrary unitary transformation on a qubit. Available quantum ...
A note on simulated annealing to computer laboratory scheduling ...
African Journals Online (AJOL)
The concepts, principles and implementation of simulated Annealing as a modem heuristic technique is presented. Simulated Annealing algorithm is used in solving real life problem of Computer Laboratory scheduling in order to maximize the use of scarce and insufficient resources. KEY WORDS: Simulated Annealing ...
CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.
Skrein, Dale
1994-01-01
CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)
Simulation methods to estimate design power: an overview for applied research
2011-01-01
Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447
Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide
Khayat, Michael A.
2011-01-01
The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.
Computational sieving applied to some classical number-theoretic problems
H.J.J. te Riele (Herman)
1998-01-01
textabstractMany problems in computational number theory require the application of some sieve. Efficient implementation of these sieves on modern computers has extended our knowledge of these problems considerably. This is illustrated by three classical problems: the Goldbach conjecture, factoring
Applying Minimalist Design Principles to the Problem of Computer Anxiety.
Reznich, Christopher B.
1996-01-01
Minimalist design principles were used to test whether instructional intervention could decrease computer anxiety of subjects learning basic word-processing skills. Subjects were pre- and posttested on anxiety during each session. Findings indicated that the method as well as increased computer use decreased anxiety. (Author/AEF)
Applying natural evolution for solving computational problems - Lecture 2
CERN. Geneva
2017-01-01
Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.
Applying natural evolution for solving computational problems - Lecture 1
CERN. Geneva
2017-01-01
Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.
Agent-based computer simulations of language choice dynamics.
Hadzibeganovic, Tarik; Stauffer, Dietrich; Schulze, Christian
2009-06-01
We use agent-based Monte Carlo simulations to address the problem of language choice dynamics in a tripartite community that is linguistically homogeneous but politically divided. We observe the process of nonlocal pattern formation that causes populations to self-organize into stable antagonistic groups as a result of the local dynamics of attraction and influence between individual computational agents. Our findings uncover some of the unique properties of opinion formation in social groups when the process is affected by asymmetric noise distribution, unstable intergroup boundaries, and different migratory behaviors. Although we focus on one particular study, the proposed stochastic dynamic models can be easily generalized and applied to investigate the evolution of other complex and nonlinear features of human collective behavior.
Simulating Smoke Filling in Big Halls by Computational Fluid Dynamics
Directory of Open Access Journals (Sweden)
W. K. Chow
2011-01-01
Full Text Available Many tall halls of big space volume were built and, to be built in many construction projects in the Far East, particularly Mainland China, Hong Kong, and Taiwan. Smoke is identified to be the key hazard to handle. Consequently, smoke exhaust systems are specified in the fire code in those areas. An update on applying Computational Fluid Dynamics (CFD in smoke exhaust design will be presented in this paper. Key points to note in CFD simulations on smoke filling due to a fire in a big hall will be discussed. Mathematical aspects concerning of discretization of partial differential equations and algorithms for solving the velocity-pressure linked equations are briefly outlined. Results predicted by CFD with different free boundary conditions are compared with those on room fire tests. Standards on grid size, relaxation factors, convergence criteria, and false diffusion should be set up for numerical experiments with CFD.
Symbolic Computations in Simulations of Hydromagnetic Dynamo
Directory of Open Access Journals (Sweden)
Vodinchar Gleb
2017-01-01
Full Text Available The compilation of spectral models of geophysical fluid dynamics and hydromagnetic dynamo involves the calculation of a large number of volume integrals from complex combinations of basis fields. In this paper we describe the automation of this computation with the help of systems of symbolic computations.
Fluid dynamics theory, computation, and numerical simulation
Pozrikidis, C
2017-01-01
This book provides an accessible introduction to the basic theory of fluid mechanics and computational fluid dynamics (CFD) from a modern perspective that unifies theory and numerical computation. Methods of scientific computing are introduced alongside with theoretical analysis and MATLAB® codes are presented and discussed for a broad range of topics: from interfacial shapes in hydrostatics, to vortex dynamics, to viscous flow, to turbulent flow, to panel methods for flow past airfoils. The third edition includes new topics, additional examples, solved and unsolved problems, and revised images. It adds more computational algorithms and MATLAB programs. It also incorporates discussion of the latest version of the fluid dynamics software library FDLIB, which is freely available online. FDLIB offers an extensive range of computer codes that demonstrate the implementation of elementary and advanced algorithms and provide an invaluable resource for research, teaching, classroom instruction, and self-study. This ...
Computer simulations of equilibrium magnetization and microstructure in magnetic fluids
Rosa, A. P.; Abade, G. C.; Cunha, F. R.
2017-09-01
In this work, Monte Carlo and Brownian Dynamics simulations are developed to compute the equilibrium magnetization of a magnetic fluid under action of a homogeneous applied magnetic field. The particles are free of inertia and modeled as hard spheres with the same diameters. Two different periodic boundary conditions are implemented: the minimum image method and Ewald summation technique by replicating a finite number of particles throughout the suspension volume. A comparison of the equilibrium magnetization resulting from the minimum image approach and Ewald sums is performed by using Monte Carlo simulations. The Monte Carlo simulations with minimum image and lattice sums are used to investigate suspension microstructure by computing the important radial pair-distribution function go(r), which measures the probability density of finding a second particle at a distance r from a reference particle. This function provides relevant information on structure formation and its anisotropy through the suspension. The numerical results of go(r) are compared with theoretical predictions based on quite a different approach in the absence of the field and dipole-dipole interactions. A very good quantitative agreement is found for a particle volume fraction of 0.15, providing a validation of the present simulations. In general, the investigated suspensions are dominated by structures like dimmer and trimmer chains with trimmers having probability to form an order of magnitude lower than dimmers. Using Monte Carlo with lattice sums, the density distribution function g2(r) is also examined. Whenever this function is different from zero, it indicates structure-anisotropy in the suspension. The dependence of the equilibrium magnetization on the applied field, the magnetic particle volume fraction, and the magnitude of the dipole-dipole magnetic interactions for both boundary conditions are explored in this work. Results show that at dilute regimes and with moderate dipole
Understanding Emergency Care Delivery Through Computer Simulation Modeling.
Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L
2017-08-10
In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.
High-resolution computer simulations of EKC.
Breadmore, Michael C; Quirino, Joselito P; Thormann, Wolfgang
2009-02-01
The electrophoresis simulation software, GENTRANS, has been modified to include the interaction of analytes with an electrolyte additive to allow the simulation of liquid-phase EKC separations. The modifications account for interaction of weak and strong acid and base analytes with a single weak or strong acid or base background electrolyte additive and can be used to simulate a range of EKC separations with both charged and neutral additives. Simulations of separations of alkylphenyl ketones under real experimental conditions were performed using mobility and interaction constant data obtained from the literature and agreed well with experimental separations. Migration times in fused-silica capillaries and linear polyacrylamide-coated capillaries were within 7% of the experimental values, while peak widths were always narrower than the experimental values, but were still within 50% of those obtained by experiment. Simulations of sweeping were also performed; although migration time agreement was not as good as for simple EKC separations, peak widths were in good agreement, being within 1-50% of the experimental values. All simulations for comparison with experimental data were performed under real experimental conditions using a 47 cm capillary and a voltage of 20 kV and represent the first quantitative attempt at simulating EKC separations with and without sweeping.
Discrete calculus applied analysis on graphs for computational science
Grady, Leo J
2010-01-01
This unique text brings together into a single framework current research in the three areas of discrete calculus, complex networks, and algorithmic content extraction. Many example applications from several fields of computational science are provided.
Some theoretical issues on computer simulations
Energy Technology Data Exchange (ETDEWEB)
Barrett, C.L.; Reidys, C.M.
1998-02-01
The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.
Large-scale computing techniques for complex system simulations
Dubitzky, Werner; Schott, Bernard
2012-01-01
Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and
Chiu, Michelle; Posner, Glenn; Humphrey-Murto, Susan
2017-01-27
Simulation-based education has gained popularity, yet many faculty members feel inadequately prepared to teach using this technique. Fellowship training in medical education exists, but there is little information regarding simulation or formal educational programs therein. In our institution, simulation fellowships were offered by individual clinical departments. We recognized the need for a formal curriculum in educational theory. Kern's approach to curriculum development was used to develop, implement, and evaluate the Foundational Elements of Applied Simulation Theory (FEAST) curriculum. Needs assessments resulted in a 26-topic curriculum; each biweekly session built upon the previous. Components essential to success included setting goals and objectives for each interactive session and having dedicated faculty, collaborative leadership and administrative support for the curriculum. Evaluation data was collated and analyzed annually via anonymous feedback surveys, focus groups, and retrospective pre-post self-assessment questionnaires. Data collected from 32 fellows over five years of implementation showed that the curriculum improved knowledge, challenged thinking, and was excellent preparation for a career in simulation-based medical education. Themes arising from focus groups demonstrated that participants valued faculty expertise and the structure, practicality, and content of the curriculum. We present a longitudinal simulation educator curriculum that adheres to a well-described framework of curriculum development. Program evaluation shows that FEAST has increased participant knowledge in key areas relevant to simulation-based education and that the curriculum has been successful in meeting the needs of novice simulation educators. Insights and practice points are offered for educators wishing to implement a similar curriculum in their institution.
Posner, Glenn; Humphrey-Murto, Susan
2017-01-01
Simulation-based education has gained popularity, yet many faculty members feel inadequately prepared to teach using this technique. Fellowship training in medical education exists, but there is little information regarding simulation or formal educational programs therein. In our institution, simulation fellowships were offered by individual clinical departments. We recognized the need for a formal curriculum in educational theory. Kern’s approach to curriculum development was used to develop, implement, and evaluate the Foundational Elements of Applied Simulation Theory (FEAST) curriculum. Needs assessments resulted in a 26-topic curriculum; each biweekly session built upon the previous. Components essential to success included setting goals and objectives for each interactive session and having dedicated faculty, collaborative leadership and administrative support for the curriculum. Evaluation data was collated and analyzed annually via anonymous feedback surveys, focus groups, and retrospective pre-post self-assessment questionnaires. Data collected from 32 fellows over five years of implementation showed that the curriculum improved knowledge, challenged thinking, and was excellent preparation for a career in simulation-based medical education. Themes arising from focus groups demonstrated that participants valued faculty expertise and the structure, practicality, and content of the curriculum. We present a longitudinal simulation educator curriculum that adheres to a well-described framework of curriculum development. Program evaluation shows that FEAST has increased participant knowledge in key areas relevant to simulation-based education and that the curriculum has been successful in meeting the needs of novice simulation educators. Insights and practice points are offered for educators wishing to implement a similar curriculum in their institution. PMID:28280655
Cooperative Control Simulation Validation Using Applied Probability Theory
National Research Council Canada - National Science Library
Schulz, Christopher
2003-01-01
...; however, these simulations lack a method to validate their output. This research presents a method to validate the performance of a decentralized cooperative control simulation environment for an autonomous Wide Area Search Munition (WASM...
Humans, computers and wizards human (simulated) computer interaction
Fraser, Norman; McGlashan, Scott; Wooffitt, Robin
2013-01-01
Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.
Numerical Implementation and Computer Simulation of Tracer ...
African Journals Online (AJOL)
, was most dependent on the source definition and the hydraulic conductivity K of the porous medium. The 12000mg/l chloride tracer source was almost completely dispersed within 34 hours. Keywords: Replication, Numerical simulation, ...
Computational Simulation of Droplet Collision Dynamics
National Research Council Canada - National Science Library
Law, Chung
2000-01-01
..., and the energy partition among the various modes was identified. By using the molecular dynamics method, bouncing and coalescence were successfully simulated for the first time without the artificial manipulation of the inter-droplet gaseous film...
Computational snow avalanche simulation in forested terrain
Teich, M.; Fischer, J.-T.; Feistl, T.; Bebi, P.; Christen, M.; Grêt-Regamey, A.
2014-08-01
Two-dimensional avalanche simulation software operating in three-dimensional terrain is widely used for hazard zoning and engineering to predict runout distances and impact pressures of snow avalanche events. Mountain forests are an effective biological protection measure against avalanches; however, the protective capacity of forests to decelerate or even to stop avalanches that start within forested areas or directly above the treeline is seldom considered in this context. In particular, runout distances of small- to medium-scale avalanches are strongly influenced by the structural conditions of forests in the avalanche path. We present an evaluation and operationalization of a novel detrainment function implemented in the avalanche simulation software RAMMS for avalanche simulation in forested terrain. The new approach accounts for the effect of forests in the avalanche path by detraining mass, which leads to a deceleration and runout shortening of avalanches. The relationship is parameterized by the detrainment coefficient K [kg m-1 s-2] accounting for differing forest characteristics. We varied K when simulating 40 well-documented small- to medium-scale avalanches, which were released in and ran through forests of the Swiss Alps. Analyzing and comparing observed and simulated runout distances statistically revealed values for K suitable to simulate the combined influence of four forest characteristics on avalanche runout: forest type, crown closure, vertical structure and surface cover, for example, values for K were higher for dense spruce and mixed spruce-beech forests compared to open larch forests at the upper treeline. Considering forest structural conditions within avalanche simulations will improve current applications for avalanche simulation tools in mountain forest and natural hazard management.
REEFER: a digital computer program for the simulation of high energy electron tubes. [Reefer
Energy Technology Data Exchange (ETDEWEB)
Boers, J.E.
1976-11-01
A digital computer program for the simulation of very high-energy electron and ion beams is described. The program includes space-charge effects through the solution of Poisson's equation and magnetic effects (both induced and applied) through the relativistic trajectory equations. Relaxation techniques are employed while alternately computing electric fields and trajectories. Execution time is generally less than 15 minutes on a CDC 6600 digital computer. Either space-charge-limited or field-emission sources may be simulated. The input data is described in detail and an example data set is included.
Biocellion: accelerating computer simulation of multicellular biological system models.
Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya
2014-11-01
Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Biocellion: accelerating computer simulation of multicellular biological system models
Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya
2014-01-01
Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572
Computer simulation of on-orbit manned maneuvering unit operations
Stuart, G. M.; Garcia, K. D.
1986-01-01
Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.
Modelling of dusty plasma properties by computer simulation methods
Energy Technology Data Exchange (ETDEWEB)
Baimbetov, F B [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Ramazanov, T S [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Dzhumagulova, K N [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Kadyrsizov, E R [Institute for High Energy Densities of RAS, Izhorskaya 13/19, Moscow 125412 (Russian Federation); Petrov, O F [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Gavrikov, A V [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan)
2006-04-28
Computer simulation of dusty plasma properties is performed. The radial distribution functions, the diffusion coefficient are calculated on the basis of the Langevin dynamics. A comparison with the experimental data is made.
Computer Simulation of the Impact of Cigarette Smoking On Humans
African Journals Online (AJOL)
2012-12-01
. In this edition, emphasis has been laid on computer simulation of the impact of cigarette smoking on the population between now and the ..... Secondary School curriculum in Nigeria. 3. Workshops and seminars should be.
System Identification Applied to Dynamic CFD Simulation and Wind Tunnel Data
Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.; Vicroy, Dan D.
2011-01-01
Demanding aerodynamic modeling requirements for military and civilian aircraft have provided impetus for researchers to improve computational and experimental techniques. Model validation is a key component for these research endeavors so this study is an initial effort to extend conventional time history comparisons by comparing model parameter estimates and their standard errors using system identification methods. An aerodynamic model of an aircraft performing one-degree-of-freedom roll oscillatory motion about its body axes is developed. The model includes linear aerodynamics and deficiency function parameters characterizing an unsteady effect. For estimation of unknown parameters two techniques, harmonic analysis and two-step linear regression, were applied to roll-oscillatory wind tunnel data and to computational fluid dynamics (CFD) simulated data. The model used for this study is a highly swept wing unmanned aerial combat vehicle. Differences in response prediction, parameters estimates, and standard errors are compared and discussed
Applied Nonlinear Dynamics Analytical, Computational, and Experimental Methods
Nayfeh, Ali H
1995-01-01
A unified and coherent treatment of analytical, computational and experimental techniques of nonlinear dynamics with numerous illustrative applications. Features a discourse on geometric concepts such as Poincaré maps. Discusses chaos, stability and bifurcation analysis for systems of differential and algebraic equations. Includes scores of examples to facilitate understanding.
Quantum computing applied to calculations of molecular energies
Czech Academy of Sciences Publication Activity Database
Pittner, Jiří; Veis, L.
2011-01-01
Roč. 241, - (2011), 151-phys ISSN 0065-7727. [National Meeting and Exposition of the American-Chemical-Society (ACS) /241./. 27.03.2011-31.03.2011, Anaheim] Institutional research plan: CEZ:AV0Z40400503 Keywords : molecular energie * quantum computers Subject RIV: CF - Physical ; Theoretical Chemistry
APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES
DEFF Research Database (Denmark)
Sonnenwald, Diane H.
1988-01-01
A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...
Understanding Islamist political violence through computational social simulation
Energy Technology Data Exchange (ETDEWEB)
Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory
2008-01-01
Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.
GATE Monte Carlo simulation in a cloud computing environment
Rowedder, Blake Austin
The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.
A simulator for quantum computer hardware
Michielsen, K.F L; de Raedt, H.A.; De Raedt, K.
We present new examples of the use of the quantum computer (QC) emulator. For educational purposes we describe the implementation of the CNOT and Toffoli gate, two basic building blocks of a QC, on a three qubit NMR-like QC.
Computer Simulations in the Science Classroom.
Richards, John; And Others
1992-01-01
Explorer is an interactive environment based on a constructivist epistemology of learning that integrates animated computer models with analytic capabilities for learning science. The system includes graphs, a spreadsheet, scripting, and interactive tools. Two examples involving the dynamics of colliding objects and electric circuits illustrate…
Combat Simulation Using Breach Computer Language
1979-09-01
modeling Computer language BREACH Urban warfare MOBA M0UT 20. ABSTRACT (XTantBtua oa rmveram sirfa ff racMsary and Identity by block number... MOBA Environment," Technical Memorandum 20-78, US Array Human Engineering Laboratory, Aberdeen Proving Ground, MD, July 1978 "Symposium on
Modular Modelling and Simulation Approach - Applied to Refrigeration Systems
DEFF Research Database (Denmark)
Sørensen, Kresten Kjær; Stoustrup, Jakob
2008-01-01
This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...... is divided into components where the inputs and outputs are described by a set of XML files that can be combined into a composite system model that may be loaded into MATLABtrade. A set of tools that allows the user to easily load the model and run a simulation are provided. The results show a simulation...
Advanced Simulation and Computing Business Plan
Energy Technology Data Exchange (ETDEWEB)
Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-07-09
To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.
Studying Scientific Discovery by Computer Simulation.
1983-03-30
scientific laws that were induced from data before any theory was available to discover the regularities. To the previous examples, we could add Gregor ...discoveries (excluding those of Mendel and Mendeleev, which we have not simulated) could have been made. The Role of Theory in Law Induction BACON’s
Role of computational efficiency in process simulation
Directory of Open Access Journals (Sweden)
Kurt Strand
1989-07-01
Full Text Available It is demonstrated how efficient numerical algorithms may be combined to yield a powerful environment for analysing and simulating dynamic systems. The importance of using efficient numerical algorithms is emphasized and demonstrated through examples from the petrochemical industry.
Computer Simulation Studies of Trishomocubane Heptapeptide of ...
African Journals Online (AJOL)
As part of an extension on the cage peptide chemistry, the present work involves an assessment of the conformational profile of trishomocubane heptapeptide of the type Ac-Ala3-Tris-Ala3-NHMe using molecular dynamics (MD) simulations. All MD protocols were explored within the framework of a molecular mechanics ...
Bodies Falling with Air Resistance: Computer Simulation.
Vest, Floyd
1982-01-01
Two models are presented. The first assumes that air resistance is proportional to the velocity of the falling body. The second assumes that air resistance is proportional to the square of the velocity. A program written in BASIC that simulates the second model is presented. (MP)
Quantum chemistry simulation on quantum computers: theories and experiments.
Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng
2012-07-14
It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.
Launch Site Computer Simulation and its Application to Processes
Sham, Michael D.
1995-01-01
This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.
Power-feedwater temperature operating domain for Sbwr applying Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Aguilar M, L. A.; Quezada G, S.; Espinosa M, E. G.; Vazquez R, A.; Varela H, J. R.; Cazares R, R. I.; Espinosa P, G., E-mail: sequega@gmail.com [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Mexico D. F. (Mexico)
2014-10-15
In this work the analyses of the feedwater temperature effects on reactor power in a simplified boiling water reactor (Sbwr) applying a methodology based on Monte Carlo simulation is presented. The Monte Carlo methodology was applied systematically to establish operating domain, due that the Sbwr are not yet in operation, the analysis of the nuclear and thermal-hydraulic processes must rely on numerical modeling, with the purpose of developing or confirming the design basis and qualifying the existing or new computer codes to enable reliable analyses. The results show that the reactor power is inversely proportional to the temperature of the feedwater, reactor power changes at 8% when the feed water temperature changes in 8%. (Author)
The Reserch of Granular Computing Applied in Image Mosaic
Directory of Open Access Journals (Sweden)
Xiuping Ping
2013-11-01
Full Text Available Based on the existing image mosaic technology, this paper introduces the granular computing, and obtains a simplified new algorithm. The image mosaic executed by this algorithm at first establishes correlation model on the basis of granular computing theory, and obtains edge map of each image needing mosaic. The new calculation method is used to calculate gradient of in different columns of edge map, to obtain the feature point coordinates with the maximum gradient; meanwhile, all feature points of two images are matched with each other, to acquire the best matching point. In addition, the error-correcting mechanism is introduced in the matching process, which is used to delete feature points with matching error. The correlation calculation is carried out for the matching pixels acquired by the above processing, to get the feature transformational matrix between two images.
A computer simulator for development of engineering system design methodologies
Padula, S. L.; Sobieszczanski-Sobieski, J.
1987-01-01
A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.
Computational fluid dynamics simulations and validations of results
CSIR Research Space (South Africa)
Sitek, MA
2013-09-01
Full Text Available -1 Fifth International Conference on Structural Engineering, Mechanics and Computation, Cape Town South Africa, 2-4 September 2013 Computational fluid dynamics simulations and validation of results M.A. Sitek, M. Cwik, M.A. Gizejowski Warsaw...
Probability: Actual Trials, Computer Simulations, and Mathematical Solutions.
Walton, Karen Doyle; Walton, J. Doyle
The purpose of this teaching unit is to approach elementary probability problems in three ways. First, actual trials are performed and results are recorded. Second, a simple computer simulation of the problem provided on diskette and written for Apple IIe and IIc computers, is run several times. Finally, the mathematical solution of the problem is…
Frontiers in Applied and Computational Mathematics `05’
2005-03-01
interactions involved in the regulated secretion of the hormone prolactin from pituitary lactotrophs. During the first ten days of pregnancy in rats, this...hormone is secreted in a rhythmic fashion, consisting of two pulses per day. These pulses have functional sig- nificance, since pregnancy is aborted if the...materials and the locomotion of nematodes , I will discuss different models and simulations of dynamic flexible bodies interacting with fluids. In two very
Simulations of an Optical Tactile Sensor Based on Computer Tomography
Ohka, Masahiro; Sawamoto, Yasuhiro; Zhu, Ning
In order to create a robotic tactile sensor of thin shape, a new optical tactile sensor is developed by applying a CT (Computer Tomography) algorithm. The present tactile sensor is comprised of infrared emitting diode arrays, receiving phototransistor arrays and a transparent acrylic plate and a black rubber sheet with projections. Infrared rays emitted from the diode array are directed into one end of the plate and their intensity distribution is measured by the phototransistor array mounted on the other end. If the CT algorithm is directly applied to the tactile sensor, there are two shortcomings: the shape of the sensing area is limited to a circular region and there is a long calculation time. Thus, a new CT algorithm oriented to tactile sensing is proposed for overcoming these problems. In the present algorithm, a square sensing area is divided into an N-by-N array and algebraic equations are derived from the relationship between the input and output light intensities on the assumed light projections. Several reconstruction methods are considered for obtaining pressure values caused in the squares. In the present study, the ART (Algebraic Reconstruction Technique) and LU decomposition methods were employed, and these methods were compared to select the best reconstruction method. In a series of simulations, it was found that the LU decomposition method held an advantage for the present type of tactile sensor because of its robustness against disturbance and short calculation time.
Applying virtual environments to training and simulation (abstract)
Jense, G.J.; Kuijper, F.
1993-01-01
Virtual environment (VE) technology is expected to make a big impact on future training and simulation systems. Direct stimulation of human-senses (eyesight, auditory, tactile) and new paradigms for user input will improve the realism of simulations and thereby the effectiveness of training systems.
Quantum computer gate simulations | Dada | Journal of the Nigerian ...
African Journals Online (AJOL)
As a result of this, beginners are often at a loss when trying to interact with them. The simulator here proposed therefore is aimed at bridging the gap somewhat, making quantum computer simulation more accessible to novices in the field. Journal of the Nigerian Association of Mathematical Physics Vol. 10 2006: pp. 433- ...
Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.
Daley, Michael; Hillier, Douglas
1981-01-01
Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…
Computational fluid dynamics (CFD) simulation of hot air flow ...
African Journals Online (AJOL)
Computational Fluid Dynamics simulation of air flow distribution, air velocity and pressure field pattern as it will affect moisture transient in a cabinet tray dryer is performed using SolidWorks Flow Simulation (SWFS) 2014 SP 4.0 program. The model used for the drying process in this experiment was designed with Solid ...
Simulation of quantum computation : A deterministic event-based approach
Michielsen, K; De Raedt, K; De Raedt, H
We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and
Simulation of Quantum Computation : A Deterministic Event-Based Approach
Michielsen, K.; Raedt, K. De; Raedt, H. De
2005-01-01
We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and
Development of a Computer Simulation for a Car Deceleration ...
African Journals Online (AJOL)
This is very practical, technical, and it happens every day. In this paper, we studied the factors responsible for this event. Using a computer simulation that is based on a mathematical model, we implemented the simulation of a car braking model and showed how long it takes a car to come to rest while considering certain ...
Computer Based Modelling and Simulation-Modelling and ...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 4. Computer Based Modelling and Simulation-Modelling and Simulation with Probability and Throwing Dice. N K Srinivasan. General Article Volume 6 Issue 4 April 2001 pp 69-77 ...
Computer-Based Simulation Games in Public Administration Education
Directory of Open Access Journals (Sweden)
Kutergina Evgeniia
2017-12-01
Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better
Computational Dehydration of Crystalline Hydrates Using Molecular Dynamics Simulations
DEFF Research Database (Denmark)
Larsen, Anders Støttrup; Rantanen, Jukka; Johansson, Kristoffer E
2017-01-01
Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach to the de......Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach...
COMPUTER SIMULATION OF A STIRLING REFRIGERATING MACHINE
Directory of Open Access Journals (Sweden)
V.V. Trandafilov
2015-10-01
Full Text Available In present numerical research, the mathematical model for precise performance simulation and detailed behavior of Stirling refrigerating machine is considered. The mathematical model for alpha Stirling refrigerating machine with helium as the working fluid will be useful in optimization of these machines mechanical design. Complete non-linear mathematical model of the machine, including thermodynamics of helium, and heat transfer from the walls, as well as heat transfer and gas resistance in the regenerator is developed. Non-dimensional groups are derived, and the mathematical model is numerically solved. Important design parameters are varied and their effect on Stirling refrigerating machine performance determined. The simulation results of Stirling refrigerating machine which include heat transfer and coefficient of performance are presented.
Computer Simulations of Lipid Bilayers and Proteins
DEFF Research Database (Denmark)
Sonne, Jacob
2006-01-01
profile. The pressure profile changes when small molecules partition into the bilayer and it has previously been suggested that such changes may be related to general anesthesia. MD simulations play an important role when studying the possible coupling between general anesthesia and changes...... in the pressure profile since the pressure profile cannot be measured in traditional experiments. Even so, pressure profile calculations from MD simulations are not trivial due to both fundamental and technical issues. We addressed two such issues namely the uniqueness of the pressure profile and the effect......CD belongs to the adonesine triphosphate (ATP) binding cassette (ABC) transporter family that use ATP to drive active transport of a wide variety of compounds across cell membranes. BtuCD accounts for vitamin B12 import into Escherichia coli and is one of the only ABC transporters for which a reliable...
Computer Simulation of Turbulent Reactive Gas Dynamics
Directory of Open Access Journals (Sweden)
Bjørn H. Hjertager
1984-10-01
Full Text Available A simulation procedure capable of handling transient compressible flows involving combustion is presented. The method uses the velocity components and pressure as primary flow variables. The differential equations governing the flow are discretized by integration over control volumes. The integration is performed by application of up-wind differencing in a staggered grid system. The solution procedure is an extension of the SIMPLE-algorithm accounting for compressibility effects.
Computer simulation of functioning of elements of security systems
Godovykh, A. V.; Stepanov, B. P.; Sheveleva, A. A.
2017-01-01
The article is devoted to issues of development of the informational complex for simulation of functioning of the security system elements. The complex is described from the point of view of main objectives, a design concept and an interrelation of main elements. The proposed conception of the computer simulation provides an opportunity to simulate processes of security system work for training security staff during normal and emergency operation.
Simulation of scanning transmission electron microscope images on desktop computers
Energy Technology Data Exchange (ETDEWEB)
Dwyer, C., E-mail: christian.dwyer@mcem.monash.edu.au [Monash Centre for Electron Microscopy, Department of Materials Engineering, Monash University, Victoria 3800 (Australia)
2010-02-15
Two independent strategies are presented for reducing the computation time of multislice simulations of scanning transmission electron microscope (STEM) images: (1) optimal probe sampling, and (2) the use of desktop graphics processing units. The first strategy is applicable to STEM images generated by elastic and/or inelastic scattering, and requires minimal effort for its implementation. Used together, these two strategies can reduce typical computation times from days to hours, allowing practical simulation of STEM images of general atomic structures on a desktop computer.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Mike Maguire; Adel Sarofim; Changguan Yang; Hong-Shig Shim
2004-01-28
This is the thirteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused on a preliminary detailed software design for the enhanced framework. Given the complexity of the individual software tools from each team (i.e., Reaction Engineering International, Carnegie Mellon University, Iowa State University), a robust, extensible design is required for the success of the project. In addition to achieving a preliminary software design, significant progress has been made on several development tasks for the program. These include: (1) the enhancement of the controller user interface to support detachment from the Computational Engine and support for multiple computer platforms, (2) modification of the Iowa State University interface-to-kernel communication mechanisms to meet the requirements of the new software design, (3) decoupling of the Carnegie Mellon University computational models from their parent IECM (Integrated Environmental Control Model) user interface for integration with the new framework and (4) development of a new CORBA-based model interfacing specification. A benchmarking exercise to compare process and CFD based models for entrained flow gasifiers was completed. A summary of our work on intrinsic kinetics for modeling coal gasification has been completed. Plans for implementing soot and tar models into our entrained flow gasifier models are outlined. Plans for implementing a model for mercury capture based on conventional capture technology, but applied to an IGCC system, are outlined.
Applied Computational Electromagnetics Society Journal, Volume 9, Number 2
1994-07-01
of the element software construction.’ International quasilinear Poisson equation on a nonuniform Journal for Applied Electromagnetics in triangle mesh...eigenvectors of H we orthogoaul, bodi sides of (14) can be multiplied by each of Aguaw 8 Doube aided motor with erlwanent tem eganvecton; to obtain...of induction, dicted by (1) the continuity equation and Poisson -s the other solving the complete lossy wave equation) equation and by (2) the
Directory of Open Access Journals (Sweden)
Supat Faarungsang
2017-04-01
Full Text Available The Reverse Threshold Model Theory (RTMT model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives complete elimination of random error. Furthermore, RTMT has several advantages over CM and is therefore proposed to be applied to most research data.
Moraes, R.; Fonseca, R.M.; Helici, M.; Heemink, A.W.; Jansen, J.D.
2017-01-01
We present an efficient workflow that combines multiscale (MS) forward simulation and stochastic gradient computation - MS-StoSAG - for the optimization of well controls applied to waterflooding under geological uncertainty. The Iterative Multiscale Finite Volume (i-MSFV), a mass conservative
Micromechanics-Based Computational Simulation of Ceramic Matrix Composites
Murthy, Pappu L. N.; Mutal, Subodh K.; Duff, Dennis L. (Technical Monitor)
2003-01-01
Advanced high-temperature Ceramic Matrix Composites (CMC) hold an enormous potential for use in aerospace propulsion system components and certain land-based applications. However, being relatively new materials, a reliable design properties database of sufficient fidelity does not yet exist. To characterize these materials solely by testing is cost and time prohibitive. Computational simulation then becomes very useful to limit the experimental effort and reduce the design cycle time, Authors have been involved for over a decade in developing micromechanics- based computational simulation techniques (computer codes) to simulate all aspects of CMC behavior including quantification of scatter that these materials exhibit. A brief summary/capability of these computer codes with typical examples along with their use in design/analysis of certain structural components is the subject matter of this presentation.
Computer Simulation of Breast Cancer Screening
2001-07-01
100 200 300 400 500 600 signals at A and B may be, respectively, written as pixel position ESFA =P+S, (1) 80 60 ESFB = P + Sf2, (2) 40/ where P is the...40 / primary ratio at point A (SPR) may be computed from the -60 digital signal values (among other ways) as: -80,... .. . S=2X( ESFA -ESFB), (3) 0...100 200 300 400 500 600 pixel position P= ESFA -S, (4) FIG. 4. (a) Matched primary-only and primary plus scatter ESFs and (b) the SPR= SIP. (5) resulting
Applied simulation and optimization : in logistics, industrial and aeronautical practice
Mujica Mota, Miguel; De la Mota, Idalia Flores; Guimarans Serrano, Daniel
2015-01-01
Presenting techniques, case-studies and methodologies that combine the use of simulation approaches with optimization techniques for facing problems in manufacturing, logistics, or aeronautical problems, this book provides solutions to common industrial problems in several fields, which range from
Parallel computing in cluster of GPU applied to a problem of nuclear engineering
Energy Technology Data Exchange (ETDEWEB)
Moraes, Sergio Ricardo S.; Heimlich, Adino, E-mail: sergio.moraes@ifrj.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia do Rio de Janeiro (IFRJ), Nilopolis, RJ (Brazil); Resende, Pedro [Universidade Gama Filho (UGF), Rio de Janeiro, RJ (Brazil). Departamento de Ciencia da Computacao; Mol, Antonio C.A.; Pereira, Claudio M.N.A., E-mail: cmnap@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)
2013-07-01
Cluster computing has been widely used as a low cost alternative for parallel processing in scientific applications. With the use of Message-Passing Interface (MPI) protocol development became even more accessible and widespread in the scientific community. A more recent trend is the use of Graphic Processing Unit (GPU), which is a powerful co-processor able to perform hundreds of instructions in parallel, reaching a capacity of hundreds of times the processing of a CPU. However, a standard PC does not allow, in general, more than two GPUs. Hence, it is proposed in this work development and evaluation of a hybrid low cost parallel approach to the solution to a nuclear engineering typical problem. The idea is to use clusters parallelism technology (MPI) together with GPU programming techniques (CUDA - Compute Unified Device Architecture) to simulate neutron transport through a slab using Monte Carlo method. By using a cluster comprised by four quad-core computers with 2 GPU each, it has been developed programs using MPI and CUDA technologies. Experiments, applying different configurations, from 1 to 8 GPUs has been performed and results were compared with the sequential (non-parallel) version. A speed up of about 2.000 times has been observed when comparing the 8-GPU with the sequential version. Results here presented are discussed and analyzed with the objective of outlining gains and possible limitations of the proposed approach. (author)
Applied & Computational MathematicsChallenges for the Design and Control of Dynamic Energy Systems
Energy Technology Data Exchange (ETDEWEB)
Brown, D L; Burns, J A; Collis, S; Grosh, J; Jacobson, C A; Johansen, H; Mezic, I; Narayanan, S; Wetter, M
2011-03-10
The Energy Independence and Security Act of 2007 (EISA) was passed with the goal 'to move the United States toward greater energy independence and security.' Energy security and independence cannot be achieved unless the United States addresses the issue of energy consumption in the building sector and significantly reduces energy consumption in buildings. Commercial and residential buildings account for approximately 40% of the U.S. energy consumption and emit 50% of CO{sub 2} emissions in the U.S. which is more than twice the total energy consumption of the entire U.S. automobile and light truck fleet. A 50%-80% improvement in building energy efficiency in both new construction and in retrofitting existing buildings could significantly reduce U.S. energy consumption and mitigate climate change. Reaching these aggressive building efficiency goals will not happen without significant Federal investments in areas of computational and mathematical sciences. Applied and computational mathematics are required to enable the development of algorithms and tools to design, control and optimize energy efficient buildings. The challenge has been issued by the U.S. Secretary of Energy, Dr. Steven Chu (emphasis added): 'We need to do more transformational research at DOE including computer design tools for commercial and residential buildings that enable reductions in energy consumption of up to 80 percent with investments that will pay for themselves in less than 10 years.' On July 8-9, 2010 a team of technical experts from industry, government and academia were assembled in Arlington, Virginia to identify the challenges associated with developing and deploying newcomputational methodologies and tools thatwill address building energy efficiency. These experts concluded that investments in fundamental applied and computational mathematics will be required to build enabling technology that can be used to realize the target of 80% reductions in energy
Associative Memory computing power and its simulation.
Ancu, L S; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G
2015-01-01
An important step in the ATLAS upgrade program is the installation of a tracking processor, the Fast Tracker (FTK), with the goal to identify the tracks generated from charged tracks originated by the LHC 14 TeV proton-proton. The collisions will generate thousands of hits in each layer of the silicon tracker detector and track identification is a very challenging computational problem. At the core of the FTK there is associative memory (AM) system, made with hundreds of AM ASICs chips, specifically designed to allow pattern identification in high density environments at very high speed. This component is able to organize the following steps of the track identification providing a huge computing power for a specific application. The AM system will in fact being able to reconstruct tracks in 10s of microseconds. Within the FTK team there has also been a constant effort to maintain a detailed emulation of the system, to predict the impact of single component features in the final performance and in the ATLAS da...
Computer simulation of vasectomy for wolf control
Haight, R.G.; Mech, L.D.
1997-01-01
Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.
A Coupled Earthquake-Tsunami Simulation Framework Applied to the Sumatra 2004 Event
Vater, Stefan; Bader, Michael; Behrens, Jörn; van Dinther, Ylona; Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Uphoff, Carsten; Wollherr, Stephanie; van Zelst, Iris
2017-04-01
Large earthquakes along subduction zone interfaces have generated destructive tsunamis near Chile in 1960, Sumatra in 2004, and northeast Japan in 2011. In order to better understand these extreme events, we have developed tools for physics-based, coupled earthquake-tsunami simulations. This simulation framework is applied to the 2004 Indian Ocean M 9.1-9.3 earthquake and tsunami, a devastating event that resulted in the loss of more than 230,000 lives. The earthquake rupture simulation is performed using an ADER discontinuous Galerkin discretization on an unstructured tetrahedral mesh with the software SeisSol. Advantages of this approach include accurate representation of complex fault and sea floor geometries and a parallelized and efficient workflow in high-performance computing environments. Accurate and efficient representation of the tsunami evolution and inundation at the coast is achieved with an adaptive mesh discretizing the shallow water equations with a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme. With the application of the framework to this historic event, we aim to better understand the involved mechanisms between the dynamic earthquake within the earth's crust, the resulting tsunami wave within the ocean, and the final coastal inundation process. Earthquake model results are constrained by GPS surface displacements and tsunami model results are compared with buoy and inundation data. This research is part of the ASCETE Project, "Advanced Simulation of Coupled Earthquake and Tsunami Events", funded by the Volkswagen Foundation.
Computer simulation of chemical reactions in porous materials
Turner, Christoffer Heath
Understanding reactions in nanoporous materials from a purely experimental perspective is a difficult task. Measuring the chemical composition of a reacting system within a catalytic material is usually only accomplished through indirect methods, and it is usually impossible to distinguish between true chemical equilibrium and metastable states. In addition, measuring molecular orientation or distribution profiles within porous systems is not easily accomplished. However, molecular simulation techniques are well-suited to these challenges. With appropriate simulation techniques and realistic molecular models, it is possible to validate the dominant physical and chemical forces controlling nanoscale reactivity. Novel nanostructured catalysts and supports can be designed, optimized, and tested using high-performance computing and advanced modeling techniques in order to guide the search for next-generation catalysts---setting new targets for the materials synthesis community. We have simulated the conversion of several different equilibrium-limited reactions within microporous carbons and we find that the pore size, pore geometry, and surface chemistry are important factors for determining the reaction yield. The equilibrium-limited reactions that we have modeled include nitric oxide dimerization, ammonia synthesis, and the esterification of acetic acid, all of which show yield enhancements within microporous carbons. In conjunction with a yield enhancement of the esterification reaction, selective adsorption of ethyl acetate within carbon micropores demonstrates an efficient method for product recovery. Additionally, a new method has been developed for simulating reaction kinetics within porous materials and other heterogeneous environments. The validity of this technique is first demonstrated by reproducing the kinetics of hydrogen iodide decomposition in the gas phase, and then predictions are made within slit-shaped carbon pores and carbon nanotubes. The rate
Personal Computer (PC) based image processing applied to fluid mechanics
Cho, Y.-C.; Mclachlan, B. G.
1987-01-01
A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes computation.
High-performance Computing Applied to Semantic Databases
Energy Technology Data Exchange (ETDEWEB)
Goodman, Eric L.; Jimenez, Edward; Mizell, David W.; al-Saffar, Sinan; Adolf, Robert D.; Haglin, David J.
2011-06-02
To-date, the application of high-performance computing resources to Semantic Web data has largely focused on commodity hardware and distributed memory platforms. In this paper we make the case that more specialized hardware can offer superior scaling and close to an order of magnitude improvement in performance. In particular we examine the Cray XMT. Its key characteristics, a large, global shared-memory, and processors with a memory-latency tolerant design, offer an environment conducive to programming for the Semantic Web and have engendered results that far surpass current state of the art. We examine three fundamental pieces requisite for a fully functioning semantic database: dictionary encoding, RDFS inference, and query processing. We show scaling up to 512 processors (the largest configuration we had available), and the ability to process 20 billion triples completely in-memory.
A computer code to simulate X-ray imaging techniques
Energy Technology Data Exchange (ETDEWEB)
Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel
2000-09-01
A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.
Computation simulation of the nonlinear response of suspension bridges
Energy Technology Data Exchange (ETDEWEB)
McCallen, D.B.; Astaneh-Asl, A.
1997-10-01
Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.
DEFF Research Database (Denmark)
Wang, Weizhi; Wu, Minghao; Palm, Johannes
2018-01-01
The wave loads and the resulting motions of floating wave energy converters are traditionally computed using linear radiation–diffraction methods. Yet for certain cases such as survival conditions, phase control and wave energy converters operating in the resonance region, more complete...... mathematical models such as computational fluid dynamics are preferred and over the last 5 years, computational fluid dynamics has become more frequently used in the wave energy field. However, rigorous estimation of numerical errors, convergence rates and uncertainties associated with computational fluid...... dynamics simulations have largely been overlooked in the wave energy sector. In this article, we apply formal verification and validation techniques to computational fluid dynamics simulations of a passively controlled point absorber. The phase control causes the motion response to be highly nonlinear even...
An introduction to computer simulation methods applications to physical systems
Gould, Harvey; Christian, Wolfgang
2007-01-01
Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...
Traffic simulations on parallel computers using domain decomposition techniques
Energy Technology Data Exchange (ETDEWEB)
Hanebutte, U.R.; Tentner, A.M.
1995-12-31
Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.
Supercoiled DNA energetics and dynamics by computer simulation.
Schlick, T; Olson, W K
1992-02-20
A new formulation is presented for investigating supercoiled DNA configurations by deterministic techniques. Thus far, the computational difficulties involved in applying deterministic methods to supercoiled DNA studies have generally limited computer simulations to stochastic approaches. While stochastic methods, such as simulated annealing and Metropolis-Monte Carlo sampling, are successful at generating a large number of configurations and estimating thermodynamic properties of topoisomer ensembles, deterministic methods offer an accurate characterization of the minima and a systematic following of their dynamics. To make this feasible, we model circular duplex DNA compactly by a B-spline ribbon-like model in terms of a small number of control vertices. We associate an elastic deformation energy composed of bending and twisting integrals and represent intrachain contact by a 6-12 Lennard Jones potential. The latter is parameterized to yield an energy minimum at the observed DNA-helix diameter inclusive of a hydration shell. A penalty term to ensure fixed contour length is also included. First and second partial derivatives of the energy function have been derived by using various mathematical simplifications. First derivatives are essential for Newton-type minimization as well as molecular dynamics, and partial second-derivative information can significantly accelerate minimization convergence through preconditioning. Here we apply a new large-scale truncated-Newton algorithm for minimization and a Langevin/implicit-Euler scheme for molecular dynamics. Our truncated-Newton method exploits the separability of potential energy functions into terms of differing complexity. It relies on a preconditioned conjugate gradient method that is efficient for large-scale problems to solve approximately for the search direction at every step. Our dynamics algorithm is numerically stable over large time steps. It also introduces a frequency-discriminating mechanism so that
The Simulation and Analysis of the Closed Die Hot Forging Process by A Computer Simulation Method
Directory of Open Access Journals (Sweden)
Dipakkumar Gohil
2012-06-01
Full Text Available The objective of this research work is to study the variation of various parameters such as stress, strain, temperature, force, etc. during the closed die hot forging process. A computer simulation modeling approach has been adopted to transform the theoretical aspects in to a computer algorithm which would be used to simulate and analyze the closed die hot forging process. For the purpose of process study, the entire deformation process has been divided in to finite number of steps appropriately and then the output values have been computed at each deformation step. The results of simulation have been graphically represented and suitable corrective measures are also recommended, if the simulation results do not agree with the theoretical values. This computer simulation approach would significantly improve the productivity and reduce the energy consumption of the overall process for the components which are manufactured by the closed die forging process and contribute towards the efforts in reducing the global warming.
Computer simulations of adsorbed liquid crystal films
Wall, Greg D.; Cleaver, Douglas J.
2003-01-01
The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.
Osmosis : a molecular dynamics computer simulation study
Lion, Thomas
Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..
Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I
Energy Technology Data Exchange (ETDEWEB)
Schmalz, Mark S
2011-07-24
Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient
Simulation program of nonlinearities applied to telecommunication systems
Thomas, C.
1979-01-01
In any satellite communication system, the problems of distorsion created by nonlinear devices or systems must be considered. The subject of this paper is the use of the Fast Fourier Transform (F.F.T.) in the prediction of the intermodulation performance of amplifiers, mixers, filters. A nonlinear memory-less model is chosen to simulate amplitude and phase nonlinearities of the device in the simulation program written in FORTRAN 4. The experimentally observed nonlinearity parameters of a low noise 3.7-4.2 GHz amplifier are related to the gain and phase coefficients of Fourier Service Series. The measured results are compared with those calculated from the simulation in the cases where the input signal is composed of two, three carriers and noise power density.
Teaching Physics (and Some Computation) Using Intentionally Incorrect Simulations
Cox, Anne J.; Junkin, William F.; Christian, Wolfgang; Belloni, Mario; Esquembre, Francisco
2011-05-01
Computer simulations are widely used in physics instruction because they can aid student visualization of abstract concepts, they can provide multiple representations of concepts (graphical, trajectories, charts), they can approximate real-world examples, and they can engage students interactively, all of which can enhance student understanding of physics concepts. For these reasons, we create and use simulations to teach physics,1,2 but we also want students to recognize that the simulations are only as good as the physics behind them, so we have developed a series of simulations that are intentionally incorrect, where the task is for students to find and correct the errors.3
Computer simulation tests of optimized neutron powder diffractometer configurations
Energy Technology Data Exchange (ETDEWEB)
Cussen, L.D., E-mail: Leo@CussenConsulting.com [Cussen Consulting, 23 Burgundy Drive, Doncaster 3108 (Australia); Lieutenant, K., E-mail: Klaus.Lieutenant@helmholtz-berlin.de [Helmholtz Zentrum Berlin, Hahn-Meitner Platz 1, 14109 Berlin (Germany)
2016-06-21
Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.
Computational algorithms to simulate the steel continuous casting
Ramírez-López, A.; Soto-Cortés, G.; Palomar-Pardavé, M.; Romero-Romo, M. A.; Aguilar-López, R.
2010-10-01
Computational simulation is a very powerful tool to analyze industrial processes to reduce operating risks and improve profits from equipment. The present work describes the development of some computational algorithms based on the numerical method to create a simulator for the continuous casting process, which is the most popular method to produce steel products for metallurgical industries. The kinematics of industrial processing was computationally reproduced using subroutines logically programmed. The cast steel by each strand was calculated using an iterative method nested in the main loop. The process was repeated at each time step (Δ t) to calculate the casting time, simultaneously, the steel billets produced were counted and stored. The subroutines were used for creating a computational representation of a continuous casting plant (CCP) and displaying the simulation of the steel displacement through the CCP. These algorithms have been developed to create a simulator using the programming language C++. Algorithms for computer animation of the continuous casting process were created using a graphical user interface (GUI). Finally, the simulator functionality was shown and validated by comparing with the industrial information of the steel production of three casters.
Simulation Applied to the Storage Capacity and Stockpiles
Directory of Open Access Journals (Sweden)
Andrea Alejandra Giubergia
2016-05-01
Full Text Available This investigation is focused on process based simulations. The simulation is carried out (using the FlexSim 7.3.0 software to a mining process including storage hoppers and haulage equipment in order to estimate the desirable truck fleet size and the capacity of the trucks and the hoppers as well as assessing whether the design of the access roads is acceptable for the success of the operations. It is concluded that the dimensions of the loading system has been overestimated compared to the existing equipment fleet size. Therefore, it is required to increase the number of trucks or the truck haulage capacity to improve the mine productivity.
34 CFR 464.42 - What limit applies to purchasing computer hardware and software?
2010-07-01
... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ... 34 Education 3 2010-07-01 2010-07-01 false What limit applies to purchasing computer hardware and...
Applied and computational harmonic analysis on graphs and networks
Irion, Jeff; Saito, Naoki
2015-09-01
In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.
X-ray computed tomography applied to investigate ancient manuscripts
Bettuzzi, Matteo; Albertin, Fauzia; Brancaccio, Rosa; Casali, Franco; Pia Morigi, Maria; Peccenini, Eva
2017-03-01
I will describe in this paper the first results of a series of X-ray tomography applications, with different system setups, running on some ancient manuscripts containing iron-gall ink. The purpose is to verify the optimum measurement conditions with a laboratory instrumentation -that is also in fact portable- in order to recognize the text from the inside of the documents, without opening them. This becomes possible by exploiting the X-rays absorption contrast of iron-based ink and the three-dimensional reconstruction potential provided by computed tomography that overcomes problems that appear in simple radiograph practice. This work is part of a larger project of EPFL (Ecole Polytechnique Fédérale de Lausanne, Switzerland), the "Venice Time Machine" project (EPEL, Digital Heritage Venice, http://dhvenice.eu/, 2015) aimed at digitizing, transcribing and sharing in an open database all the information of the State Archives of Venice, exploiting traditional digitization technologies and innovative methods of acquisition. In this first measurement campaign I investigated a manuscript of the seventeenth century made of a folded sheet; a couple of unopened ancient wills kept in the State Archives in Venice and a handwritten book of several hundred pages of notes of Physics of the nineteenth century.
Computer Simulation for Pain Management Education: A Pilot Study.
Allred, Kelly; Gerardi, Nicole
2017-10-01
Effective pain management is an elusive concept in acute care. Inadequate knowledge has been identified as a barrier to providing optimal pain management. This study aimed to determine student perceptions of an interactive computer simulation as a potential method for learning pain management, as a motivator to read and learn more about pain management, preference over traditional lecture, and its potential to change nursing practice. A post-computer simulation survey with a mixed-methods descriptive design was used in this study. A college of nursing in a large metropolitan university in the Southeast United States. A convenience sample of 30 nursing students in a Bachelor of Science nursing program. An interactive computer simulation was developed as a potential alternative method of teaching pain management to nursing students. Increases in educational gain as well as its potential to change practice were explored. Each participant was asked to complete a survey consisting of 10 standard 5-point Likert scale items and 5 open-ended questions. The survey was used to evaluate the students' perception of the simulation, specifically related to educational benefit, preference compared with traditional teaching methods, and perceived potential to change nursing practice. Data provided descriptive statistics for initial evaluation of the computer simulation. The responses on the survey suggest nursing students perceive the computer simulation to be entertaining, fun, educational, occasionally preferred over regular lecture, and with potential to change practice. Preliminary data support the use of computer simulation in educating nursing students about pain management. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Bélair, Jacques; Kunze, Herb; Makarov, Roman; Melnik, Roderick; Spiteri, Raymond J
2016-01-01
Focusing on five main groups of interdisciplinary problems, this book covers a wide range of topics in mathematical modeling, computational science and applied mathematics. It presents a wealth of new results in the development of modeling theories and methods, advancing diverse areas of applications and promoting interdisciplinary interactions between mathematicians, scientists, engineers and representatives from other disciplines. The book offers a valuable source of methods, ideas, and tools developed for a variety of disciplines, including the natural and social sciences, medicine, engineering, and technology. Original results are presented on both the fundamental and applied level, accompanied by an ample number of real-world problems and examples emphasizing the interdisciplinary nature and universality of mathematical modeling, and providing an excellent outline of today’s challenges. Mathematical modeling, with applied and computational methods and tools, plays a fundamental role in modern science a...
Using EDUCache Simulator for the Computer Architecture and Organization Course
Directory of Open Access Journals (Sweden)
Sasko Ristov
2013-07-01
Full Text Available The computer architecture and organization course is essential in all computer science and engineering programs, and the most selected and liked elective course for related engineering disciplines. However, the attractiveness brings a new challenge, it requires a lot of effort by the instructor, to explain rather complicated concepts to beginners or to those who study related disciplines. The usage of visual simulators can improve both the teaching and learning processes. The overall goal is twofold: 1~to enable a visual environment to explain the basic concepts and 2~to increase the student's willingness and ability to learn the material.A lot of visual simulators have been used for the computer architecture and organization course. However, due to the lack of visual simulators for simulation of the cache memory concepts, we have developed a new visual simulator EDUCache simulator. In this paper we present that it can be effectively and efficiently used as a supporting tool in the learning process of modern multi-layer, multi-cache and multi-core multi-processors.EDUCache's features enable an environment for performance evaluation and engineering of software systems, i.e. the students will also understand the importance of computer architecture building parts and hopefully, will increase their curiosity for hardware courses in general.
Macroevolution simulated with autonomously replicating computer programs.
Yedid, Gabriel; Bell, Graham
The process of adaptation occurs on two timescales. In the short term, natural selection merely sorts the variation already present in a population, whereas in the longer term genotypes quite different from any that were initially present evolve through the cumulation of new mutations. The first process is described by the mathematical theory of population genetics. However, this theory begins by defining a fixed set of genotypes and cannot provide a satisfactory analysis of the second process because it does not permit any genuinely new type to arise. The evolutionary outcome of selection acting on novel variation arising over long periods is therefore difficult to predict. The classical problem of this kind is whether 'replaying the tape of life' would invariably lead to the familiar organisms of the modern biota. Here we study the long-term behaviour of populations of autonomously replicating computer programs and find that the same type, introduced into the same simple environment, evolves on any given occasion along a unique trajectory towards one of many well-adapted end points.
Business Simulations Applied in Support of ERP Training
Conroy, George
2012-01-01
This quantitative, quasi-experimental study examined the application of a business simulation against training in support of an Enterprise Resource Planning (ERP) system. Defining more effective training strategies is a critical concern for organizational leaders and stakeholders concerned by today's economic challenges. The scope of this…
Applying a behavioural simulation for the collection of data
DEFF Research Database (Denmark)
Jespersen, Kristina Risom
2005-01-01
To collect real-time data as opposed to retrospective data requires new methodological traits. One possibility is the use of behavioral simulations that synthesize the self-administered questionnaire, experimental designs, role-playing and scenarios. Supported by Web technology this new data...
Associative Memory computing power and its simulation.
Volpi, G; The ATLAS collaboration
2014-01-01
The associative memory (AM) chip is ASIC device specifically designed to perform ``pattern matching'' at very high speed and with parallel access to memory locations. The most extensive use for such device will be the ATLAS Fast Tracker (FTK) processor, where more than 8000 chips will be installed in 128 VME boards, specifically designed for high throughput in order to exploit the chip's features. Each AM chip will store a database of about 130000 pre-calculated patterns, allowing FTK to use about 1 billion patterns for the whole system, with any data inquiry broadcast to all memory elements simultaneously within the same clock cycle (10 ns), thus data retrieval time is independent of the database size. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS FTK processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 $\\mathrm{\\mu s}$. The simulation of such a parallelized system is an extremely complex task when executed in comm...
SiMon: Simulation Monitor for Computational Astrophysics
Qian, Penny Xuran; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming
2017-09-01
Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.
DEFF Research Database (Denmark)
Skjøth-Rasmussen, Martin Skov; Glarborg, Peter; Jensen, Anker
2003-01-01
It is desired to make detailed chemical kinetic mechanisms applicable to the complex geometries of practical combustion devices simulated with computational fluid dynamics tools. This work presents a novel general approach to combining computational fluid dynamics and a detailed chemical kinetic...... mechanism. It involves post-processing of data extracted from computational fluid dynamics simulations. Application of this approach successfully describes combustion chemistry in a standard swirl burner, the so-called Harwell furnace. Nevertheless, it needs validation against more complex combustion models...
Subglacial sediment mechanics investigated by computer simulation of granular material
Damsgaard, A.; Egholm, D. L.; Tulaczyk, S. M.; Piotrowski, J. A.; Larsen, N. K.; Siegfried, M. R.; Beem, L.; Suckale, J.
2016-12-01
The mechanical properties of subglacial sediments are known to directly influence the stability of ice streams and fast-moving glaciers, but existing models of granular sediment deformation are poorly constrained. In addition, upscaling to generalized mathematical models is difficult due to the mechanical nonlinearity of the sediment, internal porosity changes during deformation, and associated structural and kinematic phase transitions. In this presentation, we introduce the Discrete Element Method (DEM) for particle-scale granular simulation. The DEM is fully coupled with fluid dynamics. The numerical method is applied to better understand the mechanical properties of the subglacial sediment and its interaction with meltwater. The computational approach allows full experimental control and offers insights into the internal kinematics, stress distribution, and mechanical stability. During confined shear with variable pore-water pressure, the sediment changes mechanical behavior, from stick, to non-linear creep, and unconstrained failure during slip. These results are contrary to more conventional models of plastic or (non-)linear viscous subglacial soft-bed sliding. Advection of sediment downstream is pressure dependent, which is consistent with theories of unstable bed bump growth. Granular mechanics prove to significantly influence the geometry and hydraulic properties of meltwater channels incised into the subglacial bed. Current models assume that channel bed erosion is balanced by linear-viscous sediment movement. We demonstrate how channel flanks are stabilized by the sediment frictional strength. Additionally, sediment liquefaction proves to be a possible mechanism for causing large and episodic sediment transport by water flow. Though computationally intense, our coupled numerical method provides a framework for quantifying a wide range of subglacial sediment-water processes, which are a key unknown in our ability to model the future evolution of ice
Applied simulation and optimization in logistics, industrial and aeronautical practice
Mota, Idalia; Serrano, Daniel
2015-01-01
Presenting techniques, case-studies and methodologies that combine the use of simulation approaches with optimization techniques for facing problems in manufacturing, logistics, or aeronautical problems, this book provides solutions to common industrial problems in several fields, which range from manufacturing to aviation problems, where the common denominator is the combination of simulation’s flexibility with optimization techniques’ robustness. Providing readers with a comprehensive guide to tackle similar issues in industrial environments, this text explores novel ways to face industrial problems through hybrid approaches (simulation-optimization) that benefit from the advantages of both paradigms, in order to give solutions to important problems in service industry, production processes, or supply chains, such as scheduling, routing problems and resource allocations, among others.
Advanced Simulation and Computing FY17 Implementation Plan, Version 0
Energy Technology Data Exchange (ETDEWEB)
McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment
2016-08-29
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.
Neural network stochastic simulation applied for quantifying uncertainties
Directory of Open Access Journals (Sweden)
N Foudil-Bey
2016-09-01
Full Text Available Generally the geostatistical simulation methods are used to generate several realizations of physical properties in the sub-surface, these methods are based on the variogram analysis and limited to measures correlation between variables at two locations only. In this paper, we propose a simulation of properties based on supervised Neural network training at the existing drilling data set. The major advantage is that this method does not require a preliminary geostatistical study and takes into account several points. As a result, the geological information and the diverse geophysical data can be combined easily. To do this, we used a neural network with multi-layer perceptron architecture like feed-forward, then we used the back-propagation algorithm with conjugate gradient technique to minimize the error of the network output. The learning process can create links between different variables, this relationship can be used for interpolation of the properties on the one hand, or to generate several possible distribution of physical properties on the other hand, changing at each time and a random value of the input neurons, which was kept constant until the period of learning. This method was tested on real data to simulate multiple realizations of the density and the magnetic susceptibility in three-dimensions at the mining camp of Val d'Or, Québec (Canada.
Simulation of Road Traffic Applying Model-Driven Engineering
Directory of Open Access Journals (Sweden)
Alberto FERNÁNDEZ-ISABEL
2016-05-01
Full Text Available Road traffic is an important phenomenon in modern societies. The study of its different aspects in the multiple scenarios where it happens is relevant for a huge number of problems. At the same time, its scale and complexity make it hard to study. Traffic simulations can alleviate these difficulties, simplifying the scenarios to consider and controlling their variables. However, their development also presents difficulties. The main ones come from the need to integrate the way of working of researchers and developers from multiple fields. Model-Driven Engineering (MDE addresses these problems using Modelling Languages (MLs and semi-automatic transformations to organise and describe the development, from requirements to code. This paper presents a domain-specific MDE framework for simulations of road traffic. It comprises an extensible ML, support tools, and development guidelines. The ML adopts an agent-based approach, which is focused on the roles of individuals in road traffic and their decision-making. A case study shows the process to model a traffic theory with the ML, and how to specialise that specification for an existing target platform and its simulations. The results are the basis for comparison with related work.
Noise Reduction in Abdominal Computed Tomography Applying Iterative Reconstruction (ADMIRE).
Schaller, Frank; Sedlmair, Martin; Raupach, Rainer; Uder, Michael; Lell, Michael
2016-10-01
The study aimed to compare image quality of filtered back projection (FBP) and iterative reconstruction (advanced modeled iterative reconstruction, ADMIRE) in contrast-enhanced computed tomography (CT) of the abdomen, and to assess the differences of reconstructions according to these methods. It also aimed to investigate the potential for noise reduction of ADMIRE for different reconstructed slice thicknesses. CT data of the abdomen and pelvis were acquired using a 128-slice single-source CT system using automated kV selection and tube current adaption based on patients' anatomy. Raw data sets from patients scanned at 100 kV were selected, and images were reconstructed with slice thicknesses of 1 mm, 3 mm, and 5 mm, both with FBP and ADMIRE. Filter strength F1, F3, and F5 of the ADMIRE algorithm and the corresponding reconstruction kernels were used. In total, 58 raw data sets from 17 patients were used to reconstruct from the same raw data FBP and ADMIRE images, representing identical body regions. Identical regions of interest were placed at the same position of up to four images and image noise was measured. Differences of reconstructed images and detail preservation were tested using an image subtraction technique, and subjective image quality was assessed using a 5-point Likert scale. On average, for 1-mm slice thickness, noise reduction was 9.15% ± 2.4% with filter strength level F1, 30.2% ± 3.4% with F3, and 54.4% ± 7.0% with F5 as compared to FBP. For a slice thickness of 3 mm, noise reduction was 8.5% ± 3.7% with F1, 28.6% ± 3.9% with F3, and 52.2% ± 9.1% with F5. For 5 mm, the corresponding values are 8.9% ± 2.7%, 31.4% ± 2.8%, and 52.7% ± 7.7%. On subtraction images, edge information of tissue classes with a high attenuation gradient was found, but structures with small differences in attenuation were not detectable on subtraction images, confirming that no relevant details were lost in the
DEFF Research Database (Denmark)
Markert, Frank; Kozine, Igor
2012-01-01
by the conventional reliability analysis models and systems analysis methods. An improvement and alternative to the conventional approach is seen in using Discrete Event Simulation (DES) models that can better account for the dynamic dimensions of the systems. The paper will describe the authors’ experience......Risk management of complex environments needs the supportive tools provided by computer models and simulation. During time, various tools have been developed and been applied with different degree of success. The still lasting increase in computer power and the associated development potentials...... stimulate and promote their application within risk management. Today, computer supported models as fault trees, event trees and Bayesian networks are commonly regarded and applied as standard tools for reliability and risk practitioners. There are though some important features that hardly can be captured...
How Many Times Should One Run a Computational Simulation?
DEFF Research Database (Denmark)
Seri, Raffaello; Secchi, Davide
2017-01-01
This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces statisti......This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...... statistical power analysis as a way to determine the appropriate number of runs. Two examples are then produced using results from an agent-based model. The reader is then guided through the application of this statistical technique and exposed to its limits and potentials....
Environments for online maritime simulators with cloud computing capabilities
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M&S) users. Performing large-scale, massively...
Sakamoto, Shinichi; Otsuru, Toru
2014-01-01
This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.
Towards accurate quantum simulations of large systems with small computers.
Yang, Yonggang
2017-01-24
Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.
Improved Pyrolysis Micro reactor Design via Computational Fluid Dynamics Simulations
2017-05-23
NUMBER (Include area code) 23 May 2017 Briefing Charts 25 April 2017 - 23 May 2017 Improved Pyrolysis Micro-reactor Design via Computational Fluid...PYROLYSIS MICRO-REACTOR DESIGN VIA COMPUTATIONAL FLUID DYNAMICS SIMULATIONS Ghanshyam L. Vaghjiani* DISTRIBUTION A: Approved for public release...History of Micro-Reactor (Chen-Source) T ≤ 1800 K S.D. Chambreau et al./International Journal of Mass Spectrometry 2000, 199, 17–27 DISTRIBUTION A
Computer simulations for thorium doped tungsten crystals
Energy Technology Data Exchange (ETDEWEB)
Eberhard, Bernd
2009-07-17
set of Langevin equations, i.e. stochastic differential equations including properly chosen ''noise'' terms. A new integration scheme is derived for integrating the equations of motion, which closely resembles the well-known Velocity Verlet algorithm. As a first application of the EAM potentials, we calculate the phonon dispersion for tungsten and thorium. Furthermore, the potentials are used to derive the excess volumes of point defects, i.e. for vacancies and Th-impurities in tungsten, grain boundary structures and energies. Additionally, we take a closer look at various stacking fault energies and link the results to the potential splitting of screw dislocations in tungsten into partials. We also compare the energetic stability of screw, edge and mixed-type dislocations. Besides this, we are interested in free enthalpy differences, for which we make use of the Overlapping Distribution Method (ODM), an efficient, albeit computationally demanding, method to calculate free enthalpy differences, with which we address the question of lattice formation, vacancy formation and impurity formation at varying temperatures. (orig.)
Computer simulation of viral-assembly and translocation
Mahalik, Jyoti Prakash
reported in the recent experimental work . We also investigated two methods for slowing down the translocation process: pore mutation and use of alternating voltage. Langevin dynamics simulation and Poisson Nernst Planck solver were used for the investigation. We demonstrated that mutation of the protein pore or applying alternating voltage is not a perfect solution for increasing translocation time deterministically. Both strategies resulted in enhanced average translocation time as well as the width of the translocation time distribution. The increase in the width of the translocation time distribution is undesired. In the last part of the project, we investigated the applicability of the polyelectrolyte theory in the computer simulation of polyelectrolyte translocation through nanopores. We determined that the Debye Huckel approximation is acceptable for most translocation simulations as long as the coarse grained polymer bead size is comparable or larger than the Debye length. We also determined that the equilibrium translocation theory is applicable to the polyelectrolyte translocation through a nanopore under biasing condition. The unbiased translocation behavior of a polyelectrolyte chain is qualitatively different from the Rouse model predictions, except for the case where the polyelectrolyte is very small compared to the nanopore.
Quantum computer simulation using the CUDA programming model
Gutiérrez, Eladio; Romero, Sergio; Trenas, María A.; Zapata, Emilio L.
2010-02-01
Quantum computing emerges as a field that captures a great theoretical interest. Its simulation represents a problem with high memory and computational requirements which makes advisable the use of parallel platforms. In this work we deal with the simulation of an ideal quantum computer on the Compute Unified Device Architecture (CUDA), as such a problem can benefit from the high computational capacities of Graphics Processing Units (GPU). After all, modern GPUs are becoming very powerful computational architectures which is causing a growing interest in their application for general purpose. CUDA provides an execution model oriented towards a more general exploitation of the GPU allowing to use it as a massively parallel SIMT (Single-Instruction Multiple-Thread) multiprocessor. A simulator that takes into account memory reference locality issues is proposed, showing that the challenge of achieving a high performance depends strongly on the explicit exploitation of memory hierarchy. Several strategies have been experimentally evaluated obtaining good performance results in comparison with conventional platforms.
The advanced computational testing and simulation toolkit (ACTS)
Energy Technology Data Exchange (ETDEWEB)
Drummond, L.A.; Marques, O.
2002-05-21
During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts
Using computer simulations to facilitate conceptual understanding of electromagnetic induction
Lee, Yu-Fen
This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit
Directory of Open Access Journals (Sweden)
Waltemath Dagmar
2011-12-01
Full Text Available Abstract Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML. SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s used
Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas
2011-12-15
The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research
2011-01-01
Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from
Energy Technology Data Exchange (ETDEWEB)
Dombroski, M; Melius, C; Edmunds, T; Banks, L E; Bates, T; Wheeler, R
2008-09-24
This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to human epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in future
An introduction to statistical computing a simulation-based approach
Voss, Jochen
2014-01-01
A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems. Sampling-based simulation techniques are now an invaluable tool for exploring statistical models. This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods. It also includes some advanced met
Simulation of Turing Machine with uEAC-Computable Functions
Directory of Open Access Journals (Sweden)
Yilin Zhu
2015-01-01
Full Text Available The micro-Extended Analog Computer (uEAC is an electronic implementation inspired by Rubel’s EAC model. In this study, a fully connected uEACs array is proposed to overcome the limitations of a single uEAC, within which each uEAC unit is connected to all the other units by some weights. Then its computational capabilities are investigated by proving that a Turing machine M can be simulated with uEAC-computable functions, even in the presence of bounded noise.
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
Technology computer aided design simulation for VLSI MOSFET
Sarkar, Chandan Kumar
2013-01-01
Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and
Multi-threaded, discrete event simulation of distributed computing systems
Legrand, Iosif; MONARC Collaboration
2001-10-01
The LHC experiments have envisaged computing systems of unprecedented complexity, for which is necessary to provide a realistic description and modeling of data access patterns, and of many jobs running concurrently on large scale distributed systems and exchanging very large amounts of data. A process oriented approach for discrete event simulation is well suited to describe various activities running concurrently, as well the stochastic arrival patterns specific for such type of simulation. Threaded objects or "Active Objects" can provide a natural way to map the specific behaviour of distributed data processing into the simulation program. The simulation tool developed within MONARC is based on Java (TM) technology which provides adequate tools for developing a flexible and distributed process oriented simulation. Proper graphics tools, and ways to analyze data interactively, are essential in any simulation project. The design elements, status and features of the MONARC simulation tool are presented. The program allows realistic modeling of complex data access patterns by multiple concurrent users in large scale computing systems in a wide range of possible architectures, from centralized to highly distributed. Comparison between queuing theory and realistic client-server measurements is also presented.
Cosmological constraints from applying SHAM to rescaled cosmological simulations
Simha, Vimal; Cole, Shaun
2013-12-01
We place constraints on the matter density of the Universe and the amplitude of clustering using measurements of the galaxy two-point correlation function from the Sloan Digital Sky Survey (SDSS). We generate model predictions for different cosmologies by populating rescaled N-body simulations with galaxies using the subhalo abundance matching (SHAM) technique. We find ΩM = 0.29 ± 0.03 and σ8 = 0.86 ± 0.04 at 68 per cent confidence by fitting the observed two-point galaxy correlation function of galaxies brighter than Mr = -18 in a volume-limited sample of galaxies obtained by the SDSS. We discuss and quantify potential sources of systematic error and conclude that while there is scope for improving its robustness, the technique presented in this paper provides a powerful low-redshift constraint on the cosmological parameters that is complementary to other commonly used methods.
Applying Simulation Method in Formulation of Gluten-Free Cookies
Directory of Open Access Journals (Sweden)
Nikitina Marina
2017-01-01
Full Text Available At present time priority direction in the development of new food products its developing of technology products for special purposes. These types of products are gluten-free confectionery products, intended for people with celiac disease. Gluten-free products are in demand among consumers, it needs to expand assortment, and improvement of quality indicators. At this article results of studies on the development of pastry products based on amaranth flour does not contain gluten. Study based on method of simulation recipes gluten-free confectionery functional orientation to optimize their chemical composition. The resulting products will allow to diversify and supplement the necessary nutrients diet for people with gluten intolerance, as well as for those who follow a gluten-free diet.
Mingo, Wendye Dianne
2013-01-01
This study attempts to determine if authentic learning strategies can be used to acquire knowledge of and increase motivation for computational thinking. Over 600 students enrolled in a computer literacy course participated in this study which involved completing a pretest, posttest and motivation survey. The students were divided into an…
Improving a Computer Networks Course Using the Partov Simulation Engine
Momeni, B.; Kharrazi, M.
2012-01-01
Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…
Time Advice and Learning Questions in Computer Simulations
Rey, Gunter Daniel
2011-01-01
Students (N = 101) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without time advice) x 3 (with learning questions and corrective feedback, with…
Atomic Force Microscopy and Real Atomic Resolution. Simple Computer Simulations
Koutsos, V.; Manias, E.; Brinke, G. ten; Hadziioannou, G.
1994-01-01
Using a simple computer simulation for AFM imaging in the contact mode, pictures with true and false atomic resolution are demonstrated. The surface probed consists of two f.c.c. (111) planes and an atomic vacancy is introduced in the upper layer. Changing the size of the effective tip and its
Using computer simulations to improve concept formation in chemistry
African Journals Online (AJOL)
By incorporating more visual material into a chemistry lecture, the lecturer may succeed in restricting the overloading of the students' short-term memory, many a time the major factor leading to misconceptions. The goal of this research project was to investigate whether computer simulations used as a visually-supporting ...
Computer Simulation of the Impact of Cigarette Smoking On Humans
African Journals Online (AJOL)
In this edition, emphasis has been laid on computer simulation of the impact of cigarette smoking on the population between now and the next 50 years, if no government intervention is exercised to control the behaviour of smokers. The statistical indices derived from the previous article (WAJIAR Volume 4) in the series ...
Solving wood chip transport problems with computer simulation.
Dennis P. Bradley; Sharon A. Winsauer
1976-01-01
Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.
Advanced Simulation and Computing Co-Design Strategy
Energy Technology Data Exchange (ETDEWEB)
Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-11-01
This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.
Development of computer simulation models for pedestrian subsystem impact tests
Kant, R.; Konosu, A.; Ishikawa, H.
2000-01-01
The European Enhanced Vehicle-safety Committee (EEVC/WG10 and WG17) proposed three component subsystem tests for cars to assess pedestrian protection. The objective of this study is to develop computer simulation models of the EEVC pedestrian subsystem tests. These models are available to develop a
Computer simulation of cytoskeleton-induced blebbing in lipid membranes
DEFF Research Database (Denmark)
Spangler, E. J.; Harvey, C. W.; Revalee, J. D.
2011-01-01
Blebs are balloon-shaped membrane protrusions that form during many physiological processes. Using computer simulation of a particle-based model for self-assembled lipid bilayers coupled to an elastic meshwork, we investigated the phase behavior and kinetics of blebbing. We found that blebs form...
Learner Perceptions of Realism and Magic in Computer Simulations.
Hennessy, Sara; O'Shea, Tim
1993-01-01
Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive…
Scaffolding learners in designing investigation assignments for a computer simulation
Vreman-de Olde, Cornelise; de Jong, Anthonius J.M.
2006-01-01
This study examined the effect of scaffolding students who learned by designing assignments for a computer simulation on the physics topic of alternating circuits. We compared the students' assignments and the knowledge acquired in a scaffolded group (N=23) and a non-scaffolded group (N=19). The
Biology Students Building Computer Simulations Using StarLogo TNG
Smith, V. Anne; Duncan, Ishbel
2011-01-01
Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…
Pedagogical Approaches to Teaching with Computer Simulations in Science Education
Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael
2013-01-01
For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is
The acoustical history of Hagia Sophia revived through computer simulations
DEFF Research Database (Denmark)
Rindel, Jens Holger; Weitze, C.A.; Christensen, Claus Lynge
2002-01-01
The present paper deals with acoustic computer simulations of Hagia Sophia, which is characterized not only by being one of the largest worship buildings in the world, but also by – in its 1500 year history – having served three purposes: as a church, as a mosque and today as a museum...
Computer simulation study of water using a fluctuating charge model
Indian Academy of Sciences (India)
Unknown
Abstract. Hydrogen bonding in small water clusters is studied through computer simulation methods using a sophisticated, empirical model of interaction developed by Rick et al (S W Rick, S J Stuart and B J Berne 1994 J. Chem. Phys. 101 6141) and others. The model allows for the charges on the interacting sites to ...
COMPUTER SIMULATION OF DISPERSED MATERIALS MOTION IN ROTARY TILTING FURNACES
Directory of Open Access Journals (Sweden)
S. L. Rovin
2016-01-01
Full Text Available The article presents the results of computer simulation of dispersed materials motion in rotary furnaces with an inclined axis of rotation. Has been received new data on the dynamic layer work that enhances understanding of heat and mass transfer processes occurring in the layer.
Monte Carlo simulation by computer for life-cycle costing
Gralow, F. H.; Larson, W. J.
1969-01-01
Prediction of behavior and support requirements during the entire life cycle of a system enables accurate cost estimates by using the Monte Carlo simulation by computer. The system reduces the ultimate cost to the procuring agency because it takes into consideration the costs of initial procurement, operation, and maintenance.
Computational Simulation of a Water-Cooled Heat Pump
Bozarth, Duane
2008-01-01
A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).
Simulating the immune response on a distributed parallel computer
Energy Technology Data Exchange (ETDEWEB)
Castiglione, F. [Univ. of Catania (Italy); Bernaschi, M. [Via Shanghai, Rome (Italy); Succi, S. [IAC/CNR, Rome (Italy)
1997-06-01
The application of ideas and methods of statistical mechanics to problems of biological relevance is one of the most promising frontiers of theoretical and computational mathematical physics. Among others, the computer simulation of the immune system dynamics stands out as one of the prominent candidates for this type of investigations. In the recent years immunological research has been drawing increasing benefits from the resort to advanced mathematical modeling on modern computers. Among others, Cellular Automata (CA), i.e., fully discrete dynamical systems evolving according to boolean laws, appear to be extremely well suited to computer simulation of biological systems. A prominent example of immunological CA is represented by the Celada-Seiden automaton, that has proven capable of providing several new insights into the dynamics of the immune system response. To date, the Celada-Seiden automaton was not in a position to exploit the impressive advances of computer technology, and notably parallel processing, simply because no parallel version of this automaton had been developed yet. In this paper we fill this gap and describe a parallel version of the Celada-Seiden cellular automaton aimed at simulating the dynamic response of the immune system. Details on the parallel implementation as well as performance data on the IBM SP2 parallel platform are presented and commented on.
Computer-simulated development process of Chinese characters font cognition
Chen, Jing; Mu, Zhichun; Sun, Dehui; Hu, Dunli
2008-10-01
The research of Chinese characters cognition is an important research aspect of cognitive science and computer science, especially artificial intelligence. In this paper, according as the traits of Chinese characters the database of Chinese characters font representations and the model of computer simulation of Chinese characters font cognition are constructed from the aspect of cognitive science. The font cognition of Chinese characters is actual a gradual process and there is the accumulation of knowledge. Through using the method of computer simulation, the development model of Chinese characters cognition was constructed. And this is the important research content of Chinese characters cognition. This model is based on self-organizing neural network and adaptive resonance theory (ART) neural network. By Combining the SOFM and ART2 network, two sets of input were trained. Through training and testing methods, the development process of Chinese characters cognition based on Chinese characters cognition was simulated. Then the results from this model and could be compared with the results that were obtained only using SOFM. By analyzing the results, this simulation suggests that the model is able to account for some empirical results. So, the model can simulate the development process of Chinese characters cognition in a way.
Computer simulation and image guidance for individualised dynamic spinal stabilization.
Kantelhardt, S R; Hausen, U; Kosterhon, M; Amr, A N; Gruber, K; Giese, A
2015-08-01
Dynamic implants for the human spine are used to re-establish regular segmental motion. However, the results have often been unsatisfactory and complications such as screw loosening are common. Individualisation of appliances and precision implantation are needed to improve the outcome of this procedure. Computer simulation, virtual implant optimisation and image guidance were used to improve the technique. A human lumbar spine computer model was developed using multi-body simulation software. The model simulates spinal motion under load and degenerative changes. After virtual degeneration of a L4/5 segment, virtual pedicle screw-based implants were introduced. The implants' positions and properties were iteratively optimised. The resulting implant positions were used as operative plan for image guidance and finally implemented in a physical spine model. In the simulation, the introduction and optimisation of virtually designed dynamic implants could partly compensate for the effects of virtual lumbar segment degeneration. The optimised operative plan was exported to two different image-guidance systems for transfer to a physical spine model. Three-dimensional computer graphic simulation is a feasible means to develop operative plans for dynamic spinal stabilization. These operative plans can be transferred to commercially available image-guidance systems for use in implantation of physical implants in a spine model. This concept has important potential in the design of operative plans and implants for individualised dynamic spine stabilization surgery.
The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.
Bronson, Richard
1986-01-01
Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)
Ross, Sheldon
2006-01-01
Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist
Bibliography for Verification and Validation in Computational Simulations
Energy Technology Data Exchange (ETDEWEB)
Oberkampf, W.L.
1998-10-01
A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.
AFFECTIVE COMPUTING AND AUGMENTED REALITY FOR CAR DRIVING SIMULATORS
Directory of Open Access Journals (Sweden)
Dragoș Datcu
2017-12-01
Full Text Available Car simulators are essential for training and for analyzing the behavior, the responses and the performance of the driver. Augmented Reality (AR is the technology that enables virtual images to be overlaid on views of the real world. Affective Computing (AC is the technology that helps reading emotions by means of computer systems, by analyzing body gestures, facial expressions, speech and physiological signals. The key aspect of the research relies on investigating novel interfaces that help building situational awareness and emotional awareness, to enable affect-driven remote collaboration in AR for car driving simulators. The problem addressed relates to the question about how to build situational awareness (using AR technology and emotional awareness (by AC technology, and how to integrate these two distinct technologies [4], into a unique affective framework for training, in a car driving simulator.
GENOMICS, PROTEOMICS & METABOLOMICS CAN PROVIDE USEFUL WEIGHT-OF-EVIDENCE DATA ALONG THE SOURCE-TO-OUTCOME CONTINUUM, WHEN APPROPRIATE BIOINFORMATIC AND COMPUTATIONAL METHODS ARE APPLIED TOWARDS INTEGRATING MOLECULAR, CHEMICAL AND TOXICOGICAL INFORMATION.
Optical high-performance computing: introduction to the JOSA A and Applied Optics feature.
Caulfield, H John; Dolev, Shlomi; Green, William M J
2009-08-01
The feature issues in both Applied Optics and the Journal of the Optical Society of America A focus on topics of immediate relevance to the community working in the area of optical high-performance computing.
National Research Council Canada - National Science Library
Vladimir Ivančević; Marko Knežević; Ivan Luković
2017-01-01
In this paper, we lay the foundation for an adaptation of the teaching process to the personality traits and academic performance of the university students enrolled in applied computer science and informatics (ACSI...
A Strategic Initiative in Applied Biological Simulations 01-SI-012 Final Report for FY01 - FY03
Energy Technology Data Exchange (ETDEWEB)
Lau, E Y; Venclovas, C; Schwegler, E; Gygi, F; Colvin, M E; Bennion, B J; Barsky, D; Mundy, C; Lightstone, F C; Galli, G; Sawicka, D
2004-02-16
The goal of this Strategic Initiative in Applied Computational Biology has been to apply LLNL's expertise in computational simulation to forge a new laboratory core competency in biological simulation. By every measure, this SI has been very successful in this goal. Based on a strong publication record and large number of conference presentations and invited talks, we have built a recognized niche for LLNL in the burgeoning field of computational biology. Further, many of the projects that were previously part of this LDRD are now externally funded based on the research results and expertise developed under this SI. We have created successful collaborations with a number of outside research groups including several joint projects with the new UC Davis/LLNL Comprehensive Cancer Center. In addition to these scientific collaborations, the staff developed on this SI is involved in computational biology program development and advisory roles with other DOE laboratories and DOE Headquarters. Moreover, a number of capabilities and expertise created by this SI are finding use in LLNL programmatic applications. Finally, and most importantly, this SI project has brought to LLNL the human talent on who will be the ensuring the further success of computational biology at this laboratory.
Teaching and Learning Methodologies Supported by ICT Applied in Computer Science
Capacho, Jose
2016-01-01
The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…
Rudisill, Marianne; Mckay, Timothy D.
1990-01-01
The applied human factors research program performed at the NASA Johnson Space Center's Human-Computer Interaction Laboratory is discussed. Research is conducted to advance knowledge in human interaction with computer systems during space crew tasks. In addition, the Laboratory is directly involved in the specification of the human-computer interface (HCI) for space systems in development (e.g., Space Station Freedom) and is providing guidelines and support for HCI design to current and future space missions.
Welch, M C; Kwan, P W; Sajeev, A S M
2014-10-01
Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.
Two-dimensional computer simulation of high intensity proton beams
Lapostolle, Pierre M
1972-01-01
A computer program has been developed which simulates the two- dimensional transverse behaviour of a proton beam in a focusing channel. The model is represented by an assembly of a few thousand 'superparticles' acted upon by their own self-consistent electric field and an external focusing force. The evolution of the system is computed stepwise in time by successively solving Poisson's equation and Newton's law of motion. Fast Fourier transform techniques are used for speed in the solution of Poisson's equation, while extensive area weighting is utilized for the accurate evaluation of electric field components. A computer experiment has been performed on the CERN CDC 6600 computer to study the nonlinear behaviour of an intense beam in phase space, showing under certain circumstances a filamentation due to space charge and an apparent emittance growth. (14 refs).
Computational Fluid Dynamics and Building Energy Performance Simulation
DEFF Research Database (Denmark)
Nielsen, Peter V.; Tryggvason, Tryggvi
An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...... simulation program requires a detailed description of the energy flow in the air movement which can be obtained by a CFD program. The paper describes an energy consumption calculation in a large building, where the building energy simulation program is modified by CFD predictions of the flow between three...... program and a building energy performance simulation program will improve both the energy consumption data and the prediction of thermal comfort and air quality in a selected area of the building....
1987-10-01
include Security Classification) Instrumentation for scientific computing in neural networks, information science, artificial intelligence, and...instrumentation grant to purchase equipment for support of research in neural networks, information science, artificail intellignece , and applied mathematics...in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics Contract AFOSR 86-0282 Principal Investigator: Stephen
Research in progress in applied mathematics, numerical analysis, and computer science
1990-01-01
Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.
Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide
2015-09-01
The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.
From Architectural Acoustics to Acoustical Architecture Using Computer Simulation
DEFF Research Database (Denmark)
Schmidt, Anne Marie Due; Kirkegaard, Poul Henning
2005-01-01
to the design of Bagsvaerd Church by Jørn Utzon. The paper discusses the advantages and disadvantages of the programme in each phase compared to the works of architects not using acoustic simulation programmes. The conclusion of the paper points towards the need to apply the acoustic simulation programmes...... properties prior to the actual construction of a building. With the right tools applied, acoustic design can become an integral part of the architectural design process. The aim of this paper is to investigate the field of application that an acoustic simulation programme can have during an architectural...... acoustic design process and to set up a strategy to develop future programmes. The emphasis is put on the first three out of four phases in the working process of the architect and a case study is carried out in which each phase is represented by typical results ? as exemplified with reference...
Precup, Radu-Emil; Preitl, Stefan
2012-01-01
This book highlights the potential of getting benefits from various applications of computational intelligence techniques. The present book is structured such that to include a set of selected and extended papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011, held in Timisoara, Romania, from 19 to 21 May 2011. After a serious paper review performed by the Technical Program Committee only 116 submissions were accepted, leading to a paper acceptance ratio of 65 %. A further refinement was made after the symposium, based also on the assessment of the presentation quality. Concluding, this book includes the extended and revised versions of the very best papers of SACI 2011 and few invited papers authored by prominent specialists. The readers will benefit from gaining knowledge of the computational intelligence and on what problems can be solved in several areas; they will learn what kind of approaches is advised to use in order to solve these problems. A...
Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation
DEFF Research Database (Denmark)
Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas
2016-01-01
Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging...... constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations...
Computational Physics Simulation of Classical and Quantum Systems
Scherer, Philipp O. J
2010-01-01
This book encapsulates the coverage for a two-semester course in computational physics. The first part introduces the basic numerical methods while omitting mathematical proofs but demonstrating the algorithms by way of numerous computer experiments. The second part specializes in simulation of classical and quantum systems with instructive examples spanning many fields in physics, from a classical rotor to a quantum bit. All program examples are realized as Java applets ready to run in your browser and do not require any programming skills.
OSL sensitivity changes during single aliquot procedures: Computer simulations
DEFF Research Database (Denmark)
McKeever, S.W.S.; Agersnap Larsen, N.; Bøtter-Jensen, L.
1997-01-01
We present computer simulations of sensitivity changes obtained during single aliquot, regeneration procedures. The simulations indicate that the sensitivity changes are the combined result of shallow trap and deep trap effects. Four separate processes have been identified. Although procedures can...... be suggested to eliminate the shallow trap effects, it appears that the deep trap effects cannot be removed. The character of the sensitivity changes which result from these effects is seen to be dependent upon several external parameters, including the extent of bleaching of the OSL signal, the laboratory...
Modeling and simulation the computer science of illusion
Raczynski, Stanislaw
2006-01-01
Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of
COMPUTATIONAL SIMULATION OF FIRE DEVELOPMENT INSIDE A TRADE CENTRE
Directory of Open Access Journals (Sweden)
Constantin LUPU
2015-07-01
Full Text Available Real scale fire experiments involve considerable costs compared to computational mathematical modelling. This paperwork is the result of such a virtual simulation of a fire occurred in a hypothetical wholesale warehouse comprising a large number of trade stands. The analysis starts from the ignition source located inside a trade stand towards the fire expansion over three groups of compartments, by highlighting the heat transfer, both in small spaces, as well as over large distances. In order to confirm the accuracy of the simulation, the obtained values are compared to the ones from the specialized literature.
A computer simulation approach to measurement of human control strategy
Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III
1982-01-01
Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.
Computational electronics semiclassical and quantum device modeling and simulation
Vasileska, Dragica; Klimeck, Gerhard
2010-01-01
Starting with the simplest semiclassical approaches and ending with the description of complex fully quantum-mechanical methods for quantum transport analysis of state-of-the-art devices, Computational Electronics: Semiclassical and Quantum Device Modeling and Simulation provides a comprehensive overview of the essential techniques and methods for effectively analyzing transport in semiconductor devices. With the transistor reaching its limits and new device designs and paradigms of operation being explored, this timely resource delivers the simulation methods needed to properly model state-of
Methodology for characterizing modeling and discretization uncertainties in computational simulation
Energy Technology Data Exchange (ETDEWEB)
ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.
2000-03-01
This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.
Computational Fluid Dynamics Simulation of Fluidized Bed Polymerization Reactors
Energy Technology Data Exchange (ETDEWEB)
Fan, Rong [Iowa State Univ., Ames, IA (United States)
2006-01-01
Fluidized beds (FB) reactors are widely used in the polymerization industry due to their superior heat- and mass-transfer characteristics. Nevertheless, problems associated with local overheating of polymer particles and excessive agglomeration leading to FB reactors defluidization still persist and limit the range of operating temperatures that can be safely achieved in plant-scale reactors. Many people have been worked on the modeling of FB polymerization reactors, and quite a few models are available in the open literature, such as the well-mixed model developed by McAuley, Talbot, and Harris (1994), the constant bubble size model (Choi and Ray, 1985) and the heterogeneous three phase model (Fernandes and Lona, 2002). Most these research works focus on the kinetic aspects, but from industrial viewpoint, the behavior of FB reactors should be modeled by considering the particle and fluid dynamics in the reactor. Computational fluid dynamics (CFD) is a powerful tool for understanding the effect of fluid dynamics on chemical reactor performance. For single-phase flows, CFD models for turbulent reacting flows are now well understood and routinely applied to investigate complex flows with detailed chemistry. For multiphase flows, the state-of-the-art in CFD models is changing rapidly and it is now possible to predict reasonably well the flow characteristics of gas-solid FB reactors with mono-dispersed, non-cohesive solids. This thesis is organized into seven chapters. In Chapter 2, an overview of fluidized bed polymerization reactors is given, and a simplified two-site kinetic mechanism are discussed. Some basic theories used in our work are given in detail in Chapter 3. First, the governing equations and other constitutive equations for the multi-fluid model are summarized, and the kinetic theory for describing the solid stress tensor is discussed. The detailed derivation of DQMOM for the population balance equation is given as the second section. In this section
Preface (to: Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction)
Tan, Desney; Tan, Desney S.; Nijholt, Antinus
2010-01-01
The advances in cognitive neuroscience and brain imaging technologies provide us with the increasing ability to interface directly with activity in the brain. Researchers have begun to use these technologies to build brain-computer interfaces. Originally, these interfaces were meant to allow
Microeconomic theory and computation applying the maxima open-source computer algebra system
Hammock, Michael R
2014-01-01
This book provides a step-by-step tutorial for using Maxima, an open-source multi-platform computer algebra system, to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques.
3rd ACIS International Conference on Applied Computing and Information Technology
2016-01-01
This edited book presents scientific results of the 3nd International Conference on Applied Computing and Information Technology (ACIT 2015) which was held on July 12-16, 2015 in Okayama, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.
Active adaptive sound control in a duct - A computer simulation
Burgess, J. C.
1981-09-01
A digital computer simulation of adaptive closed-loop control for a specific application (sound cancellation in a duct) is discussed. The principal element is an extension of Sondhi's adaptive echo canceler and Widrow's adaptive noise canceler from signal processing to control. Thus, the adaptive algorithm is based on the LMS gradient search method. The simulation demonstrates that one or more pure tones can be canceled down to the computer bit noise level (-120 dB). When additive white noise is present, pure tones can be canceled to at least 10 dB below the noise spectrum level for SNRs down to at least 0 dB. The underlying theory suggests that the algorithm allows tracking tones with amplitudes and frequencies that change more slowly with time than the adaptive filter adaptation rate. It also implies that the method can cancel narrow-band sound in the presence of spectrally overlapping broadband sound.
Molecular Dynamics Computer Simulations of Multidrug RND Efflux Pumps
Ruggerone, Paolo; Vargiu, Attilio V.; Collu, Francesca; Fischer, Nadine; Kandt, Christian
2013-01-01
Over-expression of multidrug efflux pumps of the Resistance Nodulation Division (RND) protein super family counts among the main causes for microbial resistance against pharmaceuticals. Understanding the molecular basis of this process is one of the major challenges of modern biomedical research, involving a broad range of experimental and computational techniques. Here we review the current state of RND transporter investigation employing molecular dynamics simulations providing conformation...
Carburizer particle dissolution in liquid cast iron – computer simulation
Directory of Open Access Journals (Sweden)
D. Bartocha
2010-01-01
Full Text Available In the paper issue of dissolution of carburizing materials (anthracite, petroleum coke and graphite particle in liquid metal and its computer simulation are presented. Relative movement rate of particle and liquid metal and thermophsical properties of carburizing materials (thermal conductivity coefficient, specific heat, thermal diffusivity, density are taken into consideration in calculations. Calculations have been carried out in aspect of metal bath carburization in metallurgical furnaces.
Computer simulation of carburizers particles heating in liquid metal
Directory of Open Access Journals (Sweden)
K. Janerka
2010-01-01
Full Text Available In this article are introduced the problems of computer simulation of carburizers particles heating (anthracite, graphite and petroleum coke, which are present in liquid metal. The diameter of particles, their quantity, relative velocity of particles and liquid metal and the thermophysical properties of materials (thermal conductivity, specific heat and thermal diffusivity have been taken into account in calculations. The analysis has been carried out in the aspect of liquid metal carburization in metallurgical furnaces.
A computer-simulated Stern-Gerlach laboratory
Schroeder, Daniel V
2015-01-01
We describe an interactive computer program that simulates Stern-Gerlach measurements on spin-1/2 and spin-1 particles. The user can design and run experiments involving successive spin measurements, illustrating incompatible observables, interference, and time evolution. The program can be used by students at a variety of levels, from non-science majors in a general interest course to physics majors in an upper-level quantum mechanics course. We give suggested homework exercises using the program at various levels.
IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report
Energy Technology Data Exchange (ETDEWEB)
William M. Bond; Salih Ersayin
2007-03-30
This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern
Application of artificial neural networks to identify equilibration in computer simulations
Leibowitz, Mitchell H.; Miller, Evan D.; Henry, Michael M.; Jankowski, Eric
2017-11-01
Determining which microstates generated by a thermodynamic simulation are representative of the ensemble for which sampling is desired is a ubiquitous, underspecified problem. Artificial neural networks are one type of machine learning algorithm that can provide a reproducible way to apply pattern recognition heuristics to underspecified problems. Here we use the open-source TensorFlow machine learning library and apply it to the problem of identifying which hypothetical observation sequences from a computer simulation are “equilibrated” and which are not. We generate training populations and test populations of observation sequences with embedded linear and exponential correlations. We train a two-neuron artificial network to distinguish the correlated and uncorrelated sequences. We find that this simple network is good enough for > 98% accuracy in identifying exponentially-decaying energy trajectories from molecular simulations.
Neurosurgical simulation by interactive computer graphics on iPad.
Maruyama, Keisuke; Kin, Taichi; Saito, Toki; Suematsu, Shinya; Gomyo, Miho; Noguchi, Akio; Nagane, Motoo; Shiokawa, Yoshiaki
2014-11-01
Presurgical simulation before complicated neurosurgery is a state-of-the-art technique, and its usefulness has recently become well known. However, simulation requires complex image processing, which hinders its widespread application. We explored handling the results of interactive computer graphics on the iPad tablet, which can easily be controlled anywhere. Data from preneurosurgical simulations from 12 patients (4 men, 8 women) who underwent complex brain surgery were loaded onto an iPad. First, DICOM data were loaded using Amira visualization software to create interactive computer graphics, and ParaView, another free visualization software package, was used to convert the results of the simulation to be loaded using the free iPad software KiwiViewer. The interactive computer graphics created prior to neurosurgery were successfully displayed and smoothly controlled on the iPad in all patients. The number of elements ranged from 3 to 13 (mean 7). The mean original data size was 233 MB, which was reduced to 10.4 MB (4.4% of original size) after image processing by ParaView. This was increased to 46.6 MB (19.9%) after decompression in KiwiViewer. Controlling the magnification, transfer, rotation, and selection of translucence in 10 levels of each element were smoothly and easily performed using one or two fingers. The requisite skill to smoothly control the iPad software was acquired within 1.8 trials on average in 12 medical students and 6 neurosurgical residents. Using an iPad to handle the result of preneurosurgical simulation was extremely useful because it could easily be handled anywhere.
Computer simulations of the atmospheric composition climate of Bulgaria
Energy Technology Data Exchange (ETDEWEB)
Gadzhev, G.; Ganev, K.; Syrkov, D.; Prodanova, M.; Georgieva, I.; Georgiev, G.
2015-07-01
Some extensive numerical simulations of the atmospheric composition fields in Bulgaria have been recently performed. The US EPA Model-3 system was chosen as a modelling tool. As the NCEP Global Analysis Data with 1 degree resolution was used as meteorological background, the MM5 and CMAQ nesting capabilities were applied for downscaling the simulations to a 3 km resolution over Bulgaria. The TNO emission inventory was used as emission input. Special pre-processing procedures are created for introducing temporal profiles and speciation of the emissions. The biogenic emissions of VOC are estimated by the model SMOKE. The simulations were carried out for years 2000-2007. The numerical experiments have been carried out for different emission scenarios, which makes it possible the contribution of emissions from different source categories to be evaluated. The Models-3 “Integrated Process Rate Analysis” option is applied to discriminate the role of different dynamic and chemical processes for the air pollution formation. The obtained ensemble of numerical simulation results is extensive enough to allow statistical treatment – calculating not only the mean concentrations and different source categories contribution mean fields, but also standard deviations, skewness, etc. with their dominant temporal modes (seasonal and/or diurnal variations). Thus some basic facts about the atmospheric composition climate of Bulgaria can be retrieved from the simulation ensemble. (Author)
Computer simulations of the atmospheric composition climate of Bulgaria
Energy Technology Data Exchange (ETDEWEB)
Gadzhev, G.; Ganev, K.; Syrakov, D.; Prodanova, M.; Georgieva, I.; Georgiev, G.
2015-07-01
Some extensive numerical simulations of the atmospheric composition fields in Bulgaria have been recently performed. The US EPA Model-3 system was chosen as a modelling tool. As the NCEP Global Analysis Data with 1 degree resolution was used as meteorological background, the MM5 and CMAQ nesting capabilities were applied for downscaling the simulations to a 3 km resolution over Bulgaria. The TNO emission inventory was used as emission input. Special pre-processing procedures are created for introducing temporal profiles and speciation of the emissions. The biogenic emissions of VOC are estimated by the model SMOKE. The simulations were carried out for years 2000-2007. The numerical experiments have been carried out for different emission scenarios, which makes it possible the contribution of emissions from different source categories to be evaluated. The Models-3 Integrated Process Rate Analysis option is applied to discriminate the role of different dynamic and chemical processes for the air pollution formation. The obtained ensemble of numerical simulation results is extensive enough to allow statistical treatment calculating not only the mean concentrations and different source categories contribution mean fields, but also standard deviations, skewness, etc. with their dominant temporal modes (seasonal and/or diurnal variations). Thus some basic facts about the atmospheric composition climate of Bulgaria can be retrieved from the simulation ensemble. (Author)
Computer modeling of road bridge for simulation moving load
Directory of Open Access Journals (Sweden)
Miličić Ilija M.
2016-01-01
Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .
COMPUTER SIMULATION THE MECHANICAL MOVEMENT BODY BY MEANS OF MATHCAD
Directory of Open Access Journals (Sweden)
Leonid Flehantov
2017-03-01
Full Text Available Here considered the technique of using computer mathematics system MathCAD for computer implementation of mathematical model of the mechanical motion of the physical body thrown at an angle to the horizon, and its use for educational computer simulation experiment in teaching the fundamentals of mathematical modeling. The advantages of MathCAD as environment of implementation mathematical models in the second stage of higher education are noted. It describes the creation the computer simulation model that allows you to comprehensively analyze the process of mechanical movement of the body, changing the input parameters of the model: the acceleration of gravity, the initial and final position of the body, the initial velocity and angle, the geometric dimensions of the body and goals. The technique aimed at the effective assimilation of basic knowledge and skills of students on the basics of mathematical modeling, it provides an opportunity to better master the basic theoretical principles of mathematical modeling and related disciplines, promotes logical thinking development of students, their motivation to learn discipline, improves cognitive interest, forms skills research activities than creating conditions for the effective formation of professional competence of future specialists.
An FPGA computing demo core for space charge simulation
Energy Technology Data Exchange (ETDEWEB)
Wu, Jinyuan; Huang, Yifei; /Fermilab
2009-01-01
In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.
Mathematical and computational modeling and simulation fundamentals and case studies
Moeller, Dietmar P F
2004-01-01
Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...
Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models
Energy Technology Data Exchange (ETDEWEB)
Cook, Christopher B.; Richmond, Marshall C.
2001-05-01
This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.
Computational Fluid Dynamics and Building Energy Performance Simulation
DEFF Research Database (Denmark)
Nielsen, Peter Vilhelm; Tryggvason, T.
1998-01-01
An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...... simulation program requires a detailed description of the energy flow in the air movement which can be obtained by a CFD program. The paper describes an energy consumption calculation in a large building, where the building energy simulation program is modified by CFD predictions of the flow between three...... zones connected by open areas with pressure and buoyancy driven air flow. The two programs are interconnected in an iterative procedure. The paper shows also an evaluation of the air quality in the main area of the buildings based on CFD predictions. It is shown that an interconnection between a CFD...
Computer simulation of microgravity long-term effects and risk evaluation
Perez-Poch, Antoni
The objective of this work is to analyse and simulate possible long-term effects of microgravity on the human pulmonary function. It is also studied the efficacy of long-term regular exercise on relevant cardiovascular parameters when the human body is also exposed to microgravity. Little is known today about what long-term effects microgravity may cause on pulmonary function. It does not exist a complete explanation of the possible risks involved, although some experiments are under way on the ISS in order to evaluate them. Computer simulations are an important tool which may be used to predict and analyse these possible effects, and compare them with in-flight experiments. We based our study on a previous computer model (NELME: Numerical Evaluation of Long-term Microgravity Effects) which was developed in our laboratory and validated with the available data, focusing on the cardiovascular parameters affected by changes in gravity exposure. In this previous work we simulated part of the cardiovascular systems and we applied it to evaluate risks of blood-forming organs malfunction. NELME is based on an electrical-like control system model of the physiological changes, that may occur when gravity changes are applied. The computer implementation has a modular architecture. Hence, different output parameters, potential effects, organs and countermeasures can be easily implemented and evaluated. In this work we added a module to the system to analyse the pulmonary function with a gravity input parameter, as well as exposure time. We then conducted a battery of simulations when different values of g are applied for long-term exposures. We found no significant evidence of changes and no risks were foreseen. We also carried out an EVA simulation as a perturbation in the system (intense exercise, changes in breathed air) and studied the acute response. This is of great importance as current mission requirements do not allow data collection immediately following real EVAs
Computational physics and applied mathematics capability review June 8-10, 2010
Energy Technology Data Exchange (ETDEWEB)
Lee, Stephen R [Los Alamos National Laboratory
2010-01-01
Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the Laboratory, starting from the inception of the Laboratory in 1943. The CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled multi-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CPAM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections), as follows. Theme 1: Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the Laboratory. Theme 2: Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution
Schüller, Anton; Schweitzer, Marc
2017-01-01
The contributions gathered here provide an overview of current research projects and selected software products of the Fraunhofer Institute for Algorithms and Scientific Computing SCAI. They show the wide range of challenges that scientific computing currently faces, the solutions it offers, and its important role in developing applications for industry. Given the exciting field of applied collaborative research and development it discusses, the book will appeal to scientists, practitioners, and students alike. The Fraunhofer Institute for Algorithms and Scientific Computing SCAI combines excellent research and application-oriented development to provide added value for our partners. SCAI develops numerical techniques, parallel algorithms and specialized software tools to support and optimize industrial simulations. Moreover, it implements custom software solutions for production and logistics, and offers calculations on high-performance computers. Its services and products are based on state-of-the-art metho...
Troccaz, Jocelyne; Baumann, Michael; Berkelman, Peter; Cinquin, Philippe; Daanen, Vincent; LEROY, Antoine; Marchal, Maud; Payan, Yohan; Promayon, Emmanuel; Voros, Sandrine; Bart, Stéphane; Bolla, Michel; Chartier-Kastler, Emmanuel; Descotes, Jean-Luc; Dusserre, Andrée
2006-01-01
International audience; Until recently, Computer-Aided Medical Interventions (CAMI) and Medical Robotics have focused on rigid and non deformable anatomical structures. Nowadays, special attention is paid to soft tissues, raising complex issues due to their mobility and deformation. Mini-invasive digestive surgery was probably one of the first fields where soft tissues were handled through the development of simulators, tracking of anatomical structures and specific assistance robots. However...
Elgohary, Tarek Adel Abdelsalam
In this Dissertation, computational and analytic methods are presented to address nonlinear systems with applications in structural and celestial mechanics. Scalar Homotopy Methods (SHM) are first introduced for the solution of general systems of nonlinear algebraic equations. The methods are applied to the solution of postbuckling and limit load problems of solids and structures as exemplified by simple plane elastic frames, considering only geometrical nonlinearities. In many problems, instead of simply adopting a root solving method, it is useful to study the particular problem in more detail in order to establish an especially efficient and robust method. Such a problem arises in satellite geodesy coordinate transformation where a new highly efficient solution, providing global accuracy with a non-iterative sequence of calculations, is developed. Simulation results are presented to compare the solution accuracy and algorithm performance for applications spanning the LEO-to-GEO range of missions. Analytic methods are introduced to address problems in structural mechanics and astrodynamics. Analytic transfer functions are developed to address the frequency domain control problem of flexible rotating aerospace structures. The transfer functions are used to design a Lyapunov stable controller that drives the spacecraft to a target position while suppressing vibrations in the flexible appendages. In astrodynamics, a Taylor series based analytic continuation technique is developed to address the classical two-body problem. A key algorithmic innovation for the trajectory propagation is that the classical averaged approximation strategy is replaced with a rigorous series based solution for exactly computing the acceleration derivatives. Evidence is provided to demonstrate that high precision solutions are easily obtained with the analytic continuation approach. For general nonlinear initial value problems (IVPs), the method of Radial Basis Functions time domain
Recent progress and modern challenges in applied mathematics, modeling and computational science
Makarov, Roman; Belair, Jacques
2017-01-01
This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science. The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.
4th International Conference on Computer Science, Applied Mathematics and Applications
Do, Tien; Thi, Hoai; Nguyen, Ngoc
2016-01-01
This proceedings consists of 20 papers which have been selected and invited from the submissions to the 4th International Conference on Computer Science, Applied Mathematics and Applications (ICCSAMA 2016) held on 2-3 May, 2016 in Laxenburg, Austria. The conference is organized into 5 sessions: Advanced Optimization Methods and Their Applications, Models for ICT applications, Topics on discrete mathematics, Data Analytic Methods and Applications and Feature Extractio, respectively. All chapters in the book discuss theoretical and practical issues connected with computational methods and optimization methods for knowledge engineering. The editors hope that this volume can be useful for graduate and Ph.D. students and researchers in Applied Sciences, Computer Science and Applied Mathematics. .
Cloud computing technologies applied in the virtual education of civil servants
Directory of Open Access Journals (Sweden)
Teodora GHERMAN
2016-03-01
Full Text Available From the perspective of education, e-learning through the use of Cloud Computing technologies represent one of the most important directions of educational software development, because Cloud Computing are in a rapid development and applies to all areas of the Information Society, including education. Systems require resources for virtual education on web platform (e-learning numerous hardware and software. The convenience of Internet learning, creating a learning environment based on web has become one of the strengths in virtual education research, including applied Cloud Computing technologies in virtual education of civil servants. The article presents Cloud Computing technologies as a platform for virtual education on web platforms, their advantages and disadvantages towards other technologies.
Explicit contact modeling for surgical computer guidance and simulation
Johnsen, S. F.; Taylor, Z. A.; Clarkson, M.; Thompson, S.; Hu, M.; Gurusamy, K.; Davidson, B.; Hawkes, D. J.; Ourselin, S.
2012-02-01
Realistic modelling of mechanical interactions between tissues is an important part of surgical simulation, and may become a valuable asset in surgical computer guidance. Unfortunately, it is also computationally very demanding. Explicit matrix-free FEM solvers have been shown to be a good choice for fast tissue simulation, however little work has been done on contact algorithms for such FEM solvers. This work introduces such an algorithm that is capable of handling both deformable-deformable (soft-tissue interacting with soft-tissue) and deformable-rigid (e.g. soft-tissue interacting with surgical instruments) contacts. The proposed algorithm employs responses computed with a fully matrix-free, virtual node-based version of the model first used by Taylor and Flanagan in PRONTO3D. For contact detection, a bounding-volume hierarchy (BVH) capable of identifying self collisions is introduced. The proposed BVH generation and update strategies comprise novel heuristics to minimise the number of bounding volumes visited in hierarchy update and collision detection. Aside from speed, stability was a major objective in the development of the algorithm, hence a novel method for computation of response forces from C0-continuous normals, and a gradual application of response forces from rate constraints has been devised and incorporated in the scheme. The continuity of the surface normals has advantages particularly in applications such as sliding over irregular surfaces, which occurs, e.g., in simulated breathing. The effectiveness of the scheme is demonstrated on a number of meshes derived from medical image data and artificial test cases.
Above the cloud computing: applying cloud computing principles to create an orbital services model
Straub, Jeremy; Mohammad, Atif; Berk, Josh; Nervold, Anders K.
2013-05-01
Large satellites and exquisite planetary missions are generally self-contained. They have, onboard, all of the computational, communications and other capabilities required to perform their designated functions. Because of this, the satellite or spacecraft carries hardware that may be utilized only a fraction of the time; however, the full cost of development and launch are still bone by the program. Small satellites do not have this luxury. Due to mass and volume constraints, they cannot afford to carry numerous pieces of barely utilized equipment or large antennas. This paper proposes a cloud-computing model for exposing satellite services in an orbital environment. Under this approach, each satellite with available capabilities broadcasts a service description for each service that it can provide (e.g., general computing capacity, DSP capabilities, specialized sensing capabilities, transmission capabilities, etc.) and its orbital elements. Consumer spacecraft retain a cache of service providers and select one utilizing decision making heuristics (e.g., suitability of performance, opportunity to transmit instructions and receive results - based on the orbits of the two craft). The two craft negotiate service provisioning (e.g., when the service can be available and for how long) based on the operating rules prioritizing use of (and allowing access to) the service on the service provider craft, based on the credentials of the consumer. Service description, negotiation and sample service performance protocols are presented. The required components of each consumer or provider spacecraft are reviewed. These include fully autonomous control capabilities (for provider craft), a lightweight orbit determination routine (to determine when consumer and provider craft can see each other and, possibly, pointing requirements for craft with directional antennas) and an authentication and resource utilization priority-based access decision making subsystem (for provider craft
Energy Technology Data Exchange (ETDEWEB)
Bonek, Mirosław, E-mail: miroslaw.bonek@polsl.pl; Śliwa, Agata; Mikuła, Jarosław
2016-12-01
Highlights: • Prediction of the properties of laser remelted surface layer with the use of FEM analysis. • The simulation was applied to determine the shape of molten pool of remelted surface. • Applying of numerical model MES for simulation of surface laser treatment to meaningfully shorten time of selection of optimum parameters. • An FEM model was established for the purpose of building a computer simulation. - Abstract: Investigations >The language in this paper has been slightly changed. Please check for clarity of thought, and that the meaning is still correct, and amend if necessary.include Finite Element Method simulation model of remelting of PMHSS6-5-3 high-speed steel surface layer using the high power diode laser (HPDL). The Finite Element Method computations were performed using ANSYS software. The scope of FEM simulation was determination of temperature distribution during laser alloying process at various process configurations regarding the laser beam power and method of powder deposition, as pre-coated past or surface with machined grooves. The Finite Element Method simulation was performed on five different 3-dimensional models. The model assumed nonlinear change of thermal conductivity, specific heat and density that were depended on temperature. The heating process was realized as heat flux corresponding to laser beam power of 1.4, 1.7 and 2.1 kW. Latent heat effects are considered during solidification. The molten pool is composed of the same material as the substrate and there is no chemical reaction. The absorptivity of laser energy was dependent on the simulated materials properties and their surface condition. The Finite Element Method simulation allows specifying the heat affected zone and the temperature distribution in the sample as a function of time and thus allows the estimation of the structural changes taking place during laser remelting process. The simulation was applied to determine the shape of molten pool and the
Modelling and simulation of information systems on computer: methodological advantages.
Huet, B; Martin, J
1980-01-01
Modelling and simulation of information systems by the means of miniatures on computer aim at two general objectives: (a) as an aid to design and realization of information systems; and (b) a tool to improve the dialogue between the designer and the users. An operational information system has two components bound by a dynamic relationship, an information system and a behavioural system. Thanks to the behaviour system, modelling and simulation allow the designer to integrate into the projects a large proportion of the system's implicit specification. The advantages of modelling to the information system relate to: (a) The conceptual phase: initial objectives are compared with the results of simulation and sometimes modified. (b) The external specifications: simulation is particularly useful for personalising man-machine relationships in each application. (c) The internal specifications: if the miniatures are built on the concept of process, the global design and the software are tested and also the simulation refines the configuration and directs the choice of hardware. (d) The implementation: stimulation reduces costs, time and allows testing. Progress in modelling techniques will undoubtedly lead to better information systems.
Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation
Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.
2012-01-01
The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.
First International Symposium on Applied Computing and Information Technology (ACIT 2013)
Applied Computing and Information Technology
2014-01-01
This book presents the selected results of the 1st International Symposium on Applied Computers and Information Technology (ACIT 2013) held on August 31 – September 4, 2013 in Matsue City, Japan, which brought together researchers, scientists, engineers, industry practitioners, and students to discuss all aspects of Applied Computers & Information Technology, and its practical challenges. This book includes the best 12 papers presented at the conference, which were chosen based on review scores submitted by members of the program committee and underwent further rigorous rounds of review.
Computer simulation of methanol exchange dynamics around cations and anions
Energy Technology Data Exchange (ETDEWEB)
Roy, Santanu; Dang, Liem X.
2016-03-03
In this paper, we present the first computer simulation of methanol exchange dynamics between the first and second solvation shells around different cations and anions. After water, methanol is the most frequently used solvent for ions. Methanol has different structural and dynamical properties than water, so its ion solvation process is different. To this end, we performed molecular dynamics simulations using polarizable potential models to describe methanol-methanol and ion-methanol interactions. In particular, we computed methanol exchange rates by employing the transition state theory, the Impey-Madden-McDonald method, the reactive flux approach, and the Grote-Hynes theory. We observed that methanol exchange occurs at a nanosecond time scale for Na+ and at a picosecond time scale for other ions. We also observed a trend in which, for like charges, the exchange rate is slower for smaller ions because they are more strongly bound to methanol. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Temi Linjewile; Mike Maguire; Adel Sarofim; Connie Senior; Changguan Yang; Hong-Shig Shim
2004-04-28
This is the fourteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused primarily on completing a prototype detachable user interface for the framework and on integrating Carnegie Mellon Universities IECM model core with the computational engine. In addition to this work, progress has been made on several other development and modeling tasks for the program. These include: (1) improvements to the infrastructure code of the computational engine, (2) enhancements to the model interfacing specifications, (3) additional development to increase the robustness of all framework components, (4) enhanced coupling of the computational and visualization engine components, (5) a series of detailed simulations studying the effects of gasifier inlet conditions on the heat flux to the gasifier injector, and (6) detailed plans for implementing models for mercury capture for both warm and cold gas cleanup have been created.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio
2003-04-25
This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two gasifier types. An improved process model for simulating entrained flow gasifiers has been implemented into the workbench. Model development has focused on: a pre-processor module to compute global gasification parameters from standard fuel properties and intrinsic rate information; a membrane based water gas shift; and reactors to oxidize fuel cell exhaust gas. The data visualization capabilities of the workbench have been extended by implementing the VTK visualization software that supports advanced visualization methods, including inexpensive Virtual Reality techniques. The ease-of-use, functionality and plug-and-play features of the workbench were highlighted through demonstrations of the workbench at a DOE sponsored coal utilization conference. A white paper has been completed that contains recommendations on the use of component architectures, model interface protocols and software frameworks for developing a Vision 21 plant simulator.
An Embedded System for applying High Performance Computing in Educational Learning Activity
Directory of Open Access Journals (Sweden)
Irene Erlyn Wina Rachmawan
2016-08-01
Full Text Available HPC (High Performance Computing has become more popular in the last few years. With the benefits on high computational power, HPC has impact on industry, scientific research and educational activities. Implementing HPC as a curriculum in universities could be consuming a lot of resources because well-known HPC system are using Personal Computer or Server. By using PC as the practical moduls it is need great resources and spaces. This paper presents an innovative high performance computing cluster system to support education learning activities in HPC course with small size, low cost, and yet powerful enough. In recent years, High Performance computing usually implanted in cluster computing and require high specification computer and expensive cost. It is not efficient applying High Performance Computing in Educational research activiry such as learning in Class. Therefore, our proposed system is created with inexpensive component by using Embedded System to make High Performance Computing applicable for leaning in the class. Students involved in the construction of embedded system, built clusters from basic embedded and network components, do benchmark performance, and implement simple parallel case using the cluster. In this research we performed evaluation of embedded systems comparing with i5 PC, the results of our embedded system performance of NAS benchmark are similar with i5 PCs. We also conducted surveys about student learning satisfaction that with embedded system students are able to learn about HPC from building the system until making an application that use HPC system.
Possibilities and importance of using computer games and simulations in educational process
Directory of Open Access Journals (Sweden)
Danilović Mirčeta S.
2003-01-01
Full Text Available The paper discusses if it is possible and appropriate to use simulations (simulation games and traditional games in the process of education. It is stressed that the terms "game" and "simulation" can and should be taken in a broader sense, although they are chiefly investigated herein as video-computer games and simulations. Any activity combining the properties of game (competition, rules, players and the properties of simulation (i.e. operational presentation of reality should be understood as simulation games, where role-play constitutes their essence and basis. In those games the student assumes a new identity, identifies himself with another personality and responds similarly. Game rules are basic and most important conditions for its existence, accomplishment and goal achievement. Games and simulations make possible for a student to acquire experience and practice i.e. to do exercises in nearly similar or identical life situations, to develop cognitive and psycho-motor abilities and skills, to acquire knowledge, to develop, create and change attitudes and value criteria, and to develop perception of other people’s feelings and attitudes. It is obligatory for the teacher to conduct preparations to use and apply simulation games in the process of teaching.
Simultaneous computation within a sequential process simulation tool
Directory of Open Access Journals (Sweden)
G. Endrestøl
1989-10-01
Full Text Available The paper describes an equation solver superstructure developed for a sequential modular dynamic process simulation system as part of a Eureka project with Norwegian and British participation. The purpose of the development was combining some of the advantages of equation based and purely sequential systems, enabling implicit treatment of key variables independent of module boundaries, and use of numerical integration techniques suitable for each individual type of variable. For training simulator applications the main advantages are gains in speed due to increased stability limits on time steps and improved consistency of simulation results. The system is split into an off-line analysis phase and an on-line equation solver. The off-line processing consists of automatic determination of the topological structure of the system connectivity from standard process description files and derivation of an optimized sparse matrix solution procedure for the resulting set of equations. The on-line routine collects equation coefficients from involved modules, solves the combined sets of structured equations, and stores the results appropriately. This method minimizes the processing cost during the actual simulation. The solver has been applied in the Veslefrikk training simulator project.
Simulating quantum systems on classical computers with matrix product states
Energy Technology Data Exchange (ETDEWEB)
Kleine, Adrian
2010-11-08
In this thesis, the numerical simulation of strongly-interacting many-body quantum-mechanical systems using matrix product states (MPS) is considered. Matrix-Product-States are a novel representation of arbitrary quantum many-body states. Using quantum information theory, it is possible to show that Matrix-Product-States provide a polynomial-sized representation of one-dimensional quantum systems, thus allowing an efficient simulation of one-dimensional quantum system on classical computers. Matrix-Product-States form the conceptual framework of the density-matrix renormalization group (DMRG). After a general introduction in the first chapter of this thesis, the second chapter deals with Matrix-Product-States, focusing on the development of fast and stable algorithms. To obtain algorithms to efficiently calculate ground states, the density-matrix renormalization group is reformulated using the Matrix-Product-States framework. Further, time-dependent problems are considered. Two different algorithms are presented, one based on a Trotter decomposition of the time-evolution operator, the other one on Krylov subspaces. Finally, the evaluation of dynamical spectral functions is discussed, and a correction vector-based method is presented. In the following chapters, the methods presented in the second chapter, are applied to a number of different physical problems. The third chapter deals with the existence of chiral phases in isotropic one-dimensional quantum spin systems. A preceding analytical study based on a mean-field approach indicated the possible existence of those phases in an isotropic Heisenberg model with a frustrating zig-zag interaction and a magnetic field. In this thesis, the existence of the chiral phases is shown numerically by using Matrix-Product-States-based algorithms. In the fourth chapter, we propose an experiment using ultracold atomic gases in optical lattices, which allows a well controlled observation of the spin-charge separation (of
Computer simulation of confined and flexoelectric liquid crystalline systems
Barmes, F
2003-01-01
In this Thesis, systems of confined and flexoelectric liquid crystal systems have been studied using molecular computer simulations. The aim of this work was to provide a molecular model of a bistable display cell in which switching is induced through the application of directional electric field pulses. In the first part of this Thesis, the study of confined systems of liquid crystalline particles has been addressed. Computation of the anchoring phase diagrams for three different surface interaction models showed that the hard needle wall and rod-surface potentials induce both planar and homeotropic alignment separated by a bistability region, this being stronger and wider for the rod-surface varant. The results obtained using the rod-sphere surface model, in contrast, showed that tilled surface arrangements can be induced by surface absorption mechanisms. Equivalent studies of hybrid anchored systems showed that a bend director structure can be obtained in a slab with monostable homeotropic anchoring at the...
Computational simulation of structural fracture in fiber composites
Chamis, C. C.; Murthy, P. L. N.
1990-01-01
A methodology was developed for the computational simulation of structural fracture in fiber composites. This methodology consists of step-by-step procedures for mixed mode fracture in generic components and of an integrated computer code, Composite Durability Structural Analysis (CODSTRAN). The generic types of composite structural fracture include single and combined mode fracture in beams, laminate free-edge delamination fracture, and laminate center flaw progressive fracture. Structural fracture is assessed in one or all of the following: (1) the displacements increase very rapidly; (2) the frequencies decrease very rapidly; (3) the buckling loads decrease very rapidly; or (4) the strain energy release rate increases very rapidly. These rapid changes are herein assumed to denote imminent structural fracture. Based on these rapid changes, parameters/guidelines are identified which can be used as criteria for structural fracture, inspection intervals, and retirement for cause.
Computational strategies in the dynamic simulation of constrained flexible MBS
Amirouche, F. M. L.; Xie, M.
1993-01-01
This research focuses on the computational dynamics of flexible constrained multibody systems. At first a recursive mapping formulation of the kinematical expressions in a minimum dimension as well as the matrix representation of the equations of motion are presented. The method employs Kane's equation, FEM, and concepts of continuum mechanics. The generalized active forces are extended to include the effects of high temperature conditions, such as creep, thermal stress, and elastic-plastic deformation. The time variant constraint relations for rolling/contact conditions between two flexible bodies are also studied. The constraints for validation of MBS simulation of gear meshing contact using a modified Timoshenko beam theory are also presented. The last part deals with minimization of vibration/deformation of the elastic beam in multibody systems making use of time variant boundary conditions. The above methodologies and computational procedures developed are being implemented in a program called DYAMUS.
Olson, Branden; Kleiber, William
2017-04-01
Stochastic precipitation generators (SPGs) produce synthetic precipitation data and are frequently used to generate inputs for physical models throughout many scientific disciplines. Especially for large data sets, statistical parameter estimation is difficult due to the high dimensionality of the likelihood function. We propose techniques to estimate SPG parameters for spatiotemporal precipitation occurrence based on an emerging set of methods called Approximate Bayesian computation (ABC), which bypass the evaluation of a likelihood function. Our statistical model employs a thresholded Gaussian process that reduces to a probit regression at single sites. We identify appropriate ABC penalization metrics for our model parameters to produce simulations whose statistical characteristics closely resemble those of the observations. Spell length metrics are appropriate for single sites, while a variogram-based metric is proposed for spatial simulations. We present numerical case studies at sites in Colorado and Iowa where the estimated statistical model adequately reproduces local and domain statistics.
Computer simulation of aqueous Na-Cl electrolytes
Energy Technology Data Exchange (ETDEWEB)
Hummer, G. [Los Alamos National Lab., NM (United States); Soumpasis, D.M. [Max-Planck-Institut fuer Biophysikalische Chemie (Karl-Friedrich-Bonhoeffer-Institut), Goettingen (Germany); Neumann, M. [Vienna Univ. (Austria). Inst. fuer Experimentalphysik
1993-11-01
Equilibrium structure of aqueous Na-Cl electrolytes between 1 and 5 mol/l is studied by means of molecular dynamics computer simulation using interaction site descriptions of water and ionic components. Electrostatic interactions are treated both with the newly developed charged-clouds scheme and with Ewald summation. In the case of a 5 mol/l electrolyte, the results for pair correlations obtained by the two methods are in excellent agreement. However, the charged-clouds technique is much faster than Ewald summation and makes simulations at lower salt concentrations feasible. It is found that both ion-water and ion-ion correlation functions depend only weakly on the ionic concentration. Sodium and chloride ions exhibit only a negligible tendency to form contact pairs. In particular, no chloride ion pairs in contact are observed.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio
2002-07-28
This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.
Simulation of computed radiography with imaging plate detectors
Tisseur, D.; Costin, M.; Mathy, F.; Schumm, A.
2014-02-01
Computed radiography (CR) using phosphor imaging plate detectors is taking an increasing place in Radiography Testing. CR uses similar equipment as conventional radiography except that the classical X-ray film is replaced by a numerical detector, called image plate (IP), which is made of a photostimulable layer and which is read by a scanning device through photostimulated luminescence. Such digital radiography has already demonstrated important benefits in terms of exposure time, decrease of source energies and thus reduction of radioprotection area besides being a solution without effluents. This paper presents a model for the simulation of radiography with image plate detectors in CIVA together with examples of validation of the model. The study consists in a cross comparison between experimental and simulation results obtained on a step wedge with a classical X-ray tube. Results are proposed in particular with wire Image quality Indicator (IQI) and duplex IQI.
Insights from molecular dynamics simulations for computational protein design.
Childers, Matthew Carter; Daggett, Valerie
2017-02-01
A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures.
Insights from molecular dynamics simulations for computational protein design
Childers, Matthew Carter; Daggett, Valerie
2017-01-01
A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures. PMID:28239489
Nonlinear simulations with and computational issues for NIMROD
Energy Technology Data Exchange (ETDEWEB)
Sovinec, C.R. [Los Alamos National Lab., NM (United States)
1998-12-31
The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.
Simulated Sustainable Societies: Students' Reflections on Creating Future Cities in Computer Games
Nilsson, Elisabet M.; Jakobsson, Anders
2011-02-01
The empirical study, in this article, involved 42 students (ages 14-15), who used the urban simulation computer game SimCity 4 to create models of sustainable future cities. The aim was to explore in what ways the simulated "real" worlds provided by this game could be a potential facilitator for science learning contexts. The topic investigated is in what way interactions in this gaming environment, and reflections about these interactions, can form a context where the students deal with real world problems, and where they can contextualise and apply their scientific knowledge. Focus group interviews and video recordings were used to gather data on students' reflections on their cities, and on sustainable development. The findings indicate that SimCity 4 actually contributes to creating meaningful educational situations in science classrooms, and that computer games can constitute an important artefact that may facilitate contextualisation and make students' use of science concepts and theories more explicit.
Time-partitioning simulation models for calculation on parallel computers
Milner, Edward J.; Blech, Richard A.; Chima, Rodrick V.
1987-01-01
A technique allowing time-staggered solution of partial differential equations is presented in this report. Using this technique, called time-partitioning, simulation execution speedup is proportional to the number of processors used because all processors operate simultaneously, with each updating of the solution grid at a different time point. The technique is limited by neither the number of processors available nor by the dimension of the solution grid. Time-partitioning was used to obtain the flow pattern through a cascade of airfoils, modeled by the Euler partial differential equations. An execution speedup factor of 1.77 was achieved using a two processor Cray X-MP/24 computer.
Time-partitioning simulation models for calculation of parallel computers
Milner, Edward J.; Blech, Richard A.; Chima, Rodrick V.
1987-01-01
A technique allowing time-staggered solution of partial differential equations is presented in this report. Using this technique, called time-partitioning, simulation execution speedup is proportional to the number of processors used because all processors operate simultaneously, with each updating of the solution grid at a different time point. The technique is limited by neither the number of processors available nor by the dimension of the solution grid. Time-partitioning was used to obtain the flow pattern through a cascade of airfoils, modeled by the Euler partial differential equations. An execution speedup factor of 1.77 was achieved using a two processor Cray X-MP/24 computer.
Dilbert-Peter model of organization effectiveness: computer simulations
Sobkowicz, Pawel
2010-01-01
We provide a technical report on a computer simulation of general effectiveness of a hierarchical organization depending on two main aspects: effects of promotion to managerial levels and efforts to self-promote of individual employees, reducing their actual productivity. The combination of judgment by appearance in the promotion to higher levels of hierarchy and the Peter Principle (which states that people are promoted to their level of incompetence) results in fast declines in effectiveness of the organization. The model uses a few synthetic parameters aimed at reproduction of realistic conditions in typical multilayer organizations.
Computer simulations for biological aging and sexual reproduction
Directory of Open Access Journals (Sweden)
STAUFFER DIETRICH
2001-01-01
Full Text Available The sexual version of the Penna model of biological aging, simulated since 1996, is compared here with alternative forms of reproduction as well as with models not involving aging. In particular we want to check how sexual forms of life could have evolved and won over earlier asexual forms hundreds of million years ago. This computer model is based on the mutation-accumulation theory of aging, using bits-strings to represent the genome. Its population dynamics is studied by Monte Carlo methods.
Directory of Open Access Journals (Sweden)
В В Гриншкун
2016-12-01
Full Text Available The article describes the experience of ICT training courses within design in the schools system, “international baccalaureate”. A comparison of this approach with the Russian experience of teaching Informatics at school is applied. Approaches to the application of students’ research activity in the framework of teaching computer science is also discussed in the article.
The High-Tech Humanist: Multimedia Computing in a Senior Applied Ethics Seminar.
McKinney, William J.
Improved computer technology presents philosophers with the means to enhance their applied ethics classes by providing the opportunity to explore myriad practical and conceptual problems heretofore impossible in the traditional classroom. This paper examines some of the potential and problems inherent in using computerized techniques in Southeast…
Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP
Bohát, Róbert; Rödlingová, Beata; Horáková, Nina
2015-01-01
The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…
A Delphi Study on Technology Enhanced Learning (TEL) Applied on Computer Science (CS) Skills
Porta, Marcela; Mas-Machuca, Marta; Martinez-Costa, Carme; Maillet, Katherine
2012-01-01
Technology Enhanced Learning (TEL) is a new pedagogical domain aiming to study the usage of information and communication technologies to support teaching and learning. The following study investigated how this domain is used to increase technical skills in Computer Science (CS). A Delphi method was applied, using three-rounds of online survey…
SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION
Directory of Open Access Journals (Sweden)
Marko Hadjina
2015-06-01
Full Text Available In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of suggested methodology, is investigated and described regarding its special characteristics, advantages and reasons for application, especially in shipbuilding production process. Furthermore, simulation modeling basics were described as well as suggested methodology for production process procedure. Case study of suggested methodology application for designing a robotized profile fabrication production process line is demonstrated. Selected design solution, acquired with suggested methodology was evaluated through comparison with robotized profile cutting production line installation in a specific shipyard production process. Based on obtained data from real production the simulation model was further enhanced. Finally, on grounds of this research, results and droved conclusions, directions for further research are suggested.
GENOA-PFA: Progressive Fracture in Composites Simulated Computationally
Murthy, Pappu L. N.
2000-01-01
GENOA-PFA is a commercial version of the Composite Durability Structural Analysis (CODSTRAN) computer program that simulates the progression of damage ultimately leading to fracture in polymer-matrix-composite (PMC) material structures under various loading and environmental conditions. GENOA-PFA offers several capabilities not available in other programs developed for this purpose, making it preferable for use in analyzing the durability and damage tolerance of complex PMC structures in which the fiber reinforcements occur in two- and three-dimensional weaves and braids. GENOA-PFA implements a progressive-fracture methodology based on the idea that a structure fails when flaws that may initially be small (even microscopic) grow and/or coalesce to a critical dimension where the structure no longer has an adequate safety margin to avoid catastrophic global fracture. Damage is considered to progress through five stages: (1) initiation, (2) growth, (3) accumulation (coalescence of propagating flaws), (4) stable propagation (up to the critical dimension), and (5) unstable or very rapid propagation (beyond the critical dimension) to catastrophic failure. The computational simulation of progressive failure involves formal procedures for identifying the five different stages of damage and for relating the amount of damage at each stage to the overall behavior of the deteriorating structure. In GENOA-PFA, mathematical modeling of the composite physical behavior involves an integration of simulations at multiple, hierarchical scales ranging from the macroscopic (lamina, laminate, and structure) to the microscopic (fiber, matrix, and fiber/matrix interface), as shown in the figure. The code includes algorithms to simulate the progression of damage from various source defects, including (1) through-the-thickness cracks and (2) voids with edge, pocket, internal, or mixed-mode delaminations.
Trends in Social Science: The Impact of Computational and Simulative Models
Conte, Rosaria; Paolucci, Mario; Cecconi, Federico
This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.
Plastic deformation of crystals: analytical and computer simulation studies of dislocation glide
Energy Technology Data Exchange (ETDEWEB)
Altintas, S.
1978-05-01
The plastic deformation of crystals is usually accomplished through the motion of dislocations. The glide of a dislocation is impelled by the applied stress and opposed by microstructural defects such as point defects, voids, precipitates and other dislocations. The planar glide of a dislocation through randomly distributed obstacles is considered. The objective of the present research work is to calculate the critical resolved shear stress (CRSS) for athermal glide and the velocity of the dislocation at finite temperature as a function of the applied stress and the nature and strength of the obstacles. Dislocation glide through mixtures of obstacles has been studied analytically and by computer simulation. Arrays containing two kinds of obstacles as well as square distribution of obstacle strengths are considered. The critical resolved shear stress for an array containing obstacles with a given distribution of strengths is calculated using the sum of the quadratic mean of the stresses for the individual obstacles and is found to be in good agreement with the computer simulation data. Computer simulation of dislocation glide through randomly distributed obstacles containing up to 10/sup 6/ obstacles show that the CRSS decreases as the size of the array increases and approaches a limiting value. Histograms of forces and of segment lengths are obtained and compared with theoretical predictions. Effects of array shape and boundary conditions on the dislocation glide are also studied. Analytical and computer simulation results are compared with experimental results obtained on precipitation-, irradiation-, forest-, and impurity cluster-hardening systems and are found to be in good agreement.
Visualization of computer architecture simulation data for system-level design space exploration
Taghavi, T.; Thompson, M.; Pimentel, A.D.
2009-01-01
System-level computer architecture simulations create large volumes of simulation data to explore alternative architectural solutions. Interpreting and drawing conclusions from this amount of simulation results can be extremely cumbersome. In other domains that also struggle with interpreting large
Directory of Open Access Journals (Sweden)
Kalisz D.
2016-03-01
Full Text Available The authors own computer software, based on the Ueshima mathematical model with taking into account the back diffusion, determined from the Wołczyński equation, was developed for simulation calculations. The applied calculation procedure allowed to determine the chemical composition of the non-metallic phase in steel deoxidised by means of Mn, Si and Al, at the given cooling rate. The calculation results were confirmed by the analysis of samples taken from the determined areas of the cast ingot. This indicates that the developed computer software can be applied for designing the steel casting process of the strictly determined chemical composition and for obtaining the required non-metallic precipitates.
Directory of Open Access Journals (Sweden)
Nicolae MARGINEAN
2009-01-01
Full Text Available The choice of a proper computer system is not an easy task for a decider. One reason could be the present market development of computer systems applied in business. The big number of the Romanian market players determines a big number of computerized products, with a multitude of various properties. Our proposal tries to optimize and facilitate this decisional process within an e-shop where are sold IT packets applied in business, building an online decisional assistant, a special component conceived to facilitate the decision making needed for the selection of the pertinent IT package that fits the requirements of one certain business, described by the decider. The user interacts with the system as an online buyer that visit an e-shop where are sold IT package applied in economy.
Finite element simulation of the mechanical impact of computer work on the carpal tunnel syndrome.
Mouzakis, Dionysios E; Rachiotis, George; Zaoutsos, Stefanos; Eleftheriou, Andreas; Malizos, Konstantinos N
2014-09-22
Carpal tunnel syndrome (CTS) is a clinical disorder resulting from the compression of the median nerve. The available evidence regarding the association between computer use and CTS is controversial. There is some evidence that computer mouse or keyboard work, or both are associated with the development of CTS. Despite the availability of pressure measurements in the carpal tunnel during computer work (exposure to keyboard or mouse) there are no available data to support a direct effect of the increased intracarpal canal pressure on the median nerve. This study presents an attempt to simulate the direct effects of computer work on the whole carpal area section using finite element analysis. A finite element mesh was produced from computerized tomography scans of the carpal area, involving all tissues present in the carpal tunnel. Two loading scenarios were applied on these models based on biomechanical data measured during computer work. It was found that mouse work can produce large deformation fields on the median nerve region. Also, the high stressing effect of the carpal ligament was verified. Keyboard work produced considerable and heterogeneous elongations along the longitudinal axis of the median nerve. Our study provides evidence that increased intracarpal canal pressures caused by awkward wrist postures imposed during computer work were associated directly with deformation of the median nerve. Despite the limitations of the present study the findings could be considered as a contribution to the understanding of the development of CTS due to exposure to computer work. Copyright © 2014 Elsevier Ltd. All rights reserved.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior
2004-12-22
, immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.
Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher
2017-11-01
Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.
Matched experimental and computational simulations of paintball eye impacts.
Kennedy, Eric A; Stitzel, Joel D; Duma, Stefan M
2008-01-01
Over 1200 paintball related eye injuries are treated every year in US emergency departments. These injuries can be manifested as irritation from paint splatter in the eye to catastrophic rupture of the globe. Using the Virginia Tech - Wake Forest University Eye Model, experimental paintball impacts were replicated and the experimental and computational results compared. A total of 10 paintball impacts were conducted from a range of 71.1 m/s to 112.5 m/s. All experimental tests resulted in rupture of the globe. The matched computational simulations also predicted near-failure or failure in each of the simulations, with a maximum principal stress of greater than 22.8 MPa in all scenarios, over 23 MPa for velocities above 73 m/s. Failure stress for the VT-WFU Eye Model is defined as 23 MPa. The current regulation velocity for paintballs of 91 m/s exceeds the tolerance of the eye to globe rupture and underscores the importance for eyewear in this sport.
Zradziński, Patryk
2015-01-01
Due to the various physical mechanisms of interaction between a worker's body and the electromagnetic field at various frequencies, the principles of numerical simulations have been discussed for three areas of worker exposure: to low frequency magnetic field, to low and intermediate frequency electric field and to radiofrequency electromagnetic field. This paper presents the identified difficulties in applying numerical simulations to evaluate physical estimators of direct and indirect effects of exposure to electromagnetic fields at various frequencies. Exposure of workers operating a plastic sealer have been taken as an example scenario of electromagnetic field exposure at the workplace for discussion of those difficulties in applying numerical simulations. The following difficulties in reliable numerical simulations of workers' exposure to the electromagnetic field have been considered: workers' body models (posture, dimensions, shape and grounding conditions), working environment models (objects most influencing electromagnetic field distribution) and an analysis of parameters for which exposure limitations are specified in international guidelines and standards.
Optimization of suspension smelting technology by computer simulation
Energy Technology Data Exchange (ETDEWEB)
Lilius, K.; Jokilaakso, A.; Ahokainen, T.; Teppo, O.; Yang Yongxiang [Helsinki Univ. of Technology, Otaniemi (Finland). Lab. of Materials Processing and Powder Metallurgy
1996-12-31
An industrial-scale flash smelting furnace and waste-heat boilers have been modelled by using commercial Computational-Fluid-Dynamics software. The work has proceeded from cold gas flow to heat transfer, combustion, and two-phase flow simulations. In the present study, the modelling task has been divided into three sub-models: (1) the concentrate burner, (2) the flash smelting furnace (reaction shaft and uptake shaft), and (3) the waste-heat boiler. For the concentrate burner, the flow of the process gas and distribution air together with the concentrate or a feed mixture was simulated. Eulerian - Eulerian approach was used for the carrier gas-phase and the dispersed particle-phase. A large parametric study was carried out by simulating a laboratory scale burner with varying turbulence intensities and then extending the simulations to the industrial scale model. For the flash smelting furnace, the simulation work concentrated on gas and gas-particle two-phase flows, as well as the development of combustion model for sulphide concentrate particles. Both Eulerian and Lagrangian approaches have been utilised in describing the particle phase and the spreading of the concentrate in the reaction shaft as well as the particle tracks have been obtained. Combustion of sulphides was first approximated with gaseous combustion by using a built-in combustion model of the software. The real oxidation reactions of the concentrate particles were then coded as a user-defined sub-routine and that was tested with industrial flash smelting cases. For the waste-heat boiler, both flow and heat transfer calculations have been carried out for an old boiler and a modified boiler SULA 2 Research Programme; 23 refs.
DROpS: an object of learning in computer simulation of discrete events
Directory of Open Access Journals (Sweden)
Hugo Alves Silva Ribeiro
2015-09-01
Full Text Available This work presents the “Realistic Dynamics Of Simulated Operations” (DROpS, the name given to the dynamics using the “dropper” device as an object of teaching and learning. The objective is to present alternatives for professors teaching content related to simulation of discrete events to graduate students in production engineering. The aim is to enable students to develop skills related to data collection, modeling, statistical analysis, and interpretation of results. This dynamic has been developed and applied to the students by placing them in a situation analogous to a real industry, where various concepts related to computer simulation were discussed, allowing the students to put these concepts into practice in an interactive manner, thus facilitating learning
Moon, Hongsik
What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
Computer Simulation of Hydraulic Systems with Typical Nonlinear Characteristics
Directory of Open Access Journals (Sweden)
D. N. Popov
2017-01-01
Full Text Available The task was to synthesise an adjustable hydraulic system structure, the mathematical model of which takes into account its inherent nonlinearity. Its solution suggests using a successive computer simulations starting with a structure of the linearized stable hydraulic system, which is then complicated by including the essentially non-linear elements. The hydraulic system thus obtained may be unable to meet the Lyapunov stability criterion and be unstable. This can be eliminated through correcting elements. Control of correction results is provided according to the form of transition processes due to stepwise variation of the control signal.Computer simulation of a throttle-controlled electrohydraulic servo drive with the rotary output element illustrates the proposed method application. A constant pressure power source provides fluid feed for the drive under pressure.For drive simulation the following models were involved: the linear model, the model taking into consideration a non-linearity of the flow-dynamic characteristics of a spool-type valve, and the non-linear models that take into account the dry friction in the spool-type valve, the backlash in the steering angle sensor of the motor shaft.The paper shows possibility of damping oscillation caused by variable hydrodynamic forces through introducing a correction device.The list of references attached contains 16 sources, which were used to justify and explain certain factors of the automatic control theory and the fluid mechanics of unsteady flows.The article presents 6 block-diagrams of the electrohydraulic servo drive and their appropriate transition processes, which have been studied.
Computer Simulation of Embryonic Systems: What can a ...
(1) Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative pr
Workbench for the computer simulation of underwater gated viewing systems
Braesicke, K.; Wegner, D.; Repasi, E.
2017-05-01
In this paper we introduce a software tool for image based computer simulation of an underwater gated viewing system. This development is helpful as a tool for the discussion of a possible engagement of a gated viewing camera for underwater imagery. We show the modular structure of implemented input parameter sets for camera, laser and environment description and application examples of the software tool. The whole simulation includes the scene illumination through a laser pulse with its energy pulse form and length as well as the propagation of the light through the open water taking into account complex optical properties of the environment. The scene is modeled as a geometric shape with diverse reflective areas and optical surface properties submerged in the open water. The software is based on a camera model including image degradation due to diffraction, lens transmission, detector efficiency and image enhancement by digital signal processing. We will show simulation results on some example configurations. Finally we will discuss the limits of our method and give an outlook to future development.
Value stream mapping in a computational simulation model
Directory of Open Access Journals (Sweden)
Ricardo Becker Mendes de Oliveira
2014-08-01
Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.
Protein adsorption on nanoparticles: model development using computer simulation.
Shao, Qing; Hall, Carol K
2016-10-19
The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter = 5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.
Protein adsorption on nanoparticles: model development using computer simulation
Shao, Qing; Hall, Carol K.
2016-10-01
The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter = 5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.
Computational simulation of bone fracture healing under inverse dynamisation.
Wilson, Cameron J; Schütz, Michael A; Epari, Devakara R
2017-02-01
Adaptive finite element models have allowed researchers to test hypothetical relationships between the local mechanical environment and the healing of bone fractures. However, their predictive power has not yet been demonstrated by testing hypotheses ahead of experimental testing. In this study, an established mechano-biological scheme was used in an iterative finite element simulation of sheep tibial osteotomy healing under a hypothetical fixation regime, "inverse dynamisation". Tissue distributions, interfragmentary movement and stiffness across the fracture site were compared between stiff and flexible fixation conditions and scenarios in which fixation stiffness was increased at a discrete time-point. The modelling work was conducted blind to the experimental study to be published subsequently. The simulations predicted the fastest and most direct healing under constant stiff fixation, and the slowest healing under flexible fixation. Although low fixation stiffness promoted more callus formation prior to bridging, this conferred little additional stiffness to the fracture in the first 5 weeks. Thus, while switching to stiffer fixation facilitated rapid subsequent bridging of the fracture, no advantage of inverse dynamisation could be demonstrated. In vivo data remains necessary to conclusively test this treatment protocol and this will, in turn, provide an evaluation of the model's performance. The publication of both hypotheses and their computational simulation, prior to experimental testing, offers an appealing means to test the predictive power of mechano-biological models.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison
2002-04-30
This is the sixth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of our IGCC workbench. Preliminary CFD simulations for single stage and two stage ''generic'' gasifiers using firing conditions based on the Vision 21 reference configuration have been performed. Work is continuing on implementing an advanced slagging model into the CFD based gasifier model. An investigation into published gasification kinetics has highlighted a wide variance in predicted performance due to the choice of kinetic parameters. A plan has been outlined for developing the reactor models required to simulate the heat transfer and gas clean up equipment downstream of the gasifier. Three models that utilize the CCA software protocol have been integrated into a version of the IGCC workbench. Tests of a CCA implementation of our CFD code into the workbench demonstrated that the CCA CFD module can execute on a geographically remote PC (linked via the Internet) in a manner that is transparent to the user. Software tools to create ''walk-through'' visualizations of the flow field within a gasifier have been demonstrated.
A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION
Energy Technology Data Exchange (ETDEWEB)
Mike Bockelie; Dave Swensen; Martin Denison
2002-01-31
This is the fifth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, our efforts have become focused on developing an improved workbench for simulating a gasifier based Vision 21 energyplex. To provide for interoperability of models developed under Vision 21 and other DOE programs, discussions have been held with DOE and other organizations developing plant simulator tools to review the possibility of establishing a common software interface or protocol to use when developing component models. A component model that employs the CCA protocol has successfully been interfaced to our CCA enabled workbench. To investigate the software protocol issue, DOE has selected a gasifier based Vision 21 energyplex configuration for use in testing and evaluating the impacts of different software interface methods. A Memo of Understanding with the Cooperative Research Centre for Coal in Sustainable Development (CCSD) in Australia has been completed that will enable collaborative research efforts on gasification issues. Preliminary results have been obtained for a CFD model of a pilot scale, entrained flow gasifier. A paper was presented at the Vision 21 Program Review Meeting at NETL (Morgantown) that summarized our accomplishments for Year One and plans for Year Two and Year Three.
Directory of Open Access Journals (Sweden)
2007-01-01
Full Text Available Basing on mean field theory and corporate entrepreneurship (CE concept a mathematical model of complex organization has been derived. The model was applied to computer simulations of corporation's reaction to hostile environment in corporations similar to British Petroleum.
Herrera, I.; Herrera, G. S.
2015-12-01
Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)
2nd International Doctoral Symposium on Applied Computation and Security Systems
Cortesi, Agostino; Saeed, Khalid; Chaki, Nabendu
2016-01-01
The book contains the extended version of the works that have been presented and discussed in the Second International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2015) held during May 23-25, 2015 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland; Ca’ Foscari University, Venice, Italy and University of Calcutta, India. The book is divided into volumes and presents dissertation works in the areas of Image Processing, Biometrics-based Authentication, Soft Computing, Data Mining, Next Generation Networking and Network Security, Remote Healthcare, Communications, Embedded Systems, Software Engineering and Service Engineering.
Knapczyk, J.; Tora, G.
2014-08-01
A novel parallel manipulator with 3 legs (2 actuated by linear actuators and one supporting pillar),which is applied in a wheel loader driving simulator, is proposed in this paper. The roll angle and the pitch angle of the platform are derived in closed-form of functions of the variable lengths of two actuators. The linear velocity and acceleration of the selected point and angular velocity of the moving platform are determined and compared with measurement results obtained in the respective point and in the body of the wheel loader. The differences between the desired and actual actuator displacements are used as feedback to compute how much force to send to the actuators as some function of the servo error. A numerical example with a proposed mechanism as a driving simulator is presented
1992-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, fluid mechanics including fluid dynamics, acoustics, and combustion, aerodynamics, and computer science during the period 1 Apr. 1992 - 30 Sep. 1992 is summarized.
Energy Technology Data Exchange (ETDEWEB)
Zuo, Wangda [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McNeil, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wetter, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lee, Eleanor S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2013-05-23
Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach was evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.
Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.
Liao, Wen-Hwa; Qiu, Wan-Li
2016-01-01
Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.
Fundamental problems in porous materials: Experiments & computer simulation
Xu, Zhanping
Porous materials have attracted massive scientific and technological interest because of their extremely high surface-to-volume ratio, molecular tunability in construction, and surface-based applications. Through my PhD work, porous materials were engineered to meet the design in selective binding, self-healing, and energy damping. For example, crystalline MOFs with pore size spanning from a few angstroms to a couple of nanometers were chemically engineered to show 120 times more efficiency in binding of large molecules. In addition, we found building blocks released from those crystals can be further patched back through a healing process at ambient and low temperatures down to -56 °C. When building blocks are replaced with graphenes, ultra-flyweight aerogels with pore size larger than 100 nm were made to delay shock waves. More stable rigid porous metal with larger pores (~um) was also fabricated, and its performance and survivability are under investigation. Aside from experimental studies, we also successfully applied numerical simulations to study the mutual interaction between the nonplanar liquid-solid interface and colloidal particles during the freezing of the colloidal suspensions. Colloidal particles can be either rejected or engulfed by the evolving interface depending on the freezing speed and strength of interface-particle interaction. Our interactive simulation was achieved by programming both simulation module and visualization module on high performance GPU devices.
Hambli, Ridha; Katerchi, Houda; Benhamou, Claude-Laurent
2011-02-01
The aim of this paper is to develop a multiscale hierarchical hybrid model based on finite element analysis and neural network computation to link mesoscopic scale (trabecular network level) and macroscopic (whole bone level) to simulate the process of bone remodelling. As whole bone simulation, including the 3D reconstruction of trabecular level bone, is time consuming, finite element calculation is only performed at the macroscopic level, whilst trained neural networks are employed as numerical substitutes for the finite element code needed for the mesoscale prediction. The bone mechanical properties are updated at the macroscopic scale depending on the morphological and mechanical adaptation at the mesoscopic scale computed by the trained neural network. The digital image-based modelling technique using μ-CT and voxel finite element analysis is used to capture volume elements representative of 2 mm³ at the mesoscale level of the femoral head. The input data for the artificial neural network are a set of bone material parameters, boundary conditions and the applied stress. The output data are the updated bone properties and some trabecular bone factors. The current approach is the first model, to our knowledge, that incorporates both finite element analysis and neural network computation to rapidly simulate multilevel bone adaptation.
Features of development and analysis of the simulation model of a multiprocessor computer system
Directory of Open Access Journals (Sweden)
O. M. Brekhov
2017-01-01
Full Text Available Over the past decade, multiprocessor systems have been applied in computer technology. At present,multi-core processors are equipped not only with supercomputers, but also with the vast majority of mobile devices. This creates the need for students to learn the basic principles of their construction and functioning.One of the possible methods for analyzing the operation of multiprocessor systems is simulation modeling.Its use contributes to a better understanding of the effect of workload and structure parameters on performance. The article considers the features of the development of the simulation model for estimating the time characteristics of a multiprocessor computer system, as well as the use of the regenerative method of model analysis. The characteristics of the software implementation of the inverse kinematics solution of the robot are adopted as a workload. The given task consists in definition of turns in joints of the manipulator on known angular and linear position of its grasp. An analytical algorithm for solving the problem was chosen, namely, the method of simple kinematic relations. The work of the program is characterized by the presence of parallel calculations, during which resource conflicts arise between the processor cores, involved in simultaneous access to the memory via a common bus. In connection with the high information connectivity between parallel running programs, it is assumed that all processing cores use shared memory. The simulation model takes into account probabilistic memory accesses and tracks emerging queues to shared resources. The collected statistics reveal the productive and overhead time costs for the program implementation for each processor core involved. The simulation results show the unevenness of kernel utilization, downtime in queues to shared resources and temporary losses while waiting for other cores due to information dependencies. The results of the simulation are estimated by the
Mendoza, Patricia; d'Anjou, Marc-André; Carmel, Eric N; Fournier, Eric; Mai, Wilfried; Alexander, Kate; Winter, Matthew D; Zwingenberger, Allison L; Thrall, Donald E; Theoret, Christine
2014-01-01
Understanding radiographic anatomy and the effects of varying patient and radiographic tube positioning on image quality can be a challenge for students. The purposes of this study were to develop and validate a novel technique for creating simulated radiographs using computed tomography (CT) datasets. A DICOM viewer (ORS Visual) plug-in was developed with the ability to move and deform cuboidal volumetric CT datasets, and to produce images simulating the effects of tube-patient-detector distance and angulation. Computed tomographic datasets were acquired from two dogs, one cat, and one horse. Simulated radiographs of different body parts (n = 9) were produced using different angles to mimic conventional projections, before actual digital radiographs were obtained using the same projections. These studies (n = 18) were then submitted to 10 board-certified radiologists who were asked to score visualization of anatomical landmarks, depiction of patient positioning, realism of distortion/magnification, and image quality. No significant differences between simulated and actual radiographs were found for anatomic structure visualization and patient positioning in the majority of body parts. For the assessment of radiographic realism, no significant differences were found between simulated and digital radiographs for canine pelvis, equine tarsus, and feline abdomen body parts. Overall, image quality and contrast resolution of simulated radiographs were considered satisfactory. Findings from the current study indicated that radiographs simulated using this new technique are comparable to actual digital radiographs. Further studies are needed to apply this technique in developing interactive tools for teaching radiographic anatomy and the effects of varying patient and tube positioning. © 2013 American College of Veterinary Radiology.
Simulation of branching blood flows on parallel computers.
Yue, Xue; Hwang, Feng-Nan; Shandas, Robin; Cai, Xiao-Chuan
2004-01-01
We present a fully parallel nonlinearly implicit algorithm for the numerical simulation of some branching blood flow problems, which require efficient and robust solver technologies in order to handle the high nonlinearity and the complex geometry. Parallel processing is necessary because of the large number of mesh points needed to accurately discretize the system of differential equations. In this paper we introduce a parallel Newton-Krylov-Schwarz based implicit method, and software for distributed memory parallel computers, for solving the nonlinear algebraic systems arising from a Q2-Q1 finite element discretization of the incompressible Navier-Stokes equations that we use to model the blood flow in the left anterior descending coronary artery.
Application of Computer Simulation Modeling to Medication Administration Process Redesign
Directory of Open Access Journals (Sweden)
Nathan Huynh
2012-01-01
Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.
Computer simulation of an industrial wastewater treatment process
Energy Technology Data Exchange (ETDEWEB)
Jenke, D.R.; Diebold, F.E.
1985-01-01
The computer program REDEQL.EPAK has been modified to allow for the prediction and simulation of the chemical effects of mixing 2 or more aqueous solutions and one or more solid phases. In this form the program is capable of modelling the lime neutralisation treatment process for acid mine waters. The program calculates the speciation of all influent solutions, evaluates the equilibrium composition of any mixed solution and provides the stoichiometry of the liquid and solid phases produced as a result of the mixing. The program is used to predict the optimum treatment effluent composition, to determine the amount of neutralising agent (lime) required to produce this optimum composition and to provide information which defines the mechanism controlling the treatment process.
Computer simulation of randomly cross-linked polymer networks
Williams, T P
2002-01-01
In this work, Monte Carlo and Stochastic Dynamics computer simulations of mesoscale model randomly cross-linked networks were undertaken. Task parallel implementations of the lattice Monte Carlo Bond Fluctuation model and Kremer-Grest Stochastic Dynamics bead-spring continuum model were designed and used for this purpose. Lattice and continuum precursor melt systems were prepared and then cross-linked to varying degrees. The resultant networks were used to study structural changes during deformation and relaxation dynamics. The effects of a random network topology featuring a polydisperse distribution of strand lengths and an abundance of pendant chain ends, were qualitatively compared to recent published work. A preliminary investigation into the effects of temperature on the structural and dynamical properties was also undertaken. Structural changes during isotropic swelling and uniaxial deformation, revealed a pronounced non-affine deformation dependant on the degree of cross-linking. Fractal heterogeneiti...
COMPUTER EMULATORS AND SIMULATORS OFMEASURING INSTRUMENTS ON THE PHYSICS LESSONS
Directory of Open Access Journals (Sweden)
Yaroslav Yu. Dyma
2010-10-01
Full Text Available Prominent feature of educational physical experiment at the present stage is applications of computer equipment and special software – virtual measuring instruments. The purpose of this article – to explain, when by means of virtual instruments it is possible to lead real experience (they are emulators, and when – virtual (they are simulators. For the best understanding implementation of one laboratory experimentation with usage of the software of both types is given. As at learning physics advantage should be given to carrying out of natural experiment with learning the real phenomena and measuring of real physical quantities the most perspective examination of programs-emulators of measuring instruments for their further implantation in educational process sees.
Computer simulation of cluster impact induced electronic excitation of solids
Energy Technology Data Exchange (ETDEWEB)
Weidtmann, B.; Hanke, S.; Duvenbeck, A. [Fakultät für Physik, Universität Duisburg-Essen, 47048 Duisburg (Germany); Wucher, A., E-mail: andreas.wucher@uni-deu.de [Fakultät für Physik, Universität Duisburg-Essen, 47048 Duisburg (Germany)
2013-05-15
We present a computational study of electronic excitation upon bombardment of a metal surface with cluster projectiles. Our model employs a molecular dynamics (MD) simulation to calculate the particle dynamics following the projectile impact. Kinetic excitation is implemented via two mechanisms describing the electronic energy loss of moving particles: autoionization in close binary collisions and a velocity proportional friction force resulting from direct atom–electron collisions. Two different friction models are compared with respect to the predicted sputter yields after single atom and cluster bombardment. We find that a density dependent friction coefficient leads to a significant reduction of the total energy transferred to the electronic sub-system as compared to the Lindhard friction model, thereby strongly enhancing the predicted sputter yield under cluster bombardment conditions. In contrast, the yield predicted for monoatomic projectile bombardment remains practically unchanged.
Mixed-Language High-Performance Computing for Plasma Simulations
Directory of Open Access Journals (Sweden)
Quanming Lu
2003-01-01
Full Text Available Java is receiving increasing attention as the most popular platform for distributed computing. However, programmers are still reluctant to embrace Java as a tool for writing scientific and engineering applications due to its still noticeable performance drawbacks compared with other programming languages such as Fortran or C. In this paper, we present a hybrid Java/Fortran implementation of a parallel particle-in-cell (PIC algorithm for plasma simulations. In our approach, the time-consuming components of this application are designed and implemented as Fortran subroutines, while less calculation-intensive components usually involved in building the user interface are written in Java. The two types of software modules have been glued together using the Java native interface (JNI. Our mixed-language PIC code was tested and its performance compared with pure Java and Fortran versions of the same algorithm on a Sun E6500 SMP system and a Linux cluster of Pentium~III machines.
HIGH-FIDELITY SIMULATION-DRIVEN MODEL DEVELOPMENT FOR COARSE-GRAINED COMPUTATIONAL FLUID DYNAMICS
Energy Technology Data Exchange (ETDEWEB)
Hanna, Botros N.; Dinh, Nam T.; Bolotnov, Igor A.
2016-06-01
Nuclear reactor safety analysis requires identifying various credible accident scenarios and determining their consequences. For a full-scale nuclear power plant system behavior, it is impossible to obtain sufficient experimental data for a broad range of risk-significant accident scenarios. In single-phase flow convective problems, Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES) can provide us with high fidelity results when physical data are unavailable. However, these methods are computationally expensive and cannot be afforded for simulation of long transient scenarios in nuclear accidents despite extraordinary advances in high performance scientific computing over the past decades. The major issue is the inability to make the transient computation parallel, thus making number of time steps required in high-fidelity methods unaffordable for long transients. In this work, we propose to apply a high fidelity simulation-driven approach to model sub-grid scale (SGS) effect in Coarse Grained Computational Fluid Dynamics CG-CFD. This approach aims to develop a statistical surrogate model instead of the deterministic SGS model. We chose to start with a turbulent natural convection case with volumetric heating in a horizontal fluid layer with a rigid, insulated lower boundary and isothermal (cold) upper boundary. This scenario of unstable stratification is relevant to turbulent natural convection in a molten corium pool during a severe nuclear reactor accident, as well as in containment mixing and passive cooling. The presented approach demonstrates how to create a correction for the CG-CFD solution by modifying the energy balance equation. A global correction for the temperature equation proves to achieve a significant improvement to the prediction of steady state temperature distribution through the fluid layer.
Key issues in the computational simulation of GPCR function: representation of loop domains
Mehler, E. L.; Periole, X.; Hassan, S. A.; Weinstein, H.
2002-11-01
Some key concerns raised by molecular modeling and computational simulation of functional mechanisms for membrane proteins are discussed and illustrated for members of the family of G protein coupled receptors (GPCRs). Of particular importance are issues related to the modeling and computational treatment of loop regions. These are demonstrated here with results from different levels of computational simulations applied to the structures of rhodopsin and a model of the 5-HT2A serotonin receptor, 5-HT2AR. First, comparative Molecular Dynamics (MD) simulations are reported for rhodopsin in vacuum and embedded in an explicit representation of the membrane and water environment. It is shown that in spite of a partial accounting of solvent screening effects by neutralization of charged side chains, vacuum MD simulations can lead to severe distortions of the loop structures. The primary source of the distortion appears to be formation of artifactual H-bonds, as has been repeatedly observed in vacuum simulations. To address such shortcomings, a recently proposed approach that has been developed for calculating the structure of segments that connect elements of secondary structure with known coordinates, is applied to 5-HT2AR to obtain an initial representation of the loops connecting the transmembrane (TM) helices. The approach consists of a simulated annealing combined with biased scaled collective variables Monte Carlo technique, and is applied to loops connecting the TM segments on both the extra-cellular and the cytoplasmic sides of the receptor. Although this initial calculation treats the loops as independent structural entities, the final structure exhibits a number of interloop interactions that may have functional significance. Finally, it is shown here that in the case where a given loop from two different GPCRs (here rhodopsin and 5-HT2AR) has approximately the same length and some degree of sequence identity, the fold adopted by the loops can be similar. Thus
Petascale computation of multi-physics seismic simulations
Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie; Duru, Kenneth C.
2017-04-01
Capturing the observed complexity of earthquake sources in concurrence with seismic wave propagation simulations is an inherently multi-scale, multi-physics problem. In this presentation, we present simulations of earthquake scenarios resolving high-detail dynamic rupture evolution and high frequency ground motion. The simulations combine a multitude of representations of model complexity; such as non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure to capture dynamic rupture behavior at the source; and seismic wave attenuation, 3D subsurface structure and bathymetry impacting seismic wave propagation. Performing such scenarios at the necessary spatio-temporal resolution requires highly optimized and massively parallel simulation tools which can efficiently exploit HPC facilities. Our up to multi-PetaFLOP simulations are performed with SeisSol (www.seissol.org), an open-source software package based on an ADER-Discontinuous Galerkin (DG) scheme solving the seismic wave equations in velocity-stress formulation in elastic, viscoelastic, and viscoplastic media with high-order accuracy in time and space. Our flux-based implementation of frictional failure remains free of spurious oscillations. Tetrahedral unstructured meshes allow for complicated model geometry. SeisSol has been optimized on all software levels, including: assembler-level DG kernels which obtain 50% peak performance on some of the largest supercomputers worldwide; an overlapping MPI-OpenMP parallelization shadowing the multiphysics computations; usage of local time stepping; parallel input and output schemes and direct interfaces to community standard data formats. All these factors enable aim to minimise the time-to-solution. The results presented highlight the fact that modern numerical methods and hardware-aware optimization for modern supercomputers are essential
Computer simulations for minds-on learning with ``Project Spectra!''
Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.
2010-12-01
How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.
Filter wheel equalization for chest radiography: a computer simulation.
Boone, J M; Duryea, J; Steiner, R M
1995-07-01
A chest radiographic equalization system using lung-shaped templates mounted on filter wheels is under development. Using this technique, 25 lung templates for each lung are available on two computer controlled wheels which are located in close proximity to the x-ray tube. The large magnification factor (> 10X) of the templates assures low-frequency equalization due to the blurring of the focal spot. A low-dose image is acquired without templates using a (generic) digital receptor, the image is analyzed, and the left and right lung fields are automatically identified using software developed for this purpose. The most appropriate left and right lung templates are independently selected and are positioned into the field of view at the proper location under computer control. Once the templates are positioned, acquisition of the equalized radiographic image onto film commences at clinical exposure levels. The templates reduce the exposure to the lung fields by attenuating a fraction of the incident x-ray fluence so that the exposure to the mediastinum and diaphragm areas can be increased without overexposing the lungs. A data base of 824 digitized chest radiographs was used to determine the shape of the specific lung templates, for both left and right lung fields. A second independent data base of 208 images was used to test the performance of the templates using computer simulations. The template shape characteristics derived from the clinical image data base are demonstrated. The detected exposure in the lung fields on conventional chest radiographs was found to be, on average, three times the detected exposure behind the diaphragm and mediastinum.(ABSTRACT TRUNCATED AT 250 WORDS)
Poikela, Paula; Ruokamo, Heli; Teräs, Marianne
2015-02-01
Nursing educators must ensure that nursing students acquire the necessary competencies; finding the most purposeful teaching methods and encouraging learning through meaningful learning opportunities is necessary to meet this goal. We investigated student learning in a simulated nursing practice using videography. The purpose of this paper is to examine how two different teaching methods presented students' meaningful learning in a simulated nursing experience. The 6-hour study was divided into three parts: part I, general information; part II, training; and part III, simulated nursing practice. Part II was delivered by two different methods: a computer-based simulation and a lecture. The study was carried out in the simulated nursing practice in two universities of applied sciences, in Northern Finland. The participants in parts II and I were 40 first year nursing students; 12 student volunteers continued to part III. Qualitative analysis method was used. The data were collected using video recordings and analyzed by videography. The students who used a computer-based simulation program were more likely to report meaningful learning themes than those who were first exposed to lecture method. Educators should be encouraged to use computer-based simulation teaching in conjunction with other teaching methods to ensure that nursing students are able to receive the greatest educational benefits. Copyright © 2014 Elsevier Ltd. All rights reserved.
Electron-Anode Interactions in Particle-in-Cell Simulations of Applied-B Ion Diodes
Energy Technology Data Exchange (ETDEWEB)
Bailey, J.E.; Cuneo, M.D.; Johnson, D.J.; Mehlhorn, T.A.; Pointon, T.D.; Renk, T.J.; Stygar, W.A.; Vesey, R.A.
1998-11-12
Particle-in-cell simulations of applied-B ion diodes using the QUICKSILVER code have been augmented with Monte Carlo calculations of electron-anode interactions (reflection and energy deposition). Extraction diode simulations demonstrate a link between the instability evolution and increased electron loss and anode heating. Simulations of radial and extraction ion diodes show spatial non-uniformity in the predicted electron loss profile leading to hot spots on the anode that rapidly exceed the 350-450 {degree}C range, known to be sufficient for plasma formation on electron-bombarded surfaces. Thermal resorption calculations indicate complete resorption of contaminants with 15-20 kcal/mole binding energies in high-dose regions of the anode during the power pulse. Comparisons of parasitic ion emission simulations and experiment show agreement in some aspects; but also highlight the need for better ion source, plasma, and neutral gas models.
Theophanides, Mike; Anastassopoulou, Jane
2009-07-01
This study presents an improved methodology for analysing atmospheric pollution around airports using Gaussian-plume numerical simulation integrated with Geographical Information Systems (GIS). The new methodology focuses on streamlining the lengthy analysis process for Airport Environmental Impact Assessments by integrating the definition of emission sources, simulating and displaying the results in a GIS environment. One of the objectives of the research is to validate the methodology applied to the Athens International Airport, "Eleftherios Venizelos", to produce a realistic estimate of emission inventories, dispersion simulations and comparison to measured data. The methodology used a combination of the Emission Dispersion and Modelling System (EDMS) and the Atmospheric Dispersion and Modelling system (ADMS) to improve the analysis process. The second objective is to conduct numerical simulations under various adverse conditions (e.g. scenarios) and assess the dispersion in the surrounding areas. The study concludes that the use of GIS in environmental assessments provides a valuable advantage for organizing data and entering accurate geographical/topological information for the simulation engine. Emissions simulation produced estimates within 10% of published values. Dispersion simulations indicate that airport pollution will affect neighbouring cities such as Rafina and Loutsa. Presently, there are no measured controls in these areas. In some cases, airport pollution can contribute to as much as 40% of permissible EU levels in VOCs.
Energy Technology Data Exchange (ETDEWEB)
Sun Qi; Groth, Alexandra; Bertram, Matthias; Waechter, Irina; Bruijns, Tom; Hermans, Roel; Aach, Til [Philips Research Europe, Weisshausstrasse 2, 52066 Aachen (Germany) and Institute of Imaging and Computer Vision, RWTH Aachen University, Sommerfeldstrasse 24, 52074 Aachen (Germany); Philips Research Europe, Weisshausstrasse 2, 52066 Aachen (Germany); Philips Healthcare, X-Ray Pre-Development, Veenpluis 4-6, 5684PC Best (Netherlands); Institute of Imaging and Computer Vision, RWTH Aachen University, Sommerfeldstrasse 24, 52074 Aachen (Germany)
2010-09-15
Purpose: Recently, image-based computational fluid dynamics (CFD) simulation has been applied to investigate the hemodynamics inside human cerebral aneurysms. The knowledge of the computed three-dimensional flow fields is used for clinical risk assessment and treatment decision making. However, the reliability of the application specific CFD results has not been thoroughly validated yet. Methods: In this work, by exploiting a phantom aneurysm model, the authors therefore aim to prove the reliability of the CFD results obtained from simulations with sufficiently accurate input boundary conditions. To confirm the correlation between the CFD results and the reality, virtual angiograms are generated by the simulation pipeline and are quantitatively compared to the experimentally acquired angiograms. In addition, a parametric study has been carried out to systematically investigate the influence of the input parameters associated with the current measuring techniques on the flow patterns. Results: Qualitative and quantitative evaluations demonstrate good agreement between the simulated and the real flow dynamics. Discrepancies of less than 15% are found for the relative root mean square errors of time intensity curve comparisons from each selected characteristic position. The investigated input parameters show different influences on the simulation results, indicating the desired accuracy in the measurements. Conclusions: This study provides a comprehensive validation method of CFD simulation for reproducing the real flow field in the cerebral aneurysm phantom under well controlled conditions. The reliability of the CFD is well confirmed. Through the parametric study, it is possible to assess the degree of validity of the associated CFD model based on the parameter values and their estimated accuracy range.
DEFF Research Database (Denmark)
Loubet, Bastien; Lomholt, Michael Andersen; Khandelia, Himanshu
2013-01-01
We investigate the effect of an applied electric potential on the mechanics of a coarse grained POPC bilayer under tension. The size and duration of our simulations allow for a detailed and accurate study of the fluctuations. Effects on the fluctuation spectrum, tension, bending rigidity, and bil......We investigate the effect of an applied electric potential on the mechanics of a coarse grained POPC bilayer under tension. The size and duration of our simulations allow for a detailed and accurate study of the fluctuations. Effects on the fluctuation spectrum, tension, bending rigidity......, and bilayer thickness are investigated in detail. In particular, the least square fitting technique is used to calculate the fluctuation spectra. The simulations confirm a recently proposed theory that the effect of an applied electric potential on the membrane will be moderated by the elastic properties...... fluctuations. The effect of the applied electric potential on the bending rigidity is non-existent within error bars. However, when the membrane is stretched there is a point where the bending rigidity is lowered due to a decrease of the thickness of the membrane. All these effects should prove important...
Computer simulation of disordered compounds and solid solutions
Pongsai, B
2001-01-01
simulations have also been carried out for comparison. Atomistic simulations (QLD and MCX) with full relaxation predict DELTA H sub m sub i sub x values that are less negative than those from the HF calculations in the static limit. The phase diagram is also calculated from the MCX calculations, indicating complete miscibility at temperatures as low as 200 K. In Chapter 5, both QLD and MCX methods are used to investigate the disordered metallic alloy Pd-Rh. At 300 K, vibrational contributions contribute significantly to the entropy, but not the enthalpy of mixing. The calculated consolute temperature is just above 1200 K in good agreement with experiment. In Chapter 6, we apply our new technique to the order-disorder phase transition of a complex oxygen-deficient perovskite Sr sub 2 Fe sub 2 O sub 5. The calculated order-disorder phase transition occurs between 700 and 930 K, in good agreement with experiment. Investigation of individual configurations shows the coexistence of four-, five- and six-coordinated...
Computer Simulation and Data Analysis in Molecular Biology and Biophysics An Introduction Using R
Bloomfield, Victor
2009-01-01
This book provides an introduction, suitable for advanced undergraduates and beginning graduate students, to two important aspects of molecular biology and biophysics: computer simulation and data analysis. It introduces tools to enable readers to learn and use fundamental methods for constructing quantitative models of biological mechanisms, both deterministic and with some elements of randomness, including complex reaction equilibria and kinetics, population models, and regulation of metabolism and development; to understand how concepts of probability can help in explaining important features of DNA sequences; and to apply a useful set of statistical methods to analysis of experimental data from spectroscopic, genomic, and proteomic sources. These quantitative tools are implemented using the free, open source software program R. R provides an excellent environment for general numerical and statistical computing and graphics, with capabilities similar to Matlab®. Since R is increasingly used in bioinformat...
Adaptive finite element simulation of flow and transport applications on parallel computers
Kirk, Benjamin Shelton
The subject of this work is the adaptive finite element simulation of problems arising in flow and transport applications on parallel computers. Of particular interest are new contributions to adaptive mesh refinement (AMR) in this parallel high-performance context, including novel work on data structures, treatment of constraints in a parallel setting, generality and extensibility via object-oriented programming, and the design/implementation of a flexible software framework. This technology and software capability then enables more robust, reliable treatment of multiscale--multiphysics problems and specific studies of fine scale interaction such as those in biological chemotaxis (Chapter 4) and high-speed shock physics for compressible flows (Chapter 5). The work begins by presenting an overview of key concepts and data structures employed in AMR simulations. Of particular interest is how these concepts are applied in the physics-independent software framework which is developed here and is the basis for all the numerical simulations performed in this work. This open-source software framework has been adopted by a number of researchers in the U.S. and abroad for use in a wide range of applications. The dynamic nature of adaptive simulations pose particular issues for efficient implementation on distributed-memory parallel architectures. Communication cost, computational load balance, and memory requirements must all be considered when developing adaptive software for this class of machines. Specific extensions to the adaptive data structures to enable implementation on parallel computers is therefore considered in detail. The libMesh framework for performing adaptive finite element simulations on parallel computers is developed to provide a concrete implementation of the above ideas. This physics-independent framework is applied to two distinct flow and transport applications classes in the subsequent application studies to illustrate the flexibility of the
Macular translocation surgery: computer simulation of visual perception.
Wong, D; Liazos, S; Mehta, J; Farnell, D J J
2008-06-01
Macular translocation can be associated with visual improvement, but patients often experience symptoms of confusion or diplopia. There is a high incidence of suppression of the operated or the fellow eye. The aim of this study is to use computer software to examine the pre- and post-operative fundal images, in order to better understand how patients see after macular translocation surgery. We created a graphical user interface that allowed a user to identify and record common landmark points in pre- and post-operative fundal images. We used these points to carry out interpolations using two algorithms, namely bilinear and thin-plate spline transformations. The transformations were applied to the Mona Lisa in order to appreciate how patients might see. Given two sets of corresponding points, both algorithms were able to approximate the effect of the surgery. Bilinear transformation was able to account for changes to the retina as a whole, including rotation, stretches, compression and shear. The thin-plate spline algorithm additionally accounted for the considerable regional and uneven local effects. Applying the later algorithm to the Mona Lisa produced inconsistent and warped images. Our results confirmed that neurosensory redistribution was associated with most cases of MT360. We infer from these results that corresponding retinal elements between two eyes would no longer correspond after surgery. The distortion of images from the operated eye could not be completely corrected by squint surgery, and this may account for the high incidence of suppression of the fellow or the operated eye after surgery.
Lebcir, Reda M; Choudrie, Jyoti; Atun, Rifat A; Coker, Richard J
2009-01-01
The aim of this paper is to describe the development and use of a computer simulation model that can be used as a Decision Support System (DSS) to tackle the critical public health issues of HIV and HIV-related tuberculosis in the Russian Federation. This country has recently witnessed an explosion of HIV infections and a worrying spread of the Multi-Drug Resistant form of Tuberculosis (MDRTB). The conclusions drawn are that a high population coverage with Highly Active Anti-Retroviral Treatment (HAART) (75% or higher), allied with high MDRTB cure rates, reduces cumulative deaths by 60%, with limited impact below this level. This research offers a simulation model that can be applied as a DSS by public health officials to inform policy making. By doing so, ways of controlling the spread of HIV and MDRTB and reduce mortality from these serious public health threats is provided.
Chard, T
1989-05-01
A computer simulation is described which generates 'cases' of vaginal discharge. This simulation was used to evaluate the potential effects of dependence between clinical features on the diagnostic performance of Bayes' theorem. The following observations were made: (1) dependence between some but not all pairs of features reduced the overall diagnostic efficiency (judged by the number of true positive diagnoses); (2) the overall reduction in efficiency was never substantial; (3) the largest effects on the diagnosis of individual conditions was observed with rarer diseases, and with combinations of features which were inherently unlikely to be dependent. It is concluded that the diagnostic efficiency of Bayes' theorem will not be greatly influenced by dependence if a reasonable amount of commonsense is applied to the selection of the knowledge base.
[New methods in training of paediatric emergencies: medical simulation applied to paediatrics].
González Gómez, J M; Chaves Vinagre, J; Ocete Hita, E; Calvo Macías, C
2008-06-01
Patient safety constitutes one of the main objectives in health care. Among other recommendations, such as the creation of training centres and the development of patient safety programmes, of great importance is the creation of training programmes for work teams using medical simulation. Medical simulation is defined as "a situation or environment created to allow persons to experience a representation of a real event for the purpose of practice, learning, evaluation or to understand systems or human actions". In this way, abilities can be acquired in serious and uncommon situations with no risk of harm to the patient. This study revises the origins of medical simulation and the different types of simulation are classified. The main simulators currently used in Pediatrics are presented, and the design of a simulation course applied to the training of pediatric emergencies is described, detailing all its different phases. In the first non face-to-face stage, a new concept in medical training known as e-learning is applied. In the second phase, clinical cases are carried out using robotic simulation; this is followed by a debriefing session, which is a key element for acquiring abilities and skills. Lastly, the follow-up phase allows the student to connect with the teachers to consolidate the concepts acquired during the in-person phase. In this model, the aim is to improve scientific-technical abilities in addition to a series of related abilities such as controlling crisis situations, correct leadership of work teams, distribution of tasks, communication among the team members, etc., all of these within the present concept of excellence in care and medical professionalism.
Computer-Aided Design, Modeling and Simulation of a New Solar Still Design
Directory of Open Access Journals (Sweden)
Jeremy (Zheng Li
2011-01-01
Full Text Available The clean and pure drinking water is important in today's life but current water sources are usually brackish with bacteria that cannot be used for drinking. About 78% of water available in the sea is salty, 21% of water is brackish, and only 1% of water is fresh. Distillation is one of the feasible processes applied to water purification, and it requires the energy inputs, such as solar radiation. Water is evaporated in this distillation process and water vapor can be separated and condensed to pure water. Now, with the change from conventional fuels to renewable and environment friendly fuels sources, the modern technology allows to use the abundant energy from the sun. It is better to use solar energy to process the water desalination since it is more economical than the use of conventional energies. The main focus of this paper is applying computer-aided modeling and simulation to design a less complex solar water distillation system. The prototype of this solar still system is also built to verify its feasibility, functionality, and reliability. The computational simulation and prototype testing show the reliability and proper functionality of this solar water distillation system.
Accurate simulation of MPPT methods performance when applied to commercial photovoltaic panels.
Cubas, Javier; Pindado, Santiago; Sanz-Andrés, Ángel
2015-01-01
A new, simple, and quick-calculation methodology to obtain a solar panel model, based on the manufacturers' datasheet, to perform MPPT simulations, is described. The method takes into account variations on the ambient conditions (sun irradiation and solar cells temperature) and allows fast MPPT methods comparison or their performance prediction when applied to a particular solar panel. The feasibility of the described methodology is checked with four different MPPT methods applied to a commercial solar panel, within a day, and under realistic ambient conditions.
Accurate Simulation of MPPT Methods Performance When Applied to Commercial Photovoltaic Panels
Directory of Open Access Journals (Sweden)
Javier Cubas
2015-01-01
Full Text Available A new, simple, and quick-calculation methodology to obtain a solar panel model, based on the manufacturers’ datasheet, to perform MPPT simulations, is described. The method takes into account variations on the ambient conditions (sun irradiation and solar cells temperature and allows fast MPPT methods comparison or their performance prediction when applied to a particular solar panel. The feasibility of the described methodology is checked with four different MPPT methods applied to a commercial solar panel, within a day, and under realistic ambient conditions.
"Simulated molecular evolution" or computer-generated artifacts?
Darius, F; Rojas, R
1994-11-01
1. The authors define a function with value 1 for the positive examples and 0 for the negative ones. They fit a continuous function but do not deal at all with the error margin of the fit, which is almost as large as the function values they compute. 2. The term "quality" for the value of the fitted function gives the impression that some biological significance is associated with values of the fitted function strictly between 0 and 1, but there is no justification for this kind of interpretation and finding the point where the fit achieves its maximum does not make sense. 3. By neglecting the error margin the authors try to optimize the fitted function using differences in the second, third, fourth, and even fifth decimal place which have no statistical significance. 4. Even if such a fit could profit from more data points, the authors should first prove that the region of interest has some kind of smoothness, that is, that a continuous fit makes any sense at all. 5. "Simulated molecular evolution" is a misnomer. We are dealing here with random search. Since the margin of error is so large, the fitted function does not provide statistically significant information about the points in search space where strings with cleavage sites could be found. This implies that the method is a highly unreliable stochastic search in the space of strings, even if the neural network is capable of learning some simple correlations. 6. Classical statistical methods are for these kind of problems with so few data points clearly superior to the neural networks used as a "black box" by the authors, which in the way they are structured provide a model with an error margin as large as the numbers being computed.7. And finally, even if someone would provide us with a function which separates strings with cleavage sites from strings without them perfectly, so-called simulated molecular evolution would not be better than random selection.Since a perfect fit would only produce exactly ones or
Fast acceleration of 2D wave propagation simulations using modern computational accelerators.
Directory of Open Access Journals (Sweden)
Wei Wang
Full Text Available Recent developments in modern computational accelerators like Graphics Processing Units (GPUs and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than 150x speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least 200x faster than the sequential implementation and 30x faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of 120x with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other
National Research Council Canada - National Science Library
Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas
2011-01-01
.... In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments...
A system simulation model applied to the production schedule of a fish processing facility
Directory of Open Access Journals (Sweden)
Carla Roberta Pereira
2012-11-01
Full Text Available The simulation seeks to import the reality to a controlled environment, where it is possible to study it behavior, under several conditions, without involving physical risks and/or high costs. Thus, the system simulation becomes a useful and powerful technique in emergence markets, as the tilapiculture sector that needs to expand its business. The main purpose of this study was the development of a simulation model to assist the decisions making of the production scheduling of a fish processing facility. It was applied, as research method, the case study and the modeling/simulation, including in this set the SimuCAD methodology and the development phases of a simulation model. The model works with several alternative scenarios, testing different working shifts, types of flows and production capacity, besides variations of the ending inventory and sales. The result of this research was a useful and differentiated model simulation to assist the decision making of the production scheduling of fish processing facility studied.
Computational model for simulation small testing launcher, technical solution
Energy Technology Data Exchange (ETDEWEB)
Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)
2014-12-10
The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital
3rd International Doctoral Symposium on Applied Computation and Security Systems
Saeed, Khalid; Cortesi, Agostino; Chaki, Nabendu
2017-01-01
This book presents extended versions of papers originally presented and discussed at the 3rd International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2016) held from August 12 to 14, 2016 in Kolkata, India. The symposium was jointly organized by the AGH University of Science & Technology, Cracow, Poland; Ca’ Foscari University, Venice, Italy; and the University of Calcutta, India. The book is divided into two volumes, Volumes 3 and 4, and presents dissertation works in the areas of Image Processing, Biometrics-based Authentication, Soft Computing, Data Mining, Next-Generation Networking and Network Security, Remote Healthcare, Communications, Embedded Systems, Software Engineering and Service Engineering. The first two volumes of the book published the works presented at the ACSS 2015, which was held from May 23 to 25, 2015 in Kolkata, India.
Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing
Directory of Open Access Journals (Sweden)
James (Jong Hyuk Park
2016-09-01
Full Text Available Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.
Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System
2017-08-01
formulation of automated computing.1 Modern computational science involves the use of digital computers to solve mathematical models of various...Control 31 5.4 Physical Invariance 34 6. Conclusion 40 7. References 41 Appendix A. Partial CTH Input for Baseline Simulation 45 Appendix B...Scientific investigation can be broadly grouped into 3 domains: experimental, theoretical, and computational. Experimental science involves direct
Application of Model-Based Signal Processing Methods to Computational Electromagnetics Simulators
National Research Council Canada - National Science Library
Ling, Hao
2000-01-01
This report summarizes the scientific progress on the research grant "Application of Model-Based Signal Processing Methods to Computational Electromagnetics Simulators" during the period 1 December...