WorldWideScience

Sample records for simulation tool fast

  1. A Deep Learning tool for fast simulation

    CERN. Geneva

    2018-01-01

    Machine Learning techniques have been used in different applications by the HEP community: in this talk, we discuss the case of detector simulation. The need for simulated events, expected in the future for LHC experiments and their High Luminosity upgrades, is increasing dramatically and requires new fast simulation solutions. We will describe an R&D activity, aimed at providing a configurable tool capable of training a neural network to reproduce the detector response and replace standard Monte Carlo simulation. This represents a generic approach in the sense that such a network could be designed and trained to simulate any kind of detector and, eventually, the whole data processing chain in order to get, directly in one step, the final reconstructed quantities, in just a small fraction of time. We will present the first application of three-dimensional convolutional Generative Adversarial Networks to the simulation of high granularity electromagnetic calorimeters. We will describe detailed validation s...

  2. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    Ko, Won Il; Kwon, Eun Ha; Kim, Ho Dong

    2005-06-15

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered.

  3. Fast simulation tool for ultraviolet radiation at the earth's surface

    Engelsen, Ola; Kylling, Arve

    2005-04-01

    FastRT is a fast, yet accurate, UV simulation tool that computes downward surface UV doses, UV indices, and irradiances in the spectral range 290 to 400 nm with a resolution as small as 0.05 nm. It computes a full UV spectrum within a few milliseconds on a standard PC, and enables the user to convolve the spectrum with user-defined and built-in spectral response functions including the International Commission on Illumination (CIE) erythemal response function used for UV index calculations. The program accounts for the main radiative input parameters, i.e., instrumental characteristics, solar zenith angle, ozone column, aerosol loading, clouds, surface albedo, and surface altitude. FastRT is based on look-up tables of carefully selected entries of atmospheric transmittances and spherical albedos, and exploits the smoothness of these quantities with respect to atmospheric, surface, geometrical, and spectral parameters. An interactive site, http://nadir.nilu.no/~olaeng/fastrt/fastrt.html, enables the public to run the FastRT program with most input options. This page also contains updated information about FastRT and links to freely downloadable source codes and binaries.

  4. Simulation of Logging-while-drilling Tool Response Using Integral Equation Fast Fourier Transform

    Sun Xiang-Yang

    2017-01-01

    Full Text Available We rely on the volume integral equation (VIE method for the simulation of loggingwhile- drilling (LWG tool response using the integral equation fast Fourier transform (IE-FFT algorithm to accelerate the computation of the matrix-vector product in the iterative solver. Depending on the virtue of the Toeplitz structure of the interpolation of the Green’s function on the uniform Cartesian grids, this method uses FFT to calculate the matrix-vector multiplication. At the same time, this method reduce the memory requirement and CPU time. In this paper, IEFFT method is first used in the simulation of LWG. Numerical results are presented to demonstrate the accuracy and efficiency of this method. Compared with the Moment of Method (MOM and other fast algorithms, IE-FFT have distinct advantages in the fact of memory requirement and CPU time. In addition, this paper study the truncation, mesh elements, the size of the interpolation grids of IE-FFT and dip formation, and give some conclusion with wide applicability.

  5. Simulation tools and new developments of the molten salt fast reactor

    Merle-Lucotte, E.; Doligez, X.; Heuer, D.; Allibert, M.; Ghetta, V.

    2010-01-01

    Starting from the Molten Salt Breeder Reactor project of Oak-Ridge, we have performed parametric studies in terms of safety coefficients, reprocessing requirements and breeding capabilities. In the frame of this major re-evaluation of the molten salt reactor (MSR), we have developed a new concept called Molten Salt Fast Reactor or MSFR, based on the Thorium fuel cycle and a fast neutron spectrum. This concept has been selected for further studies by the MSR steering committee of the Generation IV International Forum in 2009. Our reactor's studies of the MSFR concept rely on numerical simulations making use of the MCNP neutron transport code coupled with a code for materials evolution which resolves the Bateman's equations giving the population of each nucleus inside each part of the reactor at each moment. Because of MSR's fundamental characteristics compared to classical solid-fuelled reactors, the classical Bateman equations have to be modified by adding two terms representing the reprocessing capacities and the fertile or fissile alimentation. We have thus coupled neutronic and reprocessing simulation codes in a numerical tool used to calculate the extraction efficiencies of fission products, their location in the whole system (reactor and reprocessing unit) and radioprotection issues. (authors)

  6. Simulation tools

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  7. Simulation tools and new developments of the molten salt fast reactor

    Heuer, D.; Merle-Lucotte, E.; Allibert, M.; Doligez, X.; Ghetta, V.

    2010-01-01

    In the MSFR (Molten Salt Fast Reactor), the liquid fuel processing is part of the reactor where a small side stream of the molten salt is processed for fission product removal and then returned to the reactor. Because of this design characteristic, the MSFR can thus operate with a widely varying fuel composition. Our reactor's studies of the MSFR concept rely on numerical simulations making use of the MCNP neutron transport code coupled with a code for Bateman's equations computing the population of any nucleus inside any part of the reactor at any moment. The classical Bateman's equations have been modified by adding 2 terms representing the reprocessing capacities and an online addition. We have thus coupled neutronic and reprocessing simulation codes in a numerical tool used to calculate the extraction efficiencies of fission products, their location in the whole system and radioprotection issues. The very preliminary results show the potential of the neutronic-reprocessing coupling we have developed. We also show that these studies are limited by the uncertainties on the design and the knowledge of the chemical reprocessing processes. (A.C.)

  8. IslandFAST: A Semi-numerical Tool for Simulating the Late Epoch of Reionization

    Xu, Yidong; Chen, Xuelei [Key Laboratory for Computational Astrophysics, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China); Yue, Bin [National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China)

    2017-08-01

    We present the algorithm and main results of our semi-numerical simulation, islandFAST, which was developed from 21cmFAST and designed for the late stage of reionization. The islandFAST simulation predicts the evolution and size distribution of the large-scale underdense neutral regions (neutral islands), and we find that the late Epoch of Reionization proceeds very fast, showing a characteristic scale of the neutral islands at each redshift. Using islandFAST, we compare the impact of two types of absorption systems, i.e., the large-scale underdense neutral islands versus small-scale overdense absorbers, in regulating the reionization process. The neutral islands dominate the morphology of the ionization field, while the small-scale absorbers dominate the mean-free path of ionizing photons, and also delay and prolong the reionization process. With our semi-numerical simulation, the evolution of the ionizing background can be derived self-consistently given a model for the small absorbers. The hydrogen ionization rate of the ionizing background is reduced by an order of magnitude in the presence of dense absorbers.

  9. Fast scattering simulation tool for multi-energy x-ray imaging

    Sossin, A., E-mail: artur.sossin@cea.fr [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France); Tabary, J.; Rebuffel, V. [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France); Létang, J.M.; Freud, N. [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Claude Bernard Lyon 1, Centre Léon Bérard (France); Verger, L. [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France)

    2015-12-01

    A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.

  10. SU-F-J-204: Carbon Digitally Reconstructed Radiography (CDRR): A GPU Based Tool for Fast and Versatile Carbonimaging Simulation

    Dias, M F; Seco, J; Baroni, G; Riboldi, M

    2016-01-01

    Purpose: Research in carbon imaging has been growing over the past years, as a way to increase treatment accuracy and patient positioning in carbon therapy. The purpose of this tool is to allow a fast and flexible way to generate CDRR data without the need to use Monte Carlo (MC) simulations. It can also be used to predict future clinically measured data. Methods: A python interface has been developed, which uses information from CT or 4DCT and thetreatment calibration curve to compute the Water Equivalent Path Length (WEPL) of carbon ions. A GPU based ray tracing algorithm computes the WEPL of each individual carbon traveling through the CT voxels. A multiple peak detection method to estimate high contrast margin positioning has been implemented (described elsewhere). MC simulations have been used to simulate carbons depth dose curves in order to simulate the response of a range detector. Results: The tool allows the upload of CT or 4DCT images. The user has the possibility to selectphase/slice of interested as well as position, angle…). The WEPL is represented as a range detector which can be used to assess range dilution and multiple peak detection effects. The tool also provides knowledge of the minimum energy that should be considered for imaging purposes. The multiple peak detection method has been used in a lung tumor case, showing an accuracy of 1mm in determine the exact interface position. Conclusion: The tool offers an easy and fast way to simulate carbon imaging data. It can be used for educational and for clinical purposes, allowing the user to test beam energies and angles before real acquisition. An analysis add-on is being developed, where the used will have the opportunity to select different reconstruction methods and detector types (range or energy). Fundacao para a Ciencia e a Tecnologia (FCT), PhD Grant number SFRH/BD/85749/2012

  11. SU-F-J-204: Carbon Digitally Reconstructed Radiography (CDRR): A GPU Based Tool for Fast and Versatile Carbonimaging Simulation

    Dias, M F [Dipartamento di Elettronica, Informazione e Bioingegneria - DEIB, Politecnico di Milano (Italy); Department of Radiation Oncology, Francis H. Burr Proton Therapy Center Massachusetts General Hospital (MGH), Boston, Massachusetts (United States); Seco, J [Department of Radiation Oncology, Francis H. Burr Proton Therapy Center Massachusetts General Hospital (MGH), Boston, Massachusetts (United States); Baroni, G; Riboldi, M [Dipartamento di Elettronica, Informazione e Bioingegneria - DEIB, Politecnico di Milano (Italy); Bioengineering Unit, Centro Nazionale di Adroterapia Oncologica, Pavia (Italy)

    2016-06-15

    Purpose: Research in carbon imaging has been growing over the past years, as a way to increase treatment accuracy and patient positioning in carbon therapy. The purpose of this tool is to allow a fast and flexible way to generate CDRR data without the need to use Monte Carlo (MC) simulations. It can also be used to predict future clinically measured data. Methods: A python interface has been developed, which uses information from CT or 4DCT and thetreatment calibration curve to compute the Water Equivalent Path Length (WEPL) of carbon ions. A GPU based ray tracing algorithm computes the WEPL of each individual carbon traveling through the CT voxels. A multiple peak detection method to estimate high contrast margin positioning has been implemented (described elsewhere). MC simulations have been used to simulate carbons depth dose curves in order to simulate the response of a range detector. Results: The tool allows the upload of CT or 4DCT images. The user has the possibility to selectphase/slice of interested as well as position, angle…). The WEPL is represented as a range detector which can be used to assess range dilution and multiple peak detection effects. The tool also provides knowledge of the minimum energy that should be considered for imaging purposes. The multiple peak detection method has been used in a lung tumor case, showing an accuracy of 1mm in determine the exact interface position. Conclusion: The tool offers an easy and fast way to simulate carbon imaging data. It can be used for educational and for clinical purposes, allowing the user to test beam energies and angles before real acquisition. An analysis add-on is being developed, where the used will have the opportunity to select different reconstruction methods and detector types (range or energy). Fundacao para a Ciencia e a Tecnologia (FCT), PhD Grant number SFRH/BD/85749/2012.

  12. Further development of the fast beam dynamics simulation tool V-code

    Franke, Sylvain; Ackermann, Wolfgang; Weiland, Thomas [Institut fuer Theorie Elektromagnetischer Felder, TU Darmstadt (Germany)

    2010-07-01

    The Vlasov equation describes the evolution of a particle density under the effects of electromagnetic fields. It is derived from the fact that the volume occupied by a given number of particles in the six-dimensional phase space remains constant when only long-range interaction as for example Coulomb forces are relevant and other particle collisions can be neglected. Because this is the case for typical charged particle beams in accelerators, the Vlasov equation can be used to describe their evolution within the whole beam line. This equation is a partial differential equation in 6D and thus it is very expensive to solve it via classical numerical methods. A more efficient approach consists in representing the particle distribution function by a discrete set of characteristic moments. For each moment a time evolution equation can be stated. These ordinary differential equations can then be evaluated efficiently by means of time integration methods if all considered forces and a proper initial condition are known. The beam dynamics simulation tool V-Code implemented at TEMF utilizes this approach.

  13. Molecular dynamics simulation of subnanometric tool-workpiece contact on a force sensor-integrated fast tool servo for ultra-precision microcutting

    Cai, Yindi; Chen, Yuan-Liu; Shimizu, Yuki; Ito, So; Gao, Wei; Zhang, Liangchi

    2016-01-01

    Highlights: • Subnanometric contact between a diamond tool and a copper workpiece surface is investigated by MD simulation. • A multi-relaxation time technique is proposed to eliminate the influence of the atom vibrations. • The accuracy of the elastic-plastic transition contact depth estimation is improved by observing the residual defects. • The simulation results are beneficial for optimization of the next-generation microcutting instruments. - Abstract: This paper investigates the contact characteristics between a copper workpiece and a diamond tool in a force sensor-integrated fast tool servo (FS-FTS) for single point diamond microcutting and in-process measurement of ultra-precision surface forms of the workpiece. Molecular dynamics (MD) simulations are carried out to identify the subnanometric elastic-plastic transition contact depth, at which the plastic deformation in the workpiece is initiated. This critical depth can be used to optimize the FS-FTS as well as the cutting/measurement process. It is clarified that the vibrations of the copper atoms in the MD model have a great influence on the subnanometric MD simulation results. A multi-relaxation time method is then proposed to reduce the influence of the atom vibrations based on the fact that the dominant vibration component has a certain period determined by the size of the MD model. It is also identified that for a subnanometric contact depth, the position of the tool tip for the contact force to be zero during the retracting operation of the tool does not correspond to the final depth of the permanent contact impression on the workpiece surface. The accuracy for identification of the transition contact depth is then improved by observing the residual defects on the workpiece surface after the tool retracting.

  14. Fast Calorimeter Simulation in ATLAS

    Schaarschmidt, Jana; The ATLAS collaboration

    2017-01-01

    Producing the very large samples of simulated events required by many physics and performance studies with the ATLAS detector using the full GEANT4 detector simulation is highly CPU intensive. Fast simulation tools are a useful way of reducing CPU requirements when detailed detector simulations are not needed. During the LHC Run-1, a fast calorimeter simulation (FastCaloSim) was successfully used in ATLAS. FastCaloSim provides a simulation of the particle energy response at the calorimeter read-out cell level, taking into account the detailed particle shower shapes and the correlations between the energy depositions in the various calorimeter layers. It is interfaced to the standard ATLAS digitization and reconstruction software, and it can be tuned to data more easily than GEANT4. It is 500 times faster than full simulation in the calorimeter system. Now an improved version of FastCaloSim is in development, incorporating the experience with the version used during Run-1. The new FastCaloSim makes use of mach...

  15. Upgrading ATLAS Fast Calorimeter Simulation

    Heath, Matthew Peter; The ATLAS collaboration

    2017-01-01

    Producing the very large samples of simulated events required by many physics and performance studies with the ATLAS detector using the full GEANT4 detector simulation is highly CPU intensive. Fast simulation tools are a useful way of reducing CPU requirements when detailed detector simulations are not needed. During the LHC Run-1, a fast calorimeter simulation (FastCaloSim) was successfully used in ATLAS. FastCaloSim provides a simulation of the particle energy response at the calorimeter read-out cell level, taking into account the detailed particle shower shapes and the correlations between the energy depositions in the various calorimeter layers. It is interfaced to the standard ATLAS digitization and reconstruction software, and it can be tuned to data more easily than Geant4. Now an improved version of FastCaloSim is in development, incorporating the experience with the version used during Run-1. The new FastCaloSim aims to overcome some limitations of the first version by improving the description of s...

  16. Upgrading the Fast Calorimeter Simulation in ATLAS

    Schaarschmidt, Jana; The ATLAS collaboration

    2017-01-01

    The tremendous need for simulated samples now and even more so in the future, encourage the development of fast simulation techniques. The Fast Calorimeter Simulation is a faster though less accurate alternative to the full calorimeter simulation with Geant4. It is based on parametrizing the longitudunal and lateral energy deposits of single particles in the ATLAS calorimeter. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. The parametrizations are expanded to cover very high energies and very forward detector regions, to increase the applicability of the tool. A prototype of this upgraded Fast Calorimeter Simulation has been developed and first validations with single particles show substantial improvements over the previous version.

  17. The new ATLAS Fast Calorimeter Simulation

    Dias, Flavia; The ATLAS collaboration

    2016-01-01

    A very large number of simulated events is required for physics and performance studies with the ATLAS detector at the Large Hadron Collider. Producing these with the full GEANT4 detector simulation is highly CPU intensive. As a very detailed detector simulation is not always required, fast simulation tools have been developed to reduce the calorimeter simulation time by a few orders of magnitude. The fast simulation of ATLAS for the calorimeter systems used in Run 1, called Fast Calorimeter Simulation (FastCaloSim), provides a parameterized simulation of the particle energy response at the calorimeter read-out cell level. It is then interfaced to the ATLAS digitization and reconstruction software. In Run 1, about 13 billion events were simulated in ATLAS, out of which 50% were produced using fast simulation. For Run 2, a new parameterisation is being developed to improve the original version: It incorporates developments in geometry and physics lists of the last five years and benefits from knowledge acquire...

  18. Upgrading the ATLAS Fast Calorimeter Simulation

    Hubacek, Zdenek; The ATLAS collaboration

    2016-01-01

    Many physics and performance studies with the ATLAS detector at the Large Hadron Collider require very large samples of simulated events, and producing these using the full GEANT4 detector simulation is highly CPU intensive. Often, a very detailed detector simulation is not needed, and in these cases fast simulation tools can be used to reduce the calorimeter simulation time by a few orders of magnitude. In ATLAS, a fast simulation of the calorimeter systems was developed, called Fast Calorimeter Simulation (FastCaloSim). It provides a parametrized simulation of the particle energy response at the calorimeter read-out cell level. It is interfaced to the standard ATLAS digitization and reconstruction software, and can be tuned to data more easily than with GEANT4. The original version of FastCaloSim has been very important in the LHC Run-1, with several billion events simulated. An improved parametrisation is being developed, to eventually address shortcomings of the original version. It incorporates developme...

  19. The new ATLAS Fast Calorimeter Simulation

    Hasib, Ahmed; The ATLAS collaboration

    2017-01-01

    Producing the very large samples of simulated events required by many physics and performance studies with the ATLAS detector using the full GEANT4 detector simulation is highly CPU intensive. Fast simulation tools are a useful way of reducing CPU requirements when detailed detector simulations are not needed. During the LHC Run-1, a fast calorimeter simulation (FastCaloSim) was successfully used in ATLAS. FastCaloSim provides a simulation of the particle energy response at the calorimeter read-out cell level, taking into account the detailed particle shower shapes and the correlations between the energy depositions in the various calorimeter layers. It is interfaced to the standard ATLAS digitization and reconstruction software, and it can be tuned to data more easily than GEANT4. Now an improved version of FastCaloSim is in development, incorporating the experience with the version used during Run-1. The new FastCaloSim makes use of statistical techniques such as principal component analysis, and a neural n...

  20. The New ATLAS Fast Calorimeter Simulation

    Heath, Matthew Peter; The ATLAS collaboration

    2017-01-01

    Producing the large samples of simulated events required by many physics and performance studies with the ATLAS detector using the full GEANT4 detector simulation is highly CPU intensive. Fast simulation tools are a useful way of reducing the CPU requirements when detailed detector simulations are not needed. During Run-1 of the LHC, a fast calorimeter simulation (FastCaloSim) was successfully used in ATLAS. FastCaloSim provides a simulation of the particle energy response at the calorimeter read-out cell level, taking into account the detailed particle shower shapes and the correlations between the energy depositions in the various calorimeter layers. It is interfaced to the standard ATLAS digitisation and reconstruction software, and it can be tuned to data more easily than Geant4. Now an improved version of FastCaloSim is in development, incorporating the experience with the version used during Run-1. The new FastCaloSim aims to overcome some limitations of the first version by improving the description of...

  1. Fast simulation of ultrasound images

    Jensen, Jørgen Arendt; Nikolov, Svetoslav

    2000-01-01

    , and a whole image can take a full day. Simulating 3D images and 3D flow takes even more time. A 3D image of 64 by 64 lines can take 21 days, which is not practical for iterative work. This paper presents a new fast simulation method based on the Field II program. In imaging the same spatial impulse response...

  2. The new ATLAS Fast Calorimeter Simulation

    AUTHOR|(INSPIRE)INSPIRE-00176100; The ATLAS collaboration

    2016-01-01

    The physics and performance studies of the ATLAS detector at the Large Hadron Collider re- quire a large number of simulated events. A GEANT4 based detailed simulation of the ATLAS calorimeter systems is highly CPU intensive and such resolution is often unnecessary. To reduce the calorimeter simulation time by a few orders of magnitude, fast simulation tools have been developed. The Fast Calorimeter Simulation (FastCaloSim) provides a parameterised simulation of the particle energy response at the calorimeter read-out cell level. In Run 1, about 13 billion events were simulated in ATLAS, out of which 50% were produced using fast simulation. For Run 2, a new parameterisation is being developed to improve the original version: it incorporates developments in geometry and physics lists during the last five years and benefits from the knowledge acquired from the Run 1 data. The algorithm uses machine learning techniques to improve the parameterisations and to optimise the amount of information to be stored in the...

  3. Fast "coalescent" simulation

    Wall Jeff D

    2006-03-01

    Full Text Available Abstract Background The amount of genome-wide molecular data is increasing rapidly, as is interest in developing methods appropriate for such data. There is a consequent increasing need for methods that are able to efficiently simulate such data. In this paper we implement the sequentially Markovian coalescent algorithm described by McVean and Cardin and present a further modification to that algorithm which slightly improves the closeness of the approximation to the full coalescent model. The algorithm ignores a class of recombination events known to affect the behavior of the genealogy of the sample, but which do not appear to affect the behavior of generated samples to any substantial degree. Results We show that our software is able to simulate large chromosomal regions, such as those appropriate in a consideration of genome-wide data, in a way that is several orders of magnitude faster than existing coalescent algorithms. Conclusion This algorithm provides a useful resource for those needing to simulate large quantities of data for chromosomal-length regions using an approach that is much more efficient than traditional coalescent models.

  4. The new ATLAS Fast Calorimeter Simulation

    AUTHOR|(INSPIRE)INSPIRE-00223142; The ATLAS collaboration

    2016-01-01

    Many physics and performance studies with the ATLAS detector at the Large Hadron Collider require very large samples of simulated events, and producing these using the full GEANT4 detector simulation is highly CPU intensive. Often, a very detailed detector simulation is not needed, and in these cases fast simulation tools can be used to reduce the calorimeter simulation time by a few orders of magnitude. The new ATLAS Fast Calorimeter Simulation (FastCaloSim) is an improved parametrisation compared to the one used in the LHC Run-1. It provides a simulation of the particle energy response at the calorimeter read-out cell level, taking into account the detailed particle shower shapes and the correlations between the energy depositions in the various calorimeter layers. It is interfaced to the standard ATLAS digitization and reconstruction software, and can be tuned to data more easily than with GEANT4. The new FastCaloSim incorporates developments in geometry and physics lists of the last five years and benefit...

  5. On a fast numerical tool for nuclear accidental dynamic phenomenology and application to the real time simulation of Lady Godiva

    Bindel, Laurent; Gamess, Andre; Jasserand, Frederic; Laporte, Sebastien

    2003-01-01

    This paper present a modern numerical method, implemented in a TUI-code named MacDSP, for solving any set of differential equations and in particular phenomenological accidental dynamic calculations. The speed efficiency of such an approach, thanks to the use of the hybrid-level power offered by C++ and an ad-hoc design, make it possible to construct the first kid of a family of real time simulator employing the video games technology DirectX TM : Lady Godiva Real Time Simulator. (author)

  6. Development of simulation tools for improvement of measurement accuracy and efficiency in ultrasonic testing. Part 2. Development of fast simulator based on analytical approach

    Yamada, Hisao; Fukutomi, Hiroyuki; Lin, Shan; Ogata, Takashi

    2008-01-01

    CRIEPI developed a high speed simulation method to predict B scope images for crack-like defects under ultrasonic testing. This method is based on the geometrical theory of diffraction (GTD) to follow ultrasonic waves transmitted from the angle probe and with the aid of reciprocity relations to find analytical equations to express echoes received by the probe. The tip and mirror echoes from a slit with an arbitrary angle in the direction of thickness of test article and an arbitrary depth can be calculated by this method. Main object of the study is to develop a high speed simulation tool to gain B scope displays from the crack-like defect. This was achieved for the simple slits in geometry change regions by the prototype software based on the method. Fairy complete B scope images for slits could be obtained by about a minute on a current personal computer. The numerical predictions related to the surface opening slits were in excellent agreement with the relative experimental measurements. (author)

  7. Fast calorimeter simulation in LHCb

    CERN. Geneva

    2018-01-01

    Fast calorimeter simulation in LHCb In HEP experiments CPU resources required by MC simulations are constantly growing and become a very large fraction of the total computing power (greater than 75%). At the same time the pace of performance improvements from technology is slowing down, so the only solution is a more efficient use of resources. Efforts are ongoing in the LHC experiments to provide multiple options for simulating events in a faster way when higher statistics is needed. A key of the success for this strategy is the possibility of enabling fast simulation options in a common framework with minimal action by the final user. In this talk we will describe the solution adopted in Gauss, the LHCb simulation software framework, to selectively exclude particles from being simulated by the Geant4 toolkit and to insert the corresponding hits generated in a faster way. The approach, integrated within the Geant4 toolkit, has been applied to the LHCb calorimeter but it could also be used for other subdetec...

  8. Verification of Simulation Tools

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  9. FASTBUS simulation tools

    Dean, T.D.; Haney, M.J.

    1991-10-01

    A generalized model of a FASTBUS master is presented. The model is used with simulation tools to aid in the specification, design, and production of FASTBUS slave modules. The model provides a mechanism to interact with the electrical schematics and software models to predict performance. The model is written in the IEEE std 1076-1987 hardware description language VHDL. A model of the ATC logic is also presented. VHDL was chosen to provide portability to various platforms and simulation tools. The models, in conjunction with most commercially available simulators, will perform all of the transactions specified in IEEE std 960-1989. The models may be used to study the behavior of electrical schematics and other software models and detect violations of the FASTBUS protocol. For example, a hardware design of a slave module could be studied, protocol violations detected and corrected before committing money to prototype development. The master model accepts a stream of high level commands from an ASCII file to initiate FASTBUS transactions. The high level command language is based on the FASTBUS standard routines listed in IEEE std 1177-1989. Using this standard-based command language to direct the model of the master, hardware engineers can simulate FASTBUS transactions in the language used by physicists and programmers to operate FASTBUS systems. 15 refs., 6 figs

  10. Range Process Simulation Tool

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  11. The Fast Simulation Chain for ATLAS

    Basalaev, Artem; The ATLAS collaboration

    2016-01-01

    In order to generate the huge number of Monte Carlo events that will be required by the ATLAS experiment over the next several runs, a very fast simulation is critical. Fast detector simulation alone, however, is insufficient: with very high numbers of simultaneous proton-proton collisions expected in Run 3 and beyond, the digitization (detector response emulation) and event reconstruction time quickly become comparable to the time required for detector simulation. The ATLAS Fast Chain simulation has been developed to solve this problem. Modules are implemented for fast simulation, fast digitization, and fast track reconstruction. The application is sufficiently fast -- several orders of magnitude faster than the standard simulation -- that the simultaneous proton-proton collisions can be generated during the simulation job, so Pythia8 also runs concurrently with the rest of the algorithms. The Fast Chain has been built to be extremely modular and flexible, so that each sample can be custom-tailored to match ...

  12. The new ATLAS Fast Calorimeter Simulation

    AUTHOR|(INSPIRE)INSPIRE-00223142; The ATLAS collaboration

    2017-01-01

    Current and future need for large scale simulated samples motivate the development of reliable fast simulation techniques. The new Fast Calorimeter Simulation is an improved parameterized response of single particles in the ATLAS calorimeter that aims to accurately emulate the key features of the detailed calorimeter response as simulated with Geant4, yet approximately ten times faster. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. A prototype of this new Fast Calorimeter Simulation is in development and its integration into the ATLAS simulation infrastructure is ongoing.

  13. The new ATLAS Fast Calorimeter Simulation

    Schaarschmidt, J.; ATLAS Collaboration

    2017-10-01

    Current and future need for large scale simulated samples motivate the development of reliable fast simulation techniques. The new Fast Calorimeter Simulation is an improved parameterized response of single particles in the ATLAS calorimeter that aims to accurately emulate the key features of the detailed calorimeter response as simulated with Geant4, yet approximately ten times faster. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. A prototype of this new Fast Calorimeter Simulation is in development and its integration into the ATLAS simulation infrastructure is ongoing.

  14. The Fast Simulation Chain for ATLAS

    AUTHOR|(INSPIRE)INSPIRE-00399337; The ATLAS collaboration; Marshall, Zach

    2017-01-01

    In order to generate the huge number of Monte Carlo events that will be required by the ATLAS experiment over the next several runs, a very fast simulation is critical. Fast detector simulation alone, however, is insufficient: with very high numbers of simultaneous proton-proton collisions expected in Run 3 and beyond, the digitization (detector response emulation) and event reconstruction time quickly become comparable to the time required for detector simulation. The ATLAS Fast Chain simulation has been developed to solve this problem. Modules are implemented for fast simulation, fast digitization, and fast track reconstruction. The application is sufficiently fast—several orders of magnitude faster than the standard simulation—that the simultaneous proton-proton collisions can be generated during the simulation job, so Pythia8 also runs concurrently with the rest of the algorithms. The Fast Chain has been built to be extremely modular and flexible, so that each sample can be custom-tailored to match th...

  15. Design of tool monitor simulator

    Yao Yonggang; Deng Changming; Zhang Jia; Meng Dan; Zhang Lu; Wang Zhi'ai; Shen Yang

    2011-01-01

    It is based on tool monitor in Qinshan Nuclear Power Plant for the object of study, and manufacture a tool monitor simulator. The device is designed to automatically emulate-monitor the contamination level of objects for training students. Once if the tool monitor reports the contamination, the students can handle properly. The brief introduction of main function and system design of the simulator are presented in the paper. (authors)

  16. FATRAS - the ATLAS Fast Track Simulation project

    Mechnich, J.

    2011-01-01

    The Monte Carlo simulation of the detector response is an integral component of any analysis performed with data from the LHC experiments. As these simulated data sets must be both large and precise, their production is a CPU-intensive task. ATLAS has developed full and fast detector simulation

  17. Development of Simulator Configuration Tool

    Nedrelid, Olav; Pettersen, Geir

    1996-01-01

    The main objective of the development of a Simulator Configuration Tool (SCT) is to achieve faster and more efficient production of dynamic simulators. Through application of versatile graphical interfaces, the simulator builder should be able to configure different types of simulators including full-scope process simulators. The SCT should be able to serve different simulator environments. The configuration tool communicates with simulator execution environments through a TCP/IP-based interface, Communication with a Model Server System developed at Institutt for energiteknikk has been established and used as test case. The system consists of OSF/Motif dialogues for operations requiring textual input, list selections etc., and uses the Picasso-3 User Interface Management System to handle presentation of static and dynamic graphical information. (author)

  18. Intelligent Tools and Instructional Simulations

    Murray, William R; Sams, Michelle; Belleville, Michael

    2001-01-01

    This intelligent tools and instructional simulations project was an investigation into the utility of a knowledge-based performance support system to support learning and on-task performance for using...

  19. Surgery simulation using fast finite elements

    Bro-Nielsen, Morten

    1996-01-01

    This paper describes our recent work on real-time surgery simulation using fast finite element models of linear elasticity. In addition, we discuss various improvements in terms of speed and realism......This paper describes our recent work on real-time surgery simulation using fast finite element models of linear elasticity. In addition, we discuss various improvements in terms of speed and realism...

  20. Stochastic airspace simulation tool development

    2009-10-01

    Modeling and simulation is often used to study : the physical world when observation may not be : practical. The overall goal of a recent and ongoing : simulation tool project has been to provide a : documented, lifecycle-managed, multi-processor : c...

  1. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  2. The new ATLAS Fast Calorimeter Simulation

    Jacka, Petr; The ATLAS collaboration

    2018-01-01

    With the huge amount of data collected with ATLAS, there is a need to produce a large number of simulated events. These productions are very CPU and time consuming when using the full GEANT4 simulation. FastCaloSim is a program to quickly simulate the ATLAS calorimeter response, based on a parameterization of the GEANT4 energy deposits of several kinds of particles in a grid of energy and eta. A new version of FastCaloSim is under development and its integration into the ATLAS simulation infrastructure is ongoing. The use of machine learning techniques improves the performance and decreases the memory usage. Dedicated parameterizations for the forward calorimeters are being studied. First results of the new FastCaloSim show substantial improvements of the description of energy and shower shape variables, including the variables for jet substructure.

  3. Terascale Simulation Tools and Technologies

    Li, Xiaolin

    2007-03-09

    We report the development of front tracking method as a simulation tool and technology for the computation on several important SciDAC and SciDAC associated applications. The progress includes the extraction of an independent software library from the front tracking code, conservative front tracking, applications of front tracking to the simulation of fusion pellet injection in a magnetically confined plasma, the study of a fuel injection jet, and the study of fluid chaotic mixing, among other problems.

  4. The Xygra gun simulation tool.

    Garasi, Christopher Joseph; Lamppa, Derek C.; Aubuchon, Matthew S.; Shirley, David Noyes; Robinson, Allen Conrad; Russo, Thomas V.

    2008-12-01

    Inductive electromagnetic launchers, or coilguns, use discrete solenoidal coils to accelerate a coaxial conductive armature. To date, Sandia has been using an internally developed code, SLINGSHOT, as a point-mass lumped circuit element simulation tool for modeling coilgun behavior for design and verification purposes. This code has shortcomings in terms of accurately modeling gun performance under stressful electromagnetic propulsion environments. To correct for these limitations, it was decided to attempt to closely couple two Sandia simulation codes, Xyce and ALEGRA, to develop a more rigorous simulation capability for demanding launch applications. This report summarizes the modifications made to each respective code and the path forward to completing interfacing between them.

  5. A fast and flexible reactor physics model for simulating neutron spectra and depletion in fast reactors - 202

    Recktenwald, G.D.; Bronk, L.A.; Deinert, M.R.

    2010-01-01

    Determining the time dependent concentration of isotopes within a nuclear reactor core is central to the analysis of nuclear fuel cycles. We present a fast, flexible tool for determining the time dependent neutron spectrum within fast reactors. The code (VBUDS: visualization, burnup, depletion and spectra) uses a two region, multigroup collision probability model to simulate the energy dependent neutron flux and tracks the buildup and burnout of 24 actinides, as well as fission products. While originally developed for LWR simulations, the model is shown to produce fast reactor spectra that show high degree of fidelity to available fast reactor benchmarks. (authors)

  6. Integration between a sales support system and a simulation tool

    Wahlström, Ola

    2005-01-01

    InstantPlanner is a sales support system for the material handling industry, visualizing and calculating designs faster and more correctly than other tools on the market. AutoMod is a world leading simulation tool used in the material handling industry to optimize and calculate appropriate configuration designs. Both applications are favorable in their own area provide a great platform for integration with the properties of fast designing, correct product calculations, great simulation capabi...

  7. Fast shower simulation in the ATLAS calorimeter

    Barberio, E; Butler, B; Cheung, S L; Dell'Acqua, A; Di Simone, A; Ehrenfeld, W; Gallas, M V; Glazov, A; Marshall, Z; Müller, J; Placakyte, R; Rimoldi, A; Savard, P; Tsulaia, V; Waugh, A; Young, C C

    2008-01-01

    The time to simulate pp collisions in the ATLAS detector is largely dominated by the showering of electromagnetic particles in the heavy parts of the detector, especially the electromagnetic barrel and endcap calorimeters. Two procedures have been developed to accelerate the processing time of electromagnetic particles in these regions: (1) a fast shower parameterisation and (2) a frozen shower library. Both work by generating the response of the calorimeter to electrons and positrons with Geant 4, and then reintroduce the response into the simulation at runtime.

  8. Fast shower simulation in the ATLAS calorimeter

    Barberio, E; Boudreau, J; Mueller, J; Tsulaia, V; Butler, B; Young, C C; Cheung, S L; Savard, P; Dell'Acqua, A; Simone, A D; Gallas, M V; Ehrenfeld, W; Glazov, A; Placakyte, R; Marshall, Z; Rimoldi, A; Waugh, A

    2008-01-01

    The time to simulate pp collisions in the ATLAS detector is largely dominated by the showering of electromagnetic particles in the heavy parts of the detector, especially the electromagnetic barrel and endcap calorimeters. Two procedures have been developed to accelerate the processing time of electromagnetic particles in these regions: (1) a fast shower parameterisation and (2) a frozen shower library. Both work by generating the response of the calorimeter to electrons and positrons with Geant 4, and then reintroduce the response into the simulation at runtime. In the fast shower parameterisation technique, a parameterisation is tuned to single electrons and used later by simulation. In the frozen shower technique, actual showers from low-energy particles are used in the simulation. Full Geant 4 simulation is used to develop showers down to ∼ 1GeV, at which point the shower is terminated by substituting a frozen shower. Judicious use of both techniques over the entire electromagnetic portion of the ATLAS calorimeter produces an important improvement of CPU time. We discuss the algorithms and their performance in this paper

  9. CityZoom UP (Urban Pollution): a computational tool for the fast generation and setup of urban scenarios for CFD and dispersion modelling simulation

    Grazziotin, Pablo Colossi

    2016-01-01

    This research presents the development of CityZoom UP, the first attempt to extend existing urban planning software in order to assist in modelling urban scenarios and setting up simulation parameters for Gaussian dispersion and CFD models. Based on the previous capabilities and graphic user interfaces of CityZoom to model and validate urban scenarios based on Master Plan regulations, new graphic user interfaces, automatic mesh generation and data conversion algorithms have been created to se...

  10. High performance electromagnetic simulation tools

    Gedney, Stephen D.; Whites, Keith W.

    1994-10-01

    Army Research Office Grant #DAAH04-93-G-0453 has supported the purchase of 24 additional compute nodes that were installed in the Intel iPsC/860 hypercube at the Univesity Of Kentucky (UK), rendering a 32-node multiprocessor. This facility has allowed the investigators to explore and extend the boundaries of electromagnetic simulation for important areas of defense concerns including microwave monolithic integrated circuit (MMIC) design/analysis and electromagnetic materials research and development. The iPSC/860 has also provided an ideal platform for MMIC circuit simulations. A number of parallel methods based on direct time-domain solutions of Maxwell's equations have been developed on the iPSC/860, including a parallel finite-difference time-domain (FDTD) algorithm, and a parallel planar generalized Yee-algorithm (PGY). The iPSC/860 has also provided an ideal platform on which to develop a 'virtual laboratory' to numerically analyze, scientifically study and develop new types of materials with beneficial electromagnetic properties. These materials simulations are capable of assembling hundreds of microscopic inclusions from which an electromagnetic full-wave solution will be obtained in toto. This powerful simulation tool has enabled research of the full-wave analysis of complex multicomponent MMIC devices and the electromagnetic properties of many types of materials to be performed numerically rather than strictly in the laboratory.

  11. Bioinformatics tools for development of fast and cost effective simple ...

    Bioinformatics tools for development of fast and cost effective simple sequence repeat ... comparative mapping and exploration of functional genetic diversity in the ... Already, a number of computer programs have been implemented that aim at ...

  12. 10 CFR 434.606 - Simulation tool.

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  13. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  14. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    elements. The tool features incremental syntax checking and code generation which take place while a net is being constructed. A fast simulator efficiently handles both untimed and timed nets. Full and partial state spaces can be generated and analysed, and a standard state space report contains...

  15. Simulating Biomass Fast Pyrolysis at the Single Particle Scale

    Ciesielski, Peter [National Renewable Energy Laboratory (NREL); Wiggins, Gavin [ORNL; Daw, C Stuart [ORNL; Jakes, Joseph E. [U.S. Forest Service, Forest Products Laboratory, Madison, Wisconsin, USA

    2017-07-01

    Simulating fast pyrolysis at the scale of single particles allows for the investigation of the impacts of feedstock-specific parameters such as particle size, shape, and species of origin. For this reason particle-scale modeling has emerged as an important tool for understanding how variations in feedstock properties affect the outcomes of pyrolysis processes. The origins of feedstock properties are largely dictated by the composition and hierarchical structure of biomass, from the microstructural porosity to the external morphology of milled particles. These properties may be accounted for in simulations of fast pyrolysis by several different computational approaches depending on the level of structural and chemical complexity included in the model. The predictive utility of particle-scale simulations of fast pyrolysis can still be enhanced substantially by advancements in several areas. Most notably, considerable progress would be facilitated by the development of pyrolysis kinetic schemes that are decoupled from transport phenomena, predict product evolution from whole-biomass with increased chemical speciation, and are still tractable with present-day computational resources.

  16. Fast, Accurate Memory Architecture Simulation Technique Using Memory Access Characteristics

    小野, 貴継; 井上, 弘士; 村上, 和彰

    2007-01-01

    This paper proposes a fast and accurate memory architecture simulation technique. To design memory architecture, the first steps commonly involve using trace-driven simulation. However, expanding the design space makes the evaluation time increase. A fast simulation is achieved by a trace size reduction, but it reduces the simulation accuracy. Our approach can reduce the simulation time while maintaining the accuracy of the simulation results. In order to evaluate validity of proposed techniq...

  17. The Geant4-Based ATLAS Fast Electromagnetic Shower Simulation

    Barberio, E; Butler, B; Cheung, S L; Dell'Acqua, A; Di Simone, A; Ehrenfeld, W; Gallas, M V; Glasow, A; Hughes, E; Marshall, Z; Müller, J; Placakyte, R; Rimoldi, A; Savard, P; Tsulaia, V; Waugh, A; Young, C C; 10th ICATPP Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications

    2008-01-01

    We present a three-pronged approach to fast electromagnetic shower simulation in ATLAS. Parameterisation is used for high-energy, shower libraries for medium-energy, and an averaged energy deposition for very low-energy particles. We present a comparison between the fast simulation and full simulation in an ATLAS Monte Carlo production.

  18. A multi-group neutron noise simulator for fast reactors

    Tran, Hoai Nam; Zylbersztejn, Florian; Demazière, Christophe; Jammes, Christian; Filliatre, Philippe

    2013-01-01

    Highlights: • The development of a neutron noise simulator for fast reactors. • The noise equation is solved fully in a frequency-domain. • A good agreement with ERANOS on the static calculations. • Noise calculations induced by a localized perturbation of absorption cross section. - Abstract: A neutron noise simulator has been developed for fast reactors based on diffusion theory with multi-energy groups and several groups of delayed neutron precursors. The tool is expected to be applicable for core monitoring of fast reactors and also for other reactor types with hexagonal fuel assemblies. The noise sources are modeled through small stationary fluctuations of macroscopic cross sections, and the induced first order noise is solved fully in the frequency domain. Numerical algorithms are implemented for solving both the static and noise equations using finite differences for spatial discretization, where a hexagonal assembly is radially divided into finer triangular meshes. A coarse mesh finite difference (CMFD) acceleration has been used for accelerating the convergence of both the static and noise calculations. Numerical calculations have been performed for the ESFR core with 33 energy groups and 8 groups of delayed neutron precursors using the cross section data generated by the ERANOS code. The results of the static state have been compared with those obtained using ERANOS. The results show an adequate agreement between the two calculations. Noise calculations for the ESFR core have also been performed and demonstrated with an assumption of the perturbation of the absorption cross section located at the central fuel ring

  19. Overview of Simulation Tools for Smart Grids

    aim of this report “D2.1 – Overview of Simulation Tools for Smart Grids” is to provide an overview of the different simulation tools available, i.e. developed and in use, at the different research centres. Required new tool capabilities are identified and extensions to the existing packages...... are indicated. An analysis of the emerging power systems challenges together with a review of the main topics regarding smart grids is provided in Chapter 1. The requirements for the simulation tools and the list of available tools in the different research centres and their main characteristic are reported...... in Chapter 2. The main aspects of the different tools and their purpose of analysis are listed in Chapter 3 along with the main topics concerning the new requirements for tools in order to allow a proper study in the smart grid context. Gaps capabilities and model consolidation of the analysed tools...

  20. A fast Monte Carlo program for pulsed-neutron capture-gamma tools

    Hovgaard, J.

    1992-02-01

    A fast model for the pulsed-neutron capture-gamma tool has been developed. It is believed that the program produce valid results even though some approximation have been introduced. A correct γ photon transport simulation, which is under preparation, has for instance not yet been included. Simulations performed so far has shown that the model, with respect to computing time and accuracy, fully lives up to expectations with respect to computing time and accuracy. (au)

  1. Simulation of Oscillatory Working Tool

    Carmen Debeleac

    2010-01-01

    Full Text Available The paper presents a study of the resistance forces in soils cutting, with emphasis on their dependence on working tool motion during the loading process and dynamic regimes. The periodic process of cutting of soil by a tool (blade has described. Different intervals in the cycle of steady-state motion of the tool, and several interaction regimes were considered. The analysis has based on a non-linear approximation of the dependence of the soil resistance force on tool motion. Finally, the influence of frequency on the laws governing the interaction in the cyclic process was established.

  2. Front panel engineering with CAD simulation tool

    Delacour, Jacques; Ungar, Serge; Mathieu, Gilles; Hasna, Guenther; Martinez, Pascal; Roche, Jean-Christophe

    1999-04-01

    THe progress made recently in display technology covers many fields of application. The specification of radiance, colorimetry and lighting efficiency creates some new challenges for designers. Photometric design is limited by the capability of correctly predicting the result of a lighting system, to save on the costs and time taken to build multiple prototypes or bread board benches. The second step of the research carried out by company OPTIS is to propose an optimization method to be applied to the lighting system, developed in the software SPEOS. The main features of the tool requires include the CAD interface, to enable fast and efficient transfer between mechanical and light design software, the source modeling, the light transfer model and an optimization tool. The CAD interface is mainly a prototype of transfer, which is not the subjects here. Photometric simulation is efficiently achieved by using the measured source encoding and a simulation by the Monte Carlo method. Today, the advantages and the limitations of the Monte Carlo method are well known. The noise reduction requires a long calculation time, which increases with the complexity of the display panel. A successful optimization is difficult to achieve, due to the long calculation time required for each optimization pass including a Monte Carlo simulation. The problem was initially defined as an engineering method of study. The experience shows that good understanding and mastering of the phenomenon of light transfer is limited by the complexity of non sequential propagation. The engineer must call for the help of a simulation and optimization tool. The main point needed to be able to perform an efficient optimization is a quick method for simulating light transfer. Much work has been done in this area and some interesting results can be observed. It must be said that the Monte Carlo method wastes time calculating some results and information which are not required for the needs of the simulation

  3. Modular numerical tool for gas turbine simulation

    Sampedro Casis, Rodrigo

    2015-01-01

    In this work a free tool for the simulation of turboprops was implemented, capable of simulating the various components of a jet engine, separately or in conjunction, with different degrees of thermodynamic modelling or complexity, in order to simulate an entire jet engine. The main characteristics of this software includes its compatibility, open code and GNU license, non-existing in today's market. Furthermore, the tool was designed with a greater flexibility and a more adapted work environ...

  4. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  5. Fast Simulation of Dynamic Ultrasound Images Using the GPU.

    Storve, Sigurd; Torp, Hans

    2017-10-01

    Simulated ultrasound data is a valuable tool for development and validation of quantitative image analysis methods in echocardiography. Unfortunately, simulation time can become prohibitive for phantoms consisting of a large number of point scatterers. The COLE algorithm by Gao et al. is a fast convolution-based simulator that trades simulation accuracy for improved speed. We present highly efficient parallelized CPU and GPU implementations of the COLE algorithm with an emphasis on dynamic simulations involving moving point scatterers. We argue that it is crucial to minimize the amount of data transfers from the CPU to achieve good performance on the GPU. We achieve this by storing the complete trajectories of the dynamic point scatterers as spline curves in the GPU memory. This leads to good efficiency when simulating sequences consisting of a large number of frames, such as B-mode and tissue Doppler data for a full cardiac cycle. In addition, we propose a phase-based subsample delay technique that efficiently eliminates flickering artifacts seen in B-mode sequences when COLE is used without enough temporal oversampling. To assess the performance, we used a laptop computer and a desktop computer, each equipped with a multicore Intel CPU and an NVIDIA GPU. Running the simulator on a high-end TITAN X GPU, we observed two orders of magnitude speedup compared to the parallel CPU version, three orders of magnitude speedup compared to simulation times reported by Gao et al. in their paper on COLE, and a speedup of 27000 times compared to the multithreaded version of Field II, using numbers reported in a paper by Jensen. We hope that by releasing the simulator as an open-source project we will encourage its use and further development.

  6. SIMULATION TOOLS FOR ELECTRICAL MACHINES MODELLING ...

    Dr Obe

    ABSTRACT. Simulation tools are used both for research and teaching to allow a good ... The solution provide an easy way of determining the dynamic .... incorporate an in-built numerical algorithm, ... to learn, versatile in application, enhanced.

  7. PyFly: A fast, portable aerodynamics simulator

    Garcia, D.; Ghommem, M.; Collier, N.; Varga, B.O.N.; Calo, V.M.

    2018-01-01

    We present a fast, user-friendly implementation of a potential flow solver based on the unsteady vortex lattice method (UVLM), namely PyFly. UVLM computes the aerodynamic loads applied on lifting surfaces while capturing the unsteady effects such as the added mass forces, the growth of bound circulation, and the wake while assuming that the flow separation location is known a priori. This method is based on discretizing the body surface into a lattice of vortex rings and relies on the Biot–Savart law to construct the velocity field at every point in the simulated domain. We introduce the pointwise approximation approach to simulate the interactions of the far-field vortices to overcome the computational burden associated with the classical implementation of UVLM. The computational framework uses the Python programming language to provide an easy to handle user interface while the computational kernels are written in Fortran. The mixed language approach enables high performance regarding solution time and great flexibility concerning easiness of code adaptation to different system configurations and applications. The computational tool predicts the unsteady aerodynamic behavior of multiple moving bodies (e.g., flapping wings, rotating blades, suspension bridges) subject to incoming air. The aerodynamic simulator can also deal with enclosure effects, multi-body interactions, and B-spline representation of body shapes. We simulate different aerodynamic problems to illustrate the usefulness and effectiveness of PyFly.

  8. PyFly: A fast, portable aerodynamics simulator

    Garcia, D.

    2018-03-18

    We present a fast, user-friendly implementation of a potential flow solver based on the unsteady vortex lattice method (UVLM), namely PyFly. UVLM computes the aerodynamic loads applied on lifting surfaces while capturing the unsteady effects such as the added mass forces, the growth of bound circulation, and the wake while assuming that the flow separation location is known a priori. This method is based on discretizing the body surface into a lattice of vortex rings and relies on the Biot–Savart law to construct the velocity field at every point in the simulated domain. We introduce the pointwise approximation approach to simulate the interactions of the far-field vortices to overcome the computational burden associated with the classical implementation of UVLM. The computational framework uses the Python programming language to provide an easy to handle user interface while the computational kernels are written in Fortran. The mixed language approach enables high performance regarding solution time and great flexibility concerning easiness of code adaptation to different system configurations and applications. The computational tool predicts the unsteady aerodynamic behavior of multiple moving bodies (e.g., flapping wings, rotating blades, suspension bridges) subject to incoming air. The aerodynamic simulator can also deal with enclosure effects, multi-body interactions, and B-spline representation of body shapes. We simulate different aerodynamic problems to illustrate the usefulness and effectiveness of PyFly.

  9. New method of fast simulation for a hadron calorimeter response

    Kul'chitskij, Yu.; Sutiak, J.; Tokar, S.; Zenis, T.

    2003-01-01

    In this work we present the new method of a fast Monte-Carlo simulation of a hadron calorimeter response. It is based on the three-dimensional parameterization of the hadronic shower obtained from the ATLAS TILECAL test beam data and GEANT simulations. A new approach of including the longitudinal fluctuations of hadronic shower is described. The obtained results of the fast simulation are in good agreement with the TILECAL experimental data

  10. An adaptive simulation tool for evacuation scenarios

    Formolo, Daniel; van der Wal, C. Natalie

    2017-01-01

    Building useful and efficient models and tools for a varied audience, such as evacuation simulators for scientists, engineers and crisis managers, can be tricky. Even good models can fail in providing information when the user’s tools for the model are scarce of resources. The aim of this work is to

  11. FDTD simulation tools for UWB antenna analysis.

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  12. FDTD simulation tools for UWB antenna analysis.

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  13. Dynamic fault simulation of wind turbines using commercial simulation tools

    Lund, Torsten; Eek, Jarle; Uski, Sanna

    2005-01-01

    This paper compares the commercial simulation tools: PSCAD/EMTDC, PowerFactory, SIMPOW and PSS/E for analysing fault sequences defined in the Danish grid code requirements for wind turbines connected to a voltage level below 100 kV. Both symmetrical and unsymmetrical faults are analysed. The devi......This paper compares the commercial simulation tools: PSCAD/EMTDC, PowerFactory, SIMPOW and PSS/E for analysing fault sequences defined in the Danish grid code requirements for wind turbines connected to a voltage level below 100 kV. Both symmetrical and unsymmetrical faults are analysed....... The deviations and the reasons for the deviations between the tools are stated. The simulation models are imple-mented using the built-in library components of the simulation tools with exception of the mechanical drive-train model, which had to be user-modeled in PowerFactory and PSS/E....

  14. Methods and tools to detect thermal noise in fast reactors

    Motta, M.; Giovannini, R.

    1985-07-01

    The Specialists' Meeting on ''Methods and Tools to Detect Thermal Noise in Fast Reactors'' was held in Bologna on 8-10 October 1984. The meeting was hosted by the ENEA and was sponsored by the IAEA on the recommendation of the International Working Group on Fast Reactors. 17 participants attended the meeting from France, the Federal Republic of Germany, Italy, Japan, the United Kingdom, Joint Research Centre of CEC and from IAEA. The meeting was presided over by Prof. Mario Motta of Italy. The purpose of the meeting was to review and discuss methods and tools for temperature noise detection and related analysis as a potential means for detecting local blockages in fuel and blanket subassemblies and other faults in LMFBR. The meeting was divided into four technical sessions as follows: 1. National review presentations on application purposes and research activities for thermal noise detection. (5 papers); 2. Detection instruments and electronic equipment for temperature measurements in fast reactors. (5 papers); 3. Physical models. (2 papers); 4. Signal processing techniques. (3 papers). A separate abstract was prepared for each of these papers

  15. Westinghouse waste simulation and optimization software tool

    Mennicken, Kim; Aign, Jorg

    2013-01-01

    Applications for dynamic simulation can be found in virtually all areas of process engineering. The tangible benefits of using dynamic simulation can be seen in tighter design, smoother start-ups and optimized operation. Thus, proper implementation of dynamic simulation can deliver substantial benefits. These benefits are typically derived from improved process understanding. Simulation gives confidence in evidence based decisions and enables users to try out lots of 'what if' scenarios until one is sure that a decision is the right one. In radioactive waste treatment tasks different kinds of waste with different volumes and properties have to be treated, e.g. from NPP operation or D and D activities. Finding a commercially and technically optimized waste treatment concept is a time consuming and difficult task. The Westinghouse Waste Simulation and Optimization Software Tool will enable the user to quickly generate reliable simulation models of various process applications based on equipment modules. These modules can be built with ease and be integrated into the simulation model. This capability ensures that this tool is applicable to typical waste treatment tasks. The identified waste streams and the selected treatment methods are the basis of the simulation and optimization software. After implementing suitable equipment data into the model, process requirements and waste treatment data are fed into the simulation to finally generate primary simulation results. A sensitivity analysis of automated optimization features of the software generates the lowest possible lifecycle cost for the simulated waste stream. In combination with proven waste management equipments and integrated waste management solutions, this tool provides reliable qualitative results that lead to an effective planning and minimizes the total project planning risk of any waste management activity. It is thus the ideal tool for designing a waste treatment facility in an optimum manner

  16. Westinghouse waste simulation and optimization software tool

    Mennicken, Kim; Aign, Jorg [Westinghouse Electric Germany GmbH, Hamburg (Germany)

    2013-07-01

    Applications for dynamic simulation can be found in virtually all areas of process engineering. The tangible benefits of using dynamic simulation can be seen in tighter design, smoother start-ups and optimized operation. Thus, proper implementation of dynamic simulation can deliver substantial benefits. These benefits are typically derived from improved process understanding. Simulation gives confidence in evidence based decisions and enables users to try out lots of 'what if' scenarios until one is sure that a decision is the right one. In radioactive waste treatment tasks different kinds of waste with different volumes and properties have to be treated, e.g. from NPP operation or D and D activities. Finding a commercially and technically optimized waste treatment concept is a time consuming and difficult task. The Westinghouse Waste Simulation and Optimization Software Tool will enable the user to quickly generate reliable simulation models of various process applications based on equipment modules. These modules can be built with ease and be integrated into the simulation model. This capability ensures that this tool is applicable to typical waste treatment tasks. The identified waste streams and the selected treatment methods are the basis of the simulation and optimization software. After implementing suitable equipment data into the model, process requirements and waste treatment data are fed into the simulation to finally generate primary simulation results. A sensitivity analysis of automated optimization features of the software generates the lowest possible lifecycle cost for the simulated waste stream. In combination with proven waste management equipments and integrated waste management solutions, this tool provides reliable qualitative results that lead to an effective planning and minimizes the total project planning risk of any waste management activity. It is thus the ideal tool for designing a waste treatment facility in an optimum manner

  17. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  18. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  19. Simulation Tool For Energy Consumption and Production

    Nysteen, Michael; Mynderup, Henrik; Poulsen, Bjarne

    2013-01-01

    In order to promote adoption of smart grid with the general public it is necessary to be able to visualize the benefits of a smart home. Software tools that model the effects can help significantly with this. However, only little work has been done in the area of simulating and visualizing...... the energy consumption in smart homes. This paper presents a prototype simulation tool that allows graphical modeling of a home. Based on the modeled homes the user is able to simulate the energy consumptions and compare scenarios. The simulations are based on dynamic weather and energy price data as well...... as well as appliances and other electrical components used in the modeled homes....

  20. The development of fast simulation program for marine reactor parameters

    Chen Zhiyun; Hao Jianli; Chen Wenzhen

    2012-01-01

    Highlights: ► The simplified physical and mathematical models are proposed for a marine reactor system. ► A program is developed with Simulink module and Matlab file. ► The program developed has the merit of easy input preparation, output processing and fast running. ► The program can be used for the fast simulation of marine reactor parameters on the operating field. - Abstract: The fast simulation program for marine reactor parameters is developed based on the Simulink simulating software according to the characteristics of marine reactor with requirement of maneuverability and acute and fast response. The simplified core physical and thermal model, pressurizer model, steam generator model, control rod model, reactivity model and the corresponding Simulink modules are established. The whole program is developed by coupling all the Simulink modules. Two typical transient processes of marine reactor with fast load increase at low power level and load rejection at high power level are adopted to verify the program. The results are compared with those of Relap5/Mod3.2 with good consistency, and the program runs very fast. It is shown that the program is correct and suitable for the fast and accurate simulation of marine reactor parameters on the operating field, which is significant to the marine reactor safe operation.

  1. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  2. ProbFAST: Probabilistic Functional Analysis System Tool

    Oliveira Thiago YK

    2010-03-01

    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  3. ProbFAST: Probabilistic functional analysis system tool.

    Silva, Israel T; Vêncio, Ricardo Z N; Oliveira, Thiago Y K; Molfetta, Greice A; Silva, Wilson A

    2010-03-30

    The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE) of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  4. Very fast simulated re-annealing

    L. Ingber

    1989-01-01

    Draft An algorithm is developed to statistically find the best global fit of a nonlinear non-convex cost-function over a D-dimensional space. It is argued that this algorithm permits an annealing schedule for ‘‘temperature’’ T decreasing exponentially in annealing-time k, T = T0 exp(−ck1/D). The introduction of re-annealing also permits adaptation to changing sensitivities in the multidimensional parameter-space. This annealing schedule is faster than fast Cauchy annealing, ...

  5. Liquid metal cooled experimental fast reactor simulator

    Guimaraes, Lamartine; Braz Filho, Francisco; Borges, Eduardo M.; Rosa, Mauricio A.P.; Rocamora, Francisco; Hirdes, Viviane R.

    1997-01-01

    This paper is a continuation of the work that has been done in the area of fast reactor component dynamic analysis, as part of the REARA project at the IEAv/CTA-Brazil. A couple of preceding papers, presented in other meetings, introduced major concept design components of the REARA reactor. The components are set together in order to represent a full model of the power plant. Full model transient results will be presented, together with several parameters to help us to better establish the REARA experimental plant concept. (author). 8 refs., 6 figs., 3 tabs

  6. Simulation tools for detector and instrument design

    Kanaki, Kalliopi; Kittelmann, Thomas; Cai, Xiao Xiao

    2018-01-01

    The high performance requirements at the European Spallation Source have been driving the technological advances on the neutron detector front. Now more than ever is it important to optimize the design of detectors and instruments, to fully exploit the ESS source brilliance. Most of the simulation...... a powerful set of tools to tailor the detector and instrument design to the instrument application....

  7. Fast simulation and topological vertex finding in JAVA

    Walkowiak, Wolfgang

    2001-01-01

    An overview of the fast Monte Carlo simulation for NLC detector studies as currently provided in the Java Analysis Studio environment is presented. Special emphasis is given to the simulation of tracks. In addition, the SLD collaboration's topological vertex finding algorithm (ZVTOP) has been implemented in the Java Analysis Studio framework

  8. Calculation of the MT25 microtron dynamics and its fast simulation

    Krist, Pavel; Chvatil, David; Bila, Jiri

    2011-01-01

    This paper presents the design of a mathematical model and its fast simulation developed for the setup of the control system of the MT25 microtron, which is a cyclic electron accelerator. This type of accelerator has been controlled manually until now. The mathematical model is based on calculations of the electron motion in the accelerating cavity and vacuum chamber. The simulation diagram was created using the Matlab-Simulink tools. (author)

  9. Advanced feeder control using fast simulation models

    Verheijen, O.S.; Op den Camp, O.M.G.C.; Beerkens, R.G.C.; Backx, A.C.P.M.; Huisman, L.; Drummond, C.H.

    2005-01-01

    For the automatic control of glass quality in glass production, the relation between process variable and product or glass quality and process conditions/process input parameters must be known in detail. So far, detailed 3-D glass melting simulation models were used to predict the effect of process

  10. Fast finite elements for surgery simulation

    Bro-Nielsen, Morten

    1997-01-01

    This paper discusses volumetric deformable models for modeling human body parts and organs in surgery simulation systems. These models are built using finite element models for linear elastic materials. To achieve real-time response condensation has been applied to the system stiffness matrix...

  11. New tools for the simulation and design of calorimeters

    Womersley, W.J.

    1989-01-01

    Two new approaches to the simulation and design of large hermetic calorimeters are presented. Firstly, the Shower Library scheme used in the fast generation of showers in the Monte Carlo of the calorimeter for the D-Zero experiment at the Fermilab Tevatron is described. Secondly, a tool for the design future calorimeters is described, which can be integrated with a computer aided design system to give engineering designers an immediate idea of the relative physics capabilities of different geometries. 9 refs., 6 figs., 1 tab

  12. Simulating the Behaviour of the Fast Reactor Joyo (Draft)

    Juutilainen, Pauli

    2008-01-01

    Motivated by the development of fast reactors the behaviour of the Japanese experimental fast reactor Joyo is simulated with two Monte Carlo codes: Monte Carlo NParticle (MCNP) and Probabilistic Scattering Game (PSG). The simulations are based on the benchmark study 'Japan's Experimental Fast Reactor Joyo MKI core: Sodium-Cooled Uranium-Plutonium Mixed Oxide Fueled Fast Core Surrounded by UO 2 Blanket'. The study is focused on the criticality of the reactor, control rod worth, sodium void reactivity and isothermal temperature coefficient of the reactor. These features are calculated by applying both homogeneous and heterogeneous reactor core models that are built according to the benchmark instructions. The results of the two models obtained by the two codes are compared with each other and especially with the experimental results presented in the benchmark. (author)

  13. Fast simulation techniques for switching converters

    King, Roger J.

    1987-01-01

    Techniques for simulating a switching converter are examined. The state equations for the equivalent circuits, which represent the switching converter, are presented and explained. The uses of the Newton-Raphson iteration, low ripple approximation, half-cycle symmetry, and discrete time equations to compute the interval durations are described. An example is presented in which these methods are illustrated by applying them to a parallel-loaded resonant inverter with three equivalent circuits for its continuous mode of operation.

  14. SIMONE: Tool for Data Analysis and Simulation

    Chudoba, V.; Hnatio, B.; Sharov, P.; Papka, Paul

    2013-06-01

    SIMONE is a software tool based on the ROOT Data Analysis Framework and developed in collaboration of FLNR JINR and iThemba LABS. It is intended for physicists planning experiments and analysing experimental data. The goal of the SIMONE framework is to provide a flexible system, user friendly, efficient and well documented. It is intended for simulation of a wide range of Nuclear Physics experiments. The most significant conditions and physical processes can be taken into account during simulation of the experiment. The user can create his own experimental setup through the access of predefined detector geometries. Simulated data is made available in the same format as for the real experiment for identical analysis of both experimental and simulated data. Significant time reduction is expected during experiment planning and data analysis. (authors)

  15. Fast simulation of electromagnetic showers in the ZEUS calorimeter

    Peso, J. del; Ros, E.

    1991-02-01

    We present a fast Monte Carlo algorithm for the generation of electromagnetic showers in the uranium-scintillator sampling calorimeter of the ZEUS experiment. This algorithm includes a simulation of longitudinal and transverse profiles, their fluctuations and the correlation between these fluctuations as well. The tuning of this fast Monte Carlo with data generated with EGS is described and its performance together with some applications is discussed. (orig.)

  16. Fast and Automatic Ultrasound Simulation from CT Images

    Weijian Cong

    2013-01-01

    Full Text Available Ultrasound is currently widely used in clinical diagnosis because of its fast and safe imaging principles. As the anatomical structures present in an ultrasound image are not as clear as CT or MRI. Physicians usually need advance clinical knowledge and experience to distinguish diseased tissues. Fast simulation of ultrasound provides a cost-effective way for the training and correlation of ultrasound and the anatomic structures. In this paper, a novel method is proposed for fast simulation of ultrasound from a CT image. A multiscale method is developed to enhance tubular structures so as to simulate the blood flow. The acoustic response of common tissues is generated by weighted integration of adjacent regions on the ultrasound propagation path in the CT image, from which parameters, including attenuation, reflection, scattering, and noise, are estimated simultaneously. The thin-plate spline interpolation method is employed to transform the simulation image between polar and rectangular coordinate systems. The Kaiser window function is utilized to produce integration and radial blurring effects of multiple transducer elements. Experimental results show that the developed method is very fast and effective, allowing realistic ultrasound to be fast generated. Given that the developed method is fully automatic, it can be utilized for ultrasound guided navigation in clinical practice and for training purpose.

  17. Transport Simulations for Fast Ignition on NIF

    Strozzi, D J; Tabak, M; Grote, D P; Cohen, B I; Shay, H D; Town, R J; Kemp, A J; Key, M

    2009-10-26

    We are designing a full hydro-scale cone-guided, indirect-drive FI coupling experiment, for NIF, with the ARC-FIDO short-pulse laser. Current rad-hydro designs with limited fuel jetting into cone tip are not yet adequate for ignition. Designs are improving. Electron beam transport simulations (implicit-PIC LSP) show: (1) Magnetic fields and smaller angular spreads increase coupling to ignition-relevant 'hot spot' (20 um radius); (2) Plastic CD (for a warm target) produces somewhat better coupling than pure D (cryogenic target) due to enhanced resistive B fields; and (3) The optimal T{sub hot} for this target is {approx} 1 MeV; coupling falls by 3x as T{sub hot} rises to 4 MeV.

  18. An integrated computational tool for precipitation simulation

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  19. Real time simulation method for fast breeder reactors dynamics

    Miki, Tetsushi; Mineo, Yoshiyuki; Ogino, Takamichi; Kishida, Koji; Furuichi, Kenji.

    1985-01-01

    The development of multi-purpose real time simulator models with suitable plant dynamics was made; these models can be used not only in training operators but also in designing control systems, operation sequences and many other items which must be studied for the development of new type reactors. The prototype fast breeder reactor ''Monju'' is taken as an example. Analysis is made on various factors affecting the accuracy and computer load of its dynamic simulation. A method is presented which determines the optimum number of nodes in distributed systems and time steps. The oscillations due to the numerical instability are observed in the dynamic simulation of evaporators with a small number of nodes, and a method to cancel these oscillations is proposed. It has been verified through the development of plant dynamics simulation codes that these methods can provide efficient real time dynamics models of fast breeder reactors. (author)

  20. FastMag: Fast micromagnetic simulator for complex magnetic structures (invited)

    Chang, R.; Li, S.; Lubarda, M. V.; Livshitz, B.; Lomakin, V.

    2011-04-01

    A fast micromagnetic simulator (FastMag) for general problems is presented. FastMag solves the Landau-Lifshitz-Gilbert equation and can handle multiscale problems with a high computational efficiency. The simulator derives its high performance from efficient methods for evaluating the effective field and from implementations on massively parallel graphics processing unit (GPU) architectures. FastMag discretizes the computational domain into tetrahedral elements and therefore is highly flexible for general problems. The magnetostatic field is computed via the superposition principle for both volume and surface parts of the computational domain. This is accomplished by implementing efficient quadrature rules and analytical integration for overlapping elements in which the integral kernel is singular. Thus, discretized superposition integrals are computed using a nonuniform grid interpolation method, which evaluates the field from N sources at N collocated observers in O(N) operations. This approach allows handling objects of arbitrary shape, allows easily calculating of the field outside the magnetized domains, does not require solving a linear system of equations, and requires little memory. FastMag is implemented on GPUs with ?> GPU-central processing unit speed-ups of 2 orders of magnitude. Simulations are shown of a large array of magnetic dots and a recording head fully discretized down to the exchange length, with over a hundred million tetrahedral elements on an inexpensive desktop computer.

  1. A methodology of neutronic-thermodynamics simulation for fast reactor

    Waintraub, M.

    1986-01-01

    Aiming at a general optimization of the project, controlled fuel depletion and management, this paper develop a neutronic thermodynamics simulator, SIRZ, which besides being sufficiently precise, is also economic. That results in a 75% reduction in CPU time, for a startup calculation, when compared with the same calculation at the CITATION code. The simulation system by perturbation calculations, applied to fast reactors, which produce errors smaller than 1% in all components of the reference state given by the CITATION code was tested. (author)

  2. A Tool for Simulating Rotating Coil Magnetometers

    Bottura, L; Schnizer, P; Smirnov, N

    2002-01-01

    When investigating the quality of a magnetic measurement system, one observes difficulties to identify the "trouble maker" of such a system as different effects can yield similar influences on the measurement results.We describe a tool in this paper that allows to investigate numerically the effects produced by different imperfections of components of such a system, including, but not limited to vibration and movements of the rotating coil, influence of electrical noise on the system, angular encoder imperfections. This system can simulate the deterministic and stochastic parts of those imperfections. We outline the physical models used that are generally based on experience or first principles. Comparisons to analytical results are shown. The modular structure of the general design of this tool permits to include new modules for new devices and effects.

  3. Fast emulation of track reconstruction in the CMS simulation

    Komm, Matthias

    2017-01-01

    Simulated samples of various physics processes are a key ingredient within analyses to unlock the physics behind LHC collision data. Samples with more and more statistics are required to keep up with the increasing amounts of recorded data. During sample generation, significant computing time is spent on the reconstruction of charged particle tracks from energy deposits which additionally scales with the pileup conditions. In CMS, the FastSimulation package is developed for providing a fast alternative to the standard simulation and reconstruction workflow. It employs various techniques to emulate track reconstruction effects in particle collision events. Several analysis groups in CMS are utilizing the package, in particular those requiring many samples to scan the parameter space of physics models (e.g. SUSY) or for the purpose of estimating systematic uncertainties. The strategies for and recent developments in this emulation are presented, including a novel, flexible implementation of tracking emulation w...

  4. Building energy demand aggregation and simulation tools

    Gianniou, Panagiota; Heller, Alfred; Rode, Carsten

    2015-01-01

    to neighbourhoods and cities. Buildings occupy a key place in the development of smart cities as they represent an important potential to integrate smart energy solutions. Building energy consumption affects significantly the performance of the entire energy network. Therefore, a realistic estimation...... of the aggregated building energy use will not only ensure security of supply but also enhance the stabilization of national energy balances. In this study, the aggregation of building energy demand was investigated for a real case in Sønderborg, Denmark. Sixteen single-family houses -mainly built in the 1960s......- were examined, all connected to the regional district heating network. The aggregation of building energy demands was carried out according to typologies, being represented by archetype buildings. These houses were modelled with dynamic energy simulation software and with a simplified simulation tool...

  5. A powerful tool for plant specific simulations

    Boucau, J.; Ofstun, R.

    1988-01-01

    Following the Three Mile Island (TMI) accident, special emphasis has been placed by the Utilities and Safety Authorities on the training of plant operators and engineers to make them more familiar with the dynamic behavior of the plant in post-accident conditions. With a view to providing transient response under realistic operating conditions, concerns were raised over: the use of conservative safety analysis models for analyzing operational and accident transients; the ability to dynamically simulate the expected operator actions, and the need to simulate the plant during long term cooling recovery. To meet these requirements, an interactive, user friendly, cost efficient dynamic model capable of simulating two-phase conditions is needed. In response to these post-TMI concerns, Westinghouse has developed, the Transient Real-Time Engineering Analysis Tool (TREAT) which is a general purpose interactive thermal-hydraulic network code for thermal-hydraulic transient analysis. The applications of TREAT are numerous and range from engineering studies to plant operator training using the newly developed TREAT SIMULATOR. The purpose of this paper is to describe the TREAT program and potential applications. Special emphasis will be drawn at a plant specific training application performed for a French 4-loop plant

  6. FY14 Milestone: Simulated Impacts of Life-Like Fast Charging on BEV Batteries

    Neubauer, Jeremy [National Renewable Energy Lab. (NREL), Golden, CO (United States). Transportation and Hydrogen Systems Center; Wood, Eric [National Renewable Energy Lab. (NREL), Golden, CO (United States). Transportation and Hydrogen Systems Center; Burton, Evan [National Renewable Energy Lab. (NREL), Golden, CO (United States). Transportation and Hydrogen Systems Center; Smith, Kandler [National Renewable Energy Lab. (NREL), Golden, CO (United States). Transportation and Hydrogen Systems Center; Pesaran, Ahmad [National Renewable Energy Lab. (NREL), Golden, CO (United States). Transportation and Hydrogen Systems Center

    2014-09-01

    Fast charging is attractive to battery electric vehicle (BEV) drivers for its ability to enable long-distance travel and quickly recharge depleted batteries on short notice. However, such aggressive charging and the sustained vehicle operation that results could lead to excessive battery temperatures and degradation. Properly assessing the consequences of fast charging requires accounting for disparate cycling, heating, and aging of individual cells in large BEV packs when subjected to realistic travel patterns, usage of fast chargers, and climates over long durations (i.e., years). The U.S. Department of Energy's Vehicle Technologies Office has supported NREL's development of BLAST-V 'the Battery Lifetime Analysis and Simulation Tool for Vehicles' to create a tool capable of accounting for all of these factors. The authors present on the findings of applying this tool to realistic fast charge scenarios. The effects of different travel patterns, climates, battery sizes, battery thermal management systems, and other factors on battery performance and degradation are presented. The primary challenge for BEV batteries operated in the presence of fast charging is controlling maximum battery temperature, which can be achieved with active battery cooling systems.

  7. Development and application of modeling tools for sodium fast reactor inspection

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan [CEA LIST, Centre de Saclay F-91191 Gif-sur-Yvette (France)

    2014-02-18

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  8. Simulator platform for fast reactor operation and safety technology demonstration

    Vilim, R.B.; Park, Y.S.; Grandy, C.; Belch, H.; Dworzanski, P.; Misterka, J.

    2012-01-01

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe response to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.

  9. Simulator platform for fast reactor operation and safety technology demonstration

    Vilim, R. B.; Park, Y. S.; Grandy, C.; Belch, H.; Dworzanski, P.; Misterka, J. (Nuclear Engineering Division)

    2012-07-30

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe response to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.

  10. Non-Maxwellian fast particle effects in gyrokinetic GENE simulations

    Di Siena, A.; Görler, T.; Doerk, H.; Bilato, R.; Citrin, J.; Johnson, T.; Schneider, M.; Poli, E.; JET Contributors

    2018-04-01

    Fast ions have recently been found to significantly impact and partially suppress plasma turbulence both in experimental and numerical studies in a number of scenarios. Understanding the underlying physics and identifying the range of their beneficial effect is an essential task for future fusion reactors, where highly energetic ions are generated through fusion reactions and external heating schemes. However, in many of the gyrokinetic codes fast ions are, for simplicity, treated as equivalent-Maxwellian-distributed particle species, although it is well known that to rigorously model highly non-thermalised particles, a non-Maxwellian background distribution function is needed. To study the impact of this assumption, the gyrokinetic code GENE has recently been extended to support arbitrary background distribution functions which might be either analytical, e.g., slowing down and bi-Maxwellian, or obtained from numerical fast ion models. A particular JET plasma with strong fast-ion related turbulence suppression is revised with these new code capabilities both with linear and nonlinear gyrokinetic simulations. It appears that the fast ion stabilization tends to be less strong but still substantial with more realistic distributions, and this improves the quantitative power balance agreement with experiments.

  11. Simulations of ICRF-fast wave current drive on DIIID

    Ehst, D.A.

    1990-06-01

    Self-consistent calculations of MHD equilibria, generated by fast wave current drive and including the bootstrap effect, were done to guide and anticipate the results of upcoming experiments on the DIIID tokamak. The simulations predict that 2 MW of ICRF power is more than adequate to create several hundred kiloamperes in steady state; the total current increases with the temperature and density of the target plasma. 12 refs., 12 figs., 1 tab

  12. FAST modularization framework for wind turbine simulation: full-system linearization

    Jonkman, J. M.; Jonkman, B. J.

    2016-09-01

    The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well- established methods and tools for analyzing linear systems. This paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.

  13. FAST Modularization Framework for Wind Turbine Simulation: Full-System Linearization: Preprint

    Jonkman, Jason; Jonkman, Bonnie

    2016-11-01

    The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well-established methods and tools for analyzing linear systems. This paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.

  14. HAMMER: Reweighting tool for simulated data samples

    Duell, Stephan; Ligeti, Zoltan; Papucci, Michele; Robinson, Dean

    2016-01-01

    Modern flavour physics experiments, such as Belle II or LHCb, require large samples of generated Monte Carlo events. Monte Carlo events often are processed in a sophisticated chain that includes a simulation of the detector response. The generation and reconstruction of large samples is resource-intensive and in principle would need to be repeated if e.g. parameters responsible for the underlying models change due to new measurements or new insights. To avoid having to regenerate large samples, we work on a tool, The Helicity Amplitude Module for Matrix Element Reweighting (HAMMER), which allows one to easily reweight existing events in the context of semileptonic b → q ` ̄ ν ` analyses to new model parameters or new physics scenarios.

  15. A fast method for optical simulation of flood maps of light-sharing detector modules

    Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W.; Peng, Qiyu

    2015-01-01

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.

  16. Coarse-graining stochastic biochemical networks: adiabaticity and fast simulations

    Nemenman, Ilya [Los Alamos National Laboratory; Sinitsyn, Nikolai [Los Alamos National Laboratory; Hengartner, Nick [Los Alamos National Laboratory

    2008-01-01

    We propose a universal approach for analysis and fast simulations of stiff stochastic biochemical kinetics networks, which rests on elimination of fast chemical species without a loss of information about mesoscoplc, non-Poissonian fluctuations of the slow ones. Our approach, which is similar to the Born-Oppenhelmer approximation in quantum mechanics, follows from the stochastic path Integral representation of the cumulant generating function of reaction events. In applications with a small number of chemIcal reactions, It produces analytical expressions for cumulants of chemical fluxes between the slow variables. This allows for a low-dimensional, Interpretable representation and can be used for coarse-grained numerical simulation schemes with a small computational complexity and yet high accuracy. As an example, we derive the coarse-grained description for a chain of biochemical reactions, and show that the coarse-grained and the microscopic simulations are in an agreement, but the coarse-gralned simulations are three orders of magnitude faster.

  17. Fast Learning for Immersive Engagement in Energy Simulations

    Bush, Brian W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gruchalla, Kenny M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Potter, Kristin C [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-25

    The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximation methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.

  18. Fast and Accurate Simulation of the Cray XMT Multithreaded Supercomputer

    Villa, Oreste; Tumeo, Antonino; Secchi, Simone; Manzano Franco, Joseph B.

    2012-12-31

    Irregular applications, such as data mining and analysis or graph-based computations, show unpredictable memory/network access patterns and control structures. Highly multithreaded architectures with large processor counts, like the Cray MTA-1, MTA-2 and XMT, appear to address their requirements better than commodity clusters. However, the research on highly multithreaded systems is currently limited by the lack of adequate architectural simulation infrastructures due to issues such as size of the machines, memory footprint, simulation speed, accuracy and customization. At the same time, Shared-memory MultiProcessors (SMPs) with multi-core processors have become an attractive platform to simulate large scale machines. In this paper, we introduce a cycle-level simulator of the highly multithreaded Cray XMT supercomputer. The simulator runs unmodified XMT applications. We discuss how we tackled the challenges posed by its development, detailing the techniques introduced to make the simulation as fast as possible while maintaining a high accuracy. By mapping XMT processors (ThreadStorm with 128 hardware threads) to host computing cores, the simulation speed remains constant as the number of simulated processors increases, up to the number of available host cores. The simulator supports zero-overhead switching among different accuracy levels at run-time and includes a network model that takes into account contention. On a modern 48-core SMP host, our infrastructure simulates a large set of irregular applications 500 to 2000 times slower than real time when compared to a 128-processor XMT, while remaining within 10\\% of accuracy. Emulation is only from 25 to 200 times slower than real time.

  19. Simulation as a hospital management support tool

    Nemesio Rodrigues Capocci

    2017-07-01

    Full Text Available This study aims to demonstrate the use of the discrete event simulation technique as a hospital management support tool, as well as all complex processes existing in a health unit. There must be an analysis of the system as a whole from the perspective of service level provided to patients regarding waiting times. The role of this technique is to show the behavior of a given system. Data were collected from employees of a public Polyclinic, located in a city of the greater São Paulo, by means of interviews which questions were prepared to determine the time spent in the processes of the service system. Such data were inserted in the software Arena in flowchart format for analysis and identification of the problem. Since the person responsible for the screening process was overloaded, thus causing longer waiting times for patients submitted for screening, some changes were made in the model in order to propose an improvement, to balance the occupancy levels of the health unit’s staff and, at the same time, reach a shorter withdrawal period of patients throughout the system. Results showed a significant improvement in the performance of the Polyclinic’s system, as well as a subsequent improvement in the level of service provided to patients. Based on this study, one can note that simulation allows for evaluating scenarios and projecting changes that will impact the behavior of a certain system with no physical changes, thus preventing the lack of scientific basis when making management decisions and allowing for improvements.

  20. PhySortR: a fast, flexible tool for sorting phylogenetic trees in R.

    Stephens, Timothy G; Bhattacharya, Debashish; Ragan, Mark A; Chan, Cheong Xin

    2016-01-01

    A frequent bottleneck in interpreting phylogenomic output is the need to screen often thousands of trees for features of interest, particularly robust clades of specific taxa, as evidence of monophyletic relationship and/or reticulated evolution. Here we present PhySortR, a fast, flexible R package for classifying phylogenetic trees. Unlike existing utilities, PhySortR allows for identification of both exclusive and non-exclusive clades uniting the target taxa based on tip labels (i.e., leaves) on a tree, with customisable options to assess clades within the context of the whole tree. Using simulated and empirical datasets, we demonstrate the potential and scalability of PhySortR in analysis of thousands of phylogenetic trees without a priori assumption of tree-rooting, and in yielding readily interpretable trees that unambiguously satisfy the query. PhySortR is a command-line tool that is freely available and easily automatable.

  1. A Simulation Tool for tccp Programs

    María-del-Mar Gallardo

    2017-01-01

    Full Text Available The Timed Concurrent Constraint Language tccp is a declarative synchronous concurrent language, particularly suitable for modelling reactive systems. In tccp, agents communicate and synchronise through a global constraint store. It supports a notion of discrete time that allows all non-blocked agents to proceed with their execution simultaneously. In this paper, we present a modular architecture for the simulation of tccp programs. The tool comprises three main components. First, a set of basic abstract instructions able to model the tccp agent behaviour, the memory model needed to manage the active agents and the state of the store during the execution. Second, the agent interpreter that executes the instructions of the current agent iteratively and calculates the new agents to be executed at the next time instant. Finally, the constraint solver components which are the modules that deal with constraints. In this paper, we describe the implementation of these components and present an example of a real system modelled in tccp.

  2. Simulating the temperature noise in fast reactor fuel assemblies

    Kebadze, B.V.; Pykhtina, T.V.; Tarasko, M.Z.

    1987-01-01

    Characteristics of temperature noise at various modes of coolant flow in fast reactor fuel assemblies (FA) and for different points of sensor installation are investigated. Stationary mode of coolant flow and mode with a partial overlapping of FA through cross section, resulting in local temperature increase and sodium boiling, are considered. Numerical simulation permits to evaluate time characteristicsof temperature noise and to formulate requirements for dynamic characteristics of the sensors, and also to clarify the dependence of coolant distribution parameters on the sensor location and peculiarities of stationary temperature profile

  3. Are Live Ultrasound Models Replaceable? Traditional vs. Simulated Education Module for FAST

    Suzanne Bentley

    2015-10-01

    Full Text Available Introduction: The focused assessment with sonography for trauma (FAST is a commonly used and life-saving tool in the initial assessment of trauma patients. The recommended emergency medicine (EM curriculum includes ultrasound and studies show the additional utility of ultrasound training for medical students. EM clerkships vary and often do not contain formal ultrasound instruction. Time constraints for facilitating lectures and hands-on learning of ultrasound are challenging. Limitations on didactics call for development and inclusion of novel educational strategies, such as simulation. The objective of this study was to compare the test, survey, and performance of ultrasound between medical students trained on an ultrasound simulator versus those trained via traditional, hands-on patient format. Methods: This was a prospective, blinded, controlled educational study focused on EM clerkship medical students. After all received a standardized lecture with pictorial demonstration of image acquisition, students were randomized into two groups: control group receiving traditional training method via practice on a human model and intervention group training via practice on an ultrasound simulator. Participants were tested and surveyed on indications and interpretation of FAST and training and confidence with image interpretation and acquisition before and after this educational activity. Evaluation of FAST skills was performed on a human model to emulate patient care and practical skills were scored via objective structured clinical examination (OSCE with critical action checklist. Results: There was no significant difference between control group (N=54 and intervention group (N=39 on pretest scores, prior ultrasound training/education, or ultrasound comfort level in general or on FAST. All students (N=93 showed significant improvement from pre- to post-test scores and significant improvement in comfort level using ultrasound in general and on FAST

  4. A Fast Monte Carlo Simulation for the International Linear Collider Detector

    Furse, D.

    2005-01-01

    The following paper contains details concerning the motivation for, implementation and performance of a Java-based fast Monte Carlo simulation for a detector designed to be used in the International Linear Collider. This simulation, presently included in the SLAC ILC group's org.lcsim package, reads in standard model or SUSY events in STDHEP file format, stochastically simulates the blurring in physics measurements caused by intrinsic detector error, and writes out an LCIO format file containing a set of final particles statistically similar to those that would have found by a full Monte Carlo simulation. In addition to the reconstructed particles themselves, descriptions of the calorimeter hit clusters and tracks that these particles would have produced are also included in the LCIO output. These output files can then be put through various analysis codes in order to characterize the effectiveness of a hypothetical detector at extracting relevant physical information about an event. Such a tool is extremely useful in preliminary detector research and development, as full simulations are extremely cumbersome and taxing on processor resources; a fast, efficient Monte Carlo can facilitate and even make possible detector physics studies that would be very impractical with the full simulation by sacrificing what is in many cases inappropriate attention to detail for valuable gains in time required for results

  5. CALIBRATED ULTRA FAST IMAGE SIMULATIONS FOR THE DARK ENERGY SURVEY

    Bruderer, Claudio; Chang, Chihway; Refregier, Alexandre; Amara, Adam; Bergé, Joel; Gamper, Lukas, E-mail: claudio.bruderer@phys.ethz.ch [Institute for Astronomy, Department of Physics, ETH Zurich, Wolfgang-Pauli-Strasse 27, 8093 Zürich (Switzerland)

    2016-01-20

    Image simulations are becoming increasingly important in understanding the measurement process of the shapes of galaxies for weak lensing and the associated systematic effects. For this purpose we present the first implementation of the Monte Carlo Control Loops (MCCL), a coherent framework for studying systematic effects in weak lensing. It allows us to model and calibrate the shear measurement process using image simulations from the Ultra Fast Image Generator (UFig) and the image analysis software SExtractor. We apply this framework to a subset of the data taken during the Science Verification period (SV) of the Dark Energy Survey (DES). We calibrate the UFig simulations to be statistically consistent with one of the SV images, which covers ∼0.5 square degrees. We then perform tolerance analyses by perturbing six simulation parameters and study their impact on the shear measurement at the one-point level. This allows us to determine the relative importance of different parameters. For spatially constant systematic errors and point-spread function, the calibration of the simulation reaches the weak lensing precision needed for the DES SV survey area. Furthermore, we find a sensitivity of the shear measurement to the intrinsic ellipticity distribution, and an interplay between the magnitude-size and the pixel value diagnostics in constraining the noise model. This work is the first application of the MCCL framework to data and shows how it can be used to methodically study the impact of systematics on the cosmic shear measurement.

  6. An Interactive Simulation Tool for Production Planning in Bacon Factories

    Nielsen, Jens Frederik Dalsgaard; Nielsen, Kirsten Mølgaard

    1994-01-01

    The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory......The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory...

  7. SOAP. A tool for the fast computation of photometry and radial velocity induced by stellar spots

    Boisse, I.; Bonfils, X.; Santos, N. C.

    2012-09-01

    We define and put at the disposal of the community SOAP, Spot Oscillation And Planet, a software tool that simulates the effect of stellar spots and plages on radial velocimetry and photometry. This paper describes the tool release and provides instructions for its use. We present detailed tests with previous computations and real data to assess the code's performance and to validate its suitability. We characterize the variations of the radial velocity, line bisector, and photometric amplitude as a function of the main variables: projected stellar rotational velocity, filling factor of the spot, resolution of the spectrograph, linear limb-darkening coefficient, latitude of the spot, and inclination of the star. Finally, we model the spot distributions on the active stars HD 166435, TW Hya and HD 189733, which reproduce the observations. We show that the software is remarkably fast, allowing several evolutions in its capabilities that could be performed to study the next challenges in the exoplanetary field connected with the stellar variability. The tool is available at http://www.astro.up.pt/soap

  8. Formula student suspension setup and laptime simulation tool

    van den Heuvel, E.; Besselink, I.J.M.; Nijmeijer, H.

    2013-01-01

    In motorsports time is usually limited. With use of dedicated tools for measuring wheel alignment, camber, ride heights etc. setting up the car can be done fast and consistent. With the setup sequence and tools described in this report, progress has been made in the time it takes to set up the car.

  9. Fast stochastic algorithm for simulating evolutionary population dynamics

    Tsimring, Lev; Hasty, Jeff; Mather, William

    2012-02-01

    Evolution and co-evolution of ecological communities are stochastic processes often characterized by vastly different rates of reproduction and mutation and a coexistence of very large and very small sub-populations of co-evolving species. This creates serious difficulties for accurate statistical modeling of evolutionary dynamics. In this talk, we introduce a new exact algorithm for fast fully stochastic simulations of birth/death/mutation processes. It produces a significant speedup compared to the direct stochastic simulation algorithm in a typical case when the total population size is large and the mutation rates are much smaller than birth/death rates. We illustrate the performance of the algorithm on several representative examples: evolution on a smooth fitness landscape, NK model, and stochastic predator-prey system.

  10. Benchmark exercise for fluid flow simulations in a liquid metal fast reactor fuel assembly

    Merzari, E., E-mail: emerzari@anl.gov [Mathematics and Computer Science Division, Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Fischer, P. [Mathematics and Computer Science Division, Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Yuan, H. [Nuclear Engineering Division, Argonne National Laboratory, Lemont, IL (United States); Van Tichelen, K.; Keijers, S. [SCK-CEN, Boeretang 200, Mol (Belgium); De Ridder, J.; Degroote, J.; Vierendeels, J. [Ghent University, Ghent (Belgium); Doolaard, H.; Gopala, V.R.; Roelofs, F. [NRG, Petten (Netherlands)

    2016-03-15

    Highlights: • A EUROTAM-US INERI consortium has performed a benchmark exercise related to fast reactor assembly simulations. • LES calculations for a wire-wrapped rod bundle are compared with RANS calculations. • Results show good agreement for velocity and cross flows. - Abstract: As part of a U.S. Department of Energy International Nuclear Energy Research Initiative (I-NERI), Argonne National Laboratory (Argonne) is collaborating with the Dutch Nuclear Research and consultancy Group (NRG), the Belgian Nuclear Research Centre (SCK·CEN), and Ghent University (UGent) in Belgium to perform and compare a series of fuel-pin-bundle calculations representative of a fast reactor core. A wire-wrapped fuel bundle is a complex configuration for which little data is available for verification and validation of new simulation tools. UGent and NRG performed their simulations with commercially available computational fluid dynamics (CFD) codes. The high-fidelity Argonne large-eddy simulations were performed with Nek5000, used for CFD in the Simulation-based High-efficiency Advanced Reactor Prototyping (SHARP) suite. SHARP is a versatile tool that is being developed to model the core of a wide variety of reactor types under various scenarios. It is intended both to serve as a surrogate for physical experiments and to provide insight into experimental results. Comparison of the results obtained by the different participants with the reference Nek5000 results shows good agreement, especially for the cross-flow data. The comparison also helps highlight issues with current modeling approaches. The results of the study will be valuable in the design and licensing process of MYRRHA, a flexible fast research reactor under design at SCK·CEN that features wire-wrapped fuel bundles cooled by lead-bismuth eutectic.

  11. FastSim: A Fast Simulation for the SuperB Detector

    Andreassen, R; Sokoloff, M; Arnaud, N; Burmistrov, L; Brown, D N; Carlson, J; Gaponenko, I; Suzuki, A; Cheng, C-h; Simone, A Di; Manoni, E; Perez, A; Walsh, J; Rama, M; Roberts, D; Rotondo, M; Simi, G

    2011-01-01

    We have developed a parameterized (fast) simulation for detector optimization and physics reach studies of the proposed SuperB Flavor Factory in Italy. Detector components are modeled as thin sections of planes, cylinders, disks or cones. Particle-material interactions are modeled using simplified cross-sections and formulas. Active detectors are modeled using parameterized response functions. Geometry and response parameters are configured using xml files with a custom-designed schema. Reconstruction algorithms adapted from BaBar are used to build tracks and clusters. Multiple sources of background signals can be merged with primary signals. Pattern recognition errors are modeled statistically by randomly misassigning nearby tracking hits. Standard BaBar analysis tuples are used as an event output. Hadronic B meson pair events can be simulated at roughly 10Hz.

  12. FastSim: A Fast Simulation for the SuperB Detector

    Andreassen, R.; Arnaud, N.; Brown, D. N.; Burmistrov, L.; Carlson, J.; Cheng, C.-h.; Di Simone, A.; Gaponenko, I.; Manoni, E.; Perez, A.; Rama, M.; Roberts, D.; Rotondo, M.; Simi, G.; Sokoloff, M.; Suzuki, A.; Walsh, J.

    2011-12-01

    We have developed a parameterized (fast) simulation for detector optimization and physics reach studies of the proposed SuperB Flavor Factory in Italy. Detector components are modeled as thin sections of planes, cylinders, disks or cones. Particle-material interactions are modeled using simplified cross-sections and formulas. Active detectors are modeled using parameterized response functions. Geometry and response parameters are configured using xml files with a custom-designed schema. Reconstruction algorithms adapted from BaBar are used to build tracks and clusters. Multiple sources of background signals can be merged with primary signals. Pattern recognition errors are modeled statistically by randomly misassigning nearby tracking hits. Standard BaBar analysis tuples are used as an event output. Hadronic B meson pair events can be simulated at roughly 10Hz.

  13. Development of a fast running accident analysis computer program for use in a simulator

    Cacciabue, P.C.

    1985-01-01

    This paper describes how a reactor safety nuclear computer program can be modified and improved with the aim of reaching a very fast running tool to be used as a physical model in a plant simulator, without penalizing the accuracy of results. It also discusses some ideas on how the physical theoretical model can be combined to a driving statistical tool for the build up of the entire package of software to be implemented in the simulator for risk and reliability analysis. The approach to the problem, although applied to a specific computer program, can be considered quite general if an already existing and well tested code is being used for the purpose. The computer program considered is ALMOD, originally developed for the analysis of the thermohydraulic and neutronic behaviour of the reactor core, primary circuit and steam generator during operational and special transients. (author)

  14. Simplified simulation of an experimental fast reactor plant

    Fujii, Masaaki; Fujita, Minoru.

    1978-01-01

    Purposes of the simulation are to study the dynamic behavior of a liquid metal-cooled experimental fast breeder reactor plant and to design the control system of the reactor plant by modified-RAPID (Reactor and Plant Integrated Dynamics) computer program. As for the plant model, the Japan Experimental Fast Reactor ''Joyo'' was referred to approximately. This computer program is designed for the calculation of steady-state and transient temperatures in a FBR plant; which is described by a model consisting of the core, upper and lower plenums, an intermediate heat exchanger, an air dump heat exchanger, primary-secondary and tertiary coolant systems and connecting pipes. The basic equations are solved numerically by finite difference approximation. The mathematical model for an experimental FBR plant is useful for the design of the control system of FBR plants. The results of numerical simulation showed that the proportional change in the flow rates of the primary and secondary coolant loops provides good performance in relation to the stepped change in the power level. (J.P.N.)

  15. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and ICTEAM Institute, Université catholique de Louvain, Louvain-la-Neuve 1348 (Belgium); Sterpin, Edmond [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and Department of Oncology, Katholieke Universiteit Leuven, O& N I Herestraat 49, 3000 Leuven (Belgium)

    2016-04-15

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  16. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-01-01

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10"7 primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  17. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures.

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-04-01

    Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  18. Fast prototyping H.264 deblocking filter using ESL tools

    Damak, T.; Werda, I.; Masmoud, N.; Bilavarn, S.

    2011-01-01

    This paper presents a design methodology for hardware/software (HW/SW) architecture design using ESL tools (Electronic System Level). From C++ descriptions, our design flow is able to generate hardware blocks running with a software part and all necessary codes to prototype the HW/SW system on Xilinx FPGAs. Therefore we use assistance of high level synthesis tools (Catapult C Synthesis), logic synthesis and Xilinx tools. As application, we developed an optimized Deblocking filter C code, designed to be used as a part of a complete H.264 video coding system [1]. Based on this code, we explored many configurations of Catapult Synthesis to analyze different area/time tradeoffs. Results show execution speedups of 95,5 pour cent compared to pure software execution etc.

  19. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques

    Lee, Hanbong

    2016-01-01

    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  20. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  1. Dynamic wind turbine models in power system simulation tool

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  2. Hybrid fast Hankel transform implementation for optics simulation

    Davis, Paul K.

    2013-09-01

    The most compute intensive part of a full optics simulation, especially including diffraction effects, is the Fourier transform between pupil and image spaces. This is typically performed as a two dimensional fast discrete transform. For a nearly radially symmetric system there are advantages to using polar coordinates, in which case the radial transform becomes a Hankel transform, using Bessel functions instead of circular functions. However, there are special difficulties in calculating and handling Bessel functions. Several solutions have been proposed. We present a hybrid Hankel transform which divides the domain, calculating a portion using Bessel function approximations but converting most of the domain into a one dimensional Fourier transform which can be handled by standard methods.

  3. SIMULATION AS A TOOL FOR PROCESS OPTIMIZATION OF LOGISTIC SYSTEMS

    Radko Popovič

    2015-09-01

    Full Text Available The paper deals with the simulation of the production processes, especially module of Siemens Tecnomatix software. Tecnomatix Process Simulate is designed for building new or modifying existing production processes. The simulation created in this software has a posibility for fast testing of planned changes or improvements of the production processes. On the base of simulation you can imagine the future picture of the real production system. 3D Simulation can reflects the actual status and conditions on the running system and of course, after some improvements, it can show the possible figure of the production system.

  4. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  5. ASSIST: a fast versatile local structural comparison tool.

    Caprari, Silvia; Toti, Daniele; Viet Hung, Le; Di Stefano, Maurizio; Polticelli, Fabio

    2014-04-01

    Structural genomics initiatives are increasingly leading to the determination of the 3D structure of target proteins whose catalytic function is not known. The aim of this work was that of developing a novel versatile tool for searching structural similarity, which allows to predict the catalytic function, if any, of these proteins. The algorithm implemented by the tool is based on local structural comparison to find the largest subset of similar residues between an input protein and known functional sites. The method uses a geometric hashing approach where information related to residue pairs from the input structures is stored in a hash table and then is quickly retrieved during the comparison step. Tests on proteins belonging to different functional classes, done using the Catalytic Site Atlas entries as targets, indicate that the algorithm is able to identify the correct functional class of the input protein in the vast majority of the cases. The application was developed in Java SE 6, with a Java Swing Graphic User Interface (GUI). The system can be run locally on any operating system (OS) equipped with a suitable Java Virtual Machine, and is available at the following URL: http://www.computationalbiology.it/software/ASSISTv1.zip.

  6. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka

    2013-01-01

    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  7. INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...

  8. Induction generator models in dynamic simulation tools

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained...

  9. HAM-Tools – a whole building simulation tool in Annex 41

    Kalagasidis, Angela Sasic; Rode, Carsten; Woloszyn, Monika

    2008-01-01

    HAM-Tools is a building simulation software. The main task of this tool is to simulate transfer processes related to building physics, i.e. heat, air and moisture transport in buildings and building components in operating conditions. The scope of the ECBCS Annex 41 “Whole Building Heat, Air...

  10. Implementation of interconnect simulation tools in spice

    Satsangi, H.; Schutt-Aine, J. E.

    1993-01-01

    Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.

  11. Fast simulation of the forward tracking detector of HPLUS

    Zhang Yapeng; Fan Ruirui; Fu Fen; Yue Ke; Yuan Xiaohua; Xu Huagen; Chinese Academy of Sciences, Beijing; Yao Nan; Xu Hushan; Jin Genming; Liang Jinjie; Chen Ruofu; Sun Zhiyu; Duan Limin; Xiao Zhigang; Tsinghua Univ., Beijing

    2008-01-01

    The necessity of installing a forward tracking detector stack is discussed for the Hadron Physics Lanzhou Spectrometer(HPLUS). A local tracker is developed to solve the multi-track finding problem. The track candidates are searched iteratively via Hough Transform. The fake tracks are removed by a least square fitting process. With this tracker we have studied the feasibility of pp→pp+φ(→K + K - ), a typical physical channel proposed on HPLUS. The single track momentum resolution due to the uncertainty of the positioning in FTD is 1.3%. The multiple scattering effect contributes about 20% to the momentum resolution in the FTD coverage. The width and the signal-to-background ratio of the reconstructed φ are 1.51 MeV and 4.36, respectively, taking into account the direct Kaon channel pp→pp+K + K - as background. The geometry coverage of FTD for qb events is about 85.4%. Based on the current fast simulation and estimation, the geometrical configuration of FTD meets the physical requirement of HPLUS under the current luminosity and multiplicity conditions. The tracker is applicable in the full simulation coming next and is extendable to other tracking component of HPLUS. (authors)

  12. Simulation tools for robotics research and assessment

    Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.

    2016-05-01

    The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component

  13. Fast in-situ tool inspection based on inverse fringe projection and compact sensor heads

    Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard

    2016-11-01

    Inspection of machine elements is an important task in production processes in order to ensure the quality of produced parts and to gather feedback for the continuous improvement process. A new measuring system is presented, which is capable of performing the inspection of critical tool geometries, such as gearing elements, inside the forming machine. To meet the constraints on sensor head size and inspection time imposed by the limited space inside the machine and the cycle time of the process, the measuring device employs a combination of endoscopy techniques with the fringe projection principle. Compact gradient index lenses enable a compact design of the sensor head, which is connected to a CMOS camera and a flexible micro-mirror based projector via flexible fiber bundles. Using common fringe projection patterns, the system achieves measuring times of less than five seconds. To further reduce the time required for inspection, the generation of inverse fringe projection patterns has been implemented for the system. Inverse fringe projection speeds up the inspection process by employing object-adapted patterns, which enable the detection of geometry deviations in a single image. Two different approaches to generate object adapted patterns are presented. The first approach uses a reference measurement of a manufactured tool master to generate the inverse pattern. The second approach is based on a virtual master geometry in the form of a CAD file and a ray-tracing model of the measuring system. Virtual modeling of the measuring device and inspection setup allows for geometric tolerancing for free-form surfaces by the tool designer in the CAD-file. A new approach is presented, which uses virtual tolerance specifications and additional simulation steps to enable fast checking of metric tolerances. Following the description of the pattern generation process, the image processing steps required for inspection are demonstrated on captures of gearing geometries.

  14. Fast Multiscale Reservoir Simulations using POD-DEIM Model Reduction

    Ghasemi, Mohammadreza

    2015-02-23

    In this paper, we present a global-local model reduction for fast multiscale reservoir simulations in highly heterogeneous porous media with applications to optimization and history matching. Our proposed approach identifies a low dimensional structure of the solution space. We introduce an auxiliary variable (the velocity field) in our model reduction that allows achieving a high degree of model reduction. The latter is due to the fact that the velocity field is conservative for any low-order reduced model in our framework. Because a typical global model reduction based on POD is a Galerkin finite element method, and thus it can not guarantee local mass conservation. This can be observed in numerical simulations that use finite volume based approaches. Discrete Empirical Interpolation Method (DEIM) is used to approximate the nonlinear functions of fine-grid functions in Newton iterations. This approach allows achieving the computational cost that is independent of the fine grid dimension. POD snapshots are inexpensively computed using local model reduction techniques based on Generalized Multiscale Finite Element Method (GMsFEM) which provides (1) a hierarchical approximation of snapshot vectors (2) adaptive computations by using coarse grids (3) inexpensive global POD operations in a small dimensional spaces on a coarse grid. By balancing the errors of the global and local reduced-order models, our new methodology can provide an error bound in simulations. Our numerical results, utilizing a two-phase immiscible flow, show a substantial speed-up and we compare our results to the standard POD-DEIM in finite volume setup.

  15. Artist - analytical RT inspection simulation tool

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  16. Web Exploration Tools for a Fast Federated Optical Survey Database

    Humphreys, Roberta M.

    2000-01-01

    We implemented several new web-based tools to improve the efficiency and versatility of access to the APS Catalog of the POSS I (Palomar Observatory-National Geographic Sky Survey) and its associated image database. The most important addition was a federated database system to link the APS Catalog and image database into one Internet-accessible database. With the FDBS, the queries and transactions on the integrated database are performed as if it were a single database. We installed Myriad the FDBS developed by Professor Jaideep Srivastava and members of his group in the University of Minnesota Computer Science Department. It is the first system to provide schema integration, query processing and optimization, and transaction management capabilities in a single framework. The attached figure illustrates the Myriad architecture. The FDBS permits horizontal access to the data, not just vertical. For example, for the APS, queries can be made not only by sky position, but also by any parameter present in either of the databases. APS users will be able to produce an image of all the blue galaxies and stellar sources for comparison with x-ray source error ellipses from AXAF (X Ray Astrophysics Facility) (Chandra) for example. The FDBS is now available as a beta release with the appropriate query forms at our web site. While much of our time was occupied with adapting Myriad to the APS environment, we also made major changes in Star Base, our DBMS for the Catalog, at the web interface to improve its efficiency for issuing and processing queries. Star Base is now three times faster for large queries. Improvements were also made at the web end of the image database for faster access; although work still needs to be done to the image database itself for more efficient return with the FDBS. During the past few years, we made several improvements to the database pipeline that creates the individual plate databases queries by StarBase. The changes include improved positions

  17. Fast 2D hybrid fluid-analytical simulation of inductive/capacitive discharges

    Kawamura, E; Lieberman, M A; Graves, D B

    2011-01-01

    A fast two-dimensional (2D) hybrid fluid-analytical transform coupled plasma reactor model was developed using the finite elements simulation tool COMSOL. Both inductive and capacitive coupling of the source coils to the plasma are included in the model, as well as a capacitive bias option for the wafer electrode. A bulk fluid plasma model, which solves the time-dependent plasma fluid equations for the ion continuity and electron energy balance, is coupled with an analytical sheath model. The vacuum sheath of variable thickness is modeled with a fixed-width sheath of variable dielectric constant. The sheath heating is treated as an incoming heat flux at the plasma-sheath boundary, and a dissipative term is added to the sheath dielectric constant. A gas flow model solves for the steady-state pressure, temperature and velocity of the neutrals. The simulation results, over a range of input powers, are in good agreement with a chlorine reactor experimental study.

  18. A Simulation Tool for Ultrasonic Inspection

    Krishnamurthy, Adarsh; Mohan, K. V.; Karthikeyan, Soumya; Krishnamurthy, C. V.; Balasubramaniam, Krishnan [Indian Institute of Technology, Tamil Nadu (India)

    2006-06-15

    A simulation program SIMULTSONIC is under development at CNDE to help determine and/or help optimize ultrasonic probe locations for inspection of complex components. SIMULTSONIC provides a ray-trace based assessment for immersion and contact modes of inspection. The code written in Visual C++ operating in Microsoft Windows environment provides an interactive user interface. In this paper, a description of the various features of SIMULTSONIC is given followed by examples illustrating the capability of SIMULTSONIC to deal with inspection of canonical objects such as pipes. In particular, the use of SIMULTSONIC in the inspection of very thin-walled pipes (with 450 urn wall thickness) is described. Ray trace based assessment was done using SIMULTSONIC to determine the standoff distance and the angle of oblique incidence for an immersion mode focused transducer. A 3-cycle Hanning window pulse was chosen for simulations. Experiments were carried out to validate the simulations. The A-scans and the associated B-Scan images obtained through simulations show good correlation with experimental results, both with the arrival time of the signal as well as with the signal amplitudes

  19. Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control

    Acharya, Naresh [General Electric Company, Fairfield, CT (United States); Baone, Chaitanya [General Electric Company, Fairfield, CT (United States); Veda, Santosh [General Electric Company, Fairfield, CT (United States); Dai, Jing [General Electric Company, Fairfield, CT (United States); Chaudhuri, Nilanjan [General Electric Company, Fairfield, CT (United States); Leonardi, Bruno [General Electric Company, Fairfield, CT (United States); Sanches-Gasca, Juan [General Electric Company, Fairfield, CT (United States); Diao, Ruisheng [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wu, Di [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Huang, Zhenyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Yu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jin, Shuangshuang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zheng, Bin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, Yousu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-31

    Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve grid resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed

  20. A Coupling Tool for Parallel Molecular Dynamics-Continuum Simulations

    Neumann, Philipp; Tchipev, Nikola

    2012-01-01

    We present a tool for coupling Molecular Dynamics and continuum solvers. It is written in C++ and is meant to support the developers of hybrid molecular - continuum simulations in terms of both realisation of the respective coupling algorithm

  1. High Fidelity Regolith Simulation Tool for ISRU Applications, Phase I

    National Aeronautics and Space Administration — NASA has serious unmet needs for simulation tools capable of predicting the behavior of lunar regolith in proposed excavation, transport and handling systems....

  2. Unified Nonlinear Flight Dynamics and Aeroelastic Simulator Tool, Phase I

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes a R&D effort to develop a Unified Nonlinear Flight Dynamics and Aeroelastic Simulator (UNFDAS) Tool that will combine...

  3. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between ne...... information such as boundedness properties and liveness properties. The functionality of the simulation engine and state space facilities are similar to the corresponding components in Design/CPN, which is a widespread tool for Coloured Petri Nets.......CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between net...

  4. Claire, a simulation and testing tool for critical softwares

    Gassino, J.; Henry, J.Y.

    1996-01-01

    The CEA and IPSN (Institute of Nuclear Protection and Safety) needs concerning the testing of critical softwares, have led to the development of the CLAIRE tool which is able to test the softwares without modification. This tool allows to graphically model the system and its environment and to include components into the model which observe and do not modify the behaviour of the system to be tested. The executable codes are integrated in the model. The tool uses target machine simulators (microprocessors). The technique used (the event simulation) allows to associate actions with events such as the execution of an instruction, the access to a variable etc.. The simulation results are exploited using graphic, states research and test cover measurement tools. In particular, this tool can give help to the evaluation of critical softwares with pre-existing components. (J.S.)

  5. FastGCN: a GPU accelerated tool for fast gene co-expression networks.

    Meimei Liang

    Full Text Available Gene co-expression networks comprise one type of valuable biological networks. Many methods and tools have been published to construct gene co-expression networks; however, most of these tools and methods are inconvenient and time consuming for large datasets. We have developed a user-friendly, accelerated and optimized tool for constructing gene co-expression networks that can fully harness the parallel nature of GPU (Graphic Processing Unit architectures. Genetic entropies were exploited to filter out genes with no or small expression changes in the raw data preprocessing step. Pearson correlation coefficients were then calculated. After that, we normalized these coefficients and employed the False Discovery Rate to control the multiple tests. At last, modules identification was conducted to construct the co-expression networks. All of these calculations were implemented on a GPU. We also compressed the coefficient matrix to save space. We compared the performance of the GPU implementation with those of multi-core CPU implementations with 16 CPU threads, single-thread C/C++ implementation and single-thread R implementation. Our results show that GPU implementation largely outperforms single-thread C/C++ implementation and single-thread R implementation, and GPU implementation outperforms multi-core CPU implementation when the number of genes increases. With the test dataset containing 16,000 genes and 590 individuals, we can achieve greater than 63 times the speed using a GPU implementation compared with a single-thread R implementation when 50 percent of genes were filtered out and about 80 times the speed when no genes were filtered out.

  6. Software Tools for Stochastic Simulations of Turbulence

    2015-08-28

    40] R. D. Richtmyer. Taylor instability in shock acceleration of compressible fluids. Comm. pure Appl. Math , 13(297-319), 1960. 76 [41] R. Samulyak, J...Research Triangle Park, NC 27709-2211 Pure sciences, Applied sciences, Front tracking, Large eddy simulations, Mesh convergence, Stochastic convergence, Weak...Illustration of a component grid with a front crossing solution stencil. Cells in the pure yellow and pure blue regions are assigned different components

  7. A Simulation Tool for Hurricane Evacuation Planning

    Daniel J. Fonseca

    2009-01-01

    Full Text Available Atlantic hurricanes and severe tropical storms are a serious threat for the communities in the Gulf of Mexico region. Such storms are violent and destructive. In response to these dangers, coastal evacuation may be ordered. This paper describes the development of a simulation model to analyze the movement of vehicles through I-65, a major US Interstate highway that runs north off the coastal City of Mobile, Alabama, towards the State of Tennessee, during a massive evacuation originated by a disastrous event such a hurricane. The constructed simulation platform consists of a primary and two secondary models. The primary model is based on the entry of vehicles from the 20 on-ramps to I-65. The two secondary models assist the primary model with related traffic events such as car breakdowns and accidents, traffic control measures, interarrival signaling, and unforeseen emergency incidents, among others. Statistical testing was performed on the data generated by the simulation model to indentify variation in relevant traffic variables affecting the timely flow of vehicles travelling north. The performed statistical analysis focused on the closing of alternative on-ramps throughout the Interstate.

  8. A typology of educationally focused medical simulation tools.

    Alinier, Guillaume

    2007-10-01

    The concept of simulation as an educational tool in healthcare is not a new idea but its use has really blossomed over the last few years. This enthusiasm is partly driven by an attempt to increase patient safety and also because the technology is becoming more affordable and advanced. Simulation is becoming more commonly used for initial training purposes as well as for continuing professional development, but people often have very different perceptions of the definition of the term simulation, especially in an educational context. This highlights the need for a clear classification of the technology available but also about the method and teaching approach employed. The aims of this paper are to discuss the current range of simulation approaches and propose a clear typology of simulation teaching aids. Commonly used simulation techniques have been identified and discussed in order to create a classification that reports simulation techniques, their usual mode of delivery, the skills they can address, the facilities required, their typical use, and their pros and cons. This paper presents a clear classification scheme of educational simulation tools and techniques with six different technological levels. They are respectively: written simulations, three-dimensional models, screen-based simulators, standardized patients, intermediate fidelity patient simulators, and interactive patient simulators. This typology allows the accurate description of the simulation technology and the teaching methods applied. Thus valid comparison of educational tools can be made as to their potential effectiveness and verisimilitude at different training stages. The proposed typology of simulation methodologies available for educational purposes provides a helpful guide for educators and participants which should help them to realise the potential learning outcomes at different technological simulation levels in relation to the training approach employed. It should also be a useful

  9. Multi-objective optimum design of fast tool servo based on improved differential evolution algorithm

    Zhu, Zhiwei; Zhou, Xiaoqin; Liu, Qiang; Zhao, Shaoxin

    2011-01-01

    The flexure-based mechanism is a promising realization of fast tool servo (FTS), and the optimum determination of flexure hinge parameters is one of the most important elements in the FTS design. This paper presents a multi-objective optimization approach to optimizing the dimension and position parameters of the flexure-based mechanism, which is based on the improved differential evolution algorithm embedding chaos and nonlinear simulated anneal algorithm. The results of optimum design show that the proposed algorithm has excellent performance and a well-balanced compromise is made between two conflicting objectives, the stroke and natural frequency of the FTS mechanism. The validation tests based on finite element analysis (FEA) show good agreement with the results obtained by using the proposed theoretical algorithm of this paper. Finally, a series of experimental tests are conducted to validate the design process and assess the performance of the FTS mechanism. The designed FTS reaches up to a stroke of 10.25 μm with at least 2 kHz bandwidth. Both of the FEA and experimental results demonstrate that the parameters of the flexure-based mechanism determined by the proposed approaches can achieve the specified performance and the proposed approach is suitable for the optimum design of FTS mechanism and of excellent performances

  10. Simulation Tools for Power Electronics Courses Based on Java Technologies

    Canesin, Carlos A.; Goncalves, Flavio A. S.; Sampaio, Leonardo P.

    2010-01-01

    This paper presents interactive power electronics educational tools. These interactive tools make use of the benefits of Java language to provide a dynamic and interactive approach to simulating steady-state ideal rectifiers (uncontrolled and controlled; single-phase and three-phase). Additionally, this paper discusses the development and use of…

  11. 10 CFR 434.521 - The simulation tool.

    2010-01-01

    ... model shall account for: 521.1.1The dynamic heat transfer of the building envelope such as solar and... system controls and distribution systems by simulating the whole building; 521.1.4The operating schedule... RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.521 The simulation tool. 521.1Annual...

  12. Methodology for Developing a Diesel Exhaust After Treatment Simulation Tool

    Christiansen, Tine; Jensen, Johanne; Åberg, Andreas

    2018-01-01

    A methodology for the development of catalyst models is presented. Also, a methodology of the implementation of such models into a modular simulation tool, which simulates the units in succession, is presented. A case study is presented illustrating how suitable models can be found and used for s...

  13. Expanded Capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST)

    Bush, Brian; Melaina, Marc; Penev, Michael

    2016-06-08

    This presentation describes how NREL expanded the capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST) in FY16. It was presented at the U.S. Department of Energy Hydrogen and Fuel Cells Program 2016 Annual Merit Review and Peer Evaluation Meeting on June 8, 2016, in Washington, D.C.

  14. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  15. Software simulation: a tool for enhancing control system design

    Sze, B.; Ridgway, G.H.

    2008-01-01

    The creation, implementation and management of engineering design tools are important to the quality and efficiency of any large engineering project. Some of the most complicated tools to develop are system simulators. The development and implementation of system simulators to support replacement fuel handling control systems is of particular interest to the Canadian nuclear industry given the current age of installations and the risk of obsolescence to many utilities. The use of such simulator tools has been known to significantly improve successful deployment of new software packages and maintenance-related software changes while reducing the time required for their overall development. Moreover, these simulation systems can also serve as operator training stations and provide a virtual environment for site engineers to test operational changes before they are uploaded to the actual system. (author)

  16. Draught risk index tool for building energy simulations

    Vorre, Mette Havgaard; Jensen, Rasmus Lund; Nielsen, Peter V.

    2014-01-01

    Flow elements combined with a building energy simulation tool can be used to indicate areas and periods when there is a risk of draught in a room. The study tests this concept by making a tool for post-processing of data from building energy simulations. The objective is to show indications...... of draught risk during a whole year, giving building designers a tool for the design stage of a building. The tool uses simple one-at-a-time calculations of flow elements and assesses the uncertainty of the result by counting the number of overlapping flow elements. The calculation time is low, making...... it usable in the early design stage to optimise the building layout. The tool provides an overview of the general draught pattern over a period, e.g. a whole year, and of how often there is a draught risk....

  17. 42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems

    Stoneking, Eric

    2018-01-01

    Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.

  18. Thermo-hydraulic simulations of the experimental fast reactor core

    Silveira Luz, M. da; Braz Filho, F.A.; Borges, E.M.

    1985-01-01

    A study of the core and performance of metallic fuel of the experimental fast reactor, from the thermal-hydraulic point of view, was carried out employing the COBRA IV-I code. The good safety characteristics of this reactor and the feasibility of using metallic fuel in experimental fast reactor were demonstrated. (Author) [pt

  19. [Tools for laparoscopic skill development - available trainers and simulators].

    Jaksa, László; Haidegger, Tamás; Galambos, Péter; Kiss, Rita

    2017-10-01

    The laparoscopic minimally invasive surgical technique is widely employed on a global scale. However, the efficient and ethical teaching of this technique requires equipment for surgical simulation. These educational devices are present on the market in the form of box trainers and virtual reality simulators, or some combination of those. In this article, we present a systematic overview of commercially available surgical simulators describing the most important features of each product. Our overview elaborates on box trainers and virtual reality simulators, and also touches on surgical robotics simulators, together with operating room workflow simulators, for the sake of completeness. Apart from presenting educational tools, we evaluated the literature of laparoscopic surgical education and simulation, to provide a complete picture of the unfolding trends in this field. Orv Hetil. 2017; 158(40): 1570-1576.

  20. Development of in-situ visualization tool for PIC simulation

    Ohno, Nobuaki; Ohtani, Hiroaki

    2014-01-01

    As the capability of a supercomputer is improved, the sizes of simulation and its output data also become larger and larger. Visualization is usually carried out on a researcher's PC with interactive visualization software after performing the computer simulation. However, the data size is becoming too large to do it currently. A promising answer is in-situ visualization. For this case a simulation code is coupled with the visualization code and visualization is performed with the simulation on the same supercomputer. We developed an in-situ visualization tool for particle-in-cell (PIC) simulation and it is provided as a Fortran's module. We coupled it with a PIC simulation code and tested the coupled code on Plasma Simulator supercomputer, and ensured that it works. (author)

  1. Shower library technique for fast simulation of showers in calorimeters of the H1 experiment

    Raičević, N.; Glazov, A.; Zhokin, A.

    2013-01-01

    Fast simulation of showers in calorimeters is very important for particle physics analysis since shower simulation typically takes significant amount of the simulation time. At the same time, a simulation must reproduce experimental data in the best possible way. In this paper, a fast simulation of showers in two calorimeters of the H1 experiment is presented. High speed and good quality of shower simulation is achieved by using a shower library technique in which the detector response is simulated using a collection of stored showers for different particle types and topologies. The library is created using the GEANT programme. The fast simulation based on shower library is compared to the data collected by the H1 experiment

  2. Hygrothermal Numerical Simulation Tools Applied to Building Physics

    Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto

    2013-01-01

    This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...

  3. Echo simulator with novel training and competency testing tools.

    Sheehan, Florence H; Otto, Catherine M; Freeman, Rosario V

    2013-01-01

    We developed and validated an echo simulator with three novel tools that facilitate training and enable quantitative and objective measurement of psychomotor as well as cognitive skill. First, the trainee can see original patient images - not synthetic or simulated images - that morph in real time as the mock transducer is manipulated on the mannequin. Second, augmented reality is used for Visual Guidance, a tool that assists the trainee in scanning by displaying the target organ in 3-dimensions (3D) together with the location of the current view plane and the plane of the anatomically correct view. Third, we introduce Image Matching, a tool that leverages the aptitude of the human brain for recognizing similarities and differences to help trainees learn to perform visual assessment of ultrasound images. Psychomotor competence is measured in terms of the view plane angle error. The construct validity of the simulator for competency testing was established by demonstrating its ability to discriminate novices vs. experts.

  4. Distribution view: a tool to write and simulate distributions

    Coelho, José; Branco, Fernando; Oliveira, Teresa

    2006-01-01

    In our work we present a tool to write and simulate distributions. This tool allows to write mathematical expressions which can contain not only functions and variables, but also statistical distributions, including mixtures. Each time the expression is evaluated, for all inner distributions, is generated a value according to the distribution and is used for expression value determination. The inversion method can be used in this language, allowing to generate all distributions...

  5. A Coupling Tool for Parallel Molecular Dynamics-Continuum Simulations

    Neumann, Philipp

    2012-06-01

    We present a tool for coupling Molecular Dynamics and continuum solvers. It is written in C++ and is meant to support the developers of hybrid molecular - continuum simulations in terms of both realisation of the respective coupling algorithm as well as parallel execution of the hybrid simulation. We describe the implementational concept of the tool and its parallel extensions. We particularly focus on the parallel execution of particle insertions into dense molecular systems and propose a respective parallel algorithm. Our implementations are validated for serial and parallel setups in two and three dimensions. © 2012 IEEE.

  6. Beam simulation tools for GEANT4 (and neutrino source applications)

    V.Daniel Elvira, Paul Lebrun and Panagiotis Spentzouris email daniel@fnal.gov

    2002-01-01

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the High Energy Physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. Although there are many computer programs for beam physics simulations, Geant4 is ideal to model a beam going through material or a system with a beam line integrated to a complex detector. There are many examples in the current international High Energy Physics programs, such as studies related to a future Neutrino Factory, a Linear Collider, and a very Large Hadron Collider

  7. Physics validation of detector simulation tools for LHC

    Beringer, J.

    2004-01-01

    Extensive studies aimed at validating the physics processes built into the detector simulation tools Geant4 and Fluka are in progress within all Large Hardon Collider (LHC) experiments, within the collaborations developing these tools, and within the LHC Computing Grid (LCG) Simulation Physics Validation Project, which has become the primary forum for these activities. This work includes detailed comparisons with test beam data, as well as benchmark studies of simple geometries and materials with single incident particles of various energies for which experimental data is available. We give an overview of these validation activities with emphasis on the latest results

  8. Model for Simulating Fasting Glucose in Type 2 Diabetes and the Effect of Adherence to Treatment

    Aradóttir, Tinna Björk; Boiroux, Dimitri; Bengtsson, Henrik

    2017-01-01

    trial results where a dose guidance algorithm was used. We investigate sources of variance and through simulations evaluate the contribution of adherence to variance and dose guidance quality. The results suggest that the model for simulation of T2D patients is sufficient for simulating fasting glucose......The primary goal of this paper is to predict fasting glucose levels in type 2 diabetes (T2D) in long-acting insulin treatment. The paper presents a model for simulating insulin-glucose dynamics in T2D patients. The model combines a physiological model of type 1 diabetes (T1D) and an endogenous...... insulin production model in T2D. We include a review of sources of variance in fasting glucose values in long-acting insulin treatment, with respect to dose guidance algorithms. We use the model to simulate fasting glucose levels in T2D long-acting insulin treatment and compare the results with clinical...

  9. Process simulation support in BPM tools: The case of BPMN

    Freitas, António Paulo; Pereira, José Luís Mota

    2015-01-01

    Due to the increasing acceptance of BPM, nowadays BPM tools are extensively used in organizations. Core to BPM are the process modeling languages, of which BPMN is the one that has been receiving most attention these days. Once a business process is described using BPMN, one can use a process simulation approach in order to find its optimized form. In this context, the simulation of business processes, such as those defined in BPMN, appears as an obvious way of improving pro...

  10. Use of Torness simulator as a design and commissioning tool

    Hamilton, J.; Hacking, D.

    1989-01-01

    Real-time replica control room simulators are now in general use for the training of nuclear power plant operators, and the almost universal trend towards accurate physics-based plant models and detailed operator interface has opened up interesting and valuable engineering applications. This paper discusses the use of the Torness AGR Simulator in its dual role of commissioning support tool and operator training facility. (author)

  11. Solid-state-drives (SSDs) modeling simulation tools & strategies

    2017-01-01

    This book introduces simulation tools and strategies for complex systems of solid-state-drives (SSDs) which consist of a flash multi-core microcontroller plus NAND flash memories. It provides a broad overview of the most popular simulation tools, with special focus on open source solutions. VSSIM, NANDFlashSim and DiskSim are benchmarked against performances of real SSDs under different traffic workloads. PROs and CONs of each simulator are analyzed, and it is clearly indicated which kind of answers each of them can give and at a what price. It is explained, that speed and precision do not go hand in hand, and it is important to understand when to simulate what, and with which tool. Being able to simulate SSD’s performances is mandatory to meet time-to-market, together with product cost and quality. Over the last few years the authors developed an advanced simulator named “SSDExplorer” which has been used to evaluate multiple phenomena with great accuracy, from QoS (Quality Of Service) to Read Retry, fr...

  12. Pediatric FAST and elevated liver transaminases: An effective screening tool in blunt abdominal trauma.

    Sola, Juan E; Cheung, Michael C; Yang, Relin; Koslow, Starr; Lanuti, Emma; Seaver, Chris; Neville, Holly L; Schulman, Carl I

    2009-11-01

    The current standard for the evaluation of children with blunt abdominal trauma (BAT) consists of physical examination, screening lab values, and computed tomography (CT) scan. We sought to determine if the focused assessment with sonography for trauma (FAST) combined with elevated liver transaminases (AST/ALT) could be used as a screening tool for intra-abdominal injury (IAI) in pediatric patients with BAT. Registry data at a level 1 trauma center was retrospectively reviewed from 1991-2007. Data collected on BAT patients under the age of 16 y included demographics, injury mechanism, ISS, GCS, imaging studies, serum ALT and AST levels, and disposition. AST and ALT were considered positive if either one was >100 IU/L. Overall, 3171 cases were identified. A total of 1008 (31.8%) patients received CT scan, 1148 (36.2%) had FAST, and 497 (15.7%) patients received both. Of the 497 patients, 400 (87.1%) also had AST and ALT measured. FAST was 50% sensitive, 91% specific, with a positive predictive value (PPV) of 68%, negative predictive value (NPV) of 83%, and accuracy of 80%. Combining FAST with elevated AST or ALT resulted in a statistically significant increase in all measures (sensitivity 88%, specificity 98%, PPV 94%, NPV 96%, accuracy 96%). FAST combined with AST or ALT > 100 IU/L is an effective screening tool for IAI in children following BAT. Pediatric patients with a negative FAST and liver transaminases < 100 IU/L should be observed rather than subjected to the radiation risk of CT.

  13. RANS-based CFD simulations of sodium fast reactor wire-wrapped pin bundles

    Pointer, W. D.; Thomas, J.; Fanning, T.; Fischer, P.; Siegel, A.; Smith, J.; Tokuhiro, A.

    2009-01-01

    In response to recent renewed interest in the development of advanced fast reactors, an effort is underway to develop a high-performance computational multi-physics simulation suite for the design and safety analysis of sodium cooled fast reactors. Within the multi-resolution thermal-hydraulics simulation component of this framework, high-resolution spectral large eddy simulation methods are used to improve turbulence models from coarser resolution Reynolds-averaged Navier-Stokes methods, and in turn, that data is used to improve or extend correlations used in traditional sub-channel tools. These ongoing studies provide the foundation for the development of the intermediate RANS-based resolution level. Prior work has focused on the benchmarking of flow field predictions on in 7-pin, 19-pin, and 37-pin fuel assemblies. The present work extends these studies to 217-pin assemblies in support of initial efforts to benchmark heat transfer predictions using the RANS models against conventional sub-channel models. In an effort to reduce the number of computational cells required to describe a 217-pin geometry, the effects of simplification of the geometric description of the contact point between the wire and the pin are investigated. The advantages of using polyhedral-based meshing methods rather than trimmed cell meshing methods have been demonstrated, and the effects of changes in axial mesh resolution in these meshes have been investigated. Results show that the geometric simplification has little impact on predicted flow fields, as does the use of a polyhedral mesh of comparable mesh density in place of the original trimmed cell mesh. While reducing axial mesh density has a notable impact on the velocity field, reducing predicted exchange velocities between adjacent subchannels by as much 25%, the impact on predicted temperature fields is negligible. (authors)

  14. Multivariate data analysis as a fast tool in evaluation of solid state phenomena

    Jørgensen, Anna Cecilia; Miroshnyk, Inna; Karjalainen, Milja

    2006-01-01

    of information generated can be overwhelming and the need for more effective data analysis tools is well recognized. The aim of this study was to investigate the use of multivariate data analysis, in particular principal component analysis (PCA), for fast analysis of solid state information. The data sets...... the molecular level interpretation of the structural changes related to the loss of water, as well as interpretation of the phenomena related to the crystallization. The critical temperatures or critical time points were identified easily using the principal component analysis. The variables (diffraction angles...... or wavenumbers) that changed could be identified by the careful interpretation of the loadings plots. The PCA approach provides an effective tool for fast screening of solid state information....

  15. Comparison of discrete event simulation tools in an academic environment

    Mario Jadrić

    2014-12-01

    Full Text Available A new research model for simulation software evaluation is proposed consisting of three main categories of criteria: modeling and simulation capabilities of the explored tools, and tools’ input/output analysis possibilities, all with respective sub-criteria. Using the presented model, two discrete event simulation tools are evaluated in detail using the task-centred scenario. Both tools (Arena and ExtendSim were used for teaching discrete event simulation in preceding academic years. With the aim to inspect their effectiveness and to help us determine which tool is more suitable for students i.e. academic purposes, we used a simple simulation model of entities competing for limited resources. The main goal was to measure subjective (primarily attitude and objective indicators while using the tools when the same simulation scenario is given. The subjects were first year students of Master studies in Information Management at the Faculty of Economics in Split taking a course in Business Process Simulations (BPS. In a controlled environment – in a computer lab, two groups of students were given detailed, step-by-step instructions for building models using both tools - first using ExtendSim then Arena or vice versa. Subjective indicators (students’ attitudes were collected using an online survey completed immediately upon building each model. Subjective indicators primarily include students’ personal estimations of Arena and ExtendSim capabilities/features for model building, model simulation and result analysis. Objective indicators were measured using specialised software that logs information on user's behavior while performing a particular task on their computer such as distance crossed by mouse during model building, the number of mouse clicks, usage of the mouse wheel and speed achieved. The results indicate that ExtendSim is well preferred comparing to Arena with regards to subjective indicators while the objective indicators are

  16. Simulation training tools for nonlethal weapons using gaming environments

    Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry

    2006-05-01

    Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.

  17. The Development of a 3D LADAR Simulator Based on a Fast Target Impulse Response Generation Approach

    Al-Temeemy, Ali Adnan

    2017-09-01

    A new laser detection and ranging (LADAR) simulator has been developed, using MATLAB and its graphical user interface, to simulate direct detection time of flight LADAR systems, and to produce 3D simulated scanning images under a wide variety of conditions. This simulator models each stage from the laser source to data generation and can be considered as an efficient simulation tool to use when developing LADAR systems and their data processing algorithms. The novel approach proposed for this simulator is to generate the actual target impulse response. This approach is fast and able to deal with high scanning requirements without losing the fidelity that accompanies increments in speed. This leads to a more efficient LADAR simulator and opens up the possibility for simulating LADAR beam propagation more accurately by using a large number of laser footprint samples. The approach is to select only the parts of the target that lie in the laser beam angular field by mathematically deriving the required equations and calculating the target angular ranges. The performance of the new simulator has been evaluated under different scanning conditions, the results showing significant increments in processing speeds in comparison to conventional approaches, which are also used in this study as a point of comparison for the results. The results also show the simulator's ability to simulate phenomena related to the scanning process, for example, type of noise, scanning resolution and laser beam width.

  18. CLAIRE, an event-driven simulation tool for testing software

    Raguideau, J.; Schoen, D.; Henry, J.Y.; Boulc'h, J.

    1994-06-01

    CLAIRE is a software tool created to perform validations on executable codes or on specifications of distributed real-time applications for nuclear safety. CLAIRE can be used both to verify the safety properties by modelling the specifications, and also to validate the final code by simulating the behaviour of its equipment and software interfaces. It can be used to observe and provide dynamic control of the simulation process, and also to record changes to the simulated data for off-line analysis. (R.P.)

  19. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  20. Uni- and omnidirectional simulation tools for integrated optics

    Stoffer, Remco

    2001-01-01

    This thesis presents several improvements on simulation methods in integrated optics, as well as some new methods. Both uni- and omnidirectional tools are presented; for the unidirectional methods, the emphasis is on higher-order accuracy; for the omnidirectional methods, the boundary conditions are

  1. Mindfully implementing simulation tools for supporting pragmatic design inquiries

    Hartmann, Timo; olde Scholtenhuis, Léon Luc; Zerjav, Vedran; Champlin, Carissa J

    2015-01-01

    Based upon a conceptualization of the engineering design process as pragmatic inquiry, this paper introduces a framework for supporting designers and design managers with a better understanding of the trade-offs required for a successful implementation of simulation tools. This framework contributes

  2. Classification and optimization of training tools for NPP simulator

    Billoen, G. van

    1994-01-01

    The training cycle of nuclear power plant (NPP) operators has evolved during the last decade in parallel with the evolution of the training tools. The phases of the training cycle can be summarized as follows: (1) basic principle learning, (2) specific functional training, (3) full operating range training, and (4) detailed accident analyses. The progress in simulation technology and man/machine interface (MMI) gives the training centers new opportunities to improve their training methods and effectiveness in the transfer of knowledge. To take advantage of these new opportunities a significant investment in simulation tools may be required. It is therefore important to propose an optimized approach when dealing with the overall equipment program for these training centers. An overall look of tools proposed on the international simulation market shows that there is a need for systematic approach in this field. Classification of the different training tools needed for each training cycle is the basis for an optimized approach in terms of hardware configuration and software specifications of the equipment to install in training centers. The 'Multi-Function Simulator' is one of the approaches. (orig.) (3 tabs.)

  3. Simulator technology as a tool for education in cardiac care.

    Hravnak, Marilyn; Beach, Michael; Tuite, Patricia

    2007-01-01

    Assisting nurses in gaining the cognitive and psychomotor skills necessary to safely and effectively care for patients with cardiovascular disease can be challenging for educators. Ideally, nurses would have the opportunity to synthesize and practice these skills in a protected training environment before application in the dynamic clinical setting. Recently, a technology known as high fidelity human simulation was introduced, which permits learners to interact with a simulated patient. The dynamic physiologic parameters and physical assessment capabilities of the simulated patient provide for a realistic learning environment. This article describes the High Fidelity Human Simulation Laboratory at the University of Pittsburgh School of Nursing and presents strategies for using this technology as a tool in teaching complex cardiac nursing care at the basic and advanced practice nursing levels. The advantages and disadvantages of high fidelity human simulation in learning are discussed.

  4. Serious Games and Simulation as Tools for Education

    Luca Mori

    2012-06-01

    Full Text Available The increasing adoption of computer-based “serious games” as digital tools for education requires to address the question about the role of simulation in teaching and learning process. Whereas many recent studies have stressed the benefits of digital games in a variety of learning contexts, this paper approaches the problem of misuse and limitations of computer-based simulations, and argues that we still need to understand when a digital serious game is actually better than other non-computer-based simulation experiences. Considering that the distinction between the two types of simulation does not mean that they are incompatible, the final question that I address regards the best ways to correlate computer-based and non-computer-based simulation techniques.

  5. Fast Simulation of Mechanical Heterogeneity in the Electrically Asynchronous Heart Using the MultiPatch Module.

    John Walmsley

    2015-07-01

    Full Text Available Cardiac electrical asynchrony occurs as a result of cardiac pacing or conduction disorders such as left bundle-branch block (LBBB. Electrically asynchronous activation causes myocardial contraction heterogeneity that can be detrimental for cardiac function. Computational models provide a tool for understanding pathological consequences of dyssynchronous contraction. Simulations of mechanical dyssynchrony within the heart are typically performed using the finite element method, whose computational intensity may present an obstacle to clinical deployment of patient-specific models. We present an alternative based on the CircAdapt lumped-parameter model of the heart and circulatory system, called the MultiPatch module. Cardiac walls are subdivided into an arbitrary number of patches of homogeneous tissue. Tissue properties and activation time can differ between patches. All patches within a wall share a common wall tension and curvature. Consequently, spatial location within the wall is not required to calculate deformation in a patch. We test the hypothesis that activation time is more important than tissue location for determining mechanical deformation in asynchronous hearts. We perform simulations representing an experimental study of myocardial deformation induced by ventricular pacing, and a patient with LBBB and heart failure using endocardial recordings of electrical activation, wall volumes, and end-diastolic volumes. Direct comparison between simulated and experimental strain patterns shows both qualitative and quantitative agreement between model fibre strain and experimental circumferential strain in terms of shortening and rebound stretch during ejection. Local myofibre strain in the patient simulation shows qualitative agreement with circumferential strain patterns observed in the patient using tagged MRI. We conclude that the MultiPatch module produces realistic regional deformation patterns in the asynchronous heart and that

  6. Monte Carlo Simulations of Neutron Oil well Logging Tools

    Azcurra, Mario

    2002-01-01

    Monte Carlo simulations of simple neutron oil well logging tools into typical geological formations are presented.The simulated tools consist of both 14 MeV pulsed and continuous Am-Be neutron sources with time gated and continuous gamma ray detectors respectively.The geological formation consists of pure limestone with 15% absolute porosity in a wide range of oil saturation.The particle transport was performed with the Monte Carlo N-Particle Transport Code System, MCNP-4B.Several gamma ray spectra were obtained at the detector position that allow to perform composition analysis of the formation.In particular, the ratio C/O was analyzed as an indicator of oil saturation.Further calculations are proposed to simulate actual detector responses in order to contribute to understand the relation between the detector response with the formation composition

  7. BioNSi: A Discrete Biological Network Simulator Tool.

    Rubinstein, Amir; Bracha, Noga; Rudner, Liat; Zucker, Noga; Sloin, Hadas E; Chor, Benny

    2016-08-05

    Modeling and simulation of biological networks is an effective and widely used research methodology. The Biological Network Simulator (BioNSi) is a tool for modeling biological networks and simulating their discrete-time dynamics, implemented as a Cytoscape App. BioNSi includes a visual representation of the network that enables researchers to construct, set the parameters, and observe network behavior under various conditions. To construct a network instance in BioNSi, only partial, qualitative biological data suffices. The tool is aimed for use by experimental biologists and requires no prior computational or mathematical expertise. BioNSi is freely available at http://bionsi.wix.com/bionsi , where a complete user guide and a step-by-step manual can also be found.

  8. Monte Carlo simulations of neutron oil well logging tools

    Azcurra, Mario O.; Zamonsky, Oscar M.

    2003-01-01

    Monte Carlo simulations of simple neutron oil well logging tools into typical geological formations are presented. The simulated tools consist of both 14 MeV pulsed and continuous Am-Be neutron sources with time gated and continuous gamma ray detectors respectively. The geological formation consists of pure limestone with 15% absolute porosity in a wide range of oil saturation. The particle transport was performed with the Monte Carlo N-Particle Transport Code System, MCNP-4B. Several gamma ray spectra were obtained at the detector position that allow to perform composition analysis of the formation. In particular, the ratio C/O was analyzed as an indicator of oil saturation. Further calculations are proposed to simulate actual detector responses in order to contribute to understand the relation between the detector response with the formation composition. (author)

  9. Disaster response team FAST skills training with a portable ultrasound simulator compared to traditional training: pilot study.

    Paddock, Michael T; Bailitz, John; Horowitz, Russ; Khishfe, Basem; Cosby, Karen; Sergel, Michelle J

    2015-03-01

    Pre-hospital focused assessment with sonography in trauma (FAST) has been effectively used to improve patient care in multiple mass casualty events throughout the world. Although requisite FAST knowledge may now be learned remotely by disaster response team members, traditional live instructor and model hands-on FAST skills training remains logistically challenging. The objective of this pilot study was to compare the effectiveness of a novel portable ultrasound (US) simulator with traditional FAST skills training for a deployed mixed provider disaster response team. We randomized participants into one of three training groups stratified by provider role: Group A. Traditional Skills Training, Group B. US Simulator Skills Training, and Group C. Traditional Skills Training Plus US Simulator Skills Training. After skills training, we measured participants' FAST image acquisition and interpretation skills using a standardized direct observation tool (SDOT) with healthy models and review of FAST patient images. Pre- and post-course US and FAST knowledge were also assessed using a previously validated multiple-choice evaluation. We used the ANOVA procedure to determine the statistical significance of differences between the means of each group's skills scores. Paired sample t-tests were used to determine the statistical significance of pre- and post-course mean knowledge scores within groups. We enrolled 36 participants, 12 randomized to each training group. Randomization resulted in similar distribution of participants between training groups with respect to provider role, age, sex, and prior US training. For the FAST SDOT image acquisition and interpretation mean skills scores, there was no statistically significant difference between training groups. For US and FAST mean knowledge scores, there was a statistically significant improvement between pre- and post-course scores within each group, but again there was not a statistically significant difference between

  10. Fast Multilevel Panel Method for Wind Turbine Rotor Flow Simulations

    van Garrel, Arne; Venner, Cornelis H.; Hoeijmakers, Hendrik Willem Marie

    2017-01-01

    A fast multilevel integral transform method has been developed that enables the rapid analysis of unsteady inviscid flows around wind turbines rotors. A low order panel method is used and the new multi-level multi-integration cluster (MLMIC) method reduces the computational complexity for

  11. MUMAX: A new high-performance micromagnetic simulation tool

    Vansteenkiste, A.; Van de Wiele, B.

    2011-01-01

    We present MUMAX, a general-purpose micromagnetic simulation tool running on graphical processing units (GPUs). MUMAX is designed for high-performance computations and specifically targets large simulations. In that case speedups of over a factor 100 x can be obtained compared to the CPU-based OOMMF program developed at NIST. MUMAX aims to be general and broadly applicable. It solves the classical Landau-Lifshitz equation taking into account the magnetostatic, exchange and anisotropy interactions, thermal effects and spin-transfer torque. Periodic boundary conditions can optionally be imposed. A spatial discretization using finite differences in two or three dimensions can be employed. MUMAX is publicly available as open-source software. It can thus be freely used and extended by community. Due to its high computational performance, MUMAX should open up the possibility of running extensive simulations that would be nearly inaccessible with typical CPU-based simulators. - Highlights: → Novel, open-source micromagnetic simulator on GPU hardware. → Speedup of ∝100x compared to other widely used tools. → Extensively validated against standard problems. → Makes previously infeasible simulations accessible.

  12. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  13. A Fast Tool for Assessing the Power Performance of Large WEC arrays

    Ruiz, Pau Mercadé

    In the present work, a tool for computing wave energy converter array hydrodynamic forces and power performance is developed. The tool leads to a significant reduction on computation time compared with standard boundary element method based codes while keeping similar levels of accuracy. This mak...... it suitable for array layout optimization, where large numbers of simulations are required. Furthermore, the tool is developed within an open-source environment such as Python 2.7 so that it is fully accessible to anyone willing to make use of it....

  14. Truth Seeded Reconstruction for Fast Simulation in the ATLAS Experiment

    Jansky, Roland; Salzburger, Andreas

    The huge success of the ATLAS experiment for particle physics during Run 1 of the LHC would not have been possible without the production of vast amounts of simulated Monte Carlo data. However, the very detailed detector simulation is a highly CPU intensive task and thus resource shortages occurred. Motivated by this, great effort has been put into speeding up the simulation. As a result, other timeconsuming parts became visible. One of which is the track reconstruction. This thesis describes one potential solution to the CPU intensive reconstruction of simulated data: a newly designed truth seeded reconstruction. At its basics is the idea to skip the pattern recognition altogether, instead utilizing the available (truth) information from simulation to directly fit particle trajectories without searching for them. At the same time tracking effects of the standard reconstruction need to be emulated. This approach is validated thoroughly and no critical deviations of the results compared to the standard reconst...

  15. Planning organization and productivity simulation tool for maritime container terminals

    B. Beškovnik

    2010-09-01

    Full Text Available The article describes a proposed planning organization and productivity simulation tool, with a special emphasis on orientations to the optimization of operations in a maritime container terminal. With the application of an adequate model frame for traffic and technical-technologic forecasting, infrastructure and manpower planning and productivity simulation are possible to measure and increase the productivity in the whole subsystem of the maritime container terminal. The emphasis is mainly put on setting up planning organization in order to collect important information and consequently to raise productivity. This is the main task and goal of terminal management that must develop elements and strategies for optimal operational and financial production. An adequate planning structure must use simplified but efficient simulation tools enabling owners and management to take a vast number of adequate financial and operational decisions. Considering all important and very dynamic facts in container and shipping industry, the proposed simulation tool gives a helpful instrument for checking productivity and its time variation and monitoring a competitive position of a certain maritime terminal with the terminals from the same group. Therefore, the management of every maritime container terminal must establish an appropriate internal planning system as a mechanism for strategic decision support relating basically to the assessment of the best development and optimization solutions for the infrastructure and suprastructure of the entire system.

  16. Soft computing simulation tools for nuclear energy systems

    Kannan Balasubramanian, S.

    2012-01-01

    This chapter deals with simulation, a very powerful tool in designing, constructing and operating nuclear power generating facilities. There are very different types of power plants, and the examples mentioned in this chapter originate from experience with water cooled and water moderated thermal reactors, based on fission of uranium-235. Nevertheless, the methodological achievements in simulation mentioned below can definitely be used not only for this particular type of nuclear power generating reactor. Simulation means: investigation of processes in the time domain. We can calculate the characteristics and properties of different systems, e.g. we can design a bridge over a river, but if we calculate how it would respond to a thunderstorm with high winds, its movement can or can not evolve after a certain time into destructive oscillation - this type of calculations are called simulation

  17. State-of-the-Art for Hygrothermal Simulation Tools

    Boudreaux, Philip R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); New, Joshua Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shrestha, Som S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Adams, Mark B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pallin, Simon B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-01

    The hygrothermal (heat and moisture) performance of buildings can be assessed by utilizing simulation tools. There are currently a number of available hygrothermal calculation tools available which vary in their degree of sophistication and runtime requirements. This report investigates three of the most commonly used models (WUFI, HAMT, and EMPD) to assess their limitations and potential to generate physically realistic results to prioritize improvements for EnergyPlus (which uses HAMT and EMPD). The outcome of the study shows that, out of these three tools, WUFI has the greatest hygrothermal capabilities. Limitations of these tools were also assessed including: WUFI’s inability to properly account for air leakage and transfer at surface boundaries; HAMT’s inability to handle air leakage, precipitationrelated moisture problems, or condensation problems from high relative humidity; and multiple limitations for EMPD as a simplified method to estimate indoor temperature and humidity levels and generally not used to estimate the hygrothermal performance of the building envelope materials. In conclusion, out of the three investigated simulation tools, HAMT has the greatest modeling potential, is open source, and we have prioritized specific features that can enable EnergyPlus to model all relevant heat and moisture transfer mechanisms that impact the performance of building envelope components.

  18. Kinetic Simulation of Fast Electron Transport with Ionization Effects and Ion Acceleration

    Robinson, A. P. L.; Bell, A. R.; Kingham, R. J.

    2005-01-01

    The generation of relativistic electrons and multi-MeV ions is central to ultra intense (> 1018Wcm-2) laser-solid interactions. The production of energetic particles by lasers has a number of potential applications ranging from Fast Ignition ICF to medicine. In terms of the relativistic (fast) electrons the areas of interest can be divided into three areas. Firstly there is the absorption of laser energy into fast electrons and MeV ions. Secondly there is the transport of fast electrons through the solid target. Finally there is a transduction stage, where the fast electron energy is imparted. This may range from being the electrostatic acceleration of ions at a plasma-vacuum interface, to the heating of a compressed core (as in Fast Ignitor ICF).We have used kinetic simulation codes to study the transport stage and electrostatic ion acceleration. (Author)

  19. Fast core prediction simulator for load follow control

    Yim, Man Sung; Lee, Sang Hoon; Lee, Un Chul

    1990-01-01

    An operator-assisting system for the reactor core control under power changing operating condition was developed. The system is consisted of core simulator routine and Xenon and Iodine initial condition generation routine. The initial condition generation routine, without exactly knowing the core status, is capable of providing accurate number densities and axial offset conditions of Xenon and Iodine after several hours of predictor- corrector calculations using the plant instrumentation signals of power level and power axial offset. The core simulator routine, even with the two node core model, gives equivalently accurate results as the one-dimensional model for the core behaviour simulation under power changing condition and can provide proper control strategies for load follow operation. The core simulator can also be used by the operator to develop remedial actions to restore the distorted power distribution by using its prediction capability

  20. Developing a fast simulator for irradiated silicon detectors

    Diez Gonzalez-Pardo, Alvaro

    2015-01-01

    Simulation software for irradiated silicon detectors has been developed on the basis of an already existing C++ simulation software called TRACS[1]. This software has been already proven useful in understanding non-irradiated silicon diodes and microstrips. In addition a wide variety of user-focus features has been implemented to improve on TRACS flexibility. Such features include an interface to allow any program to leverage TRACS functionalities, a configuration file and improved documentation.

  1. Scalable and fast heterogeneous molecular simulation with predictive parallelization schemes

    Guzman, Horacio V.; Junghans, Christoph; Kremer, Kurt; Stuehn, Torsten

    2017-01-01

    Multiscale and inhomogeneous molecular systems are challenging topics in the field of molecular simulation. In particular, modeling biological systems in the context of multiscale simulations and exploring material properties are driving a permanent development of new simulation methods and optimization algorithms. In computational terms, those methods require parallelization schemes that make a productive use of computational resources for each simulation and from its genesis. Here, we introduce the heterogeneous domain decomposition approach, which is a combination of an heterogeneity-sensitive spatial domain decomposition with an a priori rearrangement of subdomain walls. Within this approach and paper, the theoretical modeling and scaling laws for the force computation time are proposed and studied as a function of the number of particles and the spatial resolution ratio. We also show the new approach capabilities, by comparing it to both static domain decomposition algorithms and dynamic load-balancing schemes. Specifically, two representative molecular systems have been simulated and compared to the heterogeneous domain decomposition proposed in this work. Finally, these two systems comprise an adaptive resolution simulation of a biomolecule solvated in water and a phase-separated binary Lennard-Jones fluid.

  2. Fast spot-based multiscale simulations of granular drainage

    Rycroft, Chris H.; Wong, Yee Lok; Bazant, Martin Z.

    2009-05-22

    We develop a multiscale simulation method for dense granular drainage, based on the recently proposed spot model, where the particle packing flows by local collective displacements in response to diffusing"spots'" of interstitial free volume. By comparing with discrete-element method (DEM) simulations of 55,000 spheres in a rectangular silo, we show that the spot simulation is able to approximately capture many features of drainage, such as packing statistics, particle mixing, and flow profiles. The spot simulation runs two to three orders of magnitude faster than DEM, making it an appropriate method for real-time control or optimization. We demonstrateextensions for modeling particle heaping and avalanching at the free surface, and for simulating the boundary layers of slower flow near walls. We show that the spot simulations are robust and flexible, by demonstrating that they can be used in both event-driven and fixed timestep approaches, and showing that the elastic relaxation step used in the model can be applied much less frequently and still create good results.

  3. Scalable and fast heterogeneous molecular simulation with predictive parallelization schemes

    Guzman, Horacio V.; Junghans, Christoph; Kremer, Kurt; Stuehn, Torsten

    2017-11-01

    Multiscale and inhomogeneous molecular systems are challenging topics in the field of molecular simulation. In particular, modeling biological systems in the context of multiscale simulations and exploring material properties are driving a permanent development of new simulation methods and optimization algorithms. In computational terms, those methods require parallelization schemes that make a productive use of computational resources for each simulation and from its genesis. Here, we introduce the heterogeneous domain decomposition approach, which is a combination of an heterogeneity-sensitive spatial domain decomposition with an a priori rearrangement of subdomain walls. Within this approach, the theoretical modeling and scaling laws for the force computation time are proposed and studied as a function of the number of particles and the spatial resolution ratio. We also show the new approach capabilities, by comparing it to both static domain decomposition algorithms and dynamic load-balancing schemes. Specifically, two representative molecular systems have been simulated and compared to the heterogeneous domain decomposition proposed in this work. These two systems comprise an adaptive resolution simulation of a biomolecule solvated in water and a phase-separated binary Lennard-Jones fluid.

  4. A Quasiphysics Intelligent Model for a Long Range Fast Tool Servo

    Liu, Qiang; Zhou, Xiaoqin; Lin, Jieqiong; Xu, Pengzi; Zhu, Zhiwei

    2013-01-01

    Accurately modeling the dynamic behaviors of fast tool servo (FTS) is one of the key issues in the ultraprecision positioning of the cutting tool. Herein, a quasiphysics intelligent model (QPIM) integrating a linear physics model (LPM) and a radial basis function (RBF) based neural model (NM) is developed to accurately describe the dynamic behaviors of a voice coil motor (VCM) actuated long range fast tool servo (LFTS). To identify the parameters of the LPM, a novel Opposition-based Self-adaptive Replacement Differential Evolution (OSaRDE) algorithm is proposed which has been proved to have a faster convergence mechanism without compromising with the quality of solution and outperform than similar evolution algorithms taken for consideration. The modeling errors of the LPM and the QPIM are investigated by experiments. The modeling error of the LPM presents an obvious trend component which is about ±1.15% of the full span range verifying the efficiency of the proposed OSaRDE algorithm for system identification. As for the QPIM, the trend component in the residual error of LPM can be well suppressed, and the error of the QPIM maintains noise level. All the results verify the efficiency and superiority of the proposed modeling and identification approaches. PMID:24163627

  5. QUICKSILVER - A general tool for electromagnetic PIC simulation

    Seidel, David B.; Coats, Rebecca S.; Johnson, William A.; Kiefer, Mark L.; Mix, L. Paul; Pasik, Michael F.; Pointon, Timothy D.; Quintenz, Jeffrey P.; Riley, Douglas J.; Turner, C. David

    1997-01-01

    The dramatic increase in computational capability that has occurred over the last ten years has allowed fully electromagnetic simulations of large, complex, three-dimensional systems to move progressively from impractical, to expensive, and recently, to routine and widespread. This is particularly true for systems that require the motion of free charge to be self-consistently treated. The QUICKSILVER electromagnetic Particle-In-Cell (EM-PIC) code has been developed at Sandia National Laboratories to provide a general tool to simulate a wide variety of such systems. This tool has found widespread use for many diverse applications, including high-current electron and ion diodes, magnetically insulated power transmission systems, high-power microwave oscillators, high-frequency digital and analog integrated circuit packages, microwave integrated circuit components, antenna systems, radar cross-section applications, and electromagnetic interaction with biological material. This paper will give a brief overview of QUICKSILVER and provide some thoughts on its future development

  6. DNA – A General Energy System Simulation Tool

    Elmegaard, Brian; Houbak, Niels

    2005-01-01

    The paper reviews the development of the energy system simulation tool DNA (Dynamic Network Analysis). DNA has been developed since 1989 to be able to handle models of any kind of energy system based on the control volume approach, usually systems of lumped parameter components. DNA has proven...... to be a useful tool in the analysis and optimization of several types of thermal systems: Steam turbines, gas turbines, fuels cells, gasification, refrigeration and heat pumps for both conventional fossil fuels and different types of biomass. DNA is applicable for models of both steady state and dynamic...... operation. The program decides at runtime to apply the DAE solver if the system contains differential equations. This makes it easy to extend an existing steady state model to simulate dynamic operation of the plant. The use of the program is illustrated by examples of gas turbine models. The paper also...

  7. In Silico PCR Tools for a Fast Primer, Probe, and Advanced Searching.

    Kalendar, Ruslan; Muterko, Alexandr; Shamekova, Malika; Zhambakin, Kabyl

    2017-01-01

    The polymerase chain reaction (PCR) is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. The principle of this technique has been further used and applied in plenty of other simple or complex nucleic acid amplification technologies (NAAT). In parallel to laboratory "wet bench" experiments for nucleic acid amplification technologies, in silico or virtual (bioinformatics) approaches have been developed, among which in silico PCR analysis. In silico NAAT analysis is a useful and efficient complementary method to ensure the specificity of primers or probes for an extensive range of PCR applications from homology gene discovery, molecular diagnosis, DNA fingerprinting, and repeat searching. Predicting sensitivity and specificity of primers and probes requires a search to determine whether they match a database with an optimal number of mismatches, similarity, and stability. In the development of in silico bioinformatics tools for nucleic acid amplification technologies, the prospects for the development of new NAAT or similar approaches should be taken into account, including forward-looking and comprehensive analysis that is not limited to only one PCR technique variant. The software FastPCR and the online Java web tool are integrated tools for in silico PCR of linear and circular DNA, multiple primer or probe searches in large or small databases and for advanced search. These tools are suitable for processing of batch files that are essential for automation when working with large amounts of data. The FastPCR software is available for download at http://primerdigital.com/fastpcr.html and the online Java version at http://primerdigital.com/tools/pcr.html .

  8. Monte Carlo simulations to advance characterisation of landmines by pulsed fast/thermal neutron analysis

    Maucec, M.; Rigollet, C.

    The performance of a detection system based on the pulsed fast/thermal neutron analysis technique was assessed using Monte Carlo simulations. The aim was to develop and implement simulation methods, to support and advance the data analysis techniques of the characteristic gamma-ray spectra,

  9. Coating-substrate-simulations applied to HFQ® forming tools

    Leopold Jürgen

    2015-01-01

    Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.

  10. Scalable space-time adaptive simulation tools for computational electrocardiology

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  11. Nucleonica. Web-based software tools for simulation and analysis

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  12. Virtual tool for simulation and wind turbine design

    Monteiro Farias, Gustavo; Barros Galhardo, Marcos André; Tavares Pinho, João

    2015-01-01

    This paper presents an educational tool to simulate wind turbines in a virtual environment. This tool can be used for research applications as well as to evaluate the operation conditions of a wind turbine by reproducing its behaviour. The first step is to apply the Blade Element Theory in order to obtain the induction factors when the tip- speed ratio and the airfoil characteristics are configured. With these values as starting point, the geometric shape of the wind blade is created and visualized. In order to evaluate the performance of the turbine, an integration method is applied, and then the power coefficient curve is plotted versus the tip-speed ratio. The power coefficient curve reaches the maximum value at the rated operation, which is essential to the transient behaviour of the wind turbine. The transient model described in this work shows the influence of all efforts acting on the rotor, which disturb the rotation. The inertial mass of the components and the air density are set up during the simulation.Using the virtual instrumentation applied to the transient model together with a 3D computer animation, the variables of the program can be controlled and visualized in graphics, and the animation of the wind turbine shows when it accelerates or decelerates the shaft rotation due to the variation of the wind speed. The tool provides the power supplied by the wind rotor to the electric generator, which can be evaluated at the end of the simulation. (full text)

  13. A Data Management System for International Space Station Simulation Tools

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  14. Tools for advanced simulations to nuclear propulsion systems in rockets

    Torres Sepulveda, A.; Perez Vara, R.

    2004-01-01

    While chemical propulsion rockets have dominated space exploration, other forms of rocket propulsion based on nuclear power, electrostatic and magnetic drive, and other principles besides chemical reactions, have been considered from the earliest days of the field. The goal of most of these advanced rocket propulsion schemes is improved efficiency through higher exhaust velocities, in order to reduce the amount of fuel the rocket vehicle needs to carry, though generally at the expense of high thrust. Nuclear propulsion seems to be the most promising short term technology to plan realistic interplanetary missions. The development of a nuclear electric propulsion spacecraft shall require the development of models to analyse the mission and to understand the interaction between the related subsystems (nuclear reactor, electrical converter, power management and distribution, and electric propulsion) during the different phases of the mission. This paper explores the modelling of a nuclear electric propulsion (NEP) spacecraft type using EcosimPro simulation software. This software is a multi-disciplinary simulation tool with a powerful object-oriented simulation language and state-of-the-art solvers. EcosimPro is the recommended ESA simulation tool for environmental Control and Life Support Systems (ECLSS) and has been used successfully within the framework of the European activities of the International Space Station programme. Furthermore, propulsion libraries for chemical and electrical propulsion are currently being developed under ESA contracts to set this tool as standard usage in the propulsion community. At present, there is not any workable NEP spacecraft, but a standardized-modular, multi-purpose interplanetary spacecraft for post-2000 missions, called ISC-2000, has been proposed in reference. The simulation model presented on this paper is based on the preliminary designs for this spacecraft. (Author)

  15. Computational tool for simulation of power and refrigeration cycles

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  16. PWMScan: a fast tool for scanning entire genomes with a position-specific weight matrix.

    Ambrosini, Giovanna; Groux, Romain; Bucher, Philipp

    2018-03-05

    Transcription factors (TFs) regulate gene expression by binding to specific short DNA sequences of 5 to 20-bp to regulate the rate of transcription of genetic information from DNA to messenger RNA. We present PWMScan, a fast web-based tool to scan server-resident genomes for matches to a user-supplied PWM or TF binding site model from a public database. The web server and source code are available at http://ccg.vital-it.ch/pwmscan and https://sourceforge.net/projects/pwmscan, respectively. giovanna.ambrosini@epfl.ch. SUPPLEMENTARY DATA ARE AVAILABLE AT BIOINFORMATICS ONLINE.

  17. Rare event simulation for highly dependable systems with fast repairs

    Reijsbergen, D.P.; de Boer, Pieter-Tjerk; Scheinhardt, Willem R.W.; Haverkort, Boudewijn R.H.M.

    2010-01-01

    Stochastic model checking has been used recently to assess, among others, dependability measures for a variety of systems. However, the employed numerical methods, as, e.g., supported by model checking tools such as PRISM and MRMC, suffer from the state-space explosion problem. The main alternative

  18. Rare event simulation for highly dependable systems with fast repairs

    Reijsbergen, D.P.; de Boer, Pieter-Tjerk; Scheinhardt, Willem R.W.; Haverkort, Boudewijn R.H.M.

    Probabilistic model checking has been used recently to assess, among others, dependability measures for a variety of systems. However, the numerical methods employed, such as those supported by model checking tools such as PRISM and MRMC, suffer from the state-space explosion problem. The main

  19. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  20. Fast simulation of the trigger system of the ATLAS detector at LHC

    Epp, B.; Ghete, V.M.; Kuhn, D.; Zhang, Y.J.

    2004-01-01

    The trigger system of the ATLAS detector aims to maximize the physics coverage and to be open to new and possibly unforeseen physics signatures. It is a multi-level system, composed from a hardware trigger at level-1, followed by the high-level-trigger (level-2 and event-filter). In order to understand its performance, to optimize it and to reduce its total cost, the trigger system requires a detailed simulation which is time- and resource-consuming. An alternative to the full detector simulation is a so-called 'fast simulation' which starts the analysis from particle level and replaces the full detector simulation and the detailed particle tracking with parametrized distributions obtained from the full simulation and/or a simplified detector geometry. The fast simulation offers a less precise description of trigger performance, but it is faster and less resource-consuming. (author)

  1. A tool for simulating parallel branch-and-bound methods

    Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail

    2016-01-01

    The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.

  2. Monte Carlo Simulation Tool Installation and Operation Guide

    Aguayo Navarrete, Estanislao; Ankney, Austin S.; Berguson, Timothy J.; Kouzes, Richard T.; Orrell, John L.; Troy, Meredith D.; Wiseman, Clinton G.

    2013-09-02

    This document provides information on software and procedures for Monte Carlo simulations based on the Geant4 toolkit, the ROOT data analysis software and the CRY cosmic ray library. These tools have been chosen for its application to shield design and activation studies as part of the simulation task for the Majorana Collaboration. This document includes instructions for installation, operation and modification of the simulation code in a high cyber-security computing environment, such as the Pacific Northwest National Laboratory network. It is intended as a living document, and will be periodically updated. It is a starting point for information collection by an experimenter, and is not the definitive source. Users should consult with one of the authors for guidance on how to find the most current information for their needs.

  3. A tool for simulating parallel branch-and-bound methods

    Golubeva Yana

    2016-01-01

    Full Text Available The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer’s interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.

  4. Assessment of COTS IR image simulation tools for ATR development

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a

  5. Fast 2D Simulation of Superconductors: a Multiscale Approach

    Rodriguez Zermeno, Victor Manuel; Sørensen, Mads Peter; Pedersen, Niels Falsig

    2009-01-01

    This work presents a method to calculate AC losses in thin conductors such as the commercially available second generation superconducting wires through a multiscale meshing technique. The main idea is to use large aspect ratio elements to accurately simulate thin material layers. For a single thin...

  6. Efficiency optimization of a fast Poisson solver in beam dynamics simulation

    Zheng, Dawei; Pöplau, Gisela; van Rienen, Ursula

    2016-01-01

    Calculating the solution of Poisson's equation relating to space charge force is still the major time consumption in beam dynamics simulations and calls for further improvement. In this paper, we summarize a classical fast Poisson solver in beam dynamics simulations: the integrated Green's function method. We introduce three optimization steps of the classical Poisson solver routine: using the reduced integrated Green's function instead of the integrated Green's function; using the discrete cosine transform instead of discrete Fourier transform for the Green's function; using a novel fast convolution routine instead of an explicitly zero-padded convolution. The new Poisson solver routine preserves the advantages of fast computation and high accuracy. This provides a fast routine for high performance calculation of the space charge effect in accelerators.

  7. Fast sampling algorithm for the simulation of photon Compton scattering

    Brusa, D.; Salvat, F.

    1996-01-01

    A simple algorithm for the simulation of Compton interactions of unpolarized photons is described. The energy and direction of the scattered photon, as well as the active atomic electron shell, are sampled from the double-differential cross section obtained by Ribberfors from the relativistic impulse approximation. The algorithm consistently accounts for Doppler broadening and electron binding effects. Simplifications of Ribberfors' formula, required for efficient random sampling, are discussed. The algorithm involves a combination of inverse transform, composition and rejection methods. A parameterization of the Compton profile is proposed from which the simulation of Compton events can be performed analytically in terms of a few parameters that characterize the target atom, namely shell ionization energies, occupation numbers and maximum values of the one-electron Compton profiles. (orig.)

  8. Development and validation of a simulation tool dedicated to eddy current non destructive testing of tubes

    Reboud, Ch.

    2006-09-01

    Eddy current testing (ECT) technique is widely used in industrial fields such as iron and steel industry. Dedicated simulation tools provide a great assistance for the optimisation of ECT processes. CEA and the Vallourec Research Center have collaborated in order to develop a simulation tool of ECT of tubes. The volume integral method has been chosen for the resolution of Maxwell equations in a stratified medium, in order to get accurate results with a computation time short enough to carry out optimisation or inversion procedures. A fast model has been developed for the simulation of ECT of non magnetic tubes using specific external probes. New flaw geometries have been modelled: holes and notches with flat bottom. Validations of the developments, which have been integrated to the CIVA platform, have been carried out using experimental data recorded in laboratory conditions and in. industrial conditions, successively. The integral equations derived are solved using the Galerkin variant of the method of moments with pulse functions as projection functions. In order to overcome some memory limitations, other projection functions have been considered. A new discretization scheme based on non-uniform B-Splines of degree 1 or 2 has been implemented, which constitutes an original contribution to the existing literature. The decrease of the mesh size needed to get a given accuracy on the result may lead to the simulation of more complex ECT configurations. (author)

  9. A fast mollified impulse method for biomolecular atomistic simulations

    Fath, L., E-mail: lukas.fath@kit.edu [Institute for App. and Num. Mathematics, Karlsruhe Institute of Technology (Germany); Hochbruck, M., E-mail: marlis.hochbruck@kit.edu [Institute for App. and Num. Mathematics, Karlsruhe Institute of Technology (Germany); Singh, C.V., E-mail: chandraveer.singh@utoronto.ca [Department of Materials Science & Engineering, University of Toronto (Canada)

    2017-03-15

    Classical integration methods for molecular dynamics are inherently limited due to resonance phenomena occurring at certain time-step sizes. The mollified impulse method can partially avoid this problem by using appropriate filters based on averaging or projection techniques. However, existing filters are computationally expensive and tedious in implementation since they require either analytical Hessians or they need to solve nonlinear systems from constraints. In this work we follow a different approach based on corotation for the construction of a new filter for (flexible) biomolecular simulations. The main advantages of the proposed filter are its excellent stability properties and ease of implementation in standard softwares without Hessians or solving constraint systems. By simulating multiple realistic examples such as peptide, protein, ice equilibrium and ice–ice friction, the new filter is shown to speed up the computations of long-range interactions by approximately 20%. The proposed filtered integrators allow step sizes as large as 10 fs while keeping the energy drift less than 1% on a 50 ps simulation.

  10. Fast Monte Carlo for ion beam analysis simulations

    Schiettekatte, Francois

    2008-01-01

    A Monte Carlo program for the simulation of ion beam analysis data is presented. It combines mainly four features: (i) ion slowdown is computed separately from the main scattering/recoil event, which is directed towards the detector. (ii) A virtual detector, that is, a detector larger than the actual one can be used, followed by trajectory correction. (iii) For each collision during ion slowdown, scattering angle components are extracted form tables. (iv) Tables of scattering angle components, stopping power and energy straggling are indexed using the binary representation of floating point numbers, which allows logarithmic distribution of these tables without the computation of logarithms to access them. Tables are sufficiently fine-grained that interpolation is not necessary. Ion slowdown computation thus avoids trigonometric, inverse and transcendental function calls and, as much as possible, divisions. All these improvements make possible the computation of 10 7 collisions/s on current PCs. Results for transmitted ions of several masses in various substrates are well comparable to those obtained using SRIM-2006 in terms of both angular and energy distributions, as long as a sufficiently large number of collisions is considered for each ion. Examples of simulated spectrum show good agreement with experimental data, although a large detector rather than the virtual detector has to be used to properly simulate background signals that are due to plural collisions. The program, written in standard C, is open-source and distributed under the terms of the GNU General Public License

  11. A Fast Visible-Infrared Imaging Radiometer Suite Simulator for Cloudy Atmopheres

    Liu, Chao; Yang, Ping; Nasiri, Shaima L.; Platnick, Steven; Meyer, Kerry G.; Wang, Chen Xi; Ding, Shouguo

    2015-01-01

    A fast instrument simulator is developed to simulate the observations made in cloudy atmospheres by the Visible Infrared Imaging Radiometer Suite (VIIRS). The correlated k-distribution (CKD) technique is used to compute the transmissivity of absorbing atmospheric gases. The bulk scattering properties of ice clouds used in this study are based on the ice model used for the MODIS Collection 6 ice cloud products. Two fast radiative transfer models based on pre-computed ice cloud look-up-tables are used for the VIIRS solar and infrared channels. The accuracy and efficiency of the fast simulator are quantify in comparison with a combination of the rigorous line-by-line (LBLRTM) and discrete ordinate radiative transfer (DISORT) models. Relative errors are less than 2 for simulated TOA reflectances for the solar channels and the brightness temperature differences for the infrared channels are less than 0.2 K. The simulator is over three orders of magnitude faster than the benchmark LBLRTM+DISORT model. Furthermore, the cloudy atmosphere reflectances and brightness temperatures from the fast VIIRS simulator compare favorably with those from VIIRS observations.

  12. Integrated fast ignition simulation of cone-guided target with three codes

    Sakagami, H. [Hyogo Univ., Computer Engineering, Himeji, Hyogo (Japan); Johzaki, T.; Nagatomo, H.; Mima, K. [Osaka Univ., Institute of Laser Engineering, Suita, Osaka (Japan)

    2004-07-01

    It was reported that the fuel core was heated up to {approx} 0.8 keV in the fast ignition experiments with cone-guided targets, but they could not theoretically explain heating mechanisms and achievement of such high temperature. Thus simulations should play an important role in estimating the scheme performance, and we must simulate each phenomenon with individual codes and integrate them under the Fast Ignition Integrated Interconnecting code project. In the previous integrated simulations, fast electrons generated by the laser-plasma interaction were too hot to efficiently heat the core and we got only a 0.096 keV temperature rise. Including the density gap at the contact surface between the cone tip and the imploded plasma, the period of core heating became longer and the core was heated by 0.162 keV, about 69% higher increment compared with ignoring the density gap effect. (authors)

  13. Kinetic-magnetohydrodynamic simulation study of fast ions and toroidal Alfven eigenmodes

    Todo, Y.; Sato, T.

    2001-01-01

    Particle-magnetohydrodynamic and Fokker-Planck-magnetohydrodynamic simulations of fast ions and toroidicity-induced Alfven eigenmodes (TAE modes) have been carried out. Alpha particle losses induced by TAE mode are investigated with particle-magnetohydrodynamic simulations. Trapped particles near the passing-trapped boundary in the phase space are also lost appreciably in addition to the counter-passing particles. In Fokker-Planck-magnetohydrodynamic simulation source and slowing-down of fast ions are considered. A coherent pulsating behavior of multiple TAE modes, which occurs in neutral beam injection experiments, is observed when the slowing-down time is much longer than the damping time of the TAE modes and the fast-ion pressure is sufficiently high. For a slowing-down time comparable to the damping time, the TAE modes reach steady saturation levels. (author)

  14. Kinetic-magnetohydrodynamic simulation study of fast ions and toroidal Alfven eigenmodes

    Todo, Y.; Sato, T.

    1999-01-01

    Particle-magnetohydrodynamic and Fokker-Planck-magnetohydrodynamic simulations of fast ions and toroidicity-induced Alfven eigenmodes (TAE modes) have been carried out. Alpha particle losses induced by TAE mode are investigated with particle-magnetohydrodynamic simulations. Trapped particles near the passing-trapped boundary in the phase space are also lost appreciably in addition to the counter-passing particles. In Fokker-Planck-magnetohydrodynamic simulation source and slowing-down of fast ions are considered. A coherent pulsating behavior of multiple TAE modes, which occurs in neutral beam injection experiments, is observed when the slowing-down time is much longer than the damping time of the TAE modes and the fast-ion pressure is sufficiently high. For a slowing-down time comparable to the damping time, the TAE modes reach steady saturation levels. (author)

  15. Quantifying the Effect of Fast Charger Deployments on Electric Vehicle Utility and Travel Patterns via Advanced Simulation: Preprint

    Wood, E.; Neubauer, J.; Burton, E.

    2015-02-01

    The disparate characteristics between conventional (CVs) and battery electric vehicles (BEVs) in terms of driving range, refill/recharge time, and availability of refuel/recharge infrastructure inherently limit the relative utility of BEVs when benchmarked against traditional driver travel patterns. However, given a high penetration of high-power public charging combined with driver tolerance for rerouting travel to facilitate charging on long-distance trips, the difference in utility between CVs and BEVs could be marginalized. We quantify the relationships between BEV utility, the deployment of fast chargers, and driver tolerance for rerouting travel and extending travel durations by simulating BEVs operated over real-world travel patterns using the National Renewable Energy Laboratory's Battery Lifetime Analysis and Simulation Tool for Vehicles (BLAST-V). With support from the U.S. Department of Energy's Vehicle Technologies Office, BLAST-V has been developed to include algorithms for estimating the available range of BEVs prior to the start of trips, for rerouting baseline travel to utilize public charging infrastructure when necessary, and for making driver travel decisions for those trips in the presence of available public charging infrastructure, all while conducting advanced vehicle simulations that account for battery electrical, thermal, and degradation response. Results from BLAST-V simulations on vehicle utility, frequency of inserted stops, duration of charging events, and additional time and distance necessary for rerouting travel are presented to illustrate how BEV utility and travel patterns can be affected by various fast charge deployments.

  16. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  17. The denitration of simulated fast reactor highly active liquor waste

    Saum, C.J.; Ford, L.H.; Blatts, N.

    1981-01-01

    A short series of tests have been made with simulated HAL containing representative concentrations of palladium and phosphate ion. The information obtained has been confirmed in a small scale continuous denitration plant. These cases of four stirred pot reactors arranged in cascade. One possible advantage of this plant would be the low mean acidity in the first stage compared to the feed material which would limit to some extent the violence of the reaction. This would lead to a lower rate of gas evolution and may permit operation even with liquors where foaming is a problem. (DG)

  18. PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation

    Espana, S; Herraiz, J L; Vicente, E; Udias, J M [Grupo de Fisica Nuclear, Departmento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, Madrid (Spain); Vaquero, J J; Desco, M [Unidad de Medicina y CirugIa Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es

    2009-03-21

    Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.

  19. PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation

    Espana, S; Herraiz, J L; Vicente, E; Udias, J M; Vaquero, J J; Desco, M

    2009-01-01

    Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.

  20. Fish habitat simulation models and integrated assessment tools

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  1. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  2. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes, Phase I

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  3. FAST

    Zuidmeer-Jongejan, Laurian; Fernandez-Rivas, Montserrat; Poulsen, Lars K.

    2012-01-01

    ABSTRACT: The FAST project (Food Allergy Specific Immunotherapy) aims at the development of safe and effective treatment of food allergies, targeting prevalent, persistent and severe allergy to fish and peach. Classical allergen-specific immunotherapy (SIT), using subcutaneous injections with aqu...

  4. Development and Integration of the CT-PPS Fast Simulation in the CMS Software

    Fonseca De Souza, Sandro

    2017-01-01

    CT-PPS (CMS-TOTEM Precision Proton Spectrometer) is a joint project of the CMS and TOTEM collaborations with the goal of studying central exclusive production (CEP) in proton-proton collisions. A simplified simulation and reconstruction code for CT-PPS has been implemented in the CMS fast simulation package FastSim. Protons scattered at very low polar angles are propagated along the LHC beamlines from the generated vertex to the detectors by means of the beam transport package Hector. The rec...

  5. Patient Simulation: A Literary Synthesis of Assessment Tools in Anesthesiology

    Alice A. Edler

    2009-12-01

    Full Text Available High-fidelity patient simulation (HFPS has been hypothesized as a modality for assessing competency of knowledge and skill in patient simulation, but uniform methods for HFPS performance assessment (PA have not yet been completely achieved. Anesthesiology as a field founded the HFPS discipline and also leads in its PA. This project reviews the types, quality, and designated purpose of HFPS PA tools in anesthesiology. We used the systematic review method and systematically reviewed anesthesiology literature referenced in PubMed to assess the quality and reliability of available PA tools in HFPS. Of 412 articles identified, 50 met our inclusion criteria. Seventy seven percent of studies have been published since 2000; more recent studies demonstrated higher quality. Investigators reported a variety of test construction and validation methods. The most commonly reported test construction methods included “modified Delphi Techniques” for item selection, reliability measurement using inter-rater agreement, and intra-class correlations between test items or subtests. Modern test theory, in particular generalizability theory, was used in nine (18% of studies. Test score validity has been addressed in multiple investigations and shown a significant improvement in reporting accuracy. However the assessment of predicative has been low across the majority of studies. Usability and practicality of testing occasions and tools was only anecdotally reported. To more completely comply with the gold standards for PA design, both shared experience of experts and recognition of test construction standards, including reliability and validity measurements, instrument piloting, rater training, and explicit identification of the purpose and proposed use of the assessment tool, are required.

  6. Eddy current NDE performance demonstrations using simulation tools

    Maurice, L.; Costan, V.; Guillot, E.; Thomas, P.

    2013-01-01

    To carry out performance demonstrations of the Eddy-Current NDE processes applied on French nuclear power plants, EDF studies the possibility of using simulation tools as an alternative to measurements on steam generator tube mocks-up. This paper focuses on the strategy led by EDF to assess and use code C armel3D and Civa, on the case of Eddy-Current NDE on wears problem which may appear in the U-shape region of steam generator tubes due to the rubbing of anti-vibration bars.

  7. Integrated simulation tools for collimation cleaning in HL-LHC

    Bruce, R; Cerutti, F; Ferrari, A; Lechner, A; Marsili, A; Mirarchi, D; Ortega, P G; Redaelli, S; Rossi, A; Salvachua, B; Sinuela, D P; Tambasco, C; Vlachoudis, V; Mereghetti, A; Assmann, R; Lari, L; Gibson, S M; Nevay, LJ; Appleby, R B; Molson, J; Serluca, M; Barlow, R J; Rafique, H; Toader, A

    2014-01-01

    The Large Hadron Collider is designed to accommodate an unprecedented stored beam energy of 362 MJ in the nominal configuration and about the double in the high-luminosity upgrade HL-LHC that is presently under study. This requires an efficient collimation system to protect the superconducting magnets from quenches. During the design, it is therefore very important to accurately predict the expected beam loss distributions and cleaning efficiency. For this purpose, there are several ongoing efforts in improving the existing simulation tools or developing new ones. This paper gives a brief overview and status of the different available codes.

  8. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  9. Building Performance Simulation tools for planning of energy efficiency retrofits

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    2014-01-01

    Designing energy efficiency retrofits for existing buildings will bring environmental, economic, social, and health benefits. However, selecting specific retrofit strategies is complex and requires careful planning. In this study, we describe a methodology for adopting Building Performance...... to energy efficiency retrofits in social housing. To generate energy savings, we focus on optimizing the building envelope. We evaluate alternative building envelope actions using procedural solar radiation and daylight simulations. In addition, we identify the digital information flow and the information...... Simulation (BPS) tools as energy and environmentally conscious decision-making aids. The methodology has been developed to screen buildings for potential improvements and to support the development of retrofit strategies. We present a case study of a Danish renovation project, implementing BPS approaches...

  10. Dynamic wind turbine models in power system simulation tool

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    , connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control......This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides thus a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built...

  11. Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds

    Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.

    2012-11-01

    A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.

  12. The FTS atomic spectrum tool (FAST) for rapid analysis of line spectra

    Ruffoni, M. P.

    2013-07-01

    The FTS Atomic Spectrum Tool (FAST) is an interactive graphical program designed to simplify the analysis of atomic emission line spectra obtained from Fourier transform spectrometers. Calculated, predicted and/or known experimental line parameters are loaded alongside experimentally observed spectral line profiles for easy comparison between new experimental data and existing results. Many such line profiles, which could span numerous spectra, may be viewed simultaneously to help the user detect problems from line blending or self-absorption. Once the user has determined that their experimental line profile fits are good, a key feature of FAST is the ability to calculate atomic branching fractions, transition probabilities, and oscillator strengths-and their uncertainties-which is not provided by existing analysis packages. Program SummaryProgram title: FAST: The FTS Atomic Spectrum Tool Catalogue identifier: AEOW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 293058 No. of bytes in distributed program, including test data, etc.: 13809509 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-based systems. Operating system: Linux/Unix/Windows. RAM: 8 MB minimum. About 50-200 MB for a typical analysis. Classification: 2.2, 2.3, 21.2. Nature of problem: Visualisation of atomic line spectra including the comparison of theoretical line parameters with experimental atomic line profiles. Accurate intensity calibration of experimental spectra, and the determination of observed relative line intensities that are needed for calculating atomic branching fractions and oscillator strengths. Solution method: FAST is centred around a graphical interface, where a user may view sets of experimental line profiles and compare

  13. Bronchoscopy Simulation Training as a Tool in Medical School Education.

    Gopal, Mallika; Skobodzinski, Alexus A; Sterbling, Helene M; Rao, Sowmya R; LaChapelle, Christopher; Suzuki, Kei; Litle, Virginia R

    2018-07-01

    Procedural simulation training is rare at the medical school level and little is known about its usefulness in improving anatomic understanding and procedural confidence in students. Our aim is to assess the impact of bronchoscopy simulation training on bronchial anatomy knowledge and technical skills in medical students. Medical students were recruited by email, consented, and asked to fill out a survey regarding their baseline experience. Two thoracic surgeons measured their knowledge of bronchoscopy on a virtual reality bronchoscopy simulator using the Bronchoscopy Skills and Tasks Assessment Tool (BSTAT), a validated 65-point checklist (46 for anatomy, 19 for simulation). Students performed four self-directed training sessions of 15 minutes per week. A posttraining survey and BSTAT were completed afterward. Differences between pretraining and posttraining scores were analyzed with paired Student's t tests and random intercept linear regression models accounting for baseline BSTAT score, total training time, and training year. The study was completed by 47 medical students with a mean training time of 81.5 ± 26.8 minutes. Mean total BSTAT score increased significantly from 12.3 ± 5.9 to 48.0 ± 12.9 (p training time and frequency of training did not have a significant impact on level of improvement. Self-driven bronchoscopy simulation training in medical students led to improvements in bronchial anatomy knowledge and bronchoscopy skills. Further investigation is under way to determine the impact of bronchoscopy simulation training on future specialty interest and long-term skills retention. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  14. A new simulation tool for testing smart home recognition algorithms

    Bouchard, K.; Ajroud, A.; Bouchard, B.; Bouzouane, A. [Chicoutimi Univ. du Quebec, Saguenay, PQ (Canada)

    2010-08-13

    Smart home technologies are being actively researched. However, many scientists studying this topic of research do not own smart home infrastructure, and therefore, are unable to conduct proper experiments in a concrete environment with real data. In order to address this problem and to help researchers working in the field of activity recognition, this paper presented a novel and flexible three-dimensional smart home infrastructure simulator called SIMACT developed in Java. The paper discussed the proposed simulator including the software architecture, implementation, and scripting. It also discussed the actual initiative to gather real data that would be incorporated in the simulator as real case pre-recorded scenarios. An overview of related work and future developments were also presented. It was concluded that SIMACT is a user-friendly three-dimensional animated simulator that offers a simple interface that could be easily adapted to individual needs and could become a very useful tool for the community with its XML scripting language, many options, customizable three-dimensional frame and its set of pre-defined real case scenarios. 18 refs., 4 figs.

  15. Virtual environment simulation as a tool to support evacuation planning

    Mol, Antonio C.; Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Carvalho, Paulo V.R.; Jorge, Carlos A.F.; Sales, Douglas S.; Couto, Pedro M.; Botelho, Felipe M.; Bastos, Felipe R.

    2007-01-01

    This work is a preliminary study of the use of a free game-engine as a tool to build and to navigate in virtual environments, with a good degree of realism, for virtual simulations of evacuation from building and risk zones. To achieve this goal, some adjustments in the game engine have been implemented. A real building with four floors, consisting of some rooms with furniture and people, has been virtually implemented. Simulations of simple different evacuation scenarios have been performed, measuring the total time spent in each case. The measured times have been compared with their corresponding real evacuation times, measured in the real building. The first results have demonstrated that the virtual environment building with the free game engine is capable to reproduce the real situation with a satisfactory level. However, it is important to emphasize that such virtual simulations serve only as an aid in the planning of real evacuation simulations, and as such must never substitute the later. (author)

  16. Fast 3D Net Expeditions: Tools for Effective Scientific Collaboration on the World Wide Web

    Watson, Val; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D (three dimensional), high resolution, dynamic, interactive viewing of scientific data. The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG (Motion Picture Expert Group) movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewers local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: (1) The visual is much higher in resolution (1280x1024 pixels with 24 bits of color) than typical video format transmitted over the network. (2) The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). (3) A rich variety of guided expeditions through the data can be included easily. (4) A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of

  17. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  18. Development of modeling tools for pin-by-pin precise reactor simulation

    Ma Yan; Li Shu; Li Gang; Zhang Baoyin; Deng Li; Fu Yuanguang

    2013-01-01

    In order to develop large-scale transport simulation and calculation method (such as simulation of whole reactor core pin-by-pin problem), the Institute of Applied Physics and Computational Mathematics developed the neutron-photon coupled transport code JMCT and the toolkit JCOGIN. Creating physical calculation model easily and efficiently can essentially reduce problem solving time. Currently, lots of visual modeling programs have been developed based on different CAD systems. In this article, the developing idea of a visual modeling tool based on field oriented development was introduced. Considering the feature of physical modeling, fast and convenient operation modules were developed. In order to solve the storage and conversion problems of large scale models, the data structure and conversional algorithm based on the hierarchical geometry tree were designed. The automatic conversion and generation of physical model input file for JMCT were realized. By using this modeling tool, the Dayawan reactor whole core physical model was created, and the transformed file was delivered to JMCT for transport calculation. The results validate the correctness of the visual modeling tool. (authors)

  19. Fast robot kinematics modeling by using a parallel simulator (PSIM)

    El-Gazzar, H.M.; Ayad, N.M.A.

    2002-01-01

    High-speed computers are strongly needed not only for solving scientific and engineering problems, but also for numerous industrial applications. Such applications include computer-aided design, oil exploration, weather predication, space applications and safety of nuclear reactors. The rapid development in VLSI technology makes it possible to implement time consuming algorithms in real-time situations. Parallel processing approaches can now be used to reduce the processing-time for models of very high mathematical structure such as the kinematics molding of robot manipulator. This system is used to construct and evaluate the performance and cost effectiveness of several proposed methods to solve the Jacobian algorithm. Parallelism is introduced to the algorithms by using different task-allocations and dividing the whole job into sub tasks. Detailed analysis is performed and results are obtained for the case of six DOF (degree of freedom) robot arms (Stanford Arm). Execution times comparisons between Von Neumann (uni processor) and parallel processor architectures by using parallel simulator package (PSIM) are presented. The gained results are much in favour for the parallel techniques by at least fifty-percent improvements. Of course, further studies are needed to achieve the convenient and optimum number of processors has to be done

  20. Fast robot kinematics modeling by using a parallel simulator (PSIM)

    El-Gazzar, H M; Ayad, N M.A. [Atomic Energy Authority, Reactor Dept., Computer and Control Lab., P.O. Box no 13759 (Egypt)

    2002-09-15

    High-speed computers are strongly needed not only for solving scientific and engineering problems, but also for numerous industrial applications. Such applications include computer-aided design, oil exploration, weather predication, space applications and safety of nuclear reactors. The rapid development in VLSI technology makes it possible to implement time consuming algorithms in real-time situations. Parallel processing approaches can now be used to reduce the processing-time for models of very high mathematical structure such as the kinematics molding of robot manipulator. This system is used to construct and evaluate the performance and cost effectiveness of several proposed methods to solve the Jacobian algorithm. Parallelism is introduced to the algorithms by using different task-allocations and dividing the whole job into sub tasks. Detailed analysis is performed and results are obtained for the case of six DOF (degree of freedom) robot arms (Stanford Arm). Execution times comparisons between Von Neumann (uni processor) and parallel processor architectures by using parallel simulator package (PSIM) are presented. The gained results are much in favour for the parallel techniques by at least fifty-percent improvements. Of course, further studies are needed to achieve the convenient and optimum number of processors has to be done.

  1. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    2015-10-05

    simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows

  2. Interactive Media and Simulation Tools for Technical Training

    Gramoll, Kurt

    1997-01-01

    Over the last several years, integration of multiple media sources into a single information system has been rapidly developing. It has been found that when sound, graphics, text, animations, and simulations are skillfully integrated, the sum of the parts exceeds the individual parts for effective learning. In addition, simulations can be used to design and understand complex engineering processes. With the recent introduction of many high-level authoring, animation, modeling, and rendering programs for personal computers, significant multimedia programs can be developed by practicing engineers, scientists and even managers for both training and education. However, even with these new tools, a considerable amount of time is required to produce an interactive multimedia program. The development of both CD-ROM and Web-based programs are discussed in addition to the use of technically oriented animations. Also examined are various multimedia development tools and how they are used to develop effective engineering education courseware. Demonstrations of actual programs in engineering mechanics are shown.

  3. Simulation of spin dynamics: a tool in MRI system development

    Stoecker, Tony; Vahedipour, Kaveh; Shah, N Jon

    2011-01-01

    Magnetic Resonance Imaging (MRI) is a routine diagnostic tool in the clinics and the method of choice in soft-tissue contrast medical imaging. It is an important tool in neuroscience to investigate structure and function of the living brain on a systemic level. The latter is one of the driving forces to further develop MRI technology, as neuroscience especially demands higher spatiotemporal resolution which is to be achieved through increasing the static main magnetic field, B 0 . Although standard MRI is a mature technology, ultra high field (UHF) systems, at B 0 ≥ 7 T, offer space for new technical inventions as the physical conditions dramatically change. This work shows that the development strongly benefits from computer simulations of the measurement process on the basis of a semi-classical, nuclear spin-1/2 treatment given by the Bloch equations. Possible applications of such simulations are outlined, suggesting new solutions to the UHF-specific inhomogeneity problems of the static main field as well as the high-frequency transmit field.

  4. Online Analytical Processing (OLAP: A Fast and Effective Data Mining Tool for Gene Expression Databases

    Alkharouf Nadim W.

    2005-01-01

    Full Text Available Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD. A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  5. Nearest-cell: a fast and easy tool for locating crystal matches in the PDB

    Ramraj, V.; Evans, G.; Diprose, J. M.; Esnouf, R. M.

    2012-01-01

    A fast and easy tool to locate unit-cell matches in the PDB is described. When embarking upon X-ray diffraction data collection from a potentially novel macromolecular crystal form, it can be useful to ascertain whether the measured data reflect a crystal form that is already recorded in the Protein Data Bank and, if so, whether it is part of a large family of related structures. Providing such information to crystallographers conveniently and quickly, as soon as the first images have been recorded and the unit cell characterized at an X-ray beamline, has the potential to save time and effort as well as pointing to possible search models for molecular replacement. Given an input unit cell, and optionally a space group, Nearest-cell rapidly scans the Protein Data Bank and retrieves near-matches

  6. Online analytical processing (OLAP): a fast and effective data mining tool for gene expression databases.

    Alkharouf, Nadim W; Jamison, D Curtis; Matthews, Benjamin F

    2005-06-30

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  7. SunFast: A sun workstation based, fuel analysis scoping tool for pressurized water reactors

    Bohnhoff, W.J.

    1991-05-01

    The objective of this research was to develop a fuel cycle scoping program for light water reactors and implement the program on a workstation class computer. Nuclear fuel management problems are quite formidable due to the many fuel arrangement options available. Therefore, an engineer must perform multigroup diffusion calculations for a variety of different strategies in order to determine an optimum core reload. Standard fine mesh finite difference codes result in a considerable computational cost. A better approach is to build upon the proven reliability of currently available mainframe computer programs, and improve the engineering efficiency by taking advantage of the most useful characteristic of workstations: enhanced man/machine interaction. This dissertation contains a description of the methods and a user's guide for the interactive fuel cycle scoping program, SunFast. SunFast provides computational speed and accuracy of solution along with a synergetic coupling between the user and the machine. It should prove to be a valuable tool when extensive sets of similar calculations must be done at a low cost as is the case for assessing fuel management strategies. 40 refs

  8. On Fast Post-Processing of Global Positioning System Simulator Truth Data and Receiver Measurements and Solutions Data

    Kizhner, Semion; Day, John H. (Technical Monitor)

    2000-01-01

    Post-Processing of data related to a Global Positioning System (GPS) simulation is an important activity in qualification of a GPS receiver for space flight. Because a GPS simulator is a critical resource it is desirable to move off the pertinent simulation data from the simulator as soon as a test is completed. The simulator data files are usually moved to a Personal Computer (PC), where the post-processing of the receiver logged measurements and solutions data and simulated data is performed. Typically post-processing is accomplished using PC-based commercial software languages and tools. Because of commercial software systems generality their general-purpose functions are notoriously slow and more than often are the bottleneck problem even for short duration experiments. For example, it may take 8 hours to post-process data from a 6-hour simulation. There is a need to do post-processing faster, especially in order to use the previous test results as feedback for a next simulation setup. This paper demonstrates that a fast software linear interpolation algorithm is applicable to a large class of engineering problems, like GPS simulation data post-processing, where computational time is a critical resource and is one of the most important considerations. An approach is developed that allows to speed-up post-processing by an order of magnitude. It is based on improving the post-processing bottleneck interpolation algorithm using apriori information that is specific to the GPS simulation application. The presented post-processing scheme was used in support of a few successful space flight missions carrying GPS receivers. A future approach to solving the post-processing performance problem using Field Programmable Gate Array (FPGA) technology is described.

  9. Simulation tools for guided wave based structural health monitoring

    Mesnil, Olivier; Imperiale, Alexandre; Demaldent, Edouard; Baronian, Vahan; Chapuis, Bastien

    2018-04-01

    Structural Health Monitoring (SHM) is a thematic derived from Non Destructive Evaluation (NDE) based on the integration of sensors onto or into a structure in order to monitor its health without disturbing its regular operating cycle. Guided wave based SHM relies on the propagation of guided waves in plate-like or extruded structures. Using piezoelectric transducers to generate and receive guided waves is one of the most widely accepted paradigms due to the low cost and low weight of those sensors. A wide range of techniques for flaw detection based on the aforementioned setup is available in the literature but very few of these techniques have found industrial applications yet. A major difficulty comes from the sensitivity of guided waves to a substantial number of parameters such as the temperature or geometrical singularities, making guided wave measurement difficult to analyze. In order to apply guided wave based SHM techniques to a wider spectrum of applications and to transfer those techniques to the industry, the CEA LIST develops novel numerical methods. These methods facilitate the evaluation of the robustness of SHM techniques for multiple applicative cases and ease the analysis of the influence of various parameters, such as sensors positioning or environmental conditions. The first numerical tool is the guided wave module integrated to the commercial software CIVA, relying on a hybrid modal-finite element formulation to compute the guided wave response of perturbations (cavities, flaws…) in extruded structures of arbitrary cross section such as rails or pipes. The second numerical tool is based on the spectral element method [2] and simulates guided waves in both isotropic (metals) and orthotropic (composites) plate like-structures. This tool is designed to match the widely accepted sparse piezoelectric transducer array SHM configuration in which each embedded sensor acts as both emitter and receiver of guided waves. This tool is under development and

  10. BASIMO - Borehole Heat Exchanger Array Simulation and Optimization Tool

    Schulte, Daniel O.; Bastian, Welsch; Wolfram, Rühaak; Kristian, Bär; Ingo, Sass

    2017-04-01

    Arrays of borehole heat exchangers are an increasingly popular source for renewable energy. Furthermore, they can serve as borehole thermal energy storage (BTES) systems for seasonally fluctuating heat sources like solar thermal energy or district heating grids. The high temperature level of these heat sources prohibits the use of the shallow subsurface for environmental reasons. Therefore, deeper reservoirs have to be accessed instead. The increased depth of the systems results in high investment costs and has hindered the implementation of this technology until now. Therefore, research of medium deep BTES systems relies on numerical simulation models. Current simulation tools cannot - or only to some extent - describe key features like partly insulated boreholes unless they run fully discretized models of the borehole heat exchangers. However, fully discretized models often come at a high computational cost, especially for large arrays of borehole heat exchangers. We give an update on the development of BASIMO: a tool, which uses one dimensional thermal resistance and capacity models for the borehole heat exchangers coupled with a numerical finite element model for the subsurface heat transport in a dual-continuum approach. An unstructured tetrahedral mesh bypasses the limitations of structured grids for borehole path geometries, while the thermal resistance and capacity model is improved to account for borehole heat exchanger properties changing with depth. Thereby, partly insulated boreholes can be considered in the model. Furthermore, BASIMO can be used to improve the design of BTES systems: the tool allows for automated parameter variations and is readily coupled to other code like mathematical optimization algorithms. Optimization can be used to determine the required minimum system size or to increase the system performance.

  11. Gas cooled fast reactor benchmarks for JNC and Cea neutronic tools assessment

    Rimpault, G.; Sugino, K.; Hayashi, H.

    2005-01-01

    In order to verify the adequacy of JNC and Cea computational tools for the definition of GCFR (gas cooled fast reactor) core characteristics, GCFR neutronic benchmarks have been performed. The benchmarks have been carried out on two different cores: 1) a conventional Gas-Cooled fast Reactor (EGCR) core with pin-type fuel, and 2) an innovative He-cooled Coated-Particle Fuel (CPF) core. Core characteristics being studied include: -) Criticality (Effective multiplication factor or K-effective), -) Instantaneous breeding gain (BG), -) Core Doppler effect, and -) Coolant depressurization reactivity. K-effective and coolant depressurization reactivity at EOEC (End Of Equilibrium Cycle) state were calculated since these values are the most critical characteristics in the core design. In order to check the influence due to the difference of depletion calculation systems, a simple depletion calculation benchmark was performed. Values such as: -) burnup reactivity loss, -) mass balance of heavy metals and fission products (FP) were calculated. Results of the core design characteristics calculated by both JNC and Cea sides agree quite satisfactorily in terms of core conceptual design study. Potential features for improving the GCFR computational tools have been discovered during the course of this benchmark such as the way to calculate accurately the breeding gain. Different ways to improve the accuracy of the calculations have also been identified. In particular, investigation on nuclear data for steel is important for EGCR and for lumped fission products in both cores. The outcome of this benchmark is already satisfactory and will help to design more precisely GCFR cores. (authors)

  12. Simulating a topological transition in a superconducting phase qubit by fast adiabatic trajectories

    Wang, Tenghui; Zhang, Zhenxing; Xiang, Liang; Gong, Zhihao; Wu, Jianlan; Yin, Yi

    2018-04-01

    The significance of topological phases has been widely recognized in the community of condensed matter physics. The well controllable quantum systems provide an artificial platform to probe and engineer various topological phases. The adiabatic trajectory of a quantum state describes the change of the bulk Bloch eigenstates with the momentum, and this adiabatic simulation method is however practically limited due to quantum dissipation. Here we apply the "shortcut to adiabaticity" (STA) protocol to realize fast adiabatic evolutions in the system of a superconducting phase qubit. The resulting fast adiabatic trajectories illustrate the change of the bulk Bloch eigenstates in the Su-Schrieffer-Heeger (SSH) model. A sharp transition is experimentally determined for the topological invariant of a winding number. Our experiment helps identify the topological Chern number of a two-dimensional toy model, suggesting the applicability of the fast adiabatic simulation method for topological systems.

  13. A Network Traffic Generator Model for Fast Network-on-Chip Simulation

    Mahadevan, Shankar; Angiolini, Frederico; Storgaard, Michael

    2005-01-01

    For Systems-on-Chip (SoCs) development, a predominant part of the design time is the simulation time. Performance evaluation and design space exploration of such systems in bit- and cycle-true fashion is becoming prohibitive. We propose a traffic generation (TG) model that provides a fast...

  14. Comparison of fast ion collective Thomson scattering measurements at ASDEX Upgrade with numerical simulations

    Salewski, Mirko; Meo, Fernando; Stejner Pedersen, Morten

    2010-01-01

    Collective Thomson scattering (CTS) experiments were carried out at ASDEX Upgrade to measure the one-dimensional velocity distribution functions of fast ion populations. These measurements are compared with simulations using the codes TRANSP/NUBEAM and ASCOT for two different neutral beam injecti...

  15. Interactive simulations as teaching tools for engineering mechanics courses

    Carbonell, Victoria; Romero, Carlos; Martínez, Elvira; Flórez, Mercedes

    2013-07-01

    This study aimed to gauge the effect of interactive simulations in class as an active teaching strategy for a mechanics course. Engineering analysis and design often use the properties of planar sections in calculations. In the stress analysis of a beam under bending and torsional loads, cross-sectional properties are used to determine stress and displacement distributions in the beam cross section. The centroid, moments and products of inertia of an area made up of several common shapes (rectangles usually) may thus be obtained by adding the moments of inertia of the component areas (U-shape, L-shape, C-shape, etc). This procedure is used to calculate the second moments of structural shapes in engineering practice because the determination of their moments of inertia is necessary for the design of structural components. This paper presents examples of interactive simulations developed for teaching the ‘Mechanics and mechanisms’ course at the Universidad Politecnica de Madrid, Spain. The simulations focus on fundamental topics such as centroids, the properties of the moment of inertia, second moments of inertia with respect to two axes, principal moments of inertia and Mohr's Circle for plane stress, and were composed using Geogebra software. These learning tools feature animations, graphics and interactivity and were designed to encourage student participation and engagement in active learning activities, to effectively explain and illustrate course topics, and to build student problem-solving skills.

  16. Interactive simulations as teaching tools for engineering mechanics courses

    Carbonell, Victoria; Martínez, Elvira; Flórez, Mercedes; Romero, Carlos

    2013-01-01

    This study aimed to gauge the effect of interactive simulations in class as an active teaching strategy for a mechanics course. Engineering analysis and design often use the properties of planar sections in calculations. In the stress analysis of a beam under bending and torsional loads, cross-sectional properties are used to determine stress and displacement distributions in the beam cross section. The centroid, moments and products of inertia of an area made up of several common shapes (rectangles usually) may thus be obtained by adding the moments of inertia of the component areas (U-shape, L-shape, C-shape, etc). This procedure is used to calculate the second moments of structural shapes in engineering practice because the determination of their moments of inertia is necessary for the design of structural components. This paper presents examples of interactive simulations developed for teaching the ‘Mechanics and mechanisms’ course at the Universidad Politecnica de Madrid, Spain. The simulations focus on fundamental topics such as centroids, the properties of the moment of inertia, second moments of inertia with respect to two axes, principal moments of inertia and Mohr's Circle for plane stress, and were composed using Geogebra software. These learning tools feature animations, graphics and interactivity and were designed to encourage student participation and engagement in active learning activities, to effectively explain and illustrate course topics, and to build student problem-solving skills. (paper)

  17. WINS. Market Simulation Tool for Facilitating Wind Energy Integration

    Shahidehpour, Mohammad [Illinois Inst. of Technology, Chicago, IL (United States)

    2012-10-30

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practices can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision

  18. SimTeacher.com: An Online Simulation Tool for Teacher Education

    Fischler, Robert

    2007-01-01

    Simulation-based learning (SBL) is a recent yet promising pedagogical approach. There is a plethora of medical and business simulations. However, there is a lack of simulation tools for other areas of study, like teacher education. Furthermore, since the content of many educational simulations is "hard-coded" into the tool, educators typically…

  19. Temporal Gillespie Algorithm: Fast Simulation of Contagion Processes on Time-Varying Networks.

    Vestergaard, Christian L; Génois, Mathieu

    2015-10-01

    Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling.

  20. Enhanced operator-training simulator for the Fast Flux Test Facility

    Schrader, F.D.; Swanson, C.D.

    1983-01-01

    The FFTF Plant Operator Training Simulator Facility has proven to be a valuable asset throughtout the testing, startup and early operational phases of the Fast Flux Test facility. However, limitations inherent in the existing simulation facility, increased emphasis on the required quality of operator training, and an expanded scope of applications (e.g., MNI development) justify an enhanced facility. Direct use of plant operators in the development of improved reactor control room displays and other man/machine interface equipment and procedures increases the credibility of proposed techniques and reported results. The FFTF Plant Operator Training Simulator provides a key element in this development program

  1. Fast dose planning Monte Carlo simulations in inhomogeneous phantoms submerged in uniform, static magnetic fields

    Yanez, R.; Dempsey, J. F.

    2007-01-01

    We present studies in support of the development of a magnetic resonance imaging (MRI) guided intensity modulated radiation therapy (IMRT) device for the treatment of cancer patients. Fast and accurate computation of the absorbed ionizing radiation dose delivered in the presence of the MRI magnetic field are required for clinical implementation. The fast Monte Carlo simulation code DPM, optimized for radiotherapy treatment planning, is modified to simulate absorbed doses in uniform, static magnetic fields, and benchmarked against PENELOPE. Simulations of dose deposition in inhomogeneous phantoms in which a low density material is sandwiched in water shows that a lower MRI field strength (0.3 T) is to prefer in order to avoid dose build-up near material boundaries. (authors)

  2. Development and Integration of the CT-PPS Fast Simulation in the CMS Software

    Fonseca De Souza, Sandro

    2017-01-01

    CT-PPS (CMS-TOTEM Precision Proton Spectrometer) is a joint project of the CMS and TOTEM collaborations with the goal of studying central exclusive production (CEP) in proton-proton collisions. A simplified simulation and reconstruction code for CT-PPS has been implemented in the CMS fast simulation package FastSim. Protons scattered at very low polar angles are propagated along the LHC beamlines from the generated vertex to the detectors by means of the beam transport package Hector. The reconstructed proton tracks are obtained from the simulated hits in the tracking detectors and are used to determine the proton kinematics at the vertex. The timing information is added to the tracks.

  3. APRON: A Cellular Processor Array Simulation and Hardware Design Tool

    Barr, David R. W.; Dudek, Piotr

    2009-12-01

    We present a software environment for the efficient simulation of cellular processor arrays (CPAs). This software (APRON) is used to explore algorithms that are designed for massively parallel fine-grained processor arrays, topographic multilayer neural networks, vision chips with SIMD processor arrays, and related architectures. The software uses a highly optimised core combined with a flexible compiler to provide the user with tools for the design of new processor array hardware architectures and the emulation of existing devices. We present performance benchmarks for the software processor array implemented on standard commodity microprocessors. APRON can be configured to use additional processing hardware if necessary and can be used as a complete graphical user interface and development environment for new or existing CPA systems, allowing more users to develop algorithms for CPA systems.

  4. Software tool for horizontal-axis wind turbine simulation

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  5. Interoperable mesh and geometry tools for advanced petascale simulations

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M S; Tautges, T; Trease, H

    2007-01-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and data-structure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications

  6. APRON: A Cellular Processor Array Simulation and Hardware Design Tool

    David R. W. Barr

    2009-01-01

    Full Text Available We present a software environment for the efficient simulation of cellular processor arrays (CPAs. This software (APRON is used to explore algorithms that are designed for massively parallel fine-grained processor arrays, topographic multilayer neural networks, vision chips with SIMD processor arrays, and related architectures. The software uses a highly optimised core combined with a flexible compiler to provide the user with tools for the design of new processor array hardware architectures and the emulation of existing devices. We present performance benchmarks for the software processor array implemented on standard commodity microprocessors. APRON can be configured to use additional processing hardware if necessary and can be used as a complete graphical user interface and development environment for new or existing CPA systems, allowing more users to develop algorithms for CPA systems.

  7. Tools for simulation of high beam intensity ion accelerators; Simulationswerkzeuge fuer die Berechnung hochintensiver Ionenbeschleuniger

    Tiede, Rudolf

    2009-07-09

    A new particle-in-cell space charge routine based on a fast Fourier transform was developed and implemented to the LORASR code. It provides the ability to perform up to several 100 batch run simulations with up to 1 million macroparticles each within reasonable computation time. The new space charge routine was successfully validated in the framework of the European ''High Intensity Pulsed Proton Injectors'' (HIPPI) collaboration: Several static Poisson solver benchmarking comparisons were performed, as well as particle tracking comparisons along the GSI UNILAC Alvarez section. Moreover machine error setting routines and data analysis tools were developed and applied on error studies for the ''Heidelberg Cacer Therapy'' (HICAT) IH-type drift tube linear accelerator (linac), the FAIR Facility Proton Linac and the proposal of a linac for the ''International Fusion Materials Irradiation Facility'' (IFMIF) based on superconducting CH-type structures. (orig.)

  8. The mesoscale dispersion modeling system a simulation tool for development of an emergency response system

    Uliasz, M.

    1990-01-01

    The mesoscale dispersion modeling system is under continuous development. The included numerical models require further improvements and evaluation against data from meteorological and tracer field experiments. The system can not be directly applied to real time predictions. However, it seems to be a useful simulation tool for solving several problems related to planning the monitoring network and development of the emergency response system for the nuclear power plant located in a coastal area. The modeling system can be also applied to another environmental problems connected with air pollution dispersion in complex terrain. The presented numerical models are designed for the use on personal computers and are relatively fast in comparison with the similar mesoscale models developed on mainframe computers

  9. Simulating polar bear energetics during a seasonal fast using a mechanistic model.

    Paul D Mathewson

    Full Text Available In this study we tested the ability of a mechanistic model (Niche Mapper™ to accurately model adult, non-denning polar bear (Ursus maritimus energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal's energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change.

  10. Simulating polar bear energetics during a seasonal fast using a mechanistic model.

    Mathewson, Paul D; Porter, Warren P

    2013-01-01

    In this study we tested the ability of a mechanistic model (Niche Mapper™) to accurately model adult, non-denning polar bear (Ursus maritimus) energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal's energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change.

  11. SOAP: A Tool for the Fast Computation of Photometry and Radial Velocity Induced by Stellar Spots

    Boisse, I.; Bonfils, X.; Santos, N. C.; Figueira, P.

    2013-04-01

    Dark spots and bright plages are present on the surface of dwarf stars from spectral types F to M, even in their low-active phase (like the Sun). Their appearance and disappearance on the stellar photosphere, combined with the stellar rotation, may lead to errors and uncertainties in the characterization of planets both in radial velocity (RV) and photometry. Spot Oscillation and Planet (SOAP) is a tool offered to the community that enables to simulate spots and plages on rotating stars and computes their impact on RV and photometric measurements. This tool will help to understand the challenges related to the knowledge of stellar activity for the next decade: detect telluric planets in the habitable zone of their stars (from G to M dwarfs), understand the activity in the low-mass end of M dwarf (on which future projects, like SPIRou or CARMENES, will focus), limitation to the characterization of the exoplanetary atmosphere (from the ground or with Spitzer, JWST), search for planets around young stars. These can be simulated with SOAP in order to search for indices and corrections to the effect of activity.

  12. Storm-rhine -simulation Tool For River Management

    Heun, J. C.; Schotanus, T. D.; de Groen, M. M.; Werner, M.

    The Simulation Tool for River Management (STORM), based on the River Rhine case, aims to provide insight into river and floodplain management, by (1) raising aware- ness of river functions, (2) exploring alternative strategies, (3) showing the links be- tween natural processes, spatial planning, engineering interventions, river functions and stakeholder interests, (4) facilitating the debate between different policy makers and stakeholders from across the basin and (5) enhancing co-operation and mutual un- derstanding. The simulation game is built around the new concepts of SRoom for the & cedil;RiverT, Flood Retention Areas, Resurrection of former River Channels and SLiving & cedil;with the FloodsT. The Game focuses on the Lower and Middle Rhine from the Dutch Delta to Maxau in Germany. Influences from outside the area are included as scenarios for boundary conditions. The heart of the tool is the hydraulic module, which calcu- lates representative high- and low water-levels for different hydrological scenarios and influenced by river engineering measures and physical planning in the floodplains. The water levels are translated in flood risks, navigation potential, nature development and land use opportunities in the floodplain. Players of the Game represent the institutions: National, Regional, Municipal Government and Interest Organisations, with interests in flood protection, navigation, agriculture, urban expansion, mining and nature. Play- ers take typical river and floodplain engineering, physical planning and administrative measures to pursue their interests in specific river functions. The players are linked by institutional arrangements and budgetary constraints. The game particularly aims at middle and higher level staff of local and regional government, water boards and members of interest groups from across the basin, who deal with particular stretches or functions of the river but who need (1) to be better aware of the integrated whole, (2) to

  13. A dedicated tool for PET scanner simulations using FLUKA

    Ortega, P.G.; Boehlen, T.T.; Cerutti, F.; Chin, M.P.W.; Ferrari, A.; Mancini, C.; Vlachoudis, V.; Mairani, A.; Sala, Paola R.

    2013-06-01

    Positron emission tomography (PET) is a well-established medical imaging technique. It is based on the detection of pairs of annihilation gamma rays from a beta+-emitting radionuclide, usually inoculated in the body via a biologically active molecule. Apart from its wide-spread use for clinical diagnosis, new applications are proposed. This includes notably the usage of PET for treatment monitoring of radiation therapy with protons and ions. PET is currently the only available technique for non-invasive monitoring of ion beam dose delivery, which was tested in several clinical pilot studies. For hadrontherapy, the distribution of positron emitters, produced by the ion beam, can be analyzed to verify the correct treatment delivery. The adaptation of previous PET scanners to new environments and the necessity of more precise diagnostics by better image quality triggered the development of new PET scanner designs. The use of Monte Carlo (MC) codes is essential in the early stages of the scanner design to simulate the transport of particles and nuclear interactions from therapeutic ion beams or radioisotopes and to predict radiation fields in tissues and radiation emerging from the patient. In particular, range verification using PET is based on the comparison of detected and simulated activity distributions. The accuracy of the MC code for the relevant physics processes is obviously essential for such applications. In this work we present new developments of the physics models with importance for PET monitoring and integrated tools for PET scanner simulations for FLUKA, a fully-integrated MC particle-transport code, which is widely used for an extended range of applications (accelerator shielding, detector and target design, calorimetry, activation, dosimetry, medical physics, radiobiology, ...). The developed tools include a PET scanner geometry builder and a dedicated scoring routine for coincident event determination. The geometry builder allows the efficient

  14. Simulation for Wind Turbine Generators -- With FAST and MATLAB-Simulink Modules

    Singh, M.; Muljadi, E.; Jonkman, J.; Gevorgian, V.; Girsang, I.; Dhupia, J.

    2014-04-01

    This report presents the work done to develop generator and gearbox models in the Matrix Laboratory (MATLAB) environment and couple them to the National Renewable Energy Laboratory's Fatigue, Aerodynamics, Structures, and Turbulence (FAST) program. The goal of this project was to interface the superior aerodynamic and mechanical models of FAST to the excellent electrical generator models found in various Simulink libraries and applications. The scope was limited to Type 1, Type 2, and Type 3 generators and fairly basic gear-train models. Future work will include models of Type 4 generators and more-advanced gear-train models with increased degrees of freedom. As described in this study, implementation of the developed drivetrain model enables the software tool to be used in many ways. Several case studies are presented as examples of the many types of studies that can be performed using this tool.

  15. FAST: a three-dimensional time-dependent FEL simulation code

    Saldin, E.L.; Schneidmiller, E.A.; Yurkov, M.V.

    1999-01-01

    In this report we briefly describe the three-dimensional, time-dependent FEL simulation code FAST. The equations of motion of the particles and Maxwell's equations are solved simultaneously taking into account the slippage effect. Radiation fields are calculated using an integral solution of Maxwell's equations. A special technique has been developed for fast calculations of the radiation field, drastically reducing the required CPU time. As a result, the developed code allows one to use a personal computer for time-dependent simulations. The code allows one to simulate the radiation from the electron bunch of any transverse and longitudinal bunch shape; to simulate simultaneously an external seed with superimposed noise in the electron beam; to take into account energy spread in the electron beam and the space charge fields; and to simulate a high-gain, high-efficiency FEL amplifier with a tapered undulator. It is important to note that there are no significant memory limitations in the developed code and an electron bunch of any length can be simulated

  16. Fast Monte Carlo-simulator with full collimator and detector response modelling for SPECT

    Sohlberg, A.O.; Kajaste, M.T.

    2012-01-01

    Monte Carlo (MC)-simulations have proved to be a valuable tool in studying single photon emission computed tomography (SPECT)-reconstruction algorithms. Despite their popularity, the use of Monte Carlo-simulations is still often limited by their large computation demand. This is especially true in situations where full collimator and detector modelling with septal penetration, scatter and X-ray fluorescence needs to be included. This paper presents a rapid and simple MC-simulator, which can effectively reduce the computation times. The simulator was built on the convolution-based forced detection principle, which can markedly lower the number of simulated photons. Full collimator and detector response look-up tables are pre-simulated and then later used in the actual MC-simulations to model the system response. The developed simulator was validated by comparing it against 123 I point source measurements made with a clinical gamma camera system and against 99m Tc software phantom simulations made with the SIMIND MC-package. The results showed good agreement between the new simulator, measurements and the SIMIND-package. The new simulator provided near noise-free projection data in approximately 1.5 min per projection with 99m Tc, which was less than one-tenth of SIMIND's time. The developed MC-simulator can markedly decrease the simulation time without sacrificing image quality. (author)

  17. Simulation of micromechanical behavior of polycrystals: finite elements versus fast Fourier transforms

    Prakash, A; Lebensohn, R A

    2009-01-01

    In this work, we compare finite element and fast Fourier transform approaches for the prediction of the micromechanical behavior of polycrystals. Both approaches are full-field approaches and use the same visco-plastic single crystal constitutive law. We investigate the texture and the heterogeneity of the inter- and intragranular stress and strain fields obtained from the two models. Additionally, we also look into their computational performance. Two cases—rolling of aluminum and wire drawing of tungsten—are used to evaluate the predictions of the two models. Results from both the models are similar, when large grain distortions do not occur in the polycrystal. The finite element simulations were found to be highly computationally intensive, in comparison with the fast Fourier transform simulations. Figure 9 was corrected in this article on the 25 August 2009. The corrected electronic version is identical to the print version

  18. A simulator tool set for evaluating HEVC/SHVC streaming

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to

  19. Performance Assessment Tools for Distance Learning and Simulation: Knowledge, Models and Tools to Improve the Effectiveness of Naval Distance Learning

    Baker, Eva L; Munro, Allen; Pizzini, Quentin A; Brill, David G; Michiuye, Joanne K

    2006-01-01

    ... by (a) extending CRESST's knowledge mapping tool's authoring and scoring functionality and (b) providing the capability to embed a knowledge mapping assessment in simulation-based training developed by BTL...

  20. A two-dimensional simulator of the neutronic behaviour of low power fast reactors

    Penha, M.A.V.R. da.

    1984-01-01

    A model to simulate the temporal neutronic behaviour of fast breeder reactors was developed. The effective cross-sections are corrected, whenever the reactor state change; by using linear correlations and interpolation schemes with data contained in a library previously compiled. This methodology was coupled with a simplified spatial neutronic calculation to investigate the temporal behaviour of neutronic parameters such as breeding gain, flux and power. (Author) [pt

  1. Neutronic/Thermal-hydraulic Coupling Technigues for Sodium Cooled Fast Reactor Simulations

    Ragusa, Jean; Siegel, Andrew; Ruggieri, Jean-Michel

    2010-01-01

    The objective of this project was to test new coupling algorithms and enable efficient and scalable multi-physics simulations of advanced nuclear reactors, with considerations regarding the implementation of such algorithms in massively parallel environments. Numerical tests were carried out to verify the proposed approach and the examples included some reactor transients. The project was directly related to the Sodium Fast Reactor program element of the Generation IV Nuclear Energy Systems Initiative and the Advanced Fuel cycle Initiative, and, supported the requirement of high-fidelity simulation as a mean of achieving the goals of the presidential Global Nuclear Energy Partnership (GNEP) vision.

  2. Neutronic/Thermalhydraulic Coupling Technigues for Sodium Cooled Fast Reactor Simulations

    Jean Ragusa; Andrew Siegel; Jean-Michel Ruggieri

    2010-09-28

    The objective of this project was to test new coupling algorithms and enable efficient and scalable multi-physics simulations of advanced nuclear reactors, with considerations regarding the implementation of such algorithms in massively parallel environments. Numerical tests were carried out to verify the proposed approach and the examples included some reactor transients. The project was directly related to the Sodium Fast Reactor program element of the Generation IV Nuclear Energy Systems Initiative and the Advanced Fuel cycle Initiative, and, supported the requirement of high-fidelity simulation as a mean of achieving the goals of the presidential Global Nuclear Energy Partnership (GNEP) vision.

  3. Fast and accurate calculation of the properties of water and steam for simulation

    Szegi, Zs.; Gacs, A.

    1990-01-01

    A basic principle simulator was developed at the CRIP, Budapest, for real time simulation of the transients of WWER-440 type nuclear power plants. Its integral part is the fast and accurate calculation of the thermodynamic properties of water and steam. To eliminate successive approximations, the model system of the secondary coolant circuit requires binary forms which are known as inverse functions, countinuous when crossing the saturation line, accurate and coherent for all argument combinations. A solution which reduces the computer memory and execution time demand is reported. (author) 36 refs.; 5 figs.; 3 tabs

  4. Analyzing asteroid reflectance spectra with numerical tools based on scattering simulations

    Penttilä, Antti; Väisänen, Timo; Markkanen, Johannes; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri

    2017-04-01

    We are developing a set of numerical tools that can be used in analyzing the reflectance spectra of granular materials such as the regolith surface of atmosphereless Solar system objects. Our goal is to be able to explain, with realistic numerical scattering models, the spectral features arising when materials are intimately mixed together. We include the space-weathering -type effects in our simulations, i.e., mixing host mineral locally with small inclusions of another material in small proportions. Our motivation for this study comes from the present lack of such tools. The current common practice is to apply a semi-physical approximate model such as some variation of Hapke models [e.g., 1] or the Shkuratov model [2]. These models are expressed in a closed form so that they are relatively fast to apply. They are based on simplifications on the radiative transfer theory. The problem is that the validity of the model is not always guaranteed, and the derived physical properties related to particle scattering properties can be unrealistic [3]. We base our numerical tool into a chain of scattering simulations. Scattering properties of small inclusions inside an absorbing host matrix can be derived using exact methods solving the Maxwell equations of the system. The next step, scattering by a single regolith grain, is solved using a geometrical optics method accounting for surface reflections, internal absorption, and possibly the internal diffuse scattering. The third step involves the radiative transfer simulations of these regolith grains in a macroscopic planar element. The chain can be continued next with shadowing simulation over the target surface elements, and finally by integrating the bidirectional reflectance distribution function over the object's shape. Most of the tools in the proposed chain already exist, and one practical task for us is to tie these together into an easy-to-use toolchain that can be publicly distributed. We plan to open the

  5. Weightfield2: A fast simulator for silicon and diamond solid state detector

    Cenna, Francesca, E-mail: cenna@to.infn.it [INFN Torino, Via Pietro Giuria 1, Torino (Italy); Cartiglia, N. [INFN Torino, Via Pietro Giuria 1, Torino (Italy); Friedl, M.; Kolbinger, B. [HEPHY Vienna (Austria); Sadrozinski, H.F.-W.; Seiden, A.; Zatserklyaniy, Andriy; Zatserklyaniy, Anton [University of California, Santa Cruz (United States)

    2015-10-01

    We have developed a fast simulation program to study the performance of silicon and diamond detectors, Weightfield2. The program uses GEANT4 libraries to simulate the energy released by an incoming particle in silicon (or diamond), and Ramo's theorem to generate the induced signal current. A graphical interface allows the user to configure many input parameters such as the incident particle, sensor geometry, presence and value of internal gain, doping of silicon sensor and its operating conditions, the values of an external magnetic field, ambient temperature and thermal diffusion. A simplified electronics simulator is also implemented to include the response of an oscilloscope and front-end electronics. The program has been validated by comparing its predictions for minimum ionizing and α particles with measured signals and TCAD simulations, finding very good agreement in both cases.

  6. A fast sorting algorithm for a hypersonic rarefied flow particle simulation on the connection machine

    Dagum, Leonardo

    1989-01-01

    The data parallel implementation of a particle simulation for hypersonic rarefied flow described by Dagum associates a single parallel data element with each particle in the simulation. The simulated space is divided into discrete regions called cells containing a variable and constantly changing number of particles. The implementation requires a global sort of the parallel data elements so as to arrange them in an order that allows immediate access to the information associated with cells in the simulation. Described here is a very fast algorithm for performing the necessary ranking of the parallel data elements. The performance of the new algorithm is compared with that of the microcoded instruction for ranking on the Connection Machine.

  7. Numerical simulations of fast ion loss measurements induced by magnetic islands in the ASDEX Upgrade tokamak

    Gobbin, M.; Marrelli, L.; Martin, P.; Fahrbach, H.U.; Garcia-Munoz, M.; Guenter, S.; White, R.B.

    2009-01-01

    A test particle approach, implemented with the Hamiltonian code ORBIT, is used to simulate measurements of fast ion losses induced by magnetic islands in the ASDEX Upgrade tokamak. In particular, the numerical simulations reproduce the toroidal localization of losses and the lost ions pitch angle and energy distribution experimentally measured with the fast ion losses detector (FILD) in the presence of a neoclassical tearing mode (NTM). The simulated NTM induced losses occurring on time scales longer than 100 μs are composed of mainly trapped or barely passing particles, consistently with the slow decay of the experimental signal from one FILD channel after the beam switch-off. The numerical simulations have been performed by taking into account the D-shaped plasma geometry, the collision mechanisms, the losses due to ripple effects and the rotation of the mode. The radial profile of the magnetic perturbation is adjusted in order to match ECE measurements. While statistical properties of FILD measurements are rather well reproduced, the simulated total amount of losses is found to be significantly affected by edge details of the magnetic perturbation as it determines the loss mechanism.

  8. Petascale molecular dynamics simulation using the fast multipole method on K computer

    Ohno, Yousuke; Yokota, Rio; Koyama, Hiroshi; Morimoto, Gentaro; Hasegawa, Aki; Masumoto, Gen; Okimoto, Noriaki; Hirano, Yoshinori; Ibeid, Huda; Narumi, Tetsu; Taiji, Makoto

    2014-01-01

    In this paper, we report all-atom simulations of molecular crowding - a result from the full node simulation on the "K computer", which is a 10-PFLOPS supercomputer in Japan. The capability of this machine enables us to perform simulation of crowded cellular environments, which are more realistic compared to conventional MD simulations where proteins are simulated in isolation. Living cells are "crowded" because macromolecules comprise ∼30% of their molecular weight. Recently, the effects of crowded cellular environments on protein stability have been revealed through in-cell NMR spectroscopy. To measure the performance of the "K computer", we performed all-atom classical molecular dynamics simulations of two systems: target proteins in a solvent, and target proteins in an environment of molecular crowders that mimic the conditions of a living cell. Using the full system, we achieved 4.4 PFLOPS during a 520 million-atom simulation with cutoff of 28 Å. Furthermore, we discuss the performance and scaling of fast multipole methods for molecular dynamics simulations on the "K computer", as well as comparisons with Ewald summation methods. © 2014 Elsevier B.V. All rights reserved.

  9. Petascale molecular dynamics simulation using the fast multipole method on K computer

    Ohno, Yousuke

    2014-10-01

    In this paper, we report all-atom simulations of molecular crowding - a result from the full node simulation on the "K computer", which is a 10-PFLOPS supercomputer in Japan. The capability of this machine enables us to perform simulation of crowded cellular environments, which are more realistic compared to conventional MD simulations where proteins are simulated in isolation. Living cells are "crowded" because macromolecules comprise ∼30% of their molecular weight. Recently, the effects of crowded cellular environments on protein stability have been revealed through in-cell NMR spectroscopy. To measure the performance of the "K computer", we performed all-atom classical molecular dynamics simulations of two systems: target proteins in a solvent, and target proteins in an environment of molecular crowders that mimic the conditions of a living cell. Using the full system, we achieved 4.4 PFLOPS during a 520 million-atom simulation with cutoff of 28 Å. Furthermore, we discuss the performance and scaling of fast multipole methods for molecular dynamics simulations on the "K computer", as well as comparisons with Ewald summation methods. © 2014 Elsevier B.V. All rights reserved.

  10. Simulation research about China Experimental Fast Reactor steam turboset based on Flowmaster platform

    Yan Hao; Tian Zhaofei

    2014-01-01

    In the third loop of China Experimental Fast Reactor (CEFR), steam turboset take an important role in converting heat energy into electric energy. However, turbo sets have not been operated on the condition of more than 40%P_0 (P_0 is full power) since they were installed. Thus it is necessary to make an analogue simulation. Based on the real models of turbo sets in CEFR, simulation models were created with the help of Flowmaster platform. By using such simulation models, a steady state result in full power circumstance was got, which is in accordance with design parameters. Meanwhile, a transient state simulation with operating condition ranging from full power to 40%P_0 was accomplished and a result which verifies part of performance and running conditions of turbo sets was got. The result of analogue simulation shows that based on Flowmaster platform, the running condition of simulation models can comply with design requirement, and offer reference values to the actual running. Such simulation models can also offer reference values to other simulation models in the third loop of CEFR. (authors)

  11. Comparison of Experiment and Simulation of the triple GEM-Based Fast Neutron Detector

    Wang Xiao-Dong; Luo Wen; Zhang Jun-Wei; Yang He-Run; Duan Li-Min; Lu Chen-Gui; Hu Rong-Jiang; Hu Bi-Tao; Zhang Chun-Hui; Yang Lei; Zhou Jian-Rong; An Lv-Xing

    2015-01-01

    A detector for fast neutrons based on a 10 × 10 cm"2 triple gas electron multiplier (GEM) device is developed and tested. A neutron converter, which is a high density polyethylene (HDPE) layer, is combined with the triple GEM detector cathode and placed inside the detector, in the path of the incident neutrons. The detector is tested by obtaining the energy deposition spectrum with an Am Be neutron source in the Institute of Modern Physics (IMP) at Lanzhou. In the present work we report the results of the tests and compare them with those of simulations. The transport of fast neutrons and their interactions with the different materials in the detector are simulated with the GEANT4 code, to understand the experimental results. The detector displays a clear response to the incident fast neutrons. However, an unexpected disagreement in the energy dependence of the response between the simulated and measured spectra is observed. The neutron sources used in our simulation include deuterium-tritium (DT, 14 MeV), deuterium-deuterium (DD, 2.45 MeV), and Am Be sources. The simulation results also show that among the secondary particles generated by the incident neutron, the main contributions to the total energy deposition are from recoil protons induced in hydrogen-rich HDPE or Kapton (GEM material), and activation photons induced by neutron interaction with Ar atoms. Their contributions account for 90% of the total energy deposition. In addition, the dependence of neutron deposited energy spectrum on the composition of the gas mixture is presented. (paper)

  12. Supply chain simulation tools and techniques: a survey

    Kleijnen, J.P.C.

    2005-01-01

    The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which

  13. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  15. Initial Study of An Effective Fast-Time Simulation Platform for Unmanned Aircraft System Traffic Management

    Xue, Min; Rios, Joseph

    2017-01-01

    Small Unmanned Aerial Vehicles (sUAVs), typically 55 lbs and below, are envisioned to play a major role in surveilling critical assets, collecting important information, and delivering goods. Large scale small UAV operations are expected to happen in low altitude airspace in the near future. Many static and dynamic constraints exist in low altitude airspace because of manned aircraft or helicopter activities, various wind conditions, restricted airspace, terrain and man-made buildings, and conflict-avoidance among sUAVs. High sensitivity and high maneuverability are unique characteristics of sUAVs that bring challenges to effective system evaluations and mandate such a simulation platform different from existing simulations that were built for manned air traffic system and large unmanned fixed aircraft. NASA's Unmanned aircraft system Traffic Management (UTM) research initiative focuses on enabling safe and efficient sUAV operations in the future. In order to help define requirements and policies for a safe and efficient UTM system to accommodate a large amount of sUAV operations, it is necessary to develop a fast-time simulation platform that can effectively evaluate requirements, policies, and concepts in a close-to-reality environment. This work analyzed the impacts of some key factors including aforementioned sUAV's characteristics and demonstrated the importance of these factors in a successful UTM fast-time simulation platform.

  16. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-08

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

  17. Computer simulation of fuel behavior during loss-of-flow accidents in a gas-cooled fast reactor

    Wehner, T.R.

    1980-01-01

    The sequence of events in a loss-of-flow accident without reactor shutdown in a gas-cooled fast breeder reactor is strongly influenced by the manner in which the fuel deforms. In order to predict the mode of initial gross fuel deformation, welling, melting or cracking, a thermomechanical computer simulation program was developed. Methods and techniques used make the simulation an economical, efficient, and flexible engineering tool. An innovative application of the enthalpy model within a finite difference scheme is used to caculate temperatures in the fuel rod. The method of successive elastic solutions is used to calculate the thermoelastic-creep response. Calculated stresses are compared with a brittle-fracture stress criterion. An independent computer code is used to calculate fission-gas-induced fuel swelling. Results obtained with the computer simulation indicate that swelling is not a mode of initial fuel deformation. Faster transients result in fuel melting, while slower transients result in fuel cracking. For investigated faster coolant flow coastdowns with time constants of 1 second and 10 seconds, compressive stresses in the outer radial portion of the fuel limit fuel swelling and inhibit fuel cracking. For a slower coolant flow coastdown with a 300 second time constant, tensile stresses in the outer radial portion of the fuel induce early fuel cracking before any melting or significant fuel swelling has occurred. Suggestions for further research are discussed. A derived noniterative solution for mechanics calculations may offer an order of magnitude decrease in computational effort

  18. Real-Time and Real-Fast Performance of General-Purpose and Real-Time Operating Systems in Multithreaded Physical Simulation of Complex Mechanical Systems

    Carlos Garre

    2014-01-01

    Full Text Available Physical simulation is a valuable tool in many fields of engineering for the tasks of design, prototyping, and testing. General-purpose operating systems (GPOS are designed for real-fast tasks, such as offline simulation of complex physical models that should finish as soon as possible. Interfacing hardware at a given rate (as in a hardware-in-the-loop test requires instead maximizing time determinism, for which real-time operating systems (RTOS are designed. In this paper, real-fast and real-time performance of RTOS and GPOS are compared when simulating models of high complexity with large time steps. This type of applications is usually present in the automotive industry and requires a good trade-off between real-fast and real-time performance. The performance of an RTOS and a GPOS is compared by running a tire model scalable on the number of degrees-of-freedom and parallel threads. The benchmark shows that the GPOS present better performance in real-fast runs but worse in real-time due to nonexplicit task switches and to the latency associated with interprocess communication (IPC and task switch.

  19. 3D-Printed Craniosynostosis Model: New Simulation Surgical Tool.

    Ghizoni, Enrico; de Souza, João Paulo Sant Ana Santos; Raposo-Amaral, Cassio Eduardo; Denadai, Rafael; de Aquino, Humberto Belém; Raposo-Amaral, Cesar Augusto; Joaquim, Andrei Fernandes; Tedeschi, Helder; Bernardes, Luís Fernando; Jardini, André Luiz

    2018-01-01

    Craniosynostosis is a complex disease once it involves deep anatomic perception, and a minor mistake during surgery can be fatal. The objective of this report is to present novel 3-dimensional-printed polyamide craniosynostosis models that can improve the understanding and treatment complex pathologies. The software InVesalius was used for segmentation of the anatomy image (from 3 patients between 6 and 9 months old). Afterward, the file was transferred to a 3-dimensional printing system and, with the use of an infrared laser, slices of powder PA 2200 were consecutively added to build a polyamide model of cranial bone. The 3 craniosynostosis models allowed fronto-orbital advancement, Pi procedure, and posterior distraction in the operating room environment. All aspects of the craniofacial anatomy could be shown on the models, as well as the most common craniosynostosis pathologic variations (sphenoid wing elevation, shallow orbits, jugular foramen stenosis). Another advantage of our model is its low cost, about 100 U.S. dollars or even less when several models are produced. Simulation is becoming an essential part of medical education for surgical training and for improving surgical safety with adequate planning. This new polyamide craniosynostosis model allowed the surgeons to have realistic tactile feedback on manipulating a child's bone and permitted execution of the main procedures for anatomic correction. It is a low-cost model. Therefore our model is an excellent option for training purposes and is potentially a new important tool to improve the quality of the management of patients with craniosynostosis. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Simulation and Comparison Between Slow and Fast FH/BPSK Spread Spectrum Using Matlab

    Sanaa Said Kadhim

    2018-02-01

    Full Text Available This paper investigates the properties and applications of Frequency Hopping Spread Spectrum (FHSS.  FHSS is radio communication technique by which the sender of information sends the data on a radio channel, which changes the frequency of transmission based on a predetermined sequence of code. The FHSS has many advantages over traditional modulation methods, it can overcome fading, multipath channels and interferences. Hence the interception becomes difficult. This security feature makes FHSS more preferable for  military applications. At the receiver side, the signal is demodulated by the same carrier signal for which frequency changes by the same code sequences used by the sender. This paper presents two types of FHSS, slow and fast. The  simulation procedures of both types were  implemented and applied on   Frequency Hopping /Binary Phase Shift Keying (FH/BPSK spread spectrum system using MATLAB. The simulation sequences for fast and slow frequency hopping is the same in number  and frequencies of spreading carriers and both used BPSK traditional modulation type. The  comparison  results  based on their power spectral density   show that the fast frequency hopping is more resistive to noise the slow one.

  1. Monte Carlo simulations of the particle transport in semiconductor detectors of fast neutrons

    Sedlačková, Katarína; Zaťko, Bohumír; Šagátová, Andrea; Nečas, Vladimír

    2013-01-01

    Several Monte Carlo all-particle transport codes are under active development around the world. In this paper we focused on the capabilities of the MCNPX code (Monte Carlo N-Particle eXtended) to follow the particle transport in semiconductor detector of fast neutrons. Semiconductor detector based on semi-insulating GaAs was the object of our investigation. As converter material capable to produce charged particles from the (n, p) interaction, a high-density polyethylene (HDPE) was employed. As the source of fast neutrons, the 239 Pu–Be neutron source was used in the model. The simulations were performed using the MCNPX code which makes possible to track not only neutrons but also recoiled protons at all interesting energies. Hence, the MCNPX code enables seamless particle transport and no other computer program is needed to process the particle transport. The determination of the optimal thickness of the conversion layer and the minimum thickness of the active region of semiconductor detector as well as the energy spectra simulation were the principal goals of the computer modeling. Theoretical detector responses showed that the best detection efficiency can be achieved for 500 μm thick HDPE converter layer. The minimum detector active region thickness has been estimated to be about 400 μm. -- Highlights: ► Application of the MCNPX code for fast neutron detector design is demonstrated. ► Simulations of the particle transport through conversion film of HDPE are presented. ► Simulations of the particle transport through detector active region are presented. ► The optimal thickness of the HDPE conversion film has been calculated. ► Detection efficiency of 0.135% was reached for 500 μm thick HDPE conversion film

  2. Fast simulation of non-linear pulsed ultrasound fields using an angular spectrum approach

    Du, Yigang; Jensen, Jørgen Arendt

    2013-01-01

    A fast non-linear pulsed ultrasound field simulation is presented. It is implemented based on an angular spectrum approach (ASA), which analytically solves the non-linear wave equation. The ASA solution to the Westervelt equation is derived in detail. The calculation speed is significantly...... increased compared to a numerical solution using an operator splitting method (OSM). The ASA has been modified and extended to pulsed non-linear ultrasound fields in combination with Field II, where any array transducer with arbitrary geometry, excitation, focusing and apodization can be simulated...... with a center frequency of 5 MHz. The speed is increased approximately by a factor of 140 and the calculation time is 12 min with a standard PC, when simulating the second harmonic pulse at the focal point. For the second harmonic point spread function the full width error is 1.5% at 6 dB and 6.4% at 12 d...

  3. FBG_SiMul V1.0: Fibre Bragg grating signal simulation tool for finite element method models

    G. Pereira

    2016-01-01

    Full Text Available FBG_SiMul V1.0 is a tool to study and design the implementation of fibre Bragg grating (FBG sensors solutions in any arbitrary loaded structure or application. The software removes the need for a fibre optic expert user and makes the sensor response of a structural health monitoring solution using FBG sensors more simple and fast. The software uses a modified T-Matrix method to simulate the FBG reflected spectrum based on the stress and strain from a finite element method model. The article describes the theory and algorithm implementation, followed by an empirical validation.

  4. FEM-DEM coupling simulations of the tool wear characteristics in prestressed machining superalloy

    Ruitao Peng

    2016-01-01

    Full Text Available Due to the complicated contact loading at the tool-chip interface, ceramic tool wear in prestressed machining superalloy is rare difficult to evaluate only by experimental approaches. This study aims to develop a methodology to predict the tool wear evolution by using combined FEM and DEM numerical simulations. Firstly, a finite element model for prestressed cutting is established, subsequently a discrete element model to describe the tool-chip behaviour is established based on the obtained boundary conditions by FEM simulations, finally, simulated results are experimentally validated. The predicted tool wear results show nice agreement with experiments, the simulation indicates that, within a certain range, higher cutting speed effectively results in slighter wear of Sialon ceramic tools, and deeper depth of cut leads to more serious tool wear.

  5. ZEST: A Fast Code for Simulating Zeeman-Stark Line-Shape Functions

    Franck Gilleron

    2018-03-01

    Full Text Available We present the ZEST code, dedicated to the calculation of line shapes broadened by Zeeman and Stark effects. As concerns the Stark effect, the model is based on the Standard Lineshape Theory in which ions are treated in the quasi-static approximation, whereas the effects of electrons are represented by weak collisions in the framework of a binary collision relaxation theory. A static magnetic field may be taken into account in the radiator Hamiltonian in the dipole approximation, which leads to additional Zeeman splitting patterns. Ion dynamics effects are implemented using the fast Frequency-Fluctuation Model. For fast calculations, the static ion microfield distribution in the plasma is evaluated using analytic fits of Monte-Carlo simulations, which depend only on the ion-ion coupling parameter and the electron-ion screening factor.

  6. Fast Bound Methods for Large Scale Simulation with Application for Engineering Optimization

    Patera, Anthony T.; Peraire, Jaime; Zang, Thomas A. (Technical Monitor)

    2002-01-01

    In this work, we have focused on fast bound methods for large scale simulation with application for engineering optimization. The emphasis is on the development of techniques that provide both very fast turnaround and a certificate of Fidelity; these attributes ensure that the results are indeed relevant to - and trustworthy within - the engineering context. The bound methodology which underlies this work has many different instantiations: finite element approximation; iterative solution techniques; and reduced-basis (parameter) approximation. In this grant we have, in fact, treated all three, but most of our effort has been concentrated on the first and third. We describe these below briefly - but with a pointer to an Appendix which describes, in some detail, the current "state of the art."

  7. Scoring the home falls and accidents screening tool for health professionals (HOME FAST-HP): Evidence from one epidemiological study.

    Mackenzie, Lynette; Byles, Julie

    2018-03-30

    Falls in older people are a major public health concern. To target falls prevention interventions, screening tools need to be able to identify older people at greater risk of falling. This study aimed to investigate the screening capacity of the Home Falls and Accidents Screening Tool for health professionals (HOME FAST-HP), and to identify the best cut-off score to identify older people at higher risk of falls using the HOME FAST-HP. The study used cross-sectional data from a random sample of 650 women from the 1921 to 1926 cohort of the Australian Longitudinal Study of Women's Health (ALSWH). Selected women were sent a postal survey including the HOME FAST-HP, falls history, and other health factors. Scores on the home fast were calculated and the cut-point for optimal sensitivity and specificity of the HOME FAST-HP in relation to falls was assessed using a Receiver Operating Characteristic curve. A total of 567 older women participated (response rate 87%). The mean age of participants was 77.5 yrs (95% CI 77.31-77.70). A total of 153 participants (27%) reported a fall in the previous six months. The mean number of hazards using the HOME FAST-HP was 9.74 (95% CI 9.48-10.01), range 2-22. Non-fallers had a mean of 9.6 hazards (95% CI 9.32-9.91) and fallers had a mean of 10.63 hazards (95% CI 10.08-11.19) which was a significant difference (t = 3.41, P = 0.001). The area under the receiver operator curve (AUC) was 0.58 (95% CI 0.53-0.64). A HOME FAST-HP cut-off score of 9 was associated with the optimal sensitivity for falls (73.9%), with specificity (37.9%), and positive predictive value was 30.6% and negative predictive value was 79.7%. The HOME FAST-HP can be used as a screening tool to identify fallers with a cut-off score of nine indicating a higher risk of falling. © 2018 Occupational Therapy Australia.

  8. Fast 2D fluid-analytical simulation of ion energy distributions and electromagnetic effects in multi-frequency capacitive discharges

    Kawamura, E.; Lieberman, M. A.; Graves, D. B.

    2014-12-01

    A fast 2D axisymmetric fluid-analytical plasma reactor model using the finite elements simulation tool COMSOL is interfaced with a 1D particle-in-cell (PIC) code to study ion energy distributions (IEDs) in multi-frequency capacitive argon discharges. A bulk fluid plasma model, which solves the time-dependent plasma fluid equations for the ion continuity and electron energy balance, is coupled with an analytical sheath model, which solves for the sheath parameters. The time-independent Helmholtz equation is used to solve for the fields and a gas flow model solves for the steady-state pressure, temperature and velocity of the neutrals. The results of the fluid-analytical model are used as inputs to a PIC simulation of the sheath region of the discharge to obtain the IEDs at the target electrode. Each 2D fluid-analytical-PIC simulation on a moderate 2.2 GHz CPU workstation with 8 GB of memory took about 15-20 min. The multi-frequency 2D fluid-analytical model was compared to 1D PIC simulations of a symmetric parallel-plate discharge, showing good agreement. We also conducted fluid-analytical simulations of a multi-frequency argon capacitively coupled plasma (CCP) with a typical asymmetric reactor geometry at 2/60/162 MHz. The low frequency 2 MHz power controlled the sheath width and sheath voltage while the high frequencies controlled the plasma production. A standing wave was observable at the highest frequency of 162 MHz. We noticed that adding 2 MHz power to a 60 MHz discharge or 162 MHz to a dual frequency 2 MHz/60 MHz discharge can enhance the plasma uniformity. We found that multiple frequencies were not only useful for controlling IEDs but also plasma uniformity in CCP reactors.

  9. Plutonium Worlds. Fast Breeders, Systems Analysis and Computer Simulation in the Age of Hypotheticality

    Sebastian Vehlken

    2014-09-01

    Full Text Available This article examines the media history of one of the hallmark civil nuclear energy programs in Western Germany – the development of Liquid Metal Fast Breeder Reactor (LMFBR technology. Promoted as a kind of perpetuum mobile of the Atomic Age, the "German Manhattan Project" not only imported big science thinking. In its context, nuclear technology was also put forth as an avantgarde of scientific inquiry, dealing with the most complex and critical technological endeavors. In the face of the risks of nuclear technology, German physicist Wolf Häfele thus announced a novel epistemology of "hypotheticality". In a context where traditional experimental engineering strategies became inappropiate, he called for the application of advanced media technologies: Computer Simulations (CS and Systems Analysis (SA generated computerized spaces for the production of knowledge. In the course of the German Fast Breeder program, such methods had a twofold impact. One the one hand, Häfele emphazised – as the "father of the German Fast Breeder" – the utilization of CS for the actual planning and construction of the novel reactor type. On the other, namely as the director of the department of Energy Systems at the International Institute for Applied Systems Analysis (IIASA, Häfele advised SA-based projections of energy consumption. These computerized scenarios provided the rationale for the conception of Fast Breeder programs as viable and necessary alternative energy sources in the first place. By focusing on the role of the involved CS techniques, the paper thus investigates the intertwined systems thinking of nuclear facilities’s planning and construction and the design of large-scale energy consumption and production scenarios in the 1970s and 1980s, as well as their conceptual afterlives in our contemporary era of computer simulation.

  10. Development of Fast-Running Simulation Methodology Using Neural Networks for Load Follow Operation

    Seong, Seung-Hwan; Park, Heui-Youn; Kim, Dong-Hoon; Suh, Yong-Suk; Hur, Seop; Koo, In-Soo; Lee, Un-Chul; Jang, Jin-Wook; Shin, Yong-Chul

    2002-01-01

    A new fast-running analytic model has been developed for analyzing the load follow operation. The new model was based on the neural network theory, which has the capability of modeling the input/output relationships of a nonlinear system. The new model is made up of two error back-propagation neural networks and procedures to calculate core parameters, such as the distributions and density of xenon in a quasi-steady-state core like load follow operation. One neural network is designed to retrieve the axial offset of power distribution, and the other is for reactivity corresponding to a given core condition. The training data sets for learning the neural networks in the new model are generated with a three-dimensional nodal code and, also, the measured data of the first-day test of load follow operation. Using the new model, the simulation results of the 5-day load follow test in a pressurized water reactor show a good agreement between the simulation data and the actual measured data. Required computing time for simulating a load follow operation is comparable to that of a fast-running lumped model. Moreover, the new model does not require additional engineering factors to compensate for the difference between the actual measurements and analysis results because the neural network has the inherent learning capability of neural networks to new situations

  11. COUPLED SIMULATION OF GAS COOLED FAST REACTOR FUEL ASSEMBLY WITH NESTLE CODE SYSTEM

    Filip Osusky

    2018-05-01

    Full Text Available The paper is focused on coupled calculation of the Gas Cooled Fast Reactor. The proper modelling of coupled neutronics and thermal-hydraulics is the corner stone for future safety assessment of the control and emergency systems. Nowadays, the system and channel thermal-hydraulic codes are accepted by the national regulatory authorities in European Union for license purposes, therefore the code NESTLE was used for the simulation. The NESTLE code is a coupled multigroup neutron diffusion code with thermal-hydraulic sub-channel code. In the paper, the validation of NESTLE code 5.2.1 installation is presented. The processing of fuel assembly homogeneous parametric cross-section library for NESTLE code simulation is made by the sequence TRITON of SCALE code package system. The simulated case in the NESTLE code is one fuel assembly of GFR2400 concept with reflective boundary condition in radial direction and zero flux boundary condition in axial direction. The results of coupled calculation are presented and are consistent with the GFR2400 study of the GoFastR project.

  12. RTSTEP regional transportation simulation tool for emergency planning - final report.

    Ley, H.; Sokolov, V.; Hope, M.; Auld, J.; Zhang, K.; Park, Y.; Kang, X. (Energy Systems)

    2012-01-20

    such materials over a large area, with responders trying to mitigate the immediate danger to the population in a variety of ways that may change over time (e.g., in-place evacuation, staged evacuations, and declarations of growing evacuation zones over time). In addition, available resources will be marshaled in unusual ways, such as the repurposing of transit vehicles to support mass evacuations. Thus, any simulation strategy will need to be able to address highly dynamic effects and will need to be able to handle any mode of ground transportation. Depending on the urgency and timeline of the event, emergency responders may also direct evacuees to leave largely on foot, keeping roadways as clear as possible for emergency responders, logistics, mass transport, and law enforcement. This RTSTEP project developed a regional emergency evacuation modeling tool for the Chicago Metropolitan Area that emergency responders can use to pre-plan evacuation strategies and compare different response strategies on the basis of a rather realistic model of the underlying complex transportation system. This approach is a significant improvement over existing response strategies that are largely based on experience gained from small-scale events, anecdotal evidence, and extrapolation to the scale of the assumed emergency. The new tool will thus add to the toolbox available to emergency response planners to help them design appropriate generalized procedures and strategies that lead to an improved outcome when used during an actual event.

  13. Selection criteria for building performance simulation tools : contrasting architects' and engineers' needs

    Attia, S.G.; Hensen, J.L.M.; Beltran, L.; De Herde, A.

    2012-01-01

    This article summarises a study undertaken to reveal potential challenges and opportunities for using building performance simulation (BPS) tools. The article reviews current trends in building simulation and outlines major criteria for BPS tool selection and evaluation based on analysing users'

  14. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  15. Tools and applications for core design and shielding in fast reactors

    Rachamin, Reuven

    2013-01-01

    Outline: • Modeling of SFR cores using the Serpent-DYN3D code sequence; • Core shielding assessment for the design of FASTEF-MYRRHA; • Neutron shielding studies on an advanced Molten Salt Fast Reactor (MSFR) design

  16. X-ray em simulation tool for ptychography dataset construction

    Stoevelaar, L.P.; Gerini, Giampiero

    2018-01-01

    In this paper, we present an electromagnetic full-wave modeling framework, as a support EM tool providing data sets for X-ray ptychographic imaging. Modeling the entire scattering problem with Finite Element Method (FEM) tools is, in fact, a prohibitive task, because of the large area illuminated by

  17. Simulation of Cascaded Longitudinal-Space-Charge Amplifier at the Fermilab Accelerator Science & Technology (Fast) Facility

    Halavanau, A. [Northern Illinois U.; Piot, P. [Northern Illinois U.

    2015-12-01

    Cascaded Longitudinal Space Charge Amplifiers (LSCA) have been proposed as a mechanism to generate density modulation over a board spectral range. The scheme has been recently demonstrated in the optical regime and has confirmed the production of broadband optical radiation. In this paper we investigate, via numerical simulations, the performance of a cascaded LSCA beamline at the Fermilab Accelerator Science & Technology (FAST) facility to produce broadband ultraviolet radiation. Our studies are carried out using elegant with included tree-based grid-less space charge algorithm.

  18. Coupled neutronics and thermal-hydraulics numerical simulations of a Molten Salt Fast Reactor (MSFR)

    Laureau, A.; Rubiolo, P.R.; Heuer, D.; Merle-Lucotte, E.; Brovchenko, M.

    2013-01-01

    Coupled neutronics and thermalhydraulic numerical analyses of a molten salt fast reactor (MSFR) are presented. These preliminary numerical simulations are carried-out using the Monte Carlo code MCNP and the Computation Fluid Dynamic code OpenFOAM. The main objectives of this analysis performed at steady-reactor conditions are to confirm the acceptability of the current neutronic and thermalhydraulic designs of the reactor, to study the effects of the reactor operating conditions on some of the key MSFR design parameters such as the temperature peaking factor. The effects of the precursor's motion on the reactor safety parameters such as the effective fraction of delayed neutrons have been evaluated. (authors)

  19. SED-ML web tools: generate, modify and export standard-compliant simulation studies.

    Bergmann, Frank T; Nickerson, David; Waltemath, Dagmar; Scharm, Martin

    2017-04-15

    The Simulation Experiment Description Markup Language (SED-ML) is a standardized format for exchanging simulation studies independently of software tools. We present the SED-ML Web Tools, an online application for creating, editing, simulating and validating SED-ML documents. The Web Tools implement all current SED-ML specifications and, thus, support complex modifications and co-simulation of models in SBML and CellML formats. Ultimately, the Web Tools lower the bar on working with SED-ML documents and help users create valid simulation descriptions. http://sysbioapps.dyndns.org/SED-ML_Web_Tools/ . fbergman@caltech.edu . © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  20. Development of a two-level modular simulation tool for dysim

    Kofoed, J.E.

    1987-07-01

    A simulation tool to assist the user when constructing continuous simulation models is described. The simulation tool can be used for constructing simulation programmes that are executed with the rutime executive DYSIM86 which applied a modular approach. This approach makes it possible to split a model into several modules. The simulation tool introduces one more level of modularity. In this level a module is constructed from submodules taken from a library. A submodule consists of a submodel for a component in the complete model. The simulation tool consists of two precompilers working on the two different levels of modularity. The library is completely open to the user so that it is possible to extend it. This is done by a routine which is also part of the simulation tool. The simulation tool is demonstrated by simulating a part of a power plant and a part of a sugar factory. This illustrates that the precompilers can be used for simulating different types of process plants. 69 ill., 13 tabs., 41 refs. (author)

  1. Environmental risk assessment using the Persian version of the Home Falls And Screening Tool (HOME FAST in Iranian elderly

    Bahareh Maghfouri

    2013-05-01

    Full Text Available Introduction: One of the common problems among older people is falling. Falling inside the houses and streets makes up a large incidence between Iranian elderly, then the effort to identify environmental factors at home and home modification can reduce falls and injury in the elderly. The aim of this study is identifying elderly at risk of fall with using screening tool (HOME FAST and define reliability of this tool.Material and Methods: As a reliability, through the health housing of the town councils in five geographical regions of Tehran, 60 old person were selected. Participants aged 60 to 65 years and the HOME FAST tool was used in the two stages (inter rater and test-retest.Results: Test-retest reliability in the study showed that agreement between the items is over than 0.8, which shows very good reliability. The results showed that the relative of the each item in the Agreement between the domain is 1 - 0.65, which shows moderate to high reliability. And the results in this study showed that agreement between the items in Inter rater reliability is over than 0.8, which shows the level of reliability is very good. Also it showed that the relative of the each item in the agreement between the domain is 1 - 0.01, which shows poor to high reliability.Conclusion: This study shows that the reliability of the HOME FAST is high. The findings of these comments have been expected that the test objectives were appropriate to prevent falls and the tools showed acceptable reliability, then this test can be used as a tool for to professionals.

  2. Evaluation of JRC source term methodology using MAAP5 as a fast-running crisis tool for a BWR4 Mark I reactor

    Vela-García, M.; Simola, K.

    2016-01-01

    JRC participated in the OECD/NEA FASTRUN benchmark reviewing fast-running software tools to model fission product releases during accidents at nuclear power plants. The main goal of fast-running software tools is to foresee the accident progression, so that mitigating actions can be taken and the population can be adequately protected. Within the FASTRUN, JRC used the MAAP 4.0.8 code and developed a methodology to obtain the source term (as activity released per radioisotope) of PWR and BWR station black-out accident scenarios. The modifications made in the MAAP models were limited to a minimum number of important parameters. This aims at reproducing a crisis situation with a limited time to adapt a generic input deck. This paper presents further studies, where JRC analysed the FASTRUN BWR scenario using MAAP 5.0.2 that has the capability of calculating doses. A sensitivity study was performed with the MAAP 5.0.2 DOSE package deactivated, using the same methodology as in the case of MAAP 4.0.8 for source term calculation. The results were close to the reference LTSBO SOARCA case, independently of the methodology used. One of the benefits of using the MAAP code is the short runtime of the simulations.

  3. Brain activation during fast driving in a driving simulator: the role of the lateral prefrontal cortex.

    Jäncke, Lutz; Brunner, Béatrice; Esslen, Michaela

    2008-07-16

    Little is currently known about the neural underpinnings of the cognitive control of driving behavior in realistic situations and of the driver's speeding behavior in particular. In this study, participants drove in realistic scenarios presented in a high-end driving simulator. Scalp-recorded EEG oscillations in the alpha-band (8-13 Hz) with a 30-electrode montage were recorded while the participants drove under different conditions: (i) excessively fast (Fast), (ii) in a controlled manner at a safe speed (Correct), and (iii) impatiently in the context of testing traffic conditions (Impatient). Intracerebral sources of alpha-band activation were estimated using low resolution electrical tomography. Given that previous studies have shown a strong negative correlation between the Bold response in the frontal cortex and the alpha-band power, we used alpha-band-related activity as an estimation of frontal activation. Statistical analysis revealed more alpha-band-related activity (i.e. less neuronal activation) in the right lateral prefrontal cortex, including the dorsolateral prefrontal cortex, during fast driving. Those participants who speeded most and exhibited greater risk-taking behavior demonstrated stronger alpha-related activity (i.e. less neuronal activation) in the left anterior lateral prefrontal cortex. These findings are discussed in the context of current theories about the role of the lateral prefrontal cortex in controlling risk-taking behavior, task switching, and multitasking.

  4. Randomized quasi-Monte Carlo simulation of fast-ion thermalization

    Höök, L J; Johnson, T; Hellsten, T

    2012-01-01

    This work investigates the applicability of the randomized quasi-Monte Carlo method for simulation of fast-ion thermalization processes in fusion plasmas, e.g. for simulation of neutral beam injection and radio frequency heating. In contrast to the standard Monte Carlo method, the quasi-Monte Carlo method uses deterministic numbers instead of pseudo-random numbers and has a statistical weak convergence close to O(N -1 ), where N is the number of markers. We have compared different quasi-Monte Carlo methods for a neutral beam injection scenario, which is solved by many realizations of the associated stochastic differential equation, discretized with the Euler-Maruyama scheme. The statistical convergence of the methods is measured for time steps up to 2 14 . (paper)

  5. Randomized quasi-Monte Carlo simulation of fast-ion thermalization

    Höök, L. J.; Johnson, T.; Hellsten, T.

    2012-01-01

    This work investigates the applicability of the randomized quasi-Monte Carlo method for simulation of fast-ion thermalization processes in fusion plasmas, e.g. for simulation of neutral beam injection and radio frequency heating. In contrast to the standard Monte Carlo method, the quasi-Monte Carlo method uses deterministic numbers instead of pseudo-random numbers and has a statistical weak convergence close to {O}(N^{-1}) , where N is the number of markers. We have compared different quasi-Monte Carlo methods for a neutral beam injection scenario, which is solved by many realizations of the associated stochastic differential equation, discretized with the Euler-Maruyama scheme. The statistical convergence of the methods is measured for time steps up to 214.

  6. Design, simulation and analysis of piezoresistive MEMS pressure sensor for fast reactor applications

    Patankar, Mahesh Kumar; Murali, N.; Satya Murty, S.A.V.; Kalyana Rao, K.; Sridhar, S.

    2013-01-01

    To exploit the extraordinary benefits of MEMS technology in fast reactor domain, a piezoresistive MEMS pressure sensor was designed, simulated and analyzed using Intellisuite Software to measure the RCB air pressure in 0 - 1.25 bar (a) range. For sensing the pressure, a thin square silicon diaphragm of size of 800 x 800 μm 2 with thickness of 20 μm was optimized using FEM analysis and to transfer the mechanical stress, induce in the diaphragm due to pressure, into electrical output voltage signal, a set of piezoresistors were arranged on top side of the diaphragm in full active wheatstone bridge configuration for obtaining the higher sensitivity. The simulation results were compared with the analytical results which were found in line of expectations and electrical sensitivity was obtained at 15 mV/V.bar. (author)

  7. BRENDA: a dynamic simulator for a sodium-cooled fast reactor power plant

    Hetrick, D.L.; Sowers, G.W.

    1978-06-01

    This report is a users' manual for one version of BRENDA (Breeder Reactor Nuclear Dynamic Analysis), which is a digital program for simulating the dynamic behavior of a sodium-cooled fast reactor power plant. This version, which contains 57 differential equations, represents a simplified model of the Clinch River Breeder Reactor Project (CRBRP). BRENDA is an input deck for DARE P (Differential Analyzer Replacement, Portable), which is a continuous-system simulation language developed at the University of Arizona. This report contains brief descriptions of DARE P and BRENDA, instructions for using BRENDA in conjunction with DARE P, and some sample output. A list of variable names and a listing for BRENDA are included as appendices

  8. Fast and accurate protein substructure searching with simulated annealing and GPUs

    Stivala Alex D

    2010-09-01

    Full Text Available Abstract Background Searching a database of protein structures for matches to a query structure, or occurrences of a structural motif, is an important task in structural biology and bioinformatics. While there are many existing methods for structural similarity searching, faster and more accurate approaches are still required, and few current methods are capable of substructure (motif searching. Results We developed an improved heuristic for tableau-based protein structure and substructure searching using simulated annealing, that is as fast or faster and comparable in accuracy, with some widely used existing methods. Furthermore, we created a parallel implementation on a modern graphics processing unit (GPU. Conclusions The GPU implementation achieves up to 34 times speedup over the CPU implementation of tableau-based structure search with simulated annealing, making it one of the fastest available methods. To the best of our knowledge, this is the first application of a GPU to the protein structural search problem.

  9. Development of Fast-Time Stochastic Airport Ground and Runway Simulation Model and Its Traffic Analysis

    Ryota Mori

    2015-01-01

    Full Text Available Airport congestion, in particular congestion of departure aircraft, has already been discussed by other researches. Most solutions, though, fail to account for uncertainties. Since it is difficult to remove uncertainties of the operations in the real world, a strategy should be developed assuming such uncertainties exist. Therefore, this research develops a fast-time stochastic simulation model used to validate various methods in order to decrease airport congestion level under existing uncertainties. The surface movement data is analyzed first, and the uncertainty level is obtained. Next, based on the result of data analysis, the stochastic simulation model is developed. The model is validated statistically and the characteristics of airport operation under existing uncertainties are investigated.

  10. Simulators of tray distillation columns as tools for interpreting ...

    ... at 0.05 m intervals were determined from top to the bottom of simulators of tray distillation columns exposed to 20 mCi of 137 Cs. Signals generated from the simulators were identical with the experimental signals obtained from the Stabilizer Column of the crude oil distillation unit at the Tema Oil Refinery Ghana Limited.

  11. Business Simulation Games: Effective Teaching Tools or Window Dressing?

    Tanner, John R.; Stewart, Geoffrey; Totaro, Michael W.; Hargrave, Melissa

    2012-01-01

    Business simulations serve as learning platforms that stimulate the "gaming" interest of students, that provide a structured learning environment, and that should help manage the time resources of faculty. Simulations appear to provide a context where students feel learning can take place. However, faculty perception of simulation…

  12. PCISIM - A Simulation Tool for PCI Bus Based Systems

    Sharp, Robin

    1999-01-01

    This document describes a PCI bus simulator for use in evaluating the feasibility of system designs based on this bus.......This document describes a PCI bus simulator for use in evaluating the feasibility of system designs based on this bus....

  13. Innovative tools for real-time simulation of dynamic systems

    Palli, Gianluca; Carloni, Raffaella; Melchiorri, Claudio

    2008-01-01

    In this paper, we present a software architecture, based on RTAI-Linux, for the real-time simulation of dynamic systems and for the rapid prototyping of digital controllers. Our aim is to simplify the testing phase of digital controllers by providing the real-time simulation of the plant with the

  14. Fast 2D Fluid-Analytical Simulation of IEDs and Plasma Uniformity in Multi-frequency CCPs

    Kawamura, E.; Lieberman, M. A.; Graves, D. B.

    2014-10-01

    A fast 2D axisymmetric fluid-analytical model using the finite elements tool COMSOL is interfaced with a 1D particle-in-cell (PIC) code to study ion energy distributions (IEDs) in multi-frequency argon capacitively coupled plasmas (CCPs). A bulk fluid plasma model which solves the time-dependent plasma fluid equations is coupled with an analytical sheath model which solves for the sheath parameters. The fluid-analytical results are used as input to a PIC simulation of the sheath region of the discharge to obtain the IEDs at the wafer electrode. Each fluid-analytical-PIC simulation on a moderate 2.2 GHz CPU workstation with 8 GB of memory took about 15-20 minutes. The 2D multi-frequency fluid-analytical model was compared to 1D PIC simulations of a symmetric parallel plate discharge, showing good agreement. Fluid-analytical simulations of a 2/60/162 MHz argon CCP with a typical asymmetric reactor geometry were also conducted. The low 2 MHz frequency controlled the sheath width and voltage while the higher frequencies controlled the plasma production. A standing wave was observable at the highest frequency of 162 MHz. Adding 2 MHz power to a 60 MHz discharge or 162 MHz to a dual frequency 2 MHz/60 MHz discharge enhanced the plasma uniformity. This work was supported by the Department of Energy Office of Fusion Energy Science Contract DE-SC000193, and in part by gifts from Lam Research Corporation and Micron Corporation.

  15. Aeroelastic Simulation Tool for Inflatable Ballute Aerocapture, Phase II

    National Aeronautics and Space Administration — This project will develop a much-needed multidisciplinary analysis tool for predicting the impact of aeroelastic effects on the functionality of inflatable...

  16. Simulation-Based Tool for Traffic Management Training, Phase II

    National Aeronautics and Space Administration — Both the current NAS, as well as NextGen, need successful use of advanced tools. Successful training is required today because more information gathering and...

  17. Simulation-Based Tool for Traffic Management Training, Phase I

    National Aeronautics and Space Administration — Both the current NAS, as well as NextGen, need successful use of advanced tools. Successful training is required today because more information gathering and...

  18. A Simulation Tool for the Conceptual Design of Thermonuclear Pulsors

    Ramos, R.; Gonzalez, J.; Clausse, A.

    2003-01-01

    We worked with an effective model that calculates the neutron production of Plasma Focus devices.From experimental data we obtained different fitting functions for the model lumped parameters.By this way, we obtained a simple tool for neutron yield calculation.This tool is very useful at a conceptual design stage, because it can predict easily if a given PF device would be suitable for a certain application

  19. A Fast Algorithm to Simulate Droplet Motions in Oil/Water Two Phase Flow

    Zhang, Tao

    2017-06-09

    To improve the research methods in petroleum industry, we develop a fast algorithm to simulate droplet motions in oil and water two phase flow, using phase field model to describe the phase distribution in the flow process. An efficient partial difference equation solver—Shift-Matrix method is applied here, to speed up the calculation coding in high-level language, i.e. Matlab and R. An analytical solution of order parameter is derived, to define the initial condition of phase distribution. The upwind scheme is applied in our algorithm, to make it energy decay stable, which results in the fast speed of calculation. To make it more clear and understandable, we provide the specific code for forming the coefficient matrix used in Shift-Matrix Method. Our algorithm is compared with other methods in different scales, including Front Tracking and VOSET method in macroscopic and LBM method using RK model in mesoscopic scale. In addition, we compare the result of droplet motion under gravity using our algorithm with the empirical formula common used in industry. The result proves the high efficiency and robustness of our algorithm and it’s then used to simulate the motions of multiple droplets under gravity and cross-direction forces, which is more practical in industry and can be extended to wider application.

  20. Simulation tools for scattering corrections in spectrally resolved x-ray computed tomography using McXtrace

    Busi, Matteo; Olsen, Ulrik L.; Knudsen, Erik B.; Frisvad, Jeppe R.; Kehres, Jan; Dreier, Erik S.; Khalil, Mohamad; Haldrup, Kristoffer

    2018-03-01

    Spectral computed tomography is an emerging imaging method that involves using recently developed energy discriminating photon-counting detectors (PCDs). This technique enables measurements at isolated high-energy ranges, in which the dominating undergoing interaction between the x-ray and the sample is the incoherent scattering. The scattered radiation causes a loss of contrast in the results, and its correction has proven to be a complex problem, due to its dependence on energy, material composition, and geometry. Monte Carlo simulations can utilize a physical model to estimate the scattering contribution to the signal, at the cost of high computational time. We present a fast Monte Carlo simulation tool, based on McXtrace, to predict the energy resolved radiation being scattered and absorbed by objects of complex shapes. We validate the tool through measurements using a CdTe single PCD (Multix ME-100) and use it for scattering correction in a simulation of a spectral CT. We found the correction to account for up to 7% relative amplification in the reconstructed linear attenuation. It is a useful tool for x-ray CT to obtain a more accurate material discrimination, especially in the high-energy range, where the incoherent scattering interactions become prevailing (>50 keV).

  1. Simulating the Camp David Negotiations: A Problem-Solving Tool in Critical Pedagogy

    McMahon, Sean F.; Miller, Chris

    2013-01-01

    This article reflects critically on simulations. Building on the authors' experience simulating the Palestinian-Israeli-American Camp David negotiations of 2000, they argue that simulations are useful pedagogical tools that encourage creative--but not critical--thinking and constructivist learning. However, they can also have the deleterious…

  2. Simulation as an educational tool in acute nursing care

    Larsen, Mona

    2017-01-01

    satisfying. The purpose of the study was to investigate if theory based lessons in combination with simulation-based lectures (FAM Camp) were superior to theory based lessons alone on above mentioned variables. Method: This was a controlled intervention study among nursing students at University College...... lessons. Conclusion: Different skills are required to practice as an acute nurse. Believe in own abilities is one of them (self-efficacy) and this skill was trained best in simulation-based lessons in an authentic setting. Contrary, it is uncertain whether simulation training improves fundamental...

  3. Simulator as a tool of training to modern equipment management

    Ahmedyanova Gulnara

    2017-01-01

    Full Text Available In the work, the learning process with the use of a simulator was studied. In this case, both the design and algorithmic content of the simulator, as well as the trainee, must pass their part of the path, only in this case the result of learning is maximized. Theoretically, it is shown that the effectiveness of simulator training is primarily a function of the cognitive-operational and professional-personal aspects of the trainee's competence. The experiment confirmed that, despite the differences above the indicated qualities, the result can be estimated as the sum of their estimates.

  4. An Integrated Simulation Tool for Modeling the Human Circulatory System

    Asami, Ken'ichi; Kitamura, Tadashi

    This paper presents an integrated simulation of the circulatory system in physiological movement. The large circulatory system model includes principal organs and functional units in modules in which comprehensive physiological changes such as nerve reflexes, temperature regulation, acid/base balance, O2/CO2 balance, and exercise are simulated. A beat-by-beat heart model, in which the corresponding electrical circuit problems are solved by a numerical analytic method, enables calculation of pulsatile blood flow to the major organs. The integration of different perspectives on physiological changes makes this simulation model applicable for the microscopic evaluation of blood flow under various conditions in the human body.

  5. siMacro: A Fast and Easy Data Processing Tool for Cell-Based Genomewide siRNA Screens

    Nitin Kumar Singh

    2013-03-01

    Full Text Available Growing numbers of studies employ cell line-based systematic short interfering RNA (siRNA screens to study gene functions and to identify drug targets. As multiple sources of variations that are unique to siRNA screens exist, there is a growing demand for a computational tool that generates normalized values and standardized scores. However, only a few tools have been available so far with limited usability. Here, we present siMacro, a fast and easy-to-use Microsoft Office Excel-based tool with a graphic user interface, designed to process single-condition or two-condition synthetic screen datasets. siMacro normalizes position and batch effects, censors outlier samples, and calculates Z-scores and robust Z-scores, with a spreadsheet output of >120,000 samples in under 1 minute.

  6. The DSNP simulation language and its application to liquid-metal fast breeder reactor transient analyses

    Saphier, D.; Madell, J.T.

    1982-01-01

    A new, special purpose block-oriented simulation language, the Dynamic Simulator for Nuclear Power Plants (DSNP), was used to perform a dynamic analysis of several conceptual design studies of liquid metal fast breeder reactors. The DSNP being a high level language enables the user to transform a power plant flow chart directly into a simulation program using a small number of DSNP statements. In addition to the language statements, the DSNP system has its own precompiler and an extensive library containing models of power plant components, algorithms of physical processes, material property functions, and various auxiliary functions. The comparative analysis covered oxide-fueled versus metal-fueled core designs and loop- versus pool-type reactors. The question of interest was the rate of change of the temperatures in the components in the upper plenum and the primary loop, in particular the reactor outlet nozzle and the intermediate heat exchanger inlet nozzle during different types of transients. From the simulations performed it can be concluded that metal-fueled cores will have much faster temperature transients than oxide-fueled cores due mainly to the much higher thermal diffusivity of the metal fuel. The transients in the pool-type design (either with oxide fuel or metal fuel) will be much slower than in the loop-type design due to the large heat capacity of the sodium pool. The DSNP language was demonstrated to be well suited to perform many types of transient analysis in nuclear power plants

  7. SIMIFR: A code to simulate material movement in the Integral Fast Reactor

    White, A.M.; Orechwa, Yuri.

    1991-01-01

    The SIMIFR code has been written to simulate the movement of material through a process. This code can be used to investigate inventory differences in material balances, assist in process design, and to produce operational scheduling. The particular process that is of concern to the authors is that centered around Argonne National Laboratory's Integral Fast Reactor. This is a process which involves the irradiation of fissile material for power production, and the recycling of the irradiated reactor fuel pins into fresh fuel elements. To adequately simulate this process it is necessary to allow for locations which can contain either discrete items or homogeneous mixtures. It is also necessary to allow for a very flexible process control algorithm. Further, the code must have the capability of transmuting isotopic compositions and computing internally the fraction of material from a process ending up in a given location. The SIMIFR code has been developed to perform all of these tasks. In addition to simulating the process, the code is capable of generating random measurement values and sampling errors for all locations, and of producing a restart deck so that terminated problems may be continued. In this paper the authors first familiarize the reader with the IFR fuel cycle. The different capabilities of the SIMIFR code are described. Finally, the simulation of the IFR fuel cycle using the SIMIFR code is discussed. 4 figs

  8. Fast simulation of reconstructed phylogenies under global time-dependent birth-death processes.

    Höhna, Sebastian

    2013-06-01

    Diversification rates and patterns may be inferred from reconstructed phylogenies. Both the time-dependent and the diversity-dependent birth-death process can produce the same observed patterns of diversity over time. To develop and test new models describing the macro-evolutionary process of diversification, generic and fast algorithms to simulate under these models are necessary. Simulations are not only important for testing and developing models but play an influential role in the assessment of model fit. In the present article, I consider as the model a global time-dependent birth-death process where each species has the same rates but rates may vary over time. For this model, I derive the likelihood of the speciation times from a reconstructed phylogenetic tree and show that each speciation event is independent and identically distributed. This fact can be used to simulate efficiently reconstructed phylogenetic trees when conditioning on the number of species, the time of the process or both. I show the usability of the simulation by approximating the posterior predictive distribution of a birth-death process with decreasing diversification rates applied on a published bird phylogeny (family Cettiidae). The methods described in this manuscript are implemented in the R package TESS, available from the repository CRAN (http://cran.r-project.org/web/packages/TESS/). Supplementary data are available at Bioinformatics online.

  9. Process simulation of heavy water plants - a powerful analytical tool

    Miller, A.I.

    1978-10-01

    The commercially conscious designs of Canadian GS (Girdler-Sulphide) have proved sensitive to process conditions. That, combined with the large scale of our units, has meant that computer simulation of their behaviour has been a natural and profitable development. Atomic Energy of Canada Limited has developed a family of steady state simulations to describe all of the Canadian plants. Modelling of plant conditions has demonstrated that the simulation description is very precise and it has become an integral part of the industry's assessments of both plant operation and decisions on capital expenditures. The simulation technique has also found extensive use in detailed designing of both the rehabilitated Glace Bay and the new La Prade plants. It has opened new insights into plant design and uncovered a radical and significant flowsheet change for future designs as well as many less dramatic but valuable lesser changes. (author)

  10. Wargaming and Simulation as Tools for CONOPS Development

    Rhoads, Russell

    2004-01-01

    The purpose of this thesis is to use wargaming and simulation to gain insight into the effective employment of a new Command Control Communications Computers Intelligence Surveillance and Reconnaissance (C4ISR...

  11. Fast modal simulation of paraxial optical systems: the MIST open source toolbox

    Vajente, Gabriele

    2013-01-01

    This paper presents a new approach to the simulation of optical laser systems in the paraxial approximation, with particular applications to interferometric gravitational wave detectors. The method presented here is based on a standard decomposition of the laser field in terms of Hermite–Gauss transverse modes. The innovative feature consists of a symbolic manipulation of the equations describing the field propagation. This approach allows a huge reduction in the computational time, especially when a large number of higher order modes is needed to properly simulate the system. The new algorithm has been implemented in an open source toolbox, called the MIST, based on the MATLAB® environment. The MIST has been developed and is being used in the framework of the design of advanced gravitational wave detectors. Examples from this field of application will be discussed to illustrate the capabilities and performance of the simulation tool. (paper)

  12. Development of FAST.Farm: A New Multiphysics Engineering Tool for Wind Farm Design and Analysis: Preprint

    Jonkman, Jason; Annoni, Jennifer; Hayman, Greg; Jonkman, Bonnie; Purkayastha, Avi

    2017-01-01

    This paper presents the development of FAST.Farm, a new multiphysics tool applicable to engineering problems in research and industry involving wind farm performance and cost optimization that is needed to address the current underperformance, failures, and expenses plaguing the wind industry. Achieving wind cost-of-energy targets - which requires improvements in wind farm performance and reliability, together with reduced uncertainty and expenditures - has been eluded by the complicated nature of the wind farm design problem, especially the sophisticated interaction between atmospheric phenomena and wake dynamics and array effects. FAST.Farm aims to balance the need for accurate modeling of the relevant physics for predicting power performance and loads while maintaining low computational cost to support a highly iterative and probabilistic design process and system-wide optimization. FAST.Farm makes use of FAST to model the aero-hydro-servo-elastics of distinct turbines in the wind farm, and it is based on some of the principles of the Dynamic Wake Meandering (DWM) model, but avoids many of the limitations of existing DWM implementations.

  13. Development of FAST.Farm: A New Multiphysics Engineering Tool for Wind-Farm Design and Analysis

    Jonkman, Jason; Annoni, Jennifer; Hayman, Greg; Jonkman, Bonnie; Purkayastha, Avi

    2017-01-09

    This paper presents the development of FAST.Farm, a new multiphysics tool applicable to engineering problems in research and industry involving wind farm performance and cost optimization that is needed to address the current underperformance, failures, and expenses plaguing the wind industry. Achieving wind cost-of-energy targets - which requires improvements in wind farm performance and reliability, together with reduced uncertainty and expenditures - has been eluded by the complicated nature of the wind farm design problem, especially the sophisticated interaction between atmospheric phenomena and wake dynamics and array effects. FAST.Farm aims to balance the need for accurate modeling of the relevant physics for predicting power performance and loads while maintaining low computational cost to support a highly iterative and probabilistic design process and system-wide optimization. FAST.Farm makes use of FAST to model the aero-hydro-servo-elastics of distinct turbines in the wind farm, and it is based on some of the principles of the Dynamic Wake Meandering (DWM) model, but avoids many of the limitations of existing DWM implementations.

  14. Numerical Propulsion System Simulation (NPSS): An Award Winning Propulsion System Simulation Tool

    Stauber, Laurel J.; Naiman, Cynthia G.

    2002-01-01

    The Numerical Propulsion System Simulation (NPSS) is a full propulsion system simulation tool used by aerospace engineers to predict and analyze the aerothermodynamic behavior of commercial jet aircraft, military applications, and space transportation. The NPSS framework was developed to support aerospace, but other applications are already leveraging the initial capabilities, such as aviation safety, ground-based power, and alternative energy conversion devices such as fuel cells. By using the framework and developing the necessary components, future applications that NPSS could support include nuclear power, water treatment, biomedicine, chemical processing, and marine propulsion. NPSS will dramatically reduce the time, effort, and expense necessary to design and test jet engines. It accomplishes that by generating sophisticated computer simulations of an aerospace object or system, thus enabling engineers to "test" various design options without having to conduct costly, time-consuming real-life tests. The ultimate goal of NPSS is to create a numerical "test cell" that enables engineers to create complete engine simulations overnight on cost-effective computing platforms. Using NPSS, engine designers will be able to analyze different parts of the engine simultaneously, perform different types of analysis simultaneously (e.g., aerodynamic and structural), and perform analysis in a more efficient and less costly manner. NPSS will cut the development time of a new engine in half, from 10 years to 5 years. And NPSS will have a similar effect on the cost of development: new jet engines will cost about a billion dollars to develop rather than two billion. NPSS is also being applied to the development of space transportation technologies, and it is expected that similar efficiencies and cost savings will result. Advancements of NPSS in fiscal year 2001 included enhancing the NPSS Developer's Kit to easily integrate external components of varying fidelities, providing

  15. Linearly scaling and almost Hamiltonian dielectric continuum molecular dynamics simulations through fast multipole expansions

    Lorenzen, Konstantin; Mathias, Gerald; Tavan, Paul, E-mail: tavan@physik.uni-muenchen.de [Lehrstuhl für BioMolekulare Optik, Ludig–Maximilians Universität München, Oettingenstr. 67, 80538 München (Germany)

    2015-11-14

    Hamiltonian Dielectric Solvent (HADES) is a recent method [S. Bauer et al., J. Chem. Phys. 140, 104103 (2014)] which enables atomistic Hamiltonian molecular dynamics (MD) simulations of peptides and proteins in dielectric solvent continua. Such simulations become rapidly impractical for large proteins, because the computational effort of HADES scales quadratically with the number N of atoms. If one tries to achieve linear scaling by applying a fast multipole method (FMM) to the computation of the HADES electrostatics, the Hamiltonian character (conservation of total energy, linear, and angular momenta) may get lost. Here, we show that the Hamiltonian character of HADES can be almost completely preserved, if the structure-adapted fast multipole method (SAMM) as recently redesigned by Lorenzen et al. [J. Chem. Theory Comput. 10, 3244-3259 (2014)] is suitably extended and is chosen as the FMM module. By this extension, the HADES/SAMM forces become exact gradients of the HADES/SAMM energy. Their translational and rotational invariance then guarantees (within the limits of numerical accuracy) the exact conservation of the linear and angular momenta. Also, the total energy is essentially conserved—up to residual algorithmic noise, which is caused by the periodically repeated SAMM interaction list updates. These updates entail very small temporal discontinuities of the force description, because the employed SAMM approximations represent deliberately balanced compromises between accuracy and efficiency. The energy-gradient corrected version of SAMM can also be applied, of course, to MD simulations of all-atom solvent-solute systems enclosed by periodic boundary conditions. However, as we demonstrate in passing, this choice does not offer any serious advantages.

  16. Fault attacks, injection techniques and tools for simulation

    Piscitelli, R.; Bhasin, S.; Regazzoni, F.

    2015-01-01

    Faults attacks are a serious threat to secure devices, because they are powerful and they can be performed with extremely cheap equipment. Resistance against fault attacks is often evaluated directly on the manufactured devices, as commercial tools supporting fault evaluation do not usually provide

  17. Simulation Evaluation of Controller-Managed Spacing Tools under Realistic Operational Conditions

    Callantine, Todd J.; Hunt, Sarah M.; Prevot, Thomas

    2014-01-01

    Controller-Managed Spacing (CMS) tools have been developed to aid air traffic controllers in managing high volumes of arriving aircraft according to a schedule while enabling them to fly efficient descent profiles. The CMS tools are undergoing refinement in preparation for field demonstration as part of NASA's Air Traffic Management (ATM) Technology Demonstration-1 (ATD-1). System-level ATD-1 simulations have been conducted to quantify expected efficiency and capacity gains under realistic operational conditions. This paper presents simulation results with a focus on CMS-tool human factors. The results suggest experienced controllers new to the tools find them acceptable and can use them effectively in ATD-1 operations.

  18. Simulation of wireline sonic logging measurements acquired with Borehole-Eccentered tools using a high-order adaptive finite-element method

    Pardo, David

    2011-07-01

    The paper introduces a high-order, adaptive finite-element method for simulation of sonic measurements acquired with borehole-eccentered logging instruments. The resulting frequency-domain based algorithm combines a Fourier series expansion in one spatial dimension with a two-dimensional high-order adaptive finite-element method (FEM), and incorporates a perfectly matched layer (PML) for truncation of the computational domain. The simulation method was verified for various model problems, including a comparison to a semi-analytical solution developed specifically for this purpose. Numerical results indicate that for a wireline sonic tool operating in a fast formation, the main propagation modes are insensitive to the distance from the center of the tool to the center of the borehole (eccentricity distance). However, new flexural modes arise with an increase in eccentricity distance. In soft formations, we identify a new dipole tool mode which arises as a result of tool eccentricity. © 2011 Elsevier Inc.

  19. Simulation of wireline sonic logging measurements acquired with Borehole-Eccentered tools using a high-order adaptive finite-element method

    Pardo, David; Matuszyk, Paweł Jerzy; Muga, Ignacio; Torres-Verdí n, Carlos; Mora Cordova, Angel; Calo, Victor M.

    2011-01-01

    The paper introduces a high-order, adaptive finite-element method for simulation of sonic measurements acquired with borehole-eccentered logging instruments. The resulting frequency-domain based algorithm combines a Fourier series expansion in one spatial dimension with a two-dimensional high-order adaptive finite-element method (FEM), and incorporates a perfectly matched layer (PML) for truncation of the computational domain. The simulation method was verified for various model problems, including a comparison to a semi-analytical solution developed specifically for this purpose. Numerical results indicate that for a wireline sonic tool operating in a fast formation, the main propagation modes are insensitive to the distance from the center of the tool to the center of the borehole (eccentricity distance). However, new flexural modes arise with an increase in eccentricity distance. In soft formations, we identify a new dipole tool mode which arises as a result of tool eccentricity. © 2011 Elsevier Inc.

  20. Battery Simulation Tool for Worst Case Analysis and Mission Evaluations

    Lefeuvre Stéphane

    2017-01-01

    The first part of this paper presents the PSpice models including their respective variable parameters at SBS and cell level. Then the second part of the paper introduces to the reader the model parameters that were chosen and identified to perform Monte Carlo Analysis simulations. The third part reflects some MCA results for a VES16 battery module. Finally the reader will see some other simulations that were performed by re-using the battery model for an another Saft battery cell type (MP XTD for a specific space application, at high temperature.

  1. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Ahmed Shamsul Arefin

    Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.

  2. Data Collection Methods for Validation of Advanced Multi-Resolution Fast Reactor Simulations

    2015-01-01

    In pool-type Sodium Fast Reactors (SFR) the regions most susceptible to thermal striping are the upper instrumentation structure (UIS) and the intermediate heat exchanger (IHX). This project experimentally and computationally (CFD) investigated the thermal mixing in the region exiting the reactor core to the UIS. The thermal mixing phenomenon was simulated using two vertical jets at different velocities and temperatures as prototypic of two adjacent channels out of the core. Thermal jet mixing of anticipated flows at different temperatures and velocities were investigated. Velocity profiles are measured throughout the flow region using Ultrasonic Doppler Velocimetry (UDV), and temperatures along the geometric centerline between the jets were recorded using a thermocouple array. CFD simulations, using COMSOL, were used to initially understand the flow, then to design the experimental apparatus and finally to compare simulation results and measurements characterizing the flows. The experimental results and CFD simulations show that the flow field is characterized into three regions with respective transitions, namely, convective mixing, (flow direction) transitional, and post-mixing. Both experiments and CFD simulations support this observation. For the anticipated SFR conditions the flow is momentum dominated and thus thermal mixing is limited due to the short flow length associated from the exit of the core to the bottom of the UIS. This means that there will be thermal striping at any surface where poorly mixed streams impinge; rather unless lateral mixing is actively promoted out of the core, thermal striping will prevail. Furthermore we note that CFD can be considered a separate effects (computational) test and is recommended as part of any integral analysis. To this effect, poorly mixed streams then have potential impact on the rest of the SFR design and scaling, especially placement of internal components, such as the IHX that may see poorly mixed streams

  3. Data Collection Methods for Validation of Advanced Multi-Resolution Fast Reactor Simulations

    Tokuhiro, Akiro [Univ. of Idaho, Moscow, ID (United States); Ruggles, Art [Univ. of Tennessee, Knoxville, TN (United States); Pointer, David [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-22

    In pool-type Sodium Fast Reactors (SFR) the regions most susceptible to thermal striping are the upper instrumentation structure (UIS) and the intermediate heat exchanger (IHX). This project experimentally and computationally (CFD) investigated the thermal mixing in the region exiting the reactor core to the UIS. The thermal mixing phenomenon was simulated using two vertical jets at different velocities and temperatures as prototypic of two adjacent channels out of the core. Thermal jet mixing of anticipated flows at different temperatures and velocities were investigated. Velocity profiles are measured throughout the flow region using Ultrasonic Doppler Velocimetry (UDV), and temperatures along the geometric centerline between the jets were recorded using a thermocouple array. CFD simulations, using COMSOL, were used to initially understand the flow, then to design the experimental apparatus and finally to compare simulation results and measurements characterizing the flows. The experimental results and CFD simulations show that the flow field is characterized into three regions with respective transitions, namely, convective mixing, (flow direction) transitional, and post-mixing. Both experiments and CFD simulations support this observation. For the anticipated SFR conditions the flow is momentum dominated and thus thermal mixing is limited due to the short flow length associated from the exit of the core to the bottom of the UIS. This means that there will be thermal striping at any surface where poorly mixed streams impinge; rather unless lateral mixing is ‘actively promoted out of the core, thermal striping will prevail. Furthermore we note that CFD can be considered a ‘separate effects (computational) test’ and is recommended as part of any integral analysis. To this effect, poorly mixed streams then have potential impact on the rest of the SFR design and scaling, especially placement of internal components, such as the IHX that may see poorly mixed

  4. An applied artificial intelligence approach towards assessing building performance simulation tools

    Yezioro, Abraham [Faculty of Architecture and Town Planning, Technion IIT (Israel); Dong, Bing [Center for Building Performance and Diagnostics, School of Architecture, Carnegie Mellon University (United States); Leite, Fernanda [Department of Civil and Environmental Engineering, Carnegie Mellon University (United States)

    2008-07-01

    With the development of modern computer technology, a large amount of building energy simulation tools is available in the market. When choosing which simulation tool to use in a project, the user must consider the tool's accuracy and reliability, considering the building information they have at hand, which will serve as input for the tool. This paper presents an approach towards assessing building performance simulation results to actual measurements, using artificial neural networks (ANN) for predicting building energy performance. Training and testing of the ANN were carried out with energy consumption data acquired for 1 week in the case building called the Solar House. The predicted results show a good fitness with the mathematical model with a mean absolute error of 0.9%. Moreover, four building simulation tools were selected in this study in order to compare their results with the ANN predicted energy consumption: Energy{sub 1}0, Green Building Studio web tool, eQuest and EnergyPlus. The results showed that the more detailed simulation tools have the best simulation performance in terms of heating and cooling electricity consumption within 3% of mean absolute error. (author)

  5. Forest vegetation simulation tools and forest health assessment

    Richard M. Teck; Melody Steele

    1995-01-01

    A Stand Hazard Rating System for Central ldaho forests has been incorporated into the Central ldaho Prognosis variant of the Forest Vegetation Simulator to evaluate how insects, disease and fire hazards within the Deadwood River Drainage change over time. A custom interface, BOISE.COMPUTE.PR, has been developed so hazard ratings can be electronically downloaded...

  6. PhET Interactive Simulations: Transformative Tools for Teaching Chemistry

    Moore, Emily B.; Chamberlain, Julia M.; Parson, Robert; Perkins, Katherine K.

    2014-01-01

    Developing fluency across symbolic-, macroscopic-, and particulate-level representations is central to learning chemistry. Within the chemistry education community, animations and simulations that support multi-representational fluency are considered critical. With advances in the accessibility and sophistication of technology,…

  7. Soil Erosion Study through Simulation: An Educational Tool.

    Huber, Thomas P.; Falkenmayer, Karen

    1987-01-01

    Discusses the need for education about soil erosion and advocates the use of the Universal Soil Loss Equation (USLE) to show the impacts of human and natural action on the land. Describes the use of a computer simulated version of the USLE in several environmental and farming situations. (TW)

  8. Preliminary investigations of Monte Carlo Simulations of neutron energy and LET spectra for fast neutron therapy facilities

    Kroc, T.K.

    2009-01-01

    No fast neutron therapy facility has been built with optimized beam quality based on a thorough understanding of the neutron spectrum and its resulting biological effectiveness. A study has been initiated to provide the information necessary for such an optimization. Monte Carlo studies will be used to simulate neutron energy spectra and LET spectra. These studies will be bench-marked with data taken at existing fast neutron therapy facilities. Results will also be compared with radiobiological studies to further support beam quality ptimization. These simulations, anchored by this data, will then be used to determine what parameters might be optimized to take full advantage of the unique LET properties of fast neutron beams. This paper will present preliminary work in generating energy and LET spectra for the Fermilab fast neutron therapy facility.

  9. Simulations of fast component and slow component of SMBI on HL-2A tokamak

    Shi Yong-Fu; Wang Zhan-Hui; Sun Ai-Ping; Yu De-Liang; Xu Min; Ren Qi-Long; Guo Wen-Feng

    2017-01-01

    It is very important to improve the penetration depth and fueling efficiency of supersonic molecular beam injection (SMBI) especially for the next generation fusion devices such as ITER. Two components, a fast component (FC) and a slow component (SC), have been observed in the HL-2A SMBI experiments for several years, and the FC can penetrate much more deeply than the common SMBIs which draws a great deal of attention for a better fueling method. It is the first time to the FC and SC of SMBI have been simulated and interpreted in theory and simulation in this paper with the trans-neut module of the BOUT++ code. The simulation results of the FC and SC are clear and distinguishable in the same way as the observation in experiment. For the major mechanism of the FC and SC, it is found that although the difference in the injection velocity has some effect on the penetration depth difference between the FC and SC, it is mainly caused by the self-blocking effect of the first ionized SMB. We also discuss the influence of the initial plasma density on the FC and SC, and the variation of the SC penetration depth with its injection velocity. (paper)

  10. Simulating irradiation hardening in tungsten under fast neutron irradiation including Re production by transmutation

    Huang, Chen-Hsi; Gilbert, Mark R.; Marian, Jaime

    2018-02-01

    Simulations of neutron damage under fusion energy conditions must capture the effects of transmutation, both in terms of accurate chemical inventory buildup as well as the physics of the interactions between transmutation elements and irradiation defect clusters. In this work, we integrate neutronics, primary damage calculations, molecular dynamics results, Re transmutation calculations, and stochastic cluster dynamics simulations to study neutron damage in single-crystal tungsten to mimic divertor materials. To gauge the accuracy and validity of the simulations, we first study the material response under experimental conditions at the JOYO fast reactor in Japan and the High Flux Isotope Reactor at Oak Ridge National Laboratory, for which measurements of cluster densities and hardening levels up to 2 dpa exist. We then provide calculations under expected DEMO fusion conditions. Several key mechanisms involving Re atoms and defect clusters are found to govern the accumulation of irradiation damage in each case. We use established correlations to translate damage accumulation into hardening increases and compare our results to the experimental measurements. We find hardening increases in excess of 5000 MPa in all cases, which casts doubts about the integrity of W-based materials under long-term fusion exposure.

  11. Computer simulations of the damage due to the passage of a heavy fast ion through diamond

    Sorkin, Anastasia; Adler, Joan; Kalish, Rafi

    2004-01-01

    Full Text:The present tight-binding molecular dynamics simulations of the structural modifications that result from the ''thermal spike'' that occurs during the passage of a heavy fast ion through a thin diamond or amorphous carbon layer, and the subsequent regrowth upon cooling. The thermal spike and cooling down are simulated by locally heating and then quenching a small region of carbon: surrounded either by diamond or by a mostly sp''3 bonded amorphous carbon network. For the case of the thermal spike in diamond Fe find that if the ''temperature'' (kinetic energy of the atoms) at the center of the thermal spike is high enough, an amorphous carbon region containing a large fraction of threefold coordinated C atoms (sp 2 bonded) remains within the diamond network after cooling. The structure of this amorphous layer depends very strongly on the ''temperature'' of heating and on the dimensions of the thermal spike. Scaling is found between curves of the dependence of the percentage of sp''2 bonded atoms in the region of the thermal spike on the heating ''temperature'' for different volumes. Justification of the validity of the' tight-binding approximation for these simulations will also be given

  12. Simulating soil phosphorus dynamics for a phosphorus loss quantification tool.

    Vadas, Peter A; Joern, Brad C; Moore, Philip A

    2012-01-01

    Pollution of fresh waters by agricultural phosphorus (P) is a water quality concern. Because soils can contribute significantly to P loss in runoff, it is important to assess how management affects soil P status over time, which is often done with models. Our objective was to describe and validate soil P dynamics in the Annual P Loss Estimator (APLE) model. APLE is a user-friendly spreadsheet model that simulates P loss in runoff and soil P dynamics over 10 yr for a given set of runoff, erosion, and management conditions. For soil P dynamics, APLE simulates two layers in the topsoil, each with three inorganic P pools and one organic P pool. It simulates P additions to soil from manure and fertilizer, distribution among pools, mixing between layers due to tillage and bioturbation, leaching between and out of layers, crop P removal, and loss by surface runoff and erosion. We used soil P data from 25 published studies to validate APLE's soil P processes. Our results show that APLE reliably simulated soil P dynamics for a wide range of soil properties, soil depths, P application sources and rates, durations, soil P contents, and management practices. We validated APLE specifically for situations where soil P was increasing from excessive P inputs, where soil P was decreasing due to greater outputs than inputs, and where soil P stratification occurred in no-till and pasture soils. Successful simulations demonstrate APLE's potential to be applied to major management scenarios related to soil P loss in runoff and erosion. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  13. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    Pritchett, Amy R.

    2002-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  14. An innovative simulation tool for waste to energy generation opportunities

    Bilal Abderezzak

    2017-03-01

    Full Text Available The new world energy policies encourage the use of renewable energy sources with clean technologies, and abandon progressively the fossil fuel dependence. Another energy generation trend called commonly the “Waste-to-Energy” solution, uses organic waste as a response for two major problems: energy generation and waste management. Thanks to the anaerobic digestion, the organic waste can provide a biogas composed essentially from Carbone dioxide (CO2 and Methane (CH4. This work aims essentially to help students, researchers and even decision makers to consider the importance of biogas generation. The proposed tool is the last version of our previous tool which is enhanced and completed. It presents the potential to produce biogas of any shortlisted kind of waste, including also some energy valorization ways. A technical economical data are introduced for eventual feasibility studies.

  15. Fast electron and X-ray scattering as a tool to study target's structure

    Amusia, M.Ya.

    2007-01-01

    We concentrate on several relatively new aspects of the study of fast electron and X-ray scattering by atoms and atom-like objects, namely endohedral atoms and fullerenes. However, main attention is given to fast charge particle scattering. We show that the corresponding cross-sections, being expressed via so-called generalized oscillator strengths (GOS), give information on the electronic structure of the target and on the role of electron correlations in it. We consider what sort of information became available when analyzing the dependence of GOS upon their multipolarity, transferred momentum q and energy ω. To obtain theoretical results, we employ both the one-electron Hartree-Fock approximation and account for the multi-electron correlation in the target, using the random phase approximation with exchange. We demonstrate the role of non-dipole corrections in the small-angle fast-electron inelastic scattering. There dipole contribution dominates while non-dipole corrections can be considerably and controllably enhanced as compared to the case of low and medium energy photoionization. We show also that analyses of GOS for discrete level excitations permit to clarify their multipolarity. The results of calculations of Compton excitation and ionization cross-sections are presented. Attention is given to cooperative effects in inelastic fast electron-atom scattering that results in directed motion of the secondary electrons, a phenomenon that is similar to 'drag currents' in photoionization. We demonstrate how one should derive GOS for endohedral atoms, e.g. A-C 60 and what is the additional information that can be obtained from corresponding GOS. Most of discussions are illustrated by the results of concrete calculations

  16. The dynamic current-voltage characteristic as a powerful tool to analyze fast phenomena in plasma

    Ivan, L. M.; Mihai-Plugaru, M.; Amarandei, G.; Aflori, M.; Dimitriu, D. G.

    2006-01-01

    The static current-voltage characteristic of an electrode immersed in plasma is obtained by slowly increasing and subsequently decreasing the potential on the electrode with respect to the plasma potential or the ground. This characteristic can give us important information about the phenomena that take place in front of the electrode. Current jumps can be evidenced which were often associated with an hysteresis effect, regions with S-type or N-type negative differential resistance, etc. The method is always used when we investigate the appearance of complex space charge configurations (CSCC) in front of an electrode immersed in plasma. However, to investigate the dynamics of such structures or other fast phenomena (like instabilities) which take place in plasma devices with frequencies of tenth, hundred kHz or more, complex investigation techniques must be used. One of the most efficient methods to investigate fast phenomena in plasma devices is the dynamic current-voltage characteristic. This is obtained by recording the time series of the current collected by the electrode when the voltage applied on it is very fast modified (most likely increased) by using a signal generator. In this way, very fast oscillations of the current can be recorded and new phenomena can be evidenced. We used this technique to study the phenomena which take place at the onset of electrostatic instabilities in Q-machine plasma, namely the potential relaxation instability (PRI) and the electrostatic ion-cyclotron instability (EICI). The obtained experimental results prove that the negative differential resistance region in the static current-voltage characteristic is the result of a nonlinear dynamics of a CSCC in form of a double layer (DL) which takes place just before the onset of the instabilities. In the case of the PRI we emphasized current jumps related with the DL appearance, which are not present in the static current-voltage characteristic at high plasma density. (authors)

  17. Simulation of laser radar tooling ball measurements: focus dependence

    Smith, Daniel G.; Slotwinski, Anthony; Hedges, Thomas

    2015-10-01

    The Nikon Metrology Laser Radar system focuses a beam from a fiber to a target object and receives the light scattered from the target through the same fiber. The system can, among other things, make highly accurate measurements of the position of a tooling ball by locating the angular position of peak signal quality, which is related to the fiber coupling efficiency. This article explores the relationship between fiber coupling efficiency and focus condition.

  18. PULSim: User-Based Adaptable Simulation Tool for Railway Planning and Operations

    Yong Cui

    2018-01-01

    Full Text Available Simulation methods are widely used in the field of railway planning and operations. Currently, several commercial software tools are available that not only provide functionality for railway simulation but also enable further evaluation and optimisation of the network for scheduling, dispatching, and capacity research. However, the various tools are all lacking with respect to the standards they utilise as well as their published interfaces. For an end-user, the basic mechanism and the assumptions built into a simulation tool are unknown, which means that the true potential of these software tools is limited. One of the most critical issues is the lack of the ability of users to define a sophisticated workflow, integrated in several rounds of simulation with adjustable parameters and settings. This paper develops and describes a user-based, customisable platform. As the preconditions of the platform, the design aspects for modelling the components of a railway system and building the workflow of railway simulation are elaborated in detail. Based on the model and the workflow, an integrated simulation platform with open interfaces is developed. Users and researchers gain the ability to rapidly develop their own algorithms, supported by the tailored simulation process in a flexible manner. The productivity of using simulation tools for further evaluation and optimisation will be significantly improved through the user-adaptable open interfaces.

  19. A dynamic simulation tool for the battery-hybrid hydrogen fuel cell vehicle

    Moore, R.M. [Hawaii Natural Energy Institute, University of Hawaii, Manoa (United States); Ramaswamy, S.; Cunningham, J.M. [California Univ., Berkeley, CA (United States); Hauer, K.H. [xcellvision, Major-Hirst-Strasse 11, 38422 Wolfsburg (Germany)

    2006-10-15

    This paper describes a dynamic fuel cell vehicle simulation tool for the battery-hybrid direct-hydrogen fuel cell vehicle. The emphasis is on simulation of the hybridized hydrogen fuel cell system within an existing fuel cell vehicle simulation tool. The discussion is focused on the simulation of the sub-systems that are unique to the hybridized direct-hydrogen vehicle, and builds on a previous paper that described a simulation tool for the load-following direct-hydrogen vehicle. The configuration of the general fuel cell vehicle simulation tool has been previously presented in detail, and is only briefly reviewed in the introduction to this paper. Strictly speaking, the results provided in this paper only serve as an example that is valid for the specific fuel cell vehicle design configuration analyzed. Different design choices may lead to different results, depending strongly on the parameters used and choices taken during the detailed design process required for this highly non-linear and n-dimensional system. The primary purpose of this paper is not to provide a dynamic simulation tool that is the ''final word'' for the ''optimal'' hybrid fuel cell vehicle design. The primary purpose is to provide an explanation of a simulation method for analyzing the energetic aspects of a hybrid fuel cell vehicle. (Abstract Copyright [2006], Wiley Periodicals, Inc.)

  20. Dynamic wind turbine models in power system simulation tool DIgSILENT

    Hansen, A.D.; Iov, F.; Soerensen, Poul.; Cutululis, N.; Jauch, C.; Blaabjerg, F.

    2007-08-15

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risoe-R-1400(EN) and it gathers and describes a whole wind turbine model database built-op and developed during several national research projects, carried out at Risoe DTU National Laboratory for Sustainable Energy and Aalborg University, in the period 2001-2007. The overall objective of these projects was to create a wind turbine model database able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides thus a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). The initialisation issues on the wind turbine models into the power system simulation are also presented. The main attention in the report is drawn to the modelling at the system level of the following wind turbine concepts: (1) Fixed speed active stall wind turbine concept (2) Variable speed doubly-fed induction generator wind turbine concept (3) Variable speed multi-pole permanent magnet synchronous generator wind turbine concept These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies, connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine

  1. CSR as a Marketing Tool : a Case Study of Fast Food Restaurant in Malaysia

    Abdul Razak, Armiza

    2014-01-01

    The use of corporate social responsibility (CSR) as being an efficient marketing tool would be matter that is being researched, especially with the increasing awareness of consumers about the importance of commercial organizations doing business in ways that do not harm the consumers or the environment. The aim of the research will be to obtain data from qualitative data in order to determine whether using CSR as being a marketing tool would bring a higher level of efficiency in the marketing...

  2. LMFBR source term experiments in the Fuel Aerosol Simulant Test (FAST) facility

    Petrykowski, J.C.; Longest, A.W.

    1985-01-01

    The transport of uranium dioxide (UO 2 ) aerosol through liquid sodium was studied in a series of ten experiments in the Fuel Aerosol Simulant Test (FAST) facility at Oak Ridge National Laboratory (ORNL). The experiments were designed to provide a mechanistic basis for evaluating the radiological source term associated with a postulated, energetic core disruptive accident (CDA) in a liquid metal fast breeder reactor (LMFBR). Aerosol was generated by capacitor discharge vaporization of UO 2 pellets which were submerged in a sodium pool under an argon cover gas. Measurements of the pool and cover gas pressures were used to study the transport of aerosol contained by vapor bubbles within the pool. Samples of cover gas were filtered to determine the quantity of aerosol released from the pool. The depth at which the aerosol was generated was found to be the most critical parameter affecting release. The largest release was observed in the baseline experiment where the sample was vaporized above the sodium pool. In the nine ''undersodium'' experiments aerosol was generated beneath the surface of the pool at depths varying from 30 to 1060 mm. The mass of aerosol released from the pool was found to be a very small fraction of the original specimen. It appears that the bulk of aerosol was contained by bubbles which collapsed within the pool. 18 refs., 11 figs., 4 tabs

  3. Fast Ion Effects During Test Blanket Module Simulation Experiments in DIII-D

    Kramer, G.J.; Budny, R.V.; Ellis, R.; Gorelenkova, M.; Heidbrink, W.W.; Kurki-Suonio, T.; Nazikian, R.; Salmi, A.; Schaffer, M.J.; Shinohara, K.; Snipes, J.A.; Spong, D.A.; Koskela, T.; Van Zeeland, M.A.

    2011-01-01

    Fast beam-ion losses were studied in DIII-D in the presence of a scaled mockup of two Test Blanket Modules (TBM) for ITER. Heating of the protective tiles on the front of the TBM surface was found when neutral beams were injected and the TBM fields were engaged. The fast-ion core confinement was not significantly affected. Different orbit-following codes predict the formation of a hot spot on the TBM surface arising from beam-ions deposited near the edge of the plasma. The codes are in good agreement with each other on the total power deposited at the hot spot predicting an increase in power with decreasing separation between the plasma edge and the TBM surface. A thermal analysis of the heat flow through the tiles shows that the simulated power can account for the measured tile temperature rise. The thermal analysis, however, is very sensitive to the details of the localization of the hot spot which is predicted to be different among the various codes.

  4. Full-wave simulations of current profiles for fast magnetosonic wave current drive

    Dmitrieva, M.V.; Eriksson, L.-G.; Gambier, D.J.

    1992-12-01

    Numerical simulations of current drive in tokamaks by fast waves (FWCD) have been performed in the range of the ion cyclotron and at lower frequencies via 3-Dimensional numerical code ICTOR. Trapped particles effects were taken into account in the calculation of the fast wave current drive efficiency and the bootstrap current generation. The global efficiency of FWCD if found to be γ∼ 0.1 x 10 20 AW -1 m -2 for the Joint European Torus tokamak (JET) parameters at a central electron temperature of ∼ 10 kev. The efficiency of FWCD for reactor-like plasmas is found to be γ∼0.3 x 10 20 AW -1 m -2 for ∼ 100% of FWCD and γ∼ 1 x 10 20 AW -1 m -2 for FWCD and ∼ 65% of bootstrap in a total current of ∼ 25MA at a 25kev central temperature with a density of ∼10 20 m -3 and major radius R ∼ 8m. Non-inductive current density profiles are studied. Broad FWCD current profiles are obtained for flat reactor temperature and density profiles with bootstrap current concentrated at the plasma edge. The possibility of a steady-state reactor on full wave (FW) with a large fraction of bootstrap current is discussed. It appears to be impractical to rely on such an external current driven (CD) scheme for a reactor as long a γ is less than 2 x 10 20 AW -1 m -2 . (Author)

  5. Nonuniform fast Fourier transform method for numerical diffraction simulation on tilted planes.

    Xiao, Yu; Tang, Xiahui; Qin, Yingxiong; Peng, Hao; Wang, Wei; Zhong, Lijing

    2016-10-01

    The method, based on the rotation of the angular spectrum in the frequency domain, is generally used for the diffraction simulation between the tilted planes. Due to the rotation of the angular spectrum, the interval between the sampling points in the Fourier domain is not even. For the conventional fast Fourier transform (FFT)-based methods, a spectrum interpolation is needed to get the approximate sampling value on the equidistant sampling points. However, due to the numerical error caused by the spectrum interpolation, the calculation accuracy degrades very quickly as the rotation angle increases. Here, the diffraction propagation between the tilted planes is transformed into a problem about the discrete Fourier transform on the uneven sampling points, which can be evaluated effectively and precisely through the nonuniform fast Fourier transform method (NUFFT). The most important advantage of this method is that the conventional spectrum interpolation is avoided and the high calculation accuracy can be guaranteed for different rotation angles, even when the rotation angle is close to π/2. Also, its calculation efficiency is comparable with that of the conventional FFT-based methods. Numerical examples as well as a discussion about the calculation accuracy and the sampling method are presented.

  6. A Fast Inspection of Tool Electrode and Drilling Depth in EDM Drilling by Detection Line Algorithm.

    Huang, Kuo-Yi

    2008-08-21

    The purpose of this study was to develop a novel measurement method using a machine vision system. Besides using image processing techniques, the proposed system employs a detection line algorithm that detects the tool electrode length and drilling depth of a workpiece accurately and effectively. Different boundaries of areas on the tool electrode are defined: a baseline between base and normal areas, a ND-line between normal and drilling areas (accumulating carbon area), and a DD-line between drilling area and dielectric fluid droplet on the electrode tip. Accordingly, image processing techniques are employed to extract a tool electrode image, and the centroid, eigenvector, and principle axis of the tool electrode are determined. The developed detection line algorithm (DLA) is then used to detect the baseline, ND-line, and DD-line along the direction of the principle axis. Finally, the tool electrode length and drilling depth of the workpiece are estimated via detected baseline, ND-line, and DD-line. Experimental results show good accuracy and efficiency in estimation of the tool electrode length and drilling depth under different conditions. Hence, this research may provide a reference for industrial application in EDM drilling measurement.

  7. GTZ: a fast compression and cloud transmission tool optimized for FASTQ files.

    Xing, Yuting; Li, Gen; Wang, Zhenguo; Feng, Bolun; Song, Zhuo; Wu, Chengkun

    2017-12-28

    The dramatic development of DNA sequencing technology is generating real big data, craving for more storage and bandwidth. To speed up data sharing and bring data to computing resource faster and cheaper, it is necessary to develop a compression tool than can support efficient compression and transmission of sequencing data onto the cloud storage. This paper presents GTZ, a compression and transmission tool, optimized for FASTQ files. As a reference-free lossless FASTQ compressor, GTZ treats different lines of FASTQ separately, utilizes adaptive context modelling to estimate their characteristic probabilities, and compresses data blocks with arithmetic coding. GTZ can also be used to compress multiple files or directories at once. Furthermore, as a tool to be used in the cloud computing era, it is capable of saving compressed data locally or transmitting data directly into cloud by choice. We evaluated the performance of GTZ on some diverse FASTQ benchmarks. Results show that in most cases, it outperforms many other tools in terms of the compression ratio, speed and stability. GTZ is a tool that enables efficient lossless FASTQ data compression and simultaneous data transmission onto to cloud. It emerges as a useful tool for NGS data storage and transmission in the cloud environment. GTZ is freely available online at: https://github.com/Genetalks/gtz .

  8. RPYFMM: Parallel adaptive fast multipole method for Rotne-Prager-Yamakawa tensor in biomolecular hydrodynamics simulations

    Guan, W.; Cheng, X.; Huang, J.; Huber, G.; Li, W.; McCammon, J. A.; Zhang, B.

    2018-06-01

    RPYFMM is a software package for the efficient evaluation of the potential field governed by the Rotne-Prager-Yamakawa (RPY) tensor interactions in biomolecular hydrodynamics simulations. In our algorithm, the RPY tensor is decomposed as a linear combination of four Laplace interactions, each of which is evaluated using the adaptive fast multipole method (FMM) (Greengard and Rokhlin, 1997) where the exponential expansions are applied to diagonalize the multipole-to-local translation operators. RPYFMM offers a unified execution on both shared and distributed memory computers by leveraging the DASHMM library (DeBuhr et al., 2016, 2018). Preliminary numerical results show that the interactions for a molecular system of 15 million particles (beads) can be computed within one second on a Cray XC30 cluster using 12,288 cores, while achieving approximately 54% strong-scaling efficiency.

  9. Light ions cyclotron bombardment to simulate fast neutron radiation damage in nuclear materials

    Segura, E.; Lucki, G.; Aguiar, D.

    1984-01-01

    The applicability and limitations of the use of cyclotron light ions bombardment to simulate the effects of the neutron irradiation are presented. Light ions with energies of about 10 MeV are capable to produce homogeneous damage in specimens suitable for measuring bulk mechanical properties although their low damage rate of 10 -5 dpa.sec -1 limit the dose range to a few dpa. On the other hand, cyclotron alpha particle implantation provides a fast and convenient way of introducing helium with a minimum of side effects so that we can take advantage of this technique to get better understanding of the mechanism by which this insoluble gas produces high temperature embrittlement. Some experimental details such as dimensions and cooling techniques are described. Finally a description of the infrastructure for cyclotron alpha particle implantation and a creep-test facility of the Division of Radiation Damage at IPEN-CNEN/SP are presented. (Author) [pt

  10. Adaptation and performance of the Cartesian coordinates fast multipole method for nanomagnetic simulations

    Zhang Wen; Haas, Stephan

    2009-01-01

    An implementation of the fast multiple method (FMM) is performed for magnetic systems with long-ranged dipolar interactions. Expansion in spherical harmonics of the original FMM is replaced by expansion of polynomials in Cartesian coordinates, which is considerably simpler. Under open boundary conditions, an expression for multipole moments of point dipoles in a cell is derived. These make the program appropriate for nanomagnetic simulations, including magnetic nanoparticles and ferrofluids. The performance is optimized in terms of cell size and parameter set (expansion order and opening angle) and the trade off between computing time and accuracy is quantitatively studied. A rule of thumb is proposed to decide the appropriate average number of dipoles in the smallest cells, and an optimal choice of parameter set is suggested. Finally, the superiority of Cartesian coordinate FMM is demonstrated by comparison to spherical harmonics FMM and FFT.

  11. CYLFUX, Fast Reactor Reactivity Transients Simulation in LWR by 2-D 2 Group Diffusion

    Schmidt, A.

    1973-01-01

    1 - Nature of physical problem solved: A 2-dimensional calculation of the 2-group, space-dependent neutron diffusion equations is performed in r-z geometry using an arbitrary number of groups of delayed neutron precursors. The program is designed to simulate fast reactivity excursions in light water reactors taking into account Doppler feedback via adiabatic heatup of fuel. Axial motions of control rods may be considered including scram action on option. 2 - Method of solution: The differential equations are solved at each time step by an explicit finite difference method using two time levels. The stationary distributions are obtained by using the same algorithm. 3 - Restrictions on the complexity of the problem: No restriction to the number of space points and delayed neutron energy groups besides the computer size

  12. Hybrid parallel strategy for the simulation of fast transient accidental situations at reactor scale

    Faucher, V.; Galon, P.; Beccantini, A.; Crouzet, F.; Debaud, F.; Gautier, T.

    2013-01-01

    This contribution is dedicated to the latest methodological developments implemented in the fast transient dynamics software EUROPLEXUS (EPX) to simulate the mechanical response of fully coupled fluid-structure systems to accidental situations to be considered at reactor scale, among which the Loss of Coolant Accident, the Core Disruptive Accident and the Hydrogen Explosion. Time integration is explicit and the search for reference solutions within the safety framework prevents any simplification and approximations in the coupled algorithm: for instance, all kinematic constraints are dealt with using Lagrange Multipliers, yielding a complex flow chart when non-permanent constraints such as unilateral contact or immersed fluid-structure boundaries are considered. The parallel acceleration of the solution process is then achieved through a hybrid approach, based on a weighted domain decomposition for distributed memory computing and the use of the KAAPI library for self-balanced shared memory processing inside sub-domains. (authors)

  13. Simulations of the TESLA Linear Collider with a Fast Feedback System

    Schulte, Daniel; White, G

    2003-01-01

    The tolerances on the beams as they collide at the interaction point of the TESLA linear collider are very tight due to the nano-metre scale final vertical bunch spot sizes. Ground motion causes the beams to increase in emittance and drift out of collision leading to dramatic degradation of luminosity performance. To combat this, both slow orbit and fast intra-train feedback systems will be used. The design of these feedback systems depends critically on how component misalignment effects the beam throughout the whole accelerator. A simulation has been set up to study in detail the accelerator performance under such conditions by merging the codes of PLACET, MERLIN and GUINEA-PIG together with Simulink code to model feedback systems, all under a Matlab environment.

  14. Introduction of the computer-based operation training tools in classrooms to support simulator training

    Noji, K.; Suzuki, K.; Kobayashi, A.

    1997-01-01

    Operation training with full-scope simulators is effective to improve trainees operation competency. To obtain more effective results of simulator training, roles of the ''classroom operation training'' closely cooperated to simulator training are important. The ''classroom operation training'' is aimed at pre- and post-studies for operation knowledge related to operation training using full-scope simulators. We have been developing computer-based operation training tools which are used in classroom training sessions. As the first step, we developed the Simulator Training Replay System. This is an aiding tool in the classroom used to enhance trainees operation performance. This system can synchronously replay plant behavior on CRT display with operators action on a video monitor in the simulator training sessions. This system is used to review plant behavior - trainees response after simulator training sessions and to understand plant behavior - operation procedure before operation training. (author)

  15. Transit Boardings Estimation and Simulation Tool (TBEST) calibration for guideway and BRT modes.

    2013-06-01

    This research initiative was motivated by a desire of the Florida Department of Transportation and the : Transit Boardings Estimation and Simulation Tool (TBEST) project team to enhance the value of TBEST to : the planning community by improving its ...

  16. High Fidelity Multi-Scale Regolith Simulation Tool for ISRU, Phase II

    National Aeronautics and Space Administration — NASA has serious unmet needs for simulation tools capable of predicting the behavior of lunar regolith in proposed excavation, transport and handling systems....

  17. Using Simulation to Teach About Poverty in Nursing Education: A Review of Available Tools.

    Reid, Carol A; Evanson, Tracy A

    2016-01-01

    Poverty is one of the most significant social determinants of health, and as such, it is imperative that nurses have an understanding of the impact that living in poverty has upon one's life and health. A lack of such understanding will impede nurses from providing care that is patient centered, treats all patients fairly, and advocates for social justice. It is essential that nursing educators assure that poverty-related content and effective teaching strategies are used in nursing curricula in order to help students develop this understanding. Several poverty-simulation tools are available and may be able to assist with development of accurate knowledge, skills, and attitudes. Unfortunately, little evidence exists to evaluate most poverty simulation tools. This article will provide an introduction to several poverty-related simulation tools, discuss any related research that evaluates their effectiveness, and make recommendations for integration of such simulation tools into nursing curricula. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. DualSPHysics: A numerical tool to simulate real breakwaters

    Zhang, Feng; Crespo, Alejandro; Altomare, Corrado; Domínguez, José; Marzeddu, Andrea; Shang, Shao-ping; Gómez-Gesteira, Moncho

    2018-02-01

    The open-source code DualSPHysics is used in this work to compute the wave run-up in an existing dike in the Chinese coast using realistic dimensions, bathymetry and wave conditions. The GPU computing power of the DualSPHysics allows simulating real-engineering problems that involve complex geometries with a high resolution in a reasonable computational time. The code is first validated by comparing the numerical free-surface elevation, the wave orbital velocities and the time series of the run-up with physical data in a wave flume. Those experiments include a smooth dike and an armored dike with two layers of cubic blocks. After validation, the code is applied to a real case to obtain the wave run-up under different incident wave conditions. In order to simulate the real open sea, the spurious reflections from the wavemaker are removed by using an active wave absorption technique.

  19. Dynamic wind turbine models in power system simulation tool DIgSILENT

    Hansen, A.D.; Jauch, C.; Sørensen, Poul Ejnar; Iov, F.; Blaabjerg, F.

    2004-01-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create amodel database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The repo...

  20. Numerical simulation of Vlasov equation with parallel tools

    Peyroux, J.

    2005-11-01

    This project aims to make even more powerful the resolution of Vlasov codes through the various parallelization tools (MPI, OpenMP...). A simplified test case served as a base for constructing the parallel codes for obtaining a data-processing skeleton which, thereafter, could be re-used for increasingly complex models (more than four variables of phase space). This will thus make it possible to treat more realistic situations linked, for example, to the injection of ultra short and ultra intense impulses in inertial fusion plasmas, or the study of the instability of trapped ions now taken as being responsible for the generation of turbulence in tokamak plasmas. (author)

  1. Evaluation of a simulation tool in ophthalmology: application in teaching funduscopy

    Joice Elise Androwiki

    2015-02-01

    Full Text Available Purpose: The aim of this study was to evaluate the Eye Retinopathy Trainer® as a teaching tool for direct ophthalmoscopy examination by comparing it with the traditional method using volunteers. Methods: Fourth year medical students received training in direct ophthalmoscopy using a simulation tool and human volunteers. Ninety students were randomized into a Simulation Group or a Control Group by the inclusion or absence of the simulation model in classroom practice. Differences between the groups were analyzed using unpaired Student’s t-test. Results: The Simulation Group was superior to the Control Group, with 51.06% successful in performing fundus examination in both the anatomical model simulation and the human model in comparison with 21.15% in the Control Group. Conclusion: The Eye Retinopathy Trainer® appears to be an effective teaching tool for practice and improvement of ophthalmologic examination among fourth year medical students.

  2. Simulation of W dust transport in the KSTAR tokamak, comparison with fast camera data

    A. Autricque

    2017-08-01

    Full Text Available In this paper, dust transport in tokamak plasmas is studied through both experimental and modeling aspects. Image processing routines allowing dust tracking on CCD camera videos are presented. The DUMPRO (DUst Movie PROcessing code features a dust detection method and a trajectory reconstruction algorithm. In addition, a dust transport code named DUMBO (DUst Migration in a plasma BOundary is briefly described. It has been developed at CEA in order to simulate dust grains transport in tokamaks and to evaluate the contribution of dust to the impurity inventory of the plasma. Like other dust transport codes, DUMBO integrates the Orbital Motion Limited (OML approach for dust/plasma interactions modeling. OML gives direct expressions for plasma ions and electrons currents, forces and heat fluxes on a dust grain. The equation of motion is solved, giving access to the dust trajectory. An attempt of model validation is made through comparison of simulated and measured trajectories on the 2015 KSTAR dust injection experiment, where W dust grains were successfully injected in the plasma using a gun-type injector. The trajectories of the injected particles, estimated using the DUMPRO routines applied on videos from the fast CCD camera in KSTAR, show two distinct general dust behaviors, due to different dust sizes. Simulations were made with DUMBO to match the measurements. Plasma parameters were estimated using different diagnostics during the dust injection experiment plasma discharge. The experimental trajectories show longer lifetimes than the simulated ones. This can be due to the substitution of a boiling/sublimation point to the usual vaporization/sublimation cooling, OML limitations (eventual potential barriers in the vicinity of a dust grain are neglected and/or to the lack of a vapor shielding model in DUMBO.

  3. The realistic consideration of human factors in model based simulation tools for the air traffic control domain.

    Duca, Gabriella; Attaianese, Erminia

    2012-01-01

    Advanced Air Traffic Management (ATM) concepts related to automation, airspace organization and operational procedures are driven by the overall goal to increase ATM system performance. Independently on the nature and/or impact of envisaged changes (e.g. from a short term procedure adjustment to a very long term operational concept or aid tools completion), the preliminary assessment of possible gains in airspace/airport capacity, safety and cost-effectiveness is done by running Model Based Simulations (MBSs, also known as Fast Time Simulations - FTS). Being a not human-in-the-loop technique, the reliability of a MBS results depend on the accuracy and significance of modeled human factors. Despite that, it can be observed in the practice that modeling tools commonly assume a generalized standardization of human behaviors and tasks and consider a very few range of work environment factors that, in the reality, affect the actual human-system performance. The present paper is aimed at opening a discussion about the possibility to keep task description and related weight at a high/general level, suitable for an efficient use of MBSs and, at the same time, increasing simulations reliability adopting some adjustment coming from the elaboration of further variables related to the human aspects of controllers workload.

  4. Fast simulation of wind generation for frequency stability analysis in island power systems

    Conroy, James [EirGrid, Dublin (Ireland)

    2010-07-01

    Frequency stability is a major issue for power system planning and operation in an island power system such as Ireland. As increasing amounts of variable speed wind generation are added to the system, this issue becomes more prominent, as variable speed wind generation does not provide an inherent inertial response. This lack of an inertial response means that simplified models for variable speed wind farms can be used for investigating frequency stability. EirGrid uses DIgSILENT Power Factory (as well as other software tools) to investigate frequency stability. In PowerFactory, an automation program has been created to convert detailed wind farm representation (as necessary for other types of analysis) to negative load models for frequency stability analysis. The advantage of this approach is much-improved simulation speed without loss of accuracy. This approach can also be to study future wind energy targets, and long-term simulation of voltage stability. (orig.)

  5. Response simulation and theoretical calibration of a dual-induction resistivity LWD tool

    Xu, Wei; Ke, Shi-Zhen; Li, An-Zong; Chen, Peng; Zhu, Jun; Zhang, Wei

    2014-03-01

    In this paper, responses of a new dual-induction resistivity logging-while-drilling (LWD) tool in 3D inhomogeneous formation models are simulated by the vector finite element method (VFEM), the influences of the borehole, invaded zone, surrounding strata, and tool eccentricity are analyzed, and calibration loop parameters and calibration coefficients of the LWD tool are discussed. The results show that the tool has a greater depth of investigation than that of the existing electromagnetic propagation LWD tools and is more sensitive to azimuthal conductivity. Both deep and medium induction responses have linear relationships with the formation conductivity, considering optimal calibration loop parameters and calibration coefficients. Due to the different depths of investigation and resolution, deep induction and medium induction are affected differently by the formation model parameters, thereby having different correction factors. The simulation results can provide theoretical references for the research and interpretation of the dual-induction resistivity LWD tools.

  6. Current and capacitance measurements as a fast diagnostic tool for evaluation of semiconductor parameters

    Kemmer, J; Krause, N; Krieglmeyer, C; Yang Yi

    2000-01-01

    A fast qualitative method is described for evaluation of semiconductor parameters by analyzing both the capacitance/voltage (C/V) and current/voltage (I/V) characteristics of pn- or Schottky-diodes, which are fabricated on the material under investigation. The method is applied for measurement of recombination and generation lifetimes of minority charge carriers and for determination of doping profiles and distribution of active generation/recombination (G/R) centers after irradiation with Am-alpha particles and deep phosphorus implantation. Measurements on epitaxial silicon result in doping profiles and distributions of active impurities within the epi-layer.

  7. Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography

    Prado, T. L.; Galuzio, P. P.; Lopes, S. R.; Viana, R. L.

    2014-01-01

    Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with a better control over the spurious fragments in the image

  8. Rigorous simulation: a tool to enhance decision making

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  9. Model and tool requirements for co-simulation of building performance

    Trcka, M.; Hensen, J.L.M.

    2006-01-01

    The use of building performance simulation (BPS) can substantially help in improving building design towards higher occupant comfort and lower fuel consumption, while reducing emission of greenhouse gasses. Unfortunately, current BPS tools do not allow inter-tool communication and thus limit a

  10. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  11. Discussing Virtual Tools that Simulate Probabilities: What Are the Middle School Teachers' Concerns?

    Savard, Annie; Freiman, Viktor; Theis, Laurent; Larose, Fançois

    2013-01-01

    Mathematics teachers, researchers and specialists in educational technology from Quebec, Canada developed virtual tools that make interactive simulations of games of chance. These tools were presented to a group of teachers from New Brunswick through workshops and they then got to test and validate them with their students. Semi-structured…

  12. Simulation of erosion in drilling tools for oil and gas industry

    Arefi, B.; Settari, A. [Calgary Univ., AB (Canada); Angman, P. [Tesco Corp., Calgary, AB (Canada)

    2004-07-01

    Erosion in oil well drilling tools is a form of wear which occurs when fluid containing solid particles impacts a solid surface. The intensity of erosion is generally measured as the rate of material removal from the surface, and is expressed as E{sub r}, the weight of material removed by unit weight of impacting particles. Erosion can also be reduced by tool improvement and modification, thereby extending the life of drilling tools. To date, no attempt has been made to model the erosion phenomenon in drilling tools. This paper presents a newly developed erosion simulator which is the first design tool for the drilling industry. This work demonstrates that erosion can be simulated. A model was developed to calibrate the erosion coefficients for drilling tool conditions. The mechanism of erosion can be controlled by the impact velocity and angle. Algorithms were developed for transient simulation of the erosion of any surface in 2-dimensional geometry. The Erosion Simulator has been validated and calibrated against data provided by TESCO Corporation's casing drilling tools. The model has been shown to successfully predict and minimize erosion by modifying the tool geometry and metallurgy. 21 refs., 1 tab., 15 figs.

  13. CPN/Tools: A Tool for Editing and Simulating Coloured Petri Nets

    Jensen, Kurt; Christensen, Søren; Ravn, Katrine

    2001-01-01

    the workspace. It challenges traditional ideas about user interfaces, getting rid of pull-down menus, scrollbars, and even selection, while providing the same or greater functionality. CPN/Tools requires an OpenGL graphics accelerator and will run on all major platforms (Windows, Unix/Linux, MacOS)....

  14. Fast simulation of transport and adaptive permeability estimation in porous media

    Berre, Inga

    2005-07-01

    The focus of the thesis is twofold: Both fast simulation of transport in porous media and adaptive estimation of permeability are considered. A short introduction that motivates the work on these topics is given in Chapter 1. In Chapter 2, the governing equations for one- and two-phase flow in porous media are presented. Overall numerical solution strategies for the two-phase flow model are also discussed briefly. The concepts of streamlines and time-of-flight are introduced in Chapter 3. Methods for computing streamlines and time-of-flight are also presented in this chapter. Subsequently, in Chapters 4 and 5, the focus is on simulation of transport in a time-of-flight perspective. In Chapter 4, transport of fluids along streamlines is considered. Chapter 5 introduces a different viewpoint based on the evolution of isocontours of the fluid saturation. While the first chapters focus on the forward problem, which consists in solving a mathematical model given the reservoir parameters, Chapters 6, 7 and 8 are devoted to the inverse problem of permeability estimation. An introduction to the problem of identifying spatial variability in reservoir permeability by inversion of dynamic production data is given in Chapter 6. In Chapter 7, adaptive multiscale strategies for permeability estimation are discussed. Subsequently, Chapter 8 presents a level-set approach for improving piecewise constant permeability representations. Finally, Chapter 9 summarizes the results obtained in the thesis; in addition, the chapter gives some recommendations and suggests directions for future work. Part II In Part II, the following papers are included in the order they were completed: Paper A: A Streamline Front Tracking Method for Two- and Three-Phase Flow Including Capillary Forces. I. Berre, H. K. Dahle, K. H. Karlsen, and H. F. Nordhaug. In Fluid flow and transport in porous media: mathematical and numerical treatment (South Hadley, MA, 2001), volume 295 of Contemp. Math., pages 49

  15. Simulation Tools for Forest Health Analysis: An Application in the Red River Watershed, Idaho

    Andrew J. McMahan; Eric L. Smith

    2006-01-01

    Software tools for landscape analyses--including FVS model extensions, and a number of FVS-related pre- and post-processing “tools”--are presented, using an analysis in the Red River Watershed, Nez Perce National Forest as an example. We present (1) a discussion of pre-simulation data analysis; (2) the Physiographic Information Extraction System (PIES), a tool that can...

  16. An advanced modelling tool for simulating complex river systems.

    Trancoso, Ana Rosa; Braunschweig, Frank; Chambel Leitão, Pedro; Obermann, Matthias; Neves, Ramiro

    2009-04-01

    The present paper describes MOHID River Network (MRN), a 1D hydrodynamic model for river networks as part of MOHID Water Modelling System, which is a modular system for the simulation of water bodies (hydrodynamics and water constituents). MRN is capable of simulating water quality in the aquatic and benthic phase and its development was especially focused on the reproduction of processes occurring in temporary river networks (flush events, pools formation, and transmission losses). Further, unlike many other models, it allows the quantification of settled materials at the channel bed also over periods when the river falls dry. These features are very important to secure mass conservation in highly varying flows of temporary rivers. The water quality models existing in MOHID are base on well-known ecological models, such as WASP and ERSEM, the latter allowing explicit parameterization of C, N, P, Si, and O cycles. MRN can be coupled to the basin model, MOHID Land, with computes runoff and porous media transport, allowing for the dynamic exchange of water and materials between the river and surroundings, or it can be used as a standalone model, receiving discharges at any specified nodes (ASCII files of time series with arbitrary time step). These features account for spatial gradients in precipitation which can be significant in Mediterranean-like basins. An interface has been already developed for SWAT basin model.

  17. Can the FAST and ROSIER adult stroke recognition tools be applied to confirmed childhood arterial ischemic stroke?

    Babl Franz E

    2011-10-01

    Full Text Available Abstract Background Stroke recognition tools have been shown to improve diagnostic accuracy in adults. Development of a similar tool in children is needed to reduce lag time to diagnosis. A critical first step is to determine whether adult stoke scales can be applied in childhood stroke. Our objective was to assess the applicability of adult stroke scales in childhood arterial ischemic stroke (AIS Methods Children aged 1 month to Results 47 children with AIS were identified. 34 had anterior, 12 had posterior and 1 child had anterior and posterior circulation infarcts. Median age was 9 years and 51% were male. Median time from symptom onset to ED presentation was 21 hours but one third of children presented within 6 hours. The most common presenting stroke symptoms were arm (63%, face (62%, leg weakness (57%, speech disturbance (46% and headache (46%. The most common signs were arm (61%, face (70% or leg weakness (57% and dysarthria (34%. 36 (78% of children had at least one positive variable on FAST and 38 (81% had a positive score of ≥1 on the ROSIER scale. Positive scores were less likely in children with posterior circulation stroke. Conclusion The presenting features of pediatric stroke appear similar to adult strokes. Two adult stroke recognition tools have fair to good sensitivity in radiologically confirmed childhood AIS but require further development and modification. Specificity of the tools also needs to be determined in a prospective cohort of children with stroke and non-stroke brain attacks.

  18. Fast neutrons: Inexpensive and reliable tool to investigate high-LET particle radiobiology

    Gueulette, J.; Slabbert, J.P.; Bischoff, P.; Denis, J.M.; Wambersie, A.; Jones, D.

    2010-01-01

    Radiation therapy with carbon ions as well as missions into outer space have boosted the interest for high-LET particle radiobiology. Optimization of treatments in accordance with technical developments, as well as the radioprotection of cosmonauts during long missions require that research in these domains continue. Therefore suitable radiation fields are needed. Fast neutrons and carbon ions exhibit comparable LET values and similar radiobiological properties. Consequently, the findings obtained with each radiation quality could be shared to benefit knowledge in all concerned domains. The p(66+Be) neutron therapy facilities of iThemba LABS (South Africa) and the p(65)+Be neutron facility of Louvain-la-Neuve (Belgium) are in constant use to do radiobiological research for clinical applications with fast neutrons. These beams - which comply with all physical and technical requirements for clinical applications - are now fully reliable, easy to use and frequently accessible for radiobiological investigations. These facilities thus provide unique opportunities to undertake radiobiological experimentation, especially for investigations that require long irradiation times and/or fractionated treatments.

  19. WATSFAR: numerical simulation of soil WATer and Solute fluxes using a FAst and Robust method

    Crevoisier, David; Voltz, Marc

    2013-04-01

    To simulate the evolution of hydro- and agro-systems, numerous spatialised models are based on a multi-local approach and improvement of simulation accuracy by data-assimilation techniques are now used in many application field. The latest acquisition techniques provide a large amount of experimental data, which increase the efficiency of parameters estimation and inverse modelling approaches. In turn simulations are often run on large temporal and spatial domains which requires a large number of model runs. Eventually, despite the regular increase in computing capacities, the development of fast and robust methods describing the evolution of saturated-unsaturated soil water and solute fluxes is still a challenge. Ross (2003, Agron J; 95:1352-1361) proposed a method, solving 1D Richards' and convection-diffusion equation, that fulfil these characteristics. The method is based on a non iterative approach which reduces the numerical divergence risks and allows the use of coarser spatial and temporal discretisations, while assuring a satisfying accuracy of the results. Crevoisier et al. (2009, Adv Wat Res; 32:936-947) proposed some technical improvements and validated this method on a wider range of agro- pedo- climatic situations. In this poster, we present the simulation code WATSFAR which generalises the Ross method to other mathematical representations of soil water retention curve (i.e. standard and modified van Genuchten model) and includes a dual permeability context (preferential fluxes) for both water and solute transfers. The situations tested are those known to be the less favourable when using standard numerical methods: fine textured and extremely dry soils, intense rainfall and solute fluxes, soils near saturation, ... The results of WATSFAR have been compared with the standard finite element model Hydrus. The analysis of these comparisons highlights two main advantages for WATSFAR, i) robustness: even on fine textured soil or high water and solute

  20. Conception of a PWR simulator as a tool for safety analysis

    Lanore, J.M.; Bernard, P.; Romeyer Dherbey, J.; Bonnet, C.; Quilchini, P.

    1982-09-01

    A simulator can be a very useful tool for safety analysis to study accident sequences involving malfunctions of the systems and operator interventions. The main characteristics of the simulator SALAMANDRE (description of the systems, physical models, programming organization, control desk) have then been selected according tot he objectives of safety analysis

  1. Achieving informed decision-making for net zero energy buildings design using building performance simulation tools

    Attia, S.G.; Gratia, E.; De Herde, A.; Hensen, J.L.M.

    2013-01-01

    Building performance simulation (BPS) is the basis for informed decision-making of Net Zero Energy Buildings (NZEBs) design. This paper aims to investigate the use of building performance simulation tools as a method of informing the design decision of NZEBs. The aim of this study is to evaluate the

  2. Integration of distributed system simulation tools for a holistic approach to integrated building and system design

    Radosevic, M.; Hensen, J.L.M.; Wijsman, A.J.T.M.; Hensen, J.L.M.; Lain, M.

    2004-01-01

    Advanced architectural developments require an integrated approach to design where simulation tools available today deal. only with a small subset of the overall problem. The aim of this study is to enable run time exchange of necessary data at suitable frequency between different simulation

  3. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  4. LBM-EP: Lattice-Boltzmann method for fast cardiac electrophysiology simulation from 3D images.

    Rapaka, S; Mansi, T; Georgescu, B; Pop, M; Wright, G A; Kamen, A; Comaniciu, Dorin

    2012-01-01

    Current treatments of heart rhythm troubles require careful planning and guidance for optimal outcomes. Computational models of cardiac electrophysiology are being proposed for therapy planning but current approaches are either too simplified or too computationally intensive for patient-specific simulations in clinical practice. This paper presents a novel approach, LBM-EP, to solve any type of mono-domain cardiac electrophysiology models at near real-time that is especially tailored for patient-specific simulations. The domain is discretized on a Cartesian grid with a level-set representation of patient's heart geometry, previously estimated from images automatically. The cell model is calculated node-wise, while the transmembrane potential is diffused using Lattice-Boltzmann method within the domain defined by the level-set. Experiments on synthetic cases, on a data set from CESC'10 and on one patient with myocardium scar showed that LBM-EP provides results comparable to an FEM implementation, while being 10 - 45 times faster. Fast, accurate, scalable and requiring no specific meshing, LBM-EP paves the way to efficient and detailed models of cardiac electrophysiology for therapy planning.

  5. Simulation technologies in networking and communications selecting the best tool for the test

    Pathan, Al-Sakib Khan; Khan, Shafiullah

    2014-01-01

    Simulation is a widely used mechanism for validating the theoretical models of networking and communication systems. Although the claims made based on simulations are considered to be reliable, how reliable they really are is best determined with real-world implementation trials.Simulation Technologies in Networking and Communications: Selecting the Best Tool for the Test addresses the spectrum of issues regarding the different mechanisms related to simulation technologies in networking and communications fields. Focusing on the practice of simulation testing instead of the theory, it presents

  6. Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool

    Chang, Y.H.; Mosleh, A.; Dang, V.N.

    2003-01-01

    The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)

  7. Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool

    Chang, Y.H.; Mosleh, A.; Dang, V.N

    2003-03-01

    The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)

  8. Tensit - a simulation tool for migration, risk and dose calculations

    Jones, J.; Kautsky, U.; Vahlund, C.F.

    2004-01-01

    During the next years the Swedish Nuclear Fuel and Waste Management Co (SKB) performs site investigations for a future repository of spent nuclear fuel. The repository will be situated in crystalline rock at a depth of approximately 500 m. Novel methods based on systems and landscape ecology are developed to understand the interacting mechanisms and finally, to model radionuclide migration in the biosphere using site specific data. These models and methods are later used as part of the overall safety assessment for the repository where also migration in the near field and in the bedrock is considered. In the present paper, a newly developed probabilistic simulation package, TENSIT, is presented. The package is based on pre-existing codes (Matlab, Simulink and the probabilistic engine-at-risk) and is capable of performing radionuclide migration calculations both for the repository and the biosphere. Hence, a platform independent, transparent (well documented and intuitive on a model scale), thoroughly supported, efficient and user friendly (graphical interface for the modeler) code can be developed at a fairly low cost. Comparisons with other codes used for compartment based biosphere modelling and the PSACOIN Level 1B exercise shows on a good agreement on the application scale. Moreover, by basing the package on continuously maintained, pre-existing codes, potential risks associated with a less spread software may be avoided. In addition to the compartment based models based on transfer factors, TENSIT is also able to handle the more complex ecosystem models (based on flow of carbon and nutrients) either separately or in combination with the compartment models. Within the project, biosphere migration calculations have been performed using TENSIT for a watershed in the Forsmark area (one of the studied sites). In this simulation, data from the ongoing site investigation program has been used to define the based model. (author)

  9. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures.

    Ceroni, Alessio; Dell, Anne; Haslam, Stuart M

    2007-08-07

    Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other applications to create intuitive and appealing user

  10. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures

    Dell Anne

    2007-08-01

    Full Text Available Abstract Background Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. Results A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. Conclusion The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other

  11. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  12. Intelligent Electric Power Systems with Active-Adaptive Electric Networks: Challenges for Simulation Tools

    Ufa Ruslan A.

    2015-01-01

    Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of intelligent electric power systems with active-adaptive electric networks (IES including Flexible Alternating Current Transmission System (FACTS devices. The key requirements for the simulation were formed. The presented analysis of simulation results of IES confirms the need to use a hybrid modelling approach.

  13. Simulation of enhanced tokamak performance on DIII-D using fast wave current drive

    Grassie, J.S. de; Lin-Liu, Y.R.; Petty, C.C.; Pinsker, R.I.; Chan, V.S.; Prater, R.; John, H. St.; Baity, F.W.; Goulding, R.H.; Hoffman, D.H.

    1993-01-01

    The fast magnetosonic wave is now recognized to be a leading candidate for noninductive current drive for the tokamak reactor due to the ability of the wave to penetrate to the hot dense core region. Fast wave current drive (FWCD) experiments on DIII-D have realized up to 120 kA of rf current drive, with up to 40% of the plasma current driven noninductively. The success of these experiments at 60 MHz with a 2 MW transmitter source capability has led to a major upgrade of the FWCD system. Two additional transmitters, 30 to 120 MHz, with a 2 MW source capability each, will be added together with two new four-strap antennas in early 1994. Another major thrust of the DIII-D program is to develop advanced tokamak modes of operation, simultaneously demonstrating improvements in confinement and stability in quasi-steady-state operation. In some of the initial advanced tokamak experiments on DIII-D with neutral beam heated (NBI) discharges it has been demonstrated that energy confinement time can be improved by rapidly elongating the plasma to force the current density profile to be more centrally peaked. However, this high-l i phase of the discharge with the commensurate improvement in confinement is transient as the current density profile relaxes. By applying FWCD to the core of such a κ-ramped discharge it may be possible to sustain the high internal inductance and elevated confinement. Using computational tools validated on the initial DIII-D FWCD experiments we find that such a high-l i advanced tokamak discharge should be capable of sustainment at the 1 MA level with the upgraded capability of the FWCD system. (author) 16 refs., 3 figs., 1 tab

  14. Simulation of operational processes in hospital emergency units as lean healthcare tool

    Andreia Macedo Gomes

    2017-07-01

    Full Text Available Recently, the Lean philosophy is gaining importance due to a competitive environment, which increases the need to reduce costs. Lean practices and tools have been applied to manufacturing, services, supply chain, startups and, the next frontier is healthcare. Most lean techniques can be easily adapted to health organizations. Therefore, this paper intends to summarize Lean practices and tools that are already being applied in health organizations. Among the numerous techniques and lean tools used, this research highlights the Simulation. Therefore, in order to understand the use of Simulation as a Lean Healthcare tool, this research aims to analyze, through the simulation technique, the operational dynamics of the service process of a fictitious hospital emergency unit. Initially a systematic review of the literature on the practices and tools of Lean Healthcare was carried out, in order to identify the main techniques practiced. The research highlighted Simulation as the sixth most cited tool in the literature. Subsequently, a simulation of a service model of an emergency unit was performed through the Arena software. As a main result, it can be highlighted that the attendants of the built model presented a degree of idleness, thus, they are able to atend a greater demand. As a last conclusion, it was verified that the emergency room is the process with longer service time and greater overload.

  15. Simulations

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  16. A fast platform for simulating semi-flexible fiber suspensions applied to cell mechanics

    Nazockdast, Ehssan, E-mail: ehssan@cims.nyu.edu [Courant Institute of Mathematical Sciences, New York University, New York, NY 10012 (United States); Center for Computational Biology, Simons Foundation, New York, NY 10010 (United States); Rahimian, Abtin, E-mail: arahimian@acm.org [Courant Institute of Mathematical Sciences, New York University, New York, NY 10012 (United States); Zorin, Denis, E-mail: dzorin@cs.nyu.edu [Courant Institute of Mathematical Sciences, New York University, New York, NY 10012 (United States); Shelley, Michael, E-mail: shelley@cims.nyu.edu [Courant Institute of Mathematical Sciences, New York University, New York, NY 10012 (United States); Center for Computational Biology, Simons Foundation, New York, NY 10010 (United States)

    2017-01-15

    We present a novel platform for the large-scale simulation of three-dimensional fibrous structures immersed in a Stokesian fluid and evolving under confinement or in free-space in three dimensions. One of the main motivations for this work is to study the dynamics of fiber assemblies within biological cells. For this, we also incorporate the key biophysical elements that determine the dynamics of these assemblies, which include the polymerization and depolymerization kinetics of fibers, their interactions with molecular motors and other objects, their flexibility, and hydrodynamic coupling. This work, to our knowledge, is the first technique to include many-body hydrodynamic interactions (HIs), and the resulting fluid flows, in cellular assemblies of flexible fibers. We use non-local slender body theory to compute the fluid–structure interactions of the fibers and a second-kind boundary integral formulation for other rigid bodies and the confining boundary. A kernel-independent implementation of the fast multipole method is utilized for efficient evaluation of HIs. The deformation of the fibers is described by nonlinear Euler–Bernoulli beam theory and their polymerization is modeled by the reparametrization of the dynamic equations in the appropriate non-Lagrangian frame. We use a pseudo-spectral representation of fiber positions and implicit time-stepping to resolve large fiber deformations, and to allow time-steps not excessively constrained by temporal stiffness or fiber–fiber interactions. The entire computational scheme is parallelized, which enables simulating assemblies of thousands of fibers. We use our method to investigate two important questions in the mechanics of cell division: (i) the effect of confinement on the hydrodynamic mobility of microtubule asters; and (ii) the dynamics of the positioning of mitotic spindle in complex cell geometries. Finally to demonstrate the general applicability of the method, we simulate the sedimentation of a

  17. A fast platform for simulating semi-flexible fiber suspensions applied to cell mechanics

    Nazockdast, Ehssan; Rahimian, Abtin; Zorin, Denis; Shelley, Michael

    2017-01-01

    We present a novel platform for the large-scale simulation of three-dimensional fibrous structures immersed in a Stokesian fluid and evolving under confinement or in free-space in three dimensions. One of the main motivations for this work is to study the dynamics of fiber assemblies within biological cells. For this, we also incorporate the key biophysical elements that determine the dynamics of these assemblies, which include the polymerization and depolymerization kinetics of fibers, their interactions with molecular motors and other objects, their flexibility, and hydrodynamic coupling. This work, to our knowledge, is the first technique to include many-body hydrodynamic interactions (HIs), and the resulting fluid flows, in cellular assemblies of flexible fibers. We use non-local slender body theory to compute the fluid–structure interactions of the fibers and a second-kind boundary integral formulation for other rigid bodies and the confining boundary. A kernel-independent implementation of the fast multipole method is utilized for efficient evaluation of HIs. The deformation of the fibers is described by nonlinear Euler–Bernoulli beam theory and their polymerization is modeled by the reparametrization of the dynamic equations in the appropriate non-Lagrangian frame. We use a pseudo-spectral representation of fiber positions and implicit time-stepping to resolve large fiber deformations, and to allow time-steps not excessively constrained by temporal stiffness or fiber–fiber interactions. The entire computational scheme is parallelized, which enables simulating assemblies of thousands of fibers. We use our method to investigate two important questions in the mechanics of cell division: (i) the effect of confinement on the hydrodynamic mobility of microtubule asters; and (ii) the dynamics of the positioning of mitotic spindle in complex cell geometries. Finally to demonstrate the general applicability of the method, we simulate the sedimentation of a

  18. Thermal-hydraulic numerical simulation of fuel sub-assembly for Sodium-cooled Fast Reactor

    Saxena, Aakanksha

    2014-01-01

    The thesis focuses on the numerical simulation of sodium flow in wire wrapped sub-assembly of Sodium-cooled Fast Reactor (SFR). First calculations were carried out by a time averaging approach called RANS (Reynolds- Averaged Navier-Stokes equations) using industrial code STAR-CCM+. This study gives a clear understanding of heat transfer between the fuel pin and sodium. The main variables of the macroscopic flow are in agreement with correlations used hitherto. However, to obtain a detailed description of temperature fluctuations around the spacer wire, more accurate approaches like LES (Large Eddy Simulation) and DNS (Direct Numerical Simulation) are clearly needed. For LES approach, the code TRIO U was used and for the DNS approach, a research code was used. These approaches require a considerable long calculation time which leads to the need of representative but simplified geometry. The DNS approach enables us to study the thermal hydraulics of sodium that has very low Prandtl number inducing a very different behavior of thermal field in comparison to the hydraulic field. The LES approach is used to study the local region of sub-assembly. This study shows that spacer wire generates the local hot spots (∼20 C) on the wake side of spacer wire with respect to the sodium flow at the region of contact with the fuel pin. Temperature fluctuations around the spacer wire are low (∼1 C-2 C). Under nominal operation, the spectral analysis shows the absence of any dominant peak for temperature oscillations at low frequency (2-10 Hz). The obtained spectra of temperature oscillations can be used as an input for further mechanical studies to determine its impact on the solid structures. (author) [fr

  19. A fast platform for simulating semi-flexible fiber suspensions applied to cell mechanics

    Nazockdast, Ehssan; Rahimian, Abtin; Zorin, Denis; Shelley, Michael

    2017-01-01

    We present a novel platform for the large-scale simulation of three-dimensional fibrous structures immersed in a Stokesian fluid and evolving under confinement or in free-space in three dimensions. One of the main motivations for this work is to study the dynamics of fiber assemblies within biological cells. For this, we also incorporate the key biophysical elements that determine the dynamics of these assemblies, which include the polymerization and depolymerization kinetics of fibers, their interactions with molecular motors and other objects, their flexibility, and hydrodynamic coupling. This work, to our knowledge, is the first technique to include many-body hydrodynamic interactions (HIs), and the resulting fluid flows, in cellular assemblies of flexible fibers. We use non-local slender body theory to compute the fluid-structure interactions of the fibers and a second-kind boundary integral formulation for other rigid bodies and the confining boundary. A kernel-independent implementation of the fast multipole method is utilized for efficient evaluation of HIs. The deformation of the fibers is described by nonlinear Euler-Bernoulli beam theory and their polymerization is modeled by the reparametrization of the dynamic equations in the appropriate non-Lagrangian frame. We use a pseudo-spectral representation of fiber positions and implicit time-stepping to resolve large fiber deformations, and to allow time-steps not excessively constrained by temporal stiffness or fiber-fiber interactions. The entire computational scheme is parallelized, which enables simulating assemblies of thousands of fibers. We use our method to investigate two important questions in the mechanics of cell division: (i) the effect of confinement on the hydrodynamic mobility of microtubule asters; and (ii) the dynamics of the positioning of mitotic spindle in complex cell geometries. Finally to demonstrate the general applicability of the method, we simulate the sedimentation of a cloud of

  20. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  1. A simulator-independent optimization tool based on genetic algorithm applied to nuclear reactor design

    Abreu Pereira, Claudio Marcio Nascimento do; Schirru, Roberto; Martinez, Aquilino Senra

    1999-01-01

    Here is presented an engineering optimization tool based on a genetic algorithm, implemented according to the method proposed in recent work that has demonstrated the feasibility of the use of this technique in nuclear reactor core designs. The tool is simulator-independent in the sense that it can be customized to use most of the simulators which have the input parameters read from formatted text files and the outputs also written from a text file. As the nuclear reactor simulators generally use such kind of interface, the proposed tool plays an important role in nuclear reactor designs. Research reactors may often use non-conventional design approaches, causing different situations that may lead the nuclear engineer to face new optimization problems. In this case, a good optimization technique, together with its customizing facility and a friendly man-machine interface could be very interesting. Here, the tool is described and some advantages are outlined. (author)

  2. Finite Element Simulation of Sheet Metal Forming Process Using Local Interpolation for Tool Surfaces

    Hama, Takayuki; Takuda, Hirohiko; Takamura, Masato; Makinouchi, Akitake; Teodosiu, Cristian

    2005-01-01

    Treatment of contact between a sheet and tools is one of the most difficult problems to deal with in finite-element simulations of sheet forming processes. In order to obtain more accurate tool models without increasing the number of elements, this paper describes a new formulation for contact problems using interpolation proposed by Nagata for tool surfaces. A contact search algorithm between sheet nodes and the interpolated tool surfaces was developed and was introduced into the static-explicit elastoplastic finite-element method code STAMP3D. Simulations of a square cup deep drawing process with a very coarsely discretized punch model were carried out. The simulated results showed that the proposed algorithm gave the proper drawn shape, demonstrating the validity of the proposed algorithm

  3. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  4. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  5. Dielectric properties of proteins from simulations: tools and techniques

    Simonson, Thomas; Perahia, David

    1995-09-01

    Tools and techniques to analyze the dielectric properties of proteins are described. Microscopic dielectric properties are determined by a susceptibility tensor of order 3 n, where n is the number of protein atoms. For perturbing charges not too close to the protein, the dielectric relaxation free energy is directly related to the dipole-dipole correlation matrix of the unperturbed protein, or equivalently to the covariance matrix of its atomic displacements. These are straightforward to obtain from existing molecular dynamics packages such as CHARMM or X- PLOR. Macroscopic dielectric properties can be derived from the dipolar fluctuations of the protein, by idealizing the protein as one or more spherical media. The dipolar fluctuations are again directly related to the covariance matrix of the atomic displacements. An interesting consequence is that the quasiharmonic approximation, which by definition exactly reproduces this covariance matrix, gives the protein dielectric constant exactly. Finally a technique is reviewed to obtain normal or quasinormal modes of vibration of symmetric protein assemblies. Using elementary group theory, and eliminating the high-frequency modes of vibration of each monomer, the limiting step in terms of memory and computation is finding the normal modes of a single monomer, with the other monomers held fixed. This technique was used to study the dielectric properties of the Tobacco Mosaic Virus protein disk.

  6. Benchmarking of fast-running software tools used to model releases during nuclear accidents

    Devitt, P.; Viktorov, A., E-mail: Peter.Devitt@cnsc-ccsn.gc.ca, E-mail: Alex.Viktorov@cnsc-ccsn.gc.ca [Canadian Nuclear Safety Commission, Ottawa, ON (Canada)

    2015-07-01

    Fukushima highlighted the importance of effective nuclear accident response. However, its complexity greatly impacted the ability to provide timely and accurate information to national and international stakeholders. Safety recommendations provided by different national and international organizations varied notably. Such differences can partially be attributed to different methods used in the initial assessment of accident progression and the amount of radioactivity release.Therefore, a comparison of methodologies was undertaken by the NEA/CSNI and its highlights are presented here. For this project, the prediction tools used by various emergency response organizations for estimating the source terms and public doses were examined. Those organizations that have a capability to use such tools responded to a questionnaire describing each code's capabilities and main algorithms. Then the project's participants analyzed five accident scenarios to predict the source term, dispersion of releases and public doses. (author)

  7. A Simulation Tool for the Study of Symmetric Inversions in Bacterial Genomes

    Dias, Ulisses; Dias, Zanoni; Setubal, João C.

    We present the tool SIB that simulates genomic inversions in bacterial chromosomes. The tool simulates symmetric inversions but allows the appearance of nonsymmetric inversions by simulating small syntenic blocks frequently observed on bacterial genome comparisons. We evaluate SIB by comparing its results to real genome alignments. We develop measures that allow quantitative comparisons between real pairwise alignments (in terms of dotplots) and simulated ones. These measures allow an evaluation of SIB in terms of dendrograms. We evaluate SIB by comparing its results to whole chromosome alignments and maximum likelihood trees for three bacterial groups (the Pseudomonadaceae family and the Xanthomonas and Shewanella genera). We demonstrate an application of SIB by using it to evaluate the ancestral genome reconstruction tool MGR.

  8. Simulation of silicon microdosimetry spectra in fast neutron therapy using the GEANT4 Monte Carlo toolkit

    Cornelius, I.M.; Rosenfeld, A.B.

    2003-01-01

    Microdosimetry is used to predict the biological effects of the densely ionizing radiation environments of hadron therapy and space. The creation of a solid state microdosimeter to replace the conventional Tissue Equivalent Proportional Counter (TEPC) is a topic of ongoing research. The Centre for Medical Radiation Physics has been investigating a technique using microscopic arrays of reverse biased PN junctions. A prototype silicon-on-insulator (SOI) microdosimeter was developed and preliminary measurements have been conducted at several hadron therapy facilities. Several factors impede the application of silicon microdosimeters to hadron therapy. One of the major limitations is that of tissue equivalence, ideally the silicon microdosimeter should provide a microdosimetry distribution identical to that of a microscopic volume of tissue. For microdosimetry in neutron fields, such as Fast Neutron Therapy, it is important that products resulting from neutron interactions in the non tissue equivalent sensitive volume do not contribute significantly to the spectrum. Experimental measurements have been conducted at the Gershenson Radiation Oncology Center, Harper Hospital, Detroit by Bradley et al. The aim was to provide a comparison with measurements performed with a TEPC under identical experimental conditions. Monte Carlo based calculations of these measurements were made using the GEANT4 Monte Carlo toolkit. Agreement between experimental and theoretical results was observed. The model illustrated the importance of neutron interactions in the non tissue equivalent sensitive volume and showed this effect to decrease with sensitive volume size as expected. Simulations were also performed for 1 micron cubic silicon sensitive volumes embedded in tissue equivalent material to predict the best case scenario for silicon microdosimetry in Fast Neutron Therapy

  9. Inventory simulation tools: Separating nuclide contributions to radiological quantities

    Gilbert, Mark R.; Fleming, Michael; Sublet, Jean-Christophe

    2017-09-01

    The activation response of a material is a primary factor considered when evaluating its suitability for a nuclear application. Various radiological quantities, such as total (becquerel) activity, decay heat, and γ dose, can be readily predicted via inventory simulations, which numerically evolve in time the composition of a material under exposure to neutron irradiation. However, the resulting data sets can be very complex, often necessarily resulting in an over-simplification of the results - most commonly by just considering total response metrics. A number of different techniques for disseminating more completely the vast amount of data output from, in particular, the FISPACT-II inventory code system, including importance diagrams, nuclide maps, and primary knock-on atom (PKA) spectra, have been developed and used in scoping studies to produce database reports for the periodic table of elements. This paper introduces the latest addition to this arsenal - standardised and automated plotting of the time evolution in a radiological quantity for a given material separated by contributions from dominant radionuclides. Examples for relevant materials under predicted fusion reactor conditions, and for bench-marking studies against decay-heat measurements, demonstrate the usefulness and power of these radionuclide-separated activation plots. Note to the reader: the pdf file has been changed on September 22, 2017.

  10. Inventory simulation tools: Separating nuclide contributions to radiological quantities

    Gilbert Mark R.

    2017-01-01

    Full Text Available The activation response of a material is a primary factor considered when evaluating its suitability for a nuclear application. Various radiological quantities, such as total (becquerel activity, decay heat, and γ dose, can be readily predicted via inventory simulations, which numerically evolve in time the composition of a material under exposure to neutron irradiation. However, the resulting data sets can be very complex, often necessarily resulting in an over-simplification of the results – most commonly by just considering total response metrics. A number of different techniques for disseminating more completely the vast amount of data output from, in particular, the FISPACT-II inventory code system, including importance diagrams, nuclide maps, and primary knock-on atom (PKA spectra, have been developed and used in scoping studies to produce database reports for the periodic table of elements. This paper introduces the latest addition to this arsenal – standardised and automated plotting of the time evolution in a radiological quantity for a given material separated by contributions from dominant radionuclides. Examples for relevant materials under predicted fusion reactor conditions, and for bench-marking studies against decay-heat measurements, demonstrate the usefulness and power of these radionuclide-separated activation plots.

  11. Solar assisted heat pump on air collectors: A simulation tool

    Karagiorgas, Michalis; Galatis, Kostas; Tsagouri, Manolis [Department of Mechanical Engineering Educators, ASPETE, N. Iraklio, GR 14121 (Greece); Tsoutsos, Theocharis [Environmental Engineering Dept., Technical University of Crete, Technical University Campus, GR 73100, Chania (Greece); Botzios-Valaskakis, Aristotelis [Centre for Renewable Energy Sources (CRES), 19th km Marathon Ave., GR 19001, Pikermi (Greece)

    2010-01-15

    The heating system of the bioclimatic building of the Greek National Centre for Renewable Energy Sources (CRES) comprises two heating plants: the first one includes an air source heat pump, Solar Air Collectors (SACs) and a heat distribution system (comprising a fan coil unit network); the second one is, mainly, a geothermal heat pump unit to cover the ground floor thermal needs. The SAC configuration as well as the fraction of the building heating load covered by the heating plant are assessed in two operation modes; the direct (hot air from the collectors is supplied directly to the heated space) and the indirect mode (warm air from the SAC or its mixture with ambient air is not supplied directly to the heated space but indirectly into the evaporator of the air source heat pump). The technique of the indirect mode of heating aims at maximizing the efficiency of the SAC, saving electrical power consumed by the compressor of the heat pump, and therefore, at optimizing the coefficient of performance (COP) of the heat pump due to the increased intake of ambient thermal energy by means of the SAC. Results are given for three research objectives: assessment of the heat pump efficiency whether in direct or indirect heating mode; Assessment of the overall heating plant efficiency on a daily or hourly basis; Assessment of the credibility of the suggested simulation model TSAGAIR by comparing its results with the TRNSYS ones. (author)

  12. Simulating high frequency water quality monitoring data using a catchment runoff attenuation flux tool (CRAFT).

    Adams, Russell; Quinn, Paul F; Perks, Matthew; Barber, Nicholas J; Jonczyk, Jennine; Owen, Gareth J

    2016-12-01

    High resolution water quality data has recently become widely available from numerous catchment based monitoring schemes. However, the models that can reproduce time series of concentrations or fluxes have not kept pace with the advances in monitoring data. Model performance at predicting phosphorus (P) and sediment concentrations has frequently been poor with models not fit for purpose except for predicting annual losses. Here, the data from the Eden Demonstration Test Catchments (DTC) project have been used to calibrate the Catchment Runoff Attenuation Flux Tool (CRAFT), a new, parsimonious model developed with the aim of modelling both the generation and attenuation of nutrients and sediments in small to medium sized catchments. The CRAFT has the ability to run on an hourly timestep and can calculate the mass of sediments and nutrients transported by three flow pathways representing rapid surface runoff, fast subsurface drainage and slow groundwater flow (baseflow). The attenuation feature of the model is introduced here; this enables surface runoff and contaminants transported via this pathway to be delayed in reaching the catchment outlet. It was used to investigate some hypotheses of nutrient and sediment transport in the Newby Beck Catchment (NBC) Model performance was assessed using a suite of metrics including visual best fit and the Nash-Sutcliffe efficiency. It was found that this approach for water quality models may be the best assessment method as opposed to using a single metric. Furthermore, it was found that, when the aim of the simulations was to reproduce the time series of total P (TP) or total reactive P (TRP) to get the best visual fit, that attenuation was required. The model will be used in the future to explore the impacts on water quality of different mitigation options in the catchment; these will include attenuation of surface runoff. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Framatome's ''SAF'' engineering simulator: a first step toward defining the engineer's simulation tool of the year 2000

    Constantieux, T.

    1986-01-01

    Among the techniques available to engineers today, computerized simulation is taking on an ever-growing importance. The ''SAF'' simulator, designed by Framatome for the use of its own engineers, has been in service since 1985. The SAF simulator provides continuous assistance to the engineer, from the preliminary design stage to the precise definition of operating procedures, including safety analysis and sizing computations. For the engineer of the year 2000, who will be used to dialoguing with the computer from a very young age, the SAF represents a first step toward a comprehensive simulation tool. Interactive and thus ''alive'', the SAF combines both extensive programming and data processing capabilities. Its simulation domain can still be considerably extended. Highly modular and equipped with easy-to-use compilers, the SAF can be readily modified and reconfigured by the user, to enable testing new models or new systems, in the complex and detailed environment of the nuclear unit being analysed. Employing the advanced computer programs used in project design, the SAF simulator is a particularly high-performance tool for simulating and analysing complex accident scenarios, including multiple equipment failures and possible operator errors, which may extend to complete draining of the reactor vessel and the release of radioactive fission products within the containment structure

  14. Fast Poisson Solvers for Self-Consistent Beam-Beam and Space-Charge Field Computation in Multiparticle Tracking Simulations

    Florio, Adrien; Pieloni, Tatiana; CERN. Geneva. ATS Department

    2015-01-01

    We present two different approaches to solve the 2-dimensional electrostatic problem with open boundary conditions to be used in fast tracking codes for beam-beam and space charge simulations in high energy accelerators. We compare a fast multipoles method with a hybrid Poisson solver based on the fast Fourier transform and finite differences in polar coordinates. We show that the latter outperforms the first in terms of execution time and precision, allowing for a reduction of the noise in the tracking simulation. Furthermore the new algorithm is shown to scale linearly on parallel architectures with shared memory. We conclude by effectively replacing the HFMM by the new Poisson solver in the COMBI code.

  15. The simulation of the process of sodium freezing in the tubes for the optimization of fast breeder reactor units maintenance

    Tashlykov, O.L.; Shcheklein, S.E.; Annikov, S.V.

    2013-01-01

    The peculiarities of the repair works of the fast breeder reactor sodium systems are considered. The requirements for the sodium melting exclusion inside the equipment and piping during their opening and repair are given. The results of the sodium cooling process simulation with SolidWorks software are also described [ru

  16. A numerical tool for reproducing driver behaviour: experiments and predictive simulations.

    Casucci, M; Marchitto, M; Cacciabue, P C

    2010-03-01

    This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.

  17. Evaluation and comparison of models and modelling tools simulating nitrogen processes in treatment wetlands

    Edelfeldt, Stina; Fritzson, Peter

    2008-01-01

    with Modelica 2.1 (Wiley-IEEE Press, USA, 2004).] and an associated tool. The differences and similarities between the MathModelica Model Editor and three other ecological modelling tools have also been evaluated. The results show that the models can well be modelled and simulated in the MathModelica Model...... Editor, and that nitrogen decrease in a constructed treatment wetland should be described and simulated using the Nitrification/Denitrification model as this model has the highest overall quality score and provides a more variable environment.......In this paper, two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models were implemented, simulated, and visualized using the Modelica modelling and simulation language [P. Fritzson, Principles of Object-Oriented Modelling and Simulation...

  18. Virtual reality and live simulation: a comparison between two simulation tools for assessing mass casualty triage skills.

    Luigi Ingrassia, Pier; Ragazzoni, Luca; Carenzo, Luca; Colombo, Davide; Ripoll Gallardo, Alba; Della Corte, Francesco

    2015-04-01

    This study tested the hypothesis that virtual reality simulation is equivalent to live simulation for testing naive medical students' abilities to perform mass casualty triage using the Simple Triage and Rapid Treatment (START) algorithm in a simulated disaster scenario and to detect the improvement in these skills after a teaching session. Fifty-six students in their last year of medical school were randomized into two groups (A and B). The same scenario, a car accident, was developed identically on the two simulation methodologies: virtual reality and live simulation. On day 1, group A was exposed to the live scenario and group B was exposed to the virtual reality scenario, aiming to triage 10 victims. On day 2, all students attended a 2-h lecture on mass casualty triage, specifically the START triage method. On day 3, groups A and B were crossed over. The groups' abilities to perform mass casualty triage in terms of triage accuracy, intervention correctness, and speed in the scenarios were assessed. Triage and lifesaving treatment scores were assessed equally by virtual reality and live simulation on day 1 and on day 3. Both simulation methodologies detected an improvement in triage accuracy and treatment correctness from day 1 to day 3 (PVirtual reality simulation proved to be a valuable tool, equivalent to live simulation, to test medical students' abilities to perform mass casualty triage and to detect improvement in such skills.

  19. A Comparison of Robotic Simulation Performance on Basic Virtual Reality Skills: Simulator Subjective Versus Objective Assessment Tools.

    Dubin, Ariel K; Smith, Roger; Julian, Danielle; Tanaka, Alyssa; Mattingly, Patricia

    To answer the question of whether there is a difference between robotic virtual reality simulator performance assessment and validated human reviewers. Current surgical education relies heavily on simulation. Several assessment tools are available to the trainee, including the actual robotic simulator assessment metrics and the Global Evaluative Assessment of Robotic Skills (GEARS) metrics, both of which have been independently validated. GEARS is a rating scale through which human evaluators can score trainees' performances on 6 domains: depth perception, bimanual dexterity, efficiency, force sensitivity, autonomy, and robotic control. Each domain is scored on a 5-point Likert scale with anchors. We used 2 common robotic simulators, the dV-Trainer (dVT; Mimic Technologies Inc., Seattle, WA) and the da Vinci Skills Simulator (dVSS; Intuitive Surgical, Sunnyvale, CA), to compare the performance metrics of robotic surgical simulators with the GEARS for a basic robotic task on each simulator. A prospective single-blinded randomized study. A surgical education and training center. Surgeons and surgeons in training. Demographic information was collected including sex, age, level of training, specialty, and previous surgical and simulator experience. Subjects performed 2 trials of ring and rail 1 (RR1) on each of the 2 simulators (dVSS and dVT) after undergoing randomization and warm-up exercises. The second RR1 trial simulator performance was recorded, and the deidentified videos were sent to human reviewers using GEARS. Eight different simulator assessment metrics were identified and paired with a similar performance metric in the GEARS tool. The GEARS evaluation scores and simulator assessment scores were paired and a Spearman rho calculated for their level of correlation. Seventy-four subjects were enrolled in this randomized study with 9 subjects excluded for missing or incomplete data. There was a strong correlation between the GEARS score and the simulator metric

  20. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.