WorldWideScience

Sample records for earth simulator-class computer

  1. The Australian Computational Earth Systems Simulator

    Science.gov (United States)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  2. Inquiry-Based Whole-Class Teaching with Computer Simulations in Physics

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Jan T.; van Joolingen, Wouter

    2015-01-01

    In this study we investigated the pedagogical context of whole-class teaching with computer simulations. We examined relations between the attitudes and learning goals of teachers and their students regarding the use of simulations in whole-class teaching, and how teachers implement these

  3. Large Atmospheric Computation on the Earth Simulator: The LACES Project

    Directory of Open Access Journals (Sweden)

    Michel Desgagné

    2006-01-01

    Full Text Available The Large Atmospheric Computation on the Earth Simulator (LACES project is a joint initiative between Canadian and Japanese meteorological services and academic institutions that focuses on the high resolution simulation of Hurricane Earl (1998. The unique aspect of this effort is the extent of the computational domain, which covers all of North America and Europe with a grid spacing of 1 km. The Canadian Mesoscale Compressible Community (MC2 model is shown to parallelize effectively on the Japanese Earth Simulator (ES supercomputer; however, even using the extensive computing resources of the ES Center (ESC, the full simulation for the majority of Hurricane Earl's lifecycle takes over eight days to perform and produces over 5.2 TB of raw data. Preliminary diagnostics show that the results of the LACES simulation for the tropical stage of Hurricane Earl's lifecycle compare well with available observations for the storm. Further studies involving advanced diagnostics have commenced, taking advantage of the uniquely large spatial extent of the high resolution LACES simulation to investigate multiscale interactions in the hurricane and its environment. It is hoped that these studies will enhance our understanding of processes occurring within the hurricane and between the hurricane and its planetary-scale environment.

  4. The use of computer simulations in whole-class versus small-group settings

    Science.gov (United States)

    Smetana, Lara Kathleen

    This study explored the use of computer simulations in a whole-class as compared to small-group setting. Specific consideration was given to the nature and impact of classroom conversations and interactions when computer simulations were incorporated into a high school chemistry course. This investigation fills a need for qualitative research that focuses on the social dimensions of actual classrooms. Participants included a novice chemistry teacher experienced in the use of educational technologies and two honors chemistry classes. The study was conducted in a rural school in the south-Atlantic United States at the end of the fall 2007 semester. The study took place during one instructional unit on atomic structure. Data collection allowed for triangulation of evidence from a variety of sources approximately 24 hours of video- and audio-taped classroom observations, supplemented with the researcher's field notes and analytic journal; miscellaneous classroom artifacts such as class notes, worksheets, and assignments; open-ended pre- and post-assessments; student exit interviews; teacher entrance, exit and informal interviews. Four web-based simulations were used, three of which were from the ExploreLearning collection. Assessments were analyzed using descriptive statistics and classroom observations, artifacts and interviews were analyzed using Erickson's (1986) guidelines for analytic induction. Conversational analysis was guided by methods outlined by Erickson (1982). Findings indicated (a) the teacher effectively incorporated simulations in both settings (b) students in both groups significantly improved their understanding of the chemistry concepts (c) there was no statistically significant difference between groups' achievement (d) there was more frequent exploratory talk in the whole-class group (e) there were more frequent and meaningful teacher-student interactions in the whole-class group (f) additional learning experiences not measured on the assessment

  5. Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes

    Science.gov (United States)

    West, Jan; Veenstra, Anneke

    2012-01-01

    Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…

  6. Investigating an intervention to support computer simulation use in whole-class teaching

    NARCIS (Netherlands)

    Rutten, N.P.G.; van Joolingen, W.R.; van der Veen, J.T.

    2016-01-01

    Going beyond simply measuring the effectiveness of a teaching approach with computer simulations during whole-class science instruction, we investigated the interaction between teachers and their students as well as searched for mechanisms in the pedagogical context related to teachers’

  7. Super computer displays future of the earth

    International Nuclear Information System (INIS)

    Yokokawa, Mitsuo; Tani, Keiji

    2000-01-01

    Science and Technology Agency has promoted a project of estimation of the earth environment fluctuation since Fiscal 1997. As one of series, it is developing a very high speed parallel computer 'the earth 'simulator' with 5TFLOPS of effective performance (40TFLOPS of peak performance). Abstract of the hardware, basic software and application software is explained. Hardware is constructed by a distributed memory type parallel computer and single-stage crossbars network. Main storage capacity is 10 TB. The basic software consisted of hierarchical structure with operating system, compiler, operation and management software. In the earth simulator, 640 nodes are connected by magnetic disk units, so that input/output of calculation is parallel processor, the most important development item. The earth simulator project is developing a software, NJR (NASDA-JAMSTEC-RIST) program, which is atmosphere and ocean large circulation joint model library system. An example of analysis showed a global distribution of rain a day in the earth. (S.Y.)

  8. Computer simulations of rare earth sites in glass: experimental tests and applications to laser materials

    International Nuclear Information System (INIS)

    Weber, M.J.

    1984-11-01

    Computer simulations of the microscopic structure of BeF 2 glasses using molecular dynamics are reviewed and compared with x-ray and neutron diffraction, EXAFS, NMR, and optical measurements. Unique information about the site-to-site variations in the local environments of rare earth ions is obtained using optical selective excitation and laser-induced fluorescence line-narrowing techniques. Applications and limitations of computer simulations to the development of laser glasses and to predictions of other static and dynamic properties of glasses are discussed. 35 references, 2 figures, 2 tables

  9. Outline of the earth simulator project

    International Nuclear Information System (INIS)

    Tani, Keiji

    2000-01-01

    The Science and Technology Agency of Japan has proposed a project to promote studies for global change prediction by an integrated three-in-one research and development approach: earth observation, basic research, and computer simulation. As part of the project, we are developing an ultra-fast computer, the 'Earth Simulator', with a sustained speed of more than 5 TFLOPS for an atmospheric circulation code. The 'Earth Simulator' is a MIMD type distributed memory parallel system in which 640 processor nodes are connected via fast single-stage crossbar network. Earth node consists of 8 vector-type arithmetic processors which are tightly connected via shared memory. The peak performance of the total system is 40 TFLOPS. As part of the development of basic software system, we are developing an operation supporting software system what is called a 'center routine'. We are going to use an archival system as a main storage of user files. Therefore, the most important function of the center routine is the optimal scheduling of not only submitted batch jobs but also user files necessary for them. All the design and R and D works for both hardware and basic software systems have been completed during the last three fiscal years, FY97, 98 and 99. The manufacture of the hardware system and the development of the center routine are underway. Facilities necessary for the Earth Simulator including buildings are also under construction. The total system will be completed in the spring of 2002. (author)

  10. The computational challenges of Earth-system science.

    Science.gov (United States)

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  11. Modeling subsurface reactive flows using leadership-class computing

    Energy Technology Data Exchange (ETDEWEB)

    Mills, Richard Tran [Computational Earth Sciences Group, Computer Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831-6015 (United States); Hammond, Glenn E [Hydrology Group, Environmental Technology Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Lichtner, Peter C [Hydrology, Geochemistry, and Geology Group, Earth and Environmental Sciences Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Sripathi, Vamsi [Department of Computer Science, North Carolina State University, Raleigh, NC 27695-8206 (United States); Mahinthakumar, G [Department of Civil, Construction, and Environmental Engineering, North Carolina State University, Raleigh, NC 27695-7908 (United States); Smith, Barry F, E-mail: rmills@ornl.go, E-mail: glenn.hammond@pnl.go, E-mail: lichtner@lanl.go, E-mail: vamsi_s@ncsu.ed, E-mail: gmkumar@ncsu.ed, E-mail: bsmith@mcs.anl.go [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, IL 60439-4844 (United States)

    2009-07-01

    We describe our experiences running PFLOTRAN-a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media- on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.

  12. Modeling subsurface reactive flows using leadership-class computing

    International Nuclear Information System (INIS)

    Mills, Richard Tran; Hammond, Glenn E; Lichtner, Peter C; Sripathi, Vamsi; Mahinthakumar, G; Smith, Barry F

    2009-01-01

    We describe our experiences running PFLOTRAN-a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media- on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.

  13. Visualization system on the earth simulator user's guide

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Sai, Kazunori

    2002-08-01

    A visualization system on the Earth Simulator is developed. The system enables users to see a graphic representation of simulation results on a client terminal simultaneously with them being computed on the Earth Simulator. Moreover, the system makes it possible to change parameters of the calculation and its visualization in the middle of calculation. The graphical user interface (GUI) of the system is constructed on a Java applet. Consequently, the client only needs a web browser, so it is independent of operating systems. The system consists of a server function, post-processing function and client function. The server and post-processing functions work on the Earth Simulator, and the client function works on the client terminal. The server function employs a library style format so that users can easily invoke real-time visualization functions by applying their code. The post-processing function employs a library style format and moreover provides a load module. This report describes mainly the usage of the server and post-processing functions. (author)

  14. SEISMIC SIMULATIONS USING PARALLEL COMPUTING AND THREE-DIMENSIONAL EARTH MODELS TO IMPROVE NUCLEAR EXPLOSION PHENOMENOLOGY AND MONITORING

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Matzel, E; Pasyanos, M; Petersson, A; Sjogreen, B; Bono, C; Vorobiev, O; Antoun, T; Walter, W; Myers, S; Lomov, I

    2008-07-07

    The development of accurate numerical methods to simulate wave propagation in three-dimensional (3D) earth models and advances in computational power offer exciting possibilities for modeling the motions excited by underground nuclear explosions. This presentation will describe recent work to use new numerical techniques and parallel computing to model earthquakes and underground explosions to improve understanding of the wave excitation at the source and path-propagation effects. Firstly, we are using the spectral element method (SEM, SPECFEM3D code of Komatitsch and Tromp, 2002) to model earthquakes and explosions at regional distances using available 3D models. SPECFEM3D simulates anelastic wave propagation in fully 3D earth models in spherical geometry with the ability to account for free surface topography, anisotropy, ellipticity, rotation and gravity. Results show in many cases that 3D models are able to reproduce features of the observed seismograms that arise from path-propagation effects (e.g. enhanced surface wave dispersion, refraction, amplitude variations from focusing and defocusing, tangential component energy from isotropic sources). We are currently investigating the ability of different 3D models to predict path-specific seismograms as a function of frequency. A number of models developed using a variety of methodologies are available for testing. These include the WENA/Unified model of Eurasia (e.g. Pasyanos et al 2004), the global CUB 2.0 model (Shapiro and Ritzwoller, 2002), the partitioned waveform model for the Mediterranean (van der Lee et al., 2007) and stochastic models of the Yellow Sea Korean Peninsula region (Pasyanos et al., 2006). Secondly, we are extending our Cartesian anelastic finite difference code (WPP of Nilsson et al., 2007) to model the effects of free-surface topography. WPP models anelastic wave propagation in fully 3D earth models using mesh refinement to increase computational speed and improve memory efficiency. Thirdly

  15. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  16. A high-orbit collimating infrared earth simulator

    International Nuclear Information System (INIS)

    Zhang Guoyu; Jiang Huilin; Fang Yang; Yu Huadong; Xu Xiping; Wang, Lingyun; Liu Xuli; Huang Lan; Yue Shixin; Peng Hui

    2007-01-01

    The earth simulator is the most important testing equipment ground-based for the infrared earth sensor, and it is also a key component in the satellite controlling system. for three orbit heights 18000Km, 35786Km and 42000Km, in this paper we adopt a project of collimation and replaceable earth diaphragm and develop a high orbit collimation earth simulator. This simulator can afford three angles 15.19 0 , 17.46 0 and 30.42 0 , resulting simulating the earth on the ground which can be seen in out space by the satellite. In this paper we introduce the components, integer structure, and the earth's field angles testing method of the earth simulator in detail. Germanium collimation lens is the most important component in the earth simulator. According to the optical configuration parameter of Germanium collimation lens, we find the location and size of the earth diaphragm and the hot earth by theoretical analyses and optics calculation, which offer foundation of design in the study of the earth simulator. The earth angle is the index to scale the precision of earth simulator. We test the three angles by experiment and the results indicate that three angles errors are all less than ±0.05 0

  17. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    Science.gov (United States)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially

  18. Computational Simulation on Electrowinning for Used LiCl-KCl salts

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, Sung June; Kim, Pyeong Hwa; Hwang, Il Soon [KAERI, Daejeon (Korea, Republic of); Park, Jae Yeong [Korea Institute of Nuclear Safety, Daejoen (Korea, Republic of)

    2016-05-15

    That purification is consisted of electrowinning with liquid metal cathode and selective oxidation with chemical equilibrium by using metal chloride as an oxidizing agent. Actinides and rare earth elements are deposited to liquid cathode in electrowinning and rare earth elements are selectively extracted to molten salt, however, code posited Li react to oxidizing agent prior to rare earth elements which are intended to react in selective oxidation. Also if termination point of actinides deposition in electrowinning is clearly known, we would decrease amount of reacting rare earth elements as well as Li and throughput could be enhanced. For pyroprocess research computational simulation is important to save limited resources and research environment. This study shows computational modeling on electrowinning with Bi cathode by using electrochemical simulation code REFIN. This study shows that it is possible to simulate electrochemical behaviors of at least seven elements (excluding electrode and electrolyte materials) according to real time. In order to enhance accuracy of simulation results, it is suggested that combination of REFIN and CFD modeling on two immiscible liquid to calculate diffusion boundary layer thickness as well.

  19. R.E.S.E.X. A computer simulation program for rare earth separation processes

    International Nuclear Information System (INIS)

    Casarci, M.; Gasparini, G.M.; Sanfilippo, L; Pozio, A.

    1996-01-01

    Lanthanides are most commonly separated using complex solvent extraction circuits. A simulation code has been developed by E.N.E.A. called R.E.S.E.X. (Rare Earth Solvent Extraction) which is able to simulate a solvent extraction battery up to 200 stages, using different configurations. The contemporary utilisation of an equilibrium data bank and of a simulation code allows the theoretical study of new rare earth separative processes or the optimisation of existing ones. As an example of this strategy are reported the results of the Pr/Nd separation in TBP 50 % in aromatic solvent

  20. A Geostationary Earth Orbit Satellite Model Using Easy Java Simulation

    Science.gov (United States)

    Wee, Loo Kang; Goh, Giam Hwee

    2013-01-01

    We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic…

  1. Interactive visualization of Earth and Space Science computations

    Science.gov (United States)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  2. Games and Simulations for Climate, Weather and Earth Science Education

    Science.gov (United States)

    Russell, R. M.; Clark, S.

    2015-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory.

  3. Virtual Earth System Laboratory (VESL): A Virtual Research Environment for The Visualization of Earth System Data and Process Simulations

    Science.gov (United States)

    Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.

    2017-12-01

    The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.

  4. Earth and Space Science Ph.D. Class of 2003 Report released

    Science.gov (United States)

    Keelor, Brad

    AGU and the American Geological Institute (AGI) released on 26 July an employment study of 180 Earth and space science Ph.D. recipients who received degrees from U.S. universities in 2003. The AGU/AGI survey asked graduates about their education and employment, efforts to find their first job after graduation, and experiences in graduate school. Key results from the study include: The vast majority (87%) of 2003 graduates found work in the Earth and space sciences, earning salaries commensurate with or slightly higher than 2001 and 2002 salary averages. Most (64%) graduates were employed within academia (including postdoctoral appointments), with the remainder in government (19%), industry (10%), and other (7%) sectors. Most graduates were positive about their employment situation and found that their work was challenging, relevant, and appropriate for someone with a Ph.D. The percentage of Ph.D. recipients accepting postdoctoral positions (58%) increased slightly from 2002. In contrast, the fields of physics and chemistry showed significant increases in postdoctoral appointments for Ph.D.s during the same time period. As in previous years, recipients of Ph.D.s in the Earth, atmospheric, and ocean sciences (median age of 32.7 years) are slightly older than Ph.D. recipients in most other natural sciences (except computer sciences), which is attributed to time taken off between undergraduate and graduate studies. Women in the Earth, atmospheric,and ocean sciences earned 33% of Ph.D.s in the class of 2003, surpassing the percentage of Ph.D.s earned by women in chemistry (32%) and well ahead of the percentage in computer sciences (20%), physics (19%), and engineering (17%). Participation of other underrepresented groups in the Earth, atmospheric, and ocean sciences remained extremely low.

  5. Physical modeling and high-performance GPU computing for characterization, interception, and disruption of hazardous near-Earth objects

    Science.gov (United States)

    Kaplinger, Brian Douglas

    For the past few decades, both the scientific community and the general public have been becoming more aware that the Earth lives in a shooting gallery of small objects. We classify all of these asteroids and comets, known or unknown, that cross Earth's orbit as near-Earth objects (NEOs). A look at our geologic history tells us that NEOs have collided with Earth in the past, and we expect that they will continue to do so. With thousands of known NEOs crossing the orbit of Earth, there has been significant scientific interest in developing the capability to deflect an NEO from an impacting trajectory. This thesis applies the ideas of Smoothed Particle Hydrodynamics (SPH) theory to the NEO disruption problem. A simulation package was designed that allows efficacy simulation to be integrated into the mission planning and design process. This is done by applying ideas in high-performance computing (HPC) on the computer graphics processing unit (GPU). Rather than prove a concept through large standalone simulations on a supercomputer, a highly parallel structure allows for flexible, target dependent questions to be resolved. Built around nonclassified data and analysis, this computer package will allow academic institutions to better tackle the issue of NEO mitigation effectiveness.

  6. The earth simulator: Roles and impacts

    International Nuclear Information System (INIS)

    Sato, Tetsuya

    2004-01-01

    In the first place, the architecture and performance of the Earth Simulator and the managing system of the Earth Simulator Center are briefly described. Secondly, some examples of products obtained so far by ES are presented to demonstrate its actual power. Then, a holistic simulator concept that should be the next generation simulator is described, and lastly the paradigm shift in science, manufacturing and human thought that could be brought by the holistic simulator is mentioned

  7. Macromod: Computer Simulation For Introductory Economics

    Science.gov (United States)

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  8. Simulating Snow in Canadian Boreal Environments with CLASS for ESM-SnowMIP

    Science.gov (United States)

    Wang, L.; Bartlett, P. A.; Derksen, C.; Ireson, A. M.; Essery, R.

    2017-12-01

    The ability of land surface schemes to provide realistic simulations of snow cover is necessary for accurate representation of energy and water balances in climate models. Historically, this has been particularly challenging in boreal forests, where poor treatment of both snow masking by forests and vegetation-snow interaction has resulted in biases in simulated albedo and snowpack properties, with subsequent effects on both regional temperatures and the snow albedo feedback in coupled simulations. The SnowMIP (Snow Model Intercomparison Project) series of experiments or `MIPs' was initiated in order to provide assessments of the performance of various snow- and land-surface-models at selected locations, in order to understand the primary factors affecting model performance. Here we present preliminary results of simulations conducted for the third such MIP, ESM-SnowMIP (Earth System Model - Snow Model Intercomparison Project), using the Canadian Land Surface Scheme (CLASS) at boreal forest sites in central Saskatchewan. We assess the ability of our latest model version (CLASS 3.6.2) to simulate observed snowpack properties (snow water equivalent, density and depth) and above-canopy albedo over 13 winters. We also examine the sensitivity of these simulations to climate forcing at local and regional scales.

  9. Earth2Class: Bringing the Earth to the Classroom-Innovative Connections between Research Scientists, Teachers, and Students

    Science.gov (United States)

    Passow, M. J.

    2017-12-01

    "Earth2Class" (E2C) is a unique program offered through the Lamont-Doherty Earth Observatory of Columbia University. It connects research scientists, classroom teachers, middle and high school students, and others in ways that foster broader outreach of cutting-edge discoveries. One key component are Saturday workshops offered during the school year. These provide investigators with a tested format for sharing research methods and results. Teachers and students learn more about "real"science than what is found in textbooks. They discover that Science is exciting, uncertain, and done by people not very different from themselves. Since 1998, we have offered more than 170 workshops, partnering with more than 90 LDEO scientists. E2C teachers establishe links with scientists that have led to participation in research projects, the LDEO Open House, and other programs. Connections developed between high school students and scientists resulted in authentic science research experiences. A second key component of the project is the E2C website, https://earth2class.org/site/. We provide archived versions of monthly workshops. The website hosts a vast array of resources geared to support learning Earth Science and other subjects. Resources created through an NSF grant to explore strategies which enhance Spatial Thinking in the NYS Regents Earth Science curriculum are found at https://earth2class.org/site/?page_id=2957. The site is well-used by K-12 Earth Science educators, averaging nearly 70k hits per month. A third component of the E2C program are week-long summer institutes offering opportunities to enhance content knowledge in weather and climate; minerals, rocks, and resources; and astronomy. These include exploration of strategies to implement NGSS-based approaches within the school curriculum. Participants can visit LDEO lab facilities and interact with scientists to learn about their research. In the past year, we have begun to create a "satellite" E2C program at UFVJM

  10. Earth simulator project. Seeking a guide line for the symbiosis between the earth (Gaia) and human beings

    International Nuclear Information System (INIS)

    Tani, Keiji; Yokokawa, Mitsuo

    2000-01-01

    In 1997, a project was started to promote the foreseeing study on the changes in global environment by Science and Technology Agency and three studies; a basic process study, observations and computer simulation have started by the project. This report outlines the development of e arth simulator , an ultra high-speed computer as a part of the project. Global warming due to human activities has been progressing on the earth. It is assumed that the mean temperature increase and sea level rising would reach 2degC and 50 cm, respectively before 2100 if the present environmental states still continue in future. There are two inevitable problems in the global environment simulation on global scale; mesh size of simulator and errors in observation data as the initial values for simulation. To accurately simulate the behaviors of thundercloud several kilometers in size, which is closely related to weather disaster, it is necessary to raise the analytic resolution by decreasing the mesh size from 20-30 km to about 1 km, resulting in several hundred times increase of the number of mesh. When compared with CRAY C90, which is the representative computer in the climate and weather field, it is thought necessary to increase the capacities of memory and calculation speed by more than one thousand times. Therefore, it seems necessary to construct a new computer including a parallel type calculator because a computer of effective speed, 5TFLOPS can not be constructed with a single processor. (M.N.)

  11. Creating science simulations through Computational Thinking Patterns

    Science.gov (United States)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  12. Proceedings of joint meeting of the 6th simulation science symposium and the NIFS collaboration research 'large scale computer simulation'

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-03-01

    Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)

  13. Problem of simulating the Earth's induction effects in modeling polar magnetic substorms

    International Nuclear Information System (INIS)

    Mareschal, M.

    1976-01-01

    A major problem encountered in trying to model the current system associated with a polar magnetic substorm from ground-based magnetic observations is the difficulty of adequately evaluating the earth's induction effects. Two methods for simulating these effects are reviewed here. Method 1 simply reduces the earth to a perfect conductor and leads to very simple field equations. Method 2 considers the earth as a ''horizontally'' layered body of finite conductivity but requires a large amount of computational time. The performances of both methods are compared when the substorm current system can be approximated by an infinitely long electrojet flowing over a flat earth. In this case it appears that for most substorm modeling problems it is sufficient to treat the earth as a perfect conductor. The depth of this perfect conductor below the earth's surface should be selected in function of the source frequency content

  14. High performance thermal stress analysis on the earth simulator

    International Nuclear Information System (INIS)

    Noriyuki, Kushida; Hiroshi, Okuda; Genki, Yagawa

    2003-01-01

    In this study, the thermal stress finite element analysis code optimized for the earth simulator was developed. A processor node of which of the earth simulator is the 8-way vector processor, and each processor can communicate using the message passing interface. Thus, there are two ways to parallelize the finite element method on the earth simulator. The first method is to assign one processor for one sub-domain, and the second method is to assign one node (=8 processors) for one sub-domain considering the shared memory type parallelization. Considering that the preconditioned conjugate gradient (PCG) method, which is one of the suitable linear equation solvers for the large-scale parallel finite element methods, shows the better convergence behavior if the number of domains is the smaller, we have determined to employ PCG and the hybrid parallelization, which is based on the shared and distributed memory type parallelization. It has been said that it is hard to obtain the good parallel or vector performance, since the finite element method is based on unstructured grids. In such situation, the reordering is inevitable to improve the computational performance [2]. In this study, we used three reordering methods, i.e. Reverse Cuthil-McKee (RCM), cyclic multicolor (CM) and diagonal jagged descending storage (DJDS)[3]. RCM provides the good convergence of the incomplete lower-upper (ILU) PCG, but causes the load imbalance. On the other hand, CM provides the good load balance, but worsens the convergence of ILU PCG if the vector length is so long. Therefore, we used the combined-method of RCM and CM. DJDS is the method to store the sparse matrices such that longer vector length can be obtained. For attaining the efficient inter-node parallelization, such partitioning methods as the recursive coordinate bisection (RCM) or MeTIS have been used. Computational performance of the practical large-scale engineering problems will be shown at the meeting. (author)

  15. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2013-01-01

    This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...

  16. Proceedings of joint meeting of the 6th simulation science symposium and the NIFS collaboration research 'large scale computer simulation'

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-03-01

    Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)

  17. Standardization of transportation classes for object-oriented deployment simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Burke, J. F., Jr.; Howard, D. L.; Jackson, J.; Macal, C. M.; Nevins, M. R.; Van Groningen, C. N.

    1999-07-30

    Many recent efforts to integrate transportation and deployment simulations, although beneficial, have lacked a feature vital for seamless integration: a common data class representation. It is an objective of the Department of Defense (DoD) to standardize all classes used in object-oriented deployment simulations by developing a standard class attribute representation and behavior for all deployment simulations that rely on an underlying class representation. The Extensive Hierarchy and Object Representation for Transportation Simulations (EXHORT) is a collection of three hierarchies that together will constitute a standard and consistent class attribute representation and behavior that could be used directly by a large set of deployment simulations. The first hierarchy is the Transportation Class Hierarchy (TCH), which describes a significant portion of the defense transportation system; the other two deal with infrastructure and resource classes. EXHORT will allow deployment simulations to use the same set of underlying class data, ensure transparent exchanges, reduce the effort needed to integrate simulations, and permit a detailed analysis of the defense transportation system. This paper describes EXHORT's first hierarchy, the TCH, and provides a rationale for why it is a helpful tool for modeling major portions of the defense transportation system.

  18. Computer simulations of electromagnetic cool ion beam instabilities. [in near earth space

    Science.gov (United States)

    Gary, S. P.; Madland, C. D.; Schriver, D.; Winske, D.

    1986-01-01

    Electromagnetic ion beam instabilities driven by cool ion beams at propagation parallel or antiparallel to a uniform magnetic field are studied using computer simulations. The elements of linear theory applicable to electromagnetic ion beam instabilities and the simulations derived from a one-dimensional hybrid computer code are described. The quasi-linear regime of the right-hand resonant ion beam instability, and the gyrophase bunching of the nonlinear regime of the right-hand resonant and nonresonant instabilities are examined. It is detected that in the quasi-linear regime the instability saturation is due to a reduction in the beam core relative drift speed and an increase in the perpendicular-to-parallel beam temperature; in the nonlinear regime the instabilities saturate when half the initial beam drift kinetic energy density is converted to fluctuating magnetic field energy density.

  19. CLASS: Core Library for Advanced Scenario Simulations

    International Nuclear Information System (INIS)

    Mouginot, B.; Thiolliere, N.

    2015-01-01

    The nuclear reactor simulation community has to perform complex electronuclear scenario simulations. To avoid constraints coming from the existing powerful scenario software such as COSI, VISION or FAMILY, the open source Core Library for Advanced Scenario Simulation (CLASS) has been developed. The main asset of CLASS is its ability to include any type of reactor, whether the system is innovative or standard. A reactor is fully described by its evolution database which should contain a set of different validated fuel compositions in order to simulate transitional scenarios. CLASS aims to be a useful tool to study scenarios involving Generation-IV reactors as well as innovative fuel cycles, like the thorium cycle. In addition to all standard key objects required by an electronuclear scenario simulation (the isotopic vector, the reactor, the fuel storage and the fabrication units), CLASS also integrates two new specific modules: fresh fuel evolution and recycled fuel fabrication. The first module, dealing with fresh fuel evolution, is implemented in CLASS by solving Bateman equations built from a database induced cross-sections. The second module, which incorporates the fabrication of recycled fuel to CLASS, can be defined by user priorities and/or algorithms. By default, it uses a linear Pu equivalent-method, which allows predicting, from the isotopic composition, the maximum burn-up accessible for a set type of fuel. This paper presents the basis of the CLASS scenario, the fuel method applied to a MOX fuel and an evolution module benchmark based on the French electronuclear fleet from 1977 to 2012. Results of the CLASS calculation were compared with the inventory made and published by the ANDRA organisation in 2012. For UOX used fuels, the ANDRA reported 12006 tonnes of heavy metal in stock, including cooling, versus 18500 tonnes of heavy metal predicted by CLASS. The large difference is easily explained by the presence of 56 tonnes of plutonium already separated

  20. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  1. Effect of computer game playing on baseline laparoscopic simulator skills.

    Science.gov (United States)

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  2. THE USE OF LAPTOP COMPUTERS, TABLETS AND GOOGLE EARTH/GOOGLE MAPS APPLICATIONS DURING GEOGRAPHY CLUB SEMINARS

    Directory of Open Access Journals (Sweden)

    FLORIN GALBIN

    2015-01-01

    Full Text Available In the current study, we aim to investigate the use of Google Earth and Google Maps Applications on tablet and laptop computers. The research was carried out during the Geography Club seminars organized at “Radu Petrescu” High School in the 2013-2014 school year. The research involved 13 students in various gymnasium and high school grades. The activities included: navigation with Google Earth/Maps, image capturing techniques, virtual tours, measuring distances or river lengths, identifying relief forms, and locating geographical components of the environment. In order to retrieve students’ opinions regarding the use of tablets and laptop computers with these two applications, they were asked to respond to a questionnaire after the activities took place. Conclusions revealed that students enjoyed using these applications with laptops and tablets and that the learning process during Geography classes became more interesting.

  3. Particle-in-cell simulations of Earth-like magnetosphere during a magnetic field reversal

    Science.gov (United States)

    Barbosa, M. V. G.; Alves, M. V.; Vieira, L. E. A.; Schmitz, R. G.

    2017-12-01

    The geologic record shows that hundreds of pole reversals have occurred throughout Earth's history. The mean interval between the poles reversals is roughly 200 to 300 thousand years and the last reversal occurred around 780 thousand years ago. Pole reversal is a slow process, during which the strength of the magnetic field decreases, become more complex, with the appearance of more than two poles for some time and then the field strength increases, changing polarity. Along the process, the magnetic field configuration changes, leaving the Earth-like planet vulnerable to the harmful effects of the Sun. Understanding what happens with the magnetosphere during these pole reversals is an open topic of investigation. Only recently PIC codes are used to modeling magnetospheres. Here we use the particle code iPIC3D [Markidis et al, Mathematics and Computers in Simulation, 2010] to simulate an Earth-like magnetosphere at three different times along the pole reversal process. The code was modified, so the Earth-like magnetic field is generated using an expansion in spherical harmonics with the Gauss coefficients given by a MHD simulation of the Earth's core [Glatzmaier et al, Nature, 1995; 1999; private communication to L.E.A.V.]. Simulations show the qualitative behavior of the magnetosphere, such as the current structures. Only the planet magnetic field was changed in the runs. The solar wind is the same for all runs. Preliminary results show the formation of the Chapman-Ferraro current in the front of the magnetosphere in all the cases. Run for the middle of the reversal process, the low intensity magnetic field and its asymmetrical configuration the current structure changes and the presence of multiple poles can be observed. In all simulations, a structure similar to the radiation belts was found. Simulations of more severe solar wind conditions are necessary to determine the real impact of the reversal in the magnetosphere.

  4. Integrated Instrument Simulator Suites for Earth Science

    Science.gov (United States)

    Tanelli, Simone; Tao, Wei-Kuo; Matsui, Toshihisa; Hostetler, Chris; Hair, John; Butler, Carolyn; Kuo, Kwo-Sen; Niamsuwan, Noppasin; Johnson, Michael P.; Jacob, Joseph C.; hide

    2012-01-01

    The NASA Earth Observing System Simulators Suite (NEOS3) is a modular framework of forward simulations tools for remote sensing of Earth's Atmosphere from space. It was initiated as the Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) under the NASA Advanced Information Systems Technology (AIST) program of the Earth Science Technology Office (ESTO) to enable science users to perform simulations based on advanced atmospheric and simple land surface models, and to rapidly integrate in a broad framework any experimental or innovative tools that they may have developed in this context. The name was changed to NEOS3 when the project was expanded to include more advanced modeling tools for the surface contributions, accounting for scattering and emission properties of layered surface (e.g., soil moisture, vegetation, snow and ice, subsurface layers). NEOS3 relies on a web-based graphic user interface, and a three-stage processing strategy to generate simulated measurements. The user has full control over a wide range of customizations both in terms of a priori assumptions and in terms of specific solvers or models used to calculate the measured signals.This presentation will demonstrate the general architecture, the configuration procedures and illustrate some sample products and the fundamental interface requirements for modules candidate for integration.

  5. Spectral determination of individual rare earths in different classes of inorganic compounds

    International Nuclear Information System (INIS)

    Karpenko, L.I.; Fadeeva, L.A.; Shevchenko, L.D.

    1979-01-01

    The conditions are found allowing to analyze various inorganic compounds for rare-earth elements without separation from non-rare-earth components. The influence of the plasma composition on the intensity of spectral lines of rare-earth elements is studied. The relative intensity of homologous spectral lines of various rare-earth elements remains constant regardless of the plasma composition. The conditions are found for the determination of individual rare-earth elements acting as both alloying additives (Csub(n) -- n x 10 -1 -n x 10 -3 %), and basic components (up to tens of per cent) in different classes of inorganic compounds of 1-7 elements. The general method is developed for the determination of individual rare-earth elements in mixtures of oxides of rare-earth elements, complex fluorides of rare-earth elements and elements of group 2, gallates, borates, germanates, vanadates of rare-earth elements and aluminium; zirconates-titanates of lead and barium, containing modifying additives of rare-earth elements, complex chalcogenides of rare-earth elements and elements of group 5

  6. Wavelet subband coding of computer simulation output using the A++ array class library

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, J.N.; Brislawn, C.M.; Quinlan, D.J.; Zhang, H.D. [Los Alamos National Lab., NM (United States); Nuri, V. [Washington State Univ., Pullman, WA (United States). School of EECS

    1995-07-01

    The goal of the project is to produce utility software for off-line compression of existing data and library code that can be called from a simulation program for on-line compression of data dumps as the simulation proceeds. Naturally, we would like the amount of CPU time required by the compression algorithm to be small in comparison to the requirements of typical simulation codes. We also want the algorithm to accomodate a wide variety of smooth, multidimensional data types. For these reasons, the subband vector quantization (VQ) approach employed in has been replaced by a scalar quantization (SQ) strategy using a bank of almost-uniform scalar subband quantizers in a scheme similar to that used in the FBI fingerprint image compression standard. This eliminates the considerable computational burdens of training VQ codebooks for each new type of data and performing nearest-vector searches to encode the data. The comparison of subband VQ and SQ algorithms in indicated that, in practice, there is relatively little additional gain from using vector as opposed to scalar quantization on DWT subbands, even when the source imagery is from a very homogeneous population, and our subjective experience with synthetic computer-generated data supports this stance. It appears that a careful study is needed of the tradeoffs involved in selecting scalar vs. vector subband quantization, but such an analysis is beyond the scope of this paper. Our present work is focused on the problem of generating wavelet transform/scalar quantization (WSQ) implementations that can be ported easily between different hardware environments. This is an extremely important consideration given the great profusion of different high-performance computing architectures available, the high cost associated with learning how to map algorithms effectively onto a new architecture, and the rapid rate of evolution in the world of high-performance computing.

  7. Toward an in-situ analytics and diagnostics framework for earth system models

    Science.gov (United States)

    Anantharaj, Valentine; Wolf, Matthew; Rasch, Philip; Klasky, Scott; Williams, Dean; Jacob, Rob; Ma, Po-Lun; Kuo, Kwo-Sen

    2017-04-01

    The development roadmaps for many earth system models (ESM) aim for a globally cloud-resolving model targeting the pre-exascale and exascale systems of the future. The ESMs will also incorporate more complex physics, chemistry and biology - thereby vastly increasing the fidelity of the information content simulated by the model. We will then be faced with an unprecedented volume of simulation output that would need to be processed and analyzed concurrently in order to derive the valuable scientific results. We are already at this threshold with our current generation of ESMs at higher resolution simulations. Currently, the nominal I/O throughput in the Community Earth System Model (CESM) via Parallel IO (PIO) library is around 100 MB/s. If we look at the high frequency I/O requirements, it would require an additional 1 GB / simulated hour, translating to roughly 4 mins wallclock / simulated-day => 24.33 wallclock hours / simulated-model-year => 1,752,000 core-hours of charge per simulated-model-year on the Titan supercomputer at the Oak Ridge Leadership Computing Facility. There is also a pending need for 3X more volume of simulation output . Meanwhile, many ESMs use instrument simulators to run forward models to compare model simulations against satellite and ground-based instruments, such as radars and radiometers. The CFMIP Observation Simulator Package (COSP) is used in CESM as well as the Accelerated Climate Model for Energy (ACME), one of the ESMs specifically targeting current and emerging leadership-class computing platforms These simulators can be computationally expensive, accounting for as much as 30% of the computational cost. Hence the data are often written to output files that are then used for offline calculations. Again, the I/O bottleneck becomes a limitation. Detection and attribution studies also use large volume of data for pattern recognition and feature extraction to analyze weather and climate phenomenon such as tropical cyclones

  8. Discovery of M class objects among the near-earth asteroid population

    Science.gov (United States)

    Tedesco, Edward F.; Gradie, Jonathan

    1987-01-01

    Broadband colorimetry, visual photometry, near-infrared photometry, and 10 and 20 micron radiometry of the near-earth asteroids (NEAs) 1986 DA and 1986 EB are used to show that these objects belong to the M class of asteroids. The similarity among the distributions of taxonomic classes among the 38 NEAs to the abundances found in the inner astoroid belt between the 3:1 and 5:2 resonances suggests that NEAs have their origins among asteroids in the vicinity of these resonances. The implied mineralogy of 1986 DA and 1986 EB is mostly nickel-iron metal; if this is indeed the case, then current models for meteorite production based on strength-related collisional processes on asteroidal surfaces predict that these two objects alone should produce about one percent of all meteorite falls. Iron meteorites derived from these near-earth asteroids should have low cosmic-ray exposure ages.

  9. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    Science.gov (United States)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as

  10. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    Science.gov (United States)

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to

  11. Accelerated simulation of near-Earth-orbit polymer degradation

    Science.gov (United States)

    Laue, Eric

    1992-01-01

    There is a need to simulate the near-Earth-orbit environmental conditions, and it is useful to be able to monitor the changes in physical properties of spacecraft materials. Two different methods for simulating the vacuum-ultraviolet (VUV) and soft X-ray near-Earth-orbit flux are presented. Also, methods for monitoring the changes in optical ultraviolet transmission and mass loss are presented. The results of exposures to VUV photons and charged particles on these materials are discussed.

  12. A general class of preconditioners for statistical iterative reconstruction of emission computed tomography

    International Nuclear Information System (INIS)

    Chinn, G.; Huang, S.C.

    1997-01-01

    A major drawback of statistical iterative image reconstruction for emission computed tomography is its high computational cost. The ill-posed nature of tomography leads to slow convergence for standard gradient-based iterative approaches such as the steepest descent or the conjugate gradient algorithm. In this paper new theory and methods for a class of preconditioners are developed for accelerating the convergence rate of iterative reconstruction. To demonstrate the potential of this class of preconditioners, a preconditioned conjugate gradient (PCG) iterative algorithm for weighted least squares reconstruction (WLS) was formulated for emission tomography. Using simulated positron emission tomography (PET) data of the Hoffman brain phantom, it was shown that the convergence rate of the PCG can reduce the number of iterations of the standard conjugate gradient algorithm by a factor of 2--8 times depending on the convergence criterion

  13. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  14. The ab initio simulation of the Earth's core.

    Science.gov (United States)

    Alfè, D; Gillan, M J; Vocadlo, L; Brodholt, J; Price, G D

    2002-06-15

    The Earth has a liquid outer and solid inner core. It is predominantly composed of Fe, alloyed with small amounts of light elements, such as S, O and Si. The detailed chemical and thermal structure of the core is poorly constrained, and it is difficult to perform experiments to establish the properties of core-forming phases at the pressures (ca. 300 GPa) and temperatures (ca. 5000-6000 K) to be found in the core. Here we present some major advances that have been made in using quantum mechanical methods to simulate the high-P/T properties of Fe alloys, which have been made possible by recent developments in high-performance computing. Specifically, we outline how we have calculated the Gibbs free energies of the crystalline and liquid forms of Fe alloys, and so conclude that the inner core of the Earth is composed of hexagonal close packed Fe containing ca. 8.5% S (or Si) and 0.2% O in equilibrium at 5600 K at the boundary between the inner and outer cores with a liquid Fe containing ca. 10% S (or Si) and 8% O.

  15. Using an In-Class Simulation in the First Accounting Class: Moving from Surface to Deep Learning

    Science.gov (United States)

    Phillips, Mary E.; Graeff, Timothy R.

    2014-01-01

    As students often find the first accounting class to be abstract and difficult to understand, the authors designed an in-class simulation as an intervention to move students toward deep learning and away from surface learning. The simulation consists of buying and selling merchandise and accounting for transactions. The simulation is an effective…

  16. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  17. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  18. Enabling Earth Science Through Cloud Computing

    Science.gov (United States)

    Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian

    2012-01-01

    Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.

  19. Frontiers of massively parallel scientific computation

    International Nuclear Information System (INIS)

    Fischer, J.R.

    1987-07-01

    Practical applications using massively parallel computer hardware first appeared during the 1980s. Their development was motivated by the need for computing power orders of magnitude beyond that available today for tasks such as numerical simulation of complex physical and biological processes, generation of interactive visual displays, satellite image analysis, and knowledge based systems. Representative of the first generation of this new class of computers is the Massively Parallel Processor (MPP). A team of scientists was provided the opportunity to test and implement their algorithms on the MPP. The first results are presented. The research spans a broad variety of applications including Earth sciences, physics, signal and image processing, computer science, and graphics. The performance of the MPP was very good. Results obtained using the Connection Machine and the Distributed Array Processor (DAP) are presented

  20. [Earth Science Technology Office's Computational Technologies Project

    Science.gov (United States)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  1. Temporal Variability of Observed and Simulated Hyperspectral Earth Reflectance

    Science.gov (United States)

    Roberts, Yolanda; Pilewskie, Peter; Kindel, Bruce; Feldman, Daniel; Collins, William D.

    2012-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is a climate observation system designed to study Earth's climate variability with unprecedented absolute radiometric accuracy and SI traceability. Observation System Simulation Experiments (OSSEs) were developed using GCM output and MODTRAN to simulate CLARREO reflectance measurements during the 21st century as a design tool for the CLARREO hyperspectral shortwave imager. With OSSE simulations of hyperspectral reflectance, Feldman et al. [2011a,b] found that shortwave reflectance is able to detect changes in climate variables during the 21st century and improve time-to-detection compared to broadband measurements. The OSSE has been a powerful tool in the design of the CLARREO imager and for understanding the effect of climate change on the spectral variability of reflectance, but it is important to evaluate how well the OSSE simulates the Earth's present-day spectral variability. For this evaluation we have used hyperspectral reflectance measurements from the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY), a shortwave spectrometer that was operational between March 2002 and April 2012. To study the spectral variability of SCIAMACHY-measured and OSSE-simulated reflectance, we used principal component analysis (PCA), a spectral decomposition technique that identifies dominant modes of variability in a multivariate data set. Using quantitative comparisons of the OSSE and SCIAMACHY PCs, we have quantified how well the OSSE captures the spectral variability of Earth?s climate system at the beginning of the 21st century relative to SCIAMACHY measurements. These results showed that the OSSE and SCIAMACHY data sets share over 99% of their total variance in 2004. Using the PCs and the temporally distributed reflectance spectra projected onto the PCs (PC scores), we can study the temporal variability of the observed and simulated reflectance spectra. Multivariate time

  2. Quasilinear simulations of interplanetary shocks and Earth's bow shock

    Science.gov (United States)

    Afanasiev, Alexandr; Battarbee, Markus; Ganse, Urs; Vainio, Rami; Palmroth, Minna; Pfau-Kempf, Yann; Hoilijoki, Sanni; von Alfthan, Sebastian

    2016-04-01

    We have developed a new self-consistent Monte Carlo simulation model for particle acceleration in shocks. The model includes a prescribed large-scale magnetic field and plasma density, temperature and velocity profiles and a self-consistently computed incompressible ULF foreshock under the quasilinear approximation. Unlike previous analytical treatments, our model is time dependent and takes full account of the anisotropic particle distributions and scattering in the wave-particle interaction process. We apply the model to the problem of particle acceleration at traveling interplanetary (IP) shocks and Earth's bow shock and compare the results with hybrid-Vlasov simulations and spacecraft observations. A qualitative agreement in terms of spectral shape of the magnetic fluctuations and the polarization of the unstable mode is found between the models and the observations. We will quantify the differences of the models and explore the region of validity of the quasilinear approach in terms of shock parameters. We will also compare the modeled IP shocks and the bow shock, identifying the similarities and differences in the spectrum of accelerated particles and waves in these scenarios. The work has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 637324 (HESPERIA). The Academy of Finland is thanked for financial support. We acknowledge the computational resources provided by CSC - IT Centre for Science Ltd., Espoo.

  3. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    Science.gov (United States)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that

  4. Computer simulation of atomic collision processes in solids

    International Nuclear Information System (INIS)

    Robinson, M.T.

    1992-11-01

    Computer simulation is a major tool for studying the interactions of swift ions with solids which underlie processes such as particle backscattering, ion implantation, radiation damage, and sputtering. Numerical models are classed as molecular dynamics or binary collision models, along with some intermediate types. Binary collision models are divided into those for crystalline targets and those for structureless ones. The foundations of such models are reviewed, including interatomic potentials, electron excitations, and relationships among the various types of codes. Some topics of current interest are summarized

  5. Computation Reduction Oriented Circular Scanning SAR Raw Data Simulation on Multi-GPUs

    Directory of Open Access Journals (Sweden)

    Hu Chen

    2016-08-01

    Full Text Available As a special working mode, the circular scanning Synthetic Aperture Radar (SAR is widely used in the earth observation. With the increase of resolution and swath width, the simulation data has a massive increase, which boosts the new requirements of efficiency. Through analyzing the redundancy in the raw data simulation based on Graphics Processing Unit (GPU, a fast simulation method considering reduction of redundant computation is realized by the multi-GPUs and Message Passing Interface (MPI. The results show that the efficiency of 4-GPUs increases 2 times through the redundant reduction, and the hardware cost decreases by 50%, thus the overall speedup achieves 350 times than the traditional CPU simulation.

  6. Atomic-level computer simulation

    International Nuclear Information System (INIS)

    Adams, J.B.; Rockett, Angus; Kieffer, John; Xu Wei; Nomura, Miki; Kilian, K.A.; Richards, D.F.; Ramprasad, R.

    1994-01-01

    This paper provides a broad overview of the methods of atomic-level computer simulation. It discusses methods of modelling atomic bonding, and computer simulation methods such as energy minimization, molecular dynamics, Monte Carlo, and lattice Monte Carlo. ((orig.))

  7. Cloud Computing Technologies Facilitate Earth Research

    Science.gov (United States)

    2015-01-01

    Under a Space Act Agreement, NASA partnered with Seattle-based Amazon Web Services to make the agency's climate and Earth science satellite data publicly available on the company's servers. Users can access the data for free, but they can also pay to use Amazon's computing services to analyze and visualize information using the same software available to NASA researchers.

  8. The biowaiver extension for BCS class III drugs: the effect of dissolution rate on the bioequivalence of BCS class III immediate-release drugs predicted by computer simulation.

    Science.gov (United States)

    Tsume, Yasuhiro; Amidon, Gordon L

    2010-08-02

    The Biopharmaceutical Classification System (BCS) guidance issued by the FDA allows waivers for in vivo bioavailability and bioequivalence studies for immediate-release (IR) solid oral dosage forms only for BCS class I drugs. However, a number of drugs within BCS class III have been proposed to be eligible for biowaivers. The World Health Organization (WHO) has shortened the requisite dissolution time of BCS class III drugs on their Essential Medicine List (EML) from 30 to 15 min for extended biowaivers; however, the impact of the shorter dissolution time on AUC(0-inf) and C(max) is unknown. The objectives of this investigation were to assess the ability of gastrointestinal simulation software to predict the oral absorption of the BCS class I drugs propranolol and metoprolol and the BCS class III drugs cimetidine, atenolol, and amoxicillin, and to perform in silico bioequivalence studies to assess the feasibility of extending biowaivers to BCS class III drugs. The drug absorption from the gastrointestinal tract was predicted using physicochemical and pharmacokinetic properties of test drugs provided by GastroPlus (version 6.0). Virtual trials with a 200 mL dose volume at different drug release rates (T(85%) = 15 to 180 min) were performed to predict the oral absorption (C(max) and AUC(0-inf)) of the above drugs. Both BCS class I drugs satisfied bioequivalence with regard to the release rates up to 120 min. The results with BCS class III drugs demonstrated bioequivalence using the prolonged release rate, T(85%) = 45 or 60 min, indicating that the dissolution standard for bioequivalence is dependent on the intestinal membrane permeability and permeability profile throughout the gastrointestinal tract. The results of GastroPlus simulations indicate that the dissolution rate of BCS class III drugs could be prolonged to the point where dissolution, rather than permeability, would control the overall absorption. For BCS class III drugs with intestinal absorption patterns

  9. Micromagnetics of rare-earth efficient permanent magnets

    Science.gov (United States)

    Fischbacher, Johann; Kovacs, Alexander; Gusenbauer, Markus; Oezelt, Harald; Exl, Lukas; Bance, Simon; Schrefl, Thomas

    2018-05-01

    The development of permanent magnets containing less or no rare-earth elements is linked to profound knowledge of the coercivity mechanism. Prerequisites for a promising permanent magnet material are a high spontaneous magnetization and a sufficiently high magnetic anisotropy. In addition to the intrinsic magnetic properties the microstructure of the magnet plays a significant role in establishing coercivity. The influence of the microstructure on coercivity, remanence, and energy density product can be understood by using micromagnetic simulations. With advances in computer hardware and numerical methods, hysteresis curves of magnets can be computed quickly so that the simulations can readily provide guidance for the development of permanent magnets. The potential of rare-earth reduced and rare-earth free permanent magnets is investigated using micromagnetic simulations. The results show excellent hard magnetic properties can be achieved in grain boundary engineered NdFeB, rare-earth magnets with a ThMn12 structure, Co-based nano-wires, and L10-FeNi provided that the magnet’s microstructure is optimized.

  10. Computer Graphics Instruction in VizClass

    Science.gov (United States)

    Grimes, Douglas; Warschauer, Mark; Hutchinson, Tara; Kuester, Falko

    2005-01-01

    "VizClass" is a university classroom environment designed to offer students in computer graphics and engineering courses up-to-date visualization technologies. Three digital whiteboards and a three-dimensional stereoscopic display provide complementary display surfaces. Input devices include touchscreens on the digital whiteboards, remote…

  11. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    Science.gov (United States)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  12. A MATLAB based Distributed Real-time Simulation of Lander-Orbiter-Earth Communication for Lunar Missions

    Science.gov (United States)

    Choudhury, Diptyajit; Angeloski, Aleksandar; Ziah, Haseeb; Buchholz, Hilmar; Landsman, Andre; Gupta, Amitava; Mitra, Tiyasa

    Lunar explorations often involve use of a lunar lander , a rover [1],[2] and an orbiter which rotates around the moon with a fixed radius. The orbiters are usually lunar satellites orbiting along a polar orbit to ensure visibility with respect to the rover and the Earth Station although with varying latency. Communication in such deep space missions is usually done using a specialized protocol like Proximity-1[3]. MATLAB simulation of Proximity-1 have been attempted by some contemporary researchers[4] to simulate all features like transmission control, delay etc. In this paper it is attempted to simulate, in real time, the communication between a tracking station on earth (earth station), a lunar orbiter and a lunar rover using concepts of Distributed Real-time Simulation(DRTS).The objective of the simulation is to simulate, in real-time, the time varying communication delays associated with the communicating elements with a facility to integrate specific simulation modules to study different aspects e.g. response due to a specific control command from the earth station to be executed by the rover. The hardware platform comprises four single board computers operating as stand-alone real time systems (developed by MATLAB xPC target and inter-networked using UDP-IP protocol). A time triggered DRTS approach is adopted. The earth station, the orbiter and the rover are programmed as three standalone real-time processes representing the communicating elements in the system. Communication from one communicating element to another constitutes an event which passes a state message from one element to another, augmenting the state of the latter. These events are handled by an event scheduler which is the fourth real-time process. The event scheduler simulates the delay in space communication taking into consideration the distance between the communicating elements. A unique time synchronization algorithm is developed which takes into account the large latencies in space

  13. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  14. The Longitudinal Study of Computer Simulation in Learning Statistics for Hospitality College Students

    Science.gov (United States)

    Huang, Ching-Hsu

    2014-01-01

    The class quasi-experiment was conducted to determine whether using computer simulation teaching strategy enhanced student understanding of statistics concepts for students enrolled in an introductory course. One hundred and ninety-three sophomores in hospitality management department were invited as participants in this two-year longitudinal…

  15. Computational search for rare-earth free hard-magnetic materials

    Science.gov (United States)

    Flores Livas, José A.; Sharma, Sangeeta; Dewhurst, John Kay; Gross, Eberhard; MagMat Team

    2015-03-01

    It is difficult to over state the importance of hard magnets for human life in modern times; they enter every walk of our life from medical equipments (NMR) to transport (trains, planes, cars, etc) to electronic appliances (for house hold use to computers). All the known hard magnets in use today contain rare-earth elements, extraction of which is expensive and environmentally harmful. Rare-earths are also instrumental in tipping the balance of world economy as most of them are mined in limited specific parts of the world. Hence it would be ideal to have similar characteristics as a hard magnet but without or at least with reduced amount of rare-earths. This is the main goal of our work: search for rare-earth-free magnets. To do so we employ a combination of density functional theory and crystal prediction methods. The quantities which define a hard magnet are magnetic anisotropy energy (MAE) and saturation magnetization (Ms), which are the quantities we maximize in search for an ideal magnet. In my talk I will present details of the computation search algorithm together with some potential newly discovered rare-earth free hard magnet. J.A.F.L. acknowledge financial support from EU's 7th Framework Marie-Curie scholarship program within the ``ExMaMa'' Project (329386).

  16. SURVEY SIMULATIONS OF A NEW NEAR-EARTH ASTEROID DETECTION SYSTEM

    International Nuclear Information System (INIS)

    Mainzer, A.; Bauer, J.; Giorgini, J.; Masiero, J.; Grav, T.; Conrow, T.; Cutri, R. M.; Dailey, J.; Fowler, J.; Jarrett, T.; Spahr, T.; Statler, T.; Wright, E. L.

    2015-01-01

    We have carried out simulations to predict the performance of a new space-based telescopic survey operating at thermal infrared wavelengths that seeks to discover and characterize a large fraction of the potentially hazardous near-Earth asteroid (NEA) population. Two potential architectures for the survey were considered: one located at the Earth–Sun L1 Lagrange point, and one in a Venus-trailing orbit. A sample cadence was formulated and tested, allowing for the self-follow-up necessary for objects discovered in the daytime sky on Earth. Synthetic populations of NEAs with sizes as small as 140 m in effective spherical diameter were simulated using recent determinations of their physical and orbital properties. Estimates of the instrumental sensitivity, integration times, and slew speeds were included for both architectures assuming the properties of newly developed large-format 10 μm HgCdTe detector arrays capable of operating at ∼35 K. Our simulation included the creation of a preliminary version of a moving object processing pipeline suitable for operating on the trial cadence. We tested this pipeline on a simulated sky populated with astrophysical sources such as stars and galaxies extrapolated from Spitzer Space Telescope and Wide-field Infrared Explorer data, the catalog of known minor planets (including Main Belt asteroids, comets, Jovian Trojans, planets, etc.), and the synthetic NEA model. Trial orbits were computed for simulated position-time pairs extracted from the synthetic surveys to verify that the tested cadence would result in orbits suitable for recovering objects at a later time. Our results indicate that the Earth–Sun L1 and Venus-trailing surveys achieve similar levels of integral completeness for potentially hazardous asteroids larger than 140 m; placing the telescope in an interior orbit does not yield an improvement in discovery rates. This work serves as a necessary first step for the detailed planning of a next-generation NEA survey

  17. Near-Earth Object Survey Simulation Software

    Science.gov (United States)

    Naidu, Shantanu P.; Chesley, Steven R.; Farnocchia, Davide

    2017-10-01

    There is a significant interest in Near-Earth objects (NEOs) because they pose an impact threat to Earth, offer valuable scientific information, and are potential targets for robotic and human exploration. The number of NEO discoveries has been rising rapidly over the last two decades with over 1800 being discovered last year, making the total number of known NEOs >16000. Pan-STARRS and the Catalina Sky Survey are currently the most prolific NEO surveys, having discovered >1600 NEOs between them in 2016. As next generation surveys such as Large Synoptic Survey Telescope (LSST) and the proposed Near-Earth Object Camera (NEOCam) become operational in the next decade, the discovery rate is expected to increase tremendously. Coordination between various survey telescopes will be necessary in order to optimize NEO discoveries and create a unified global NEO discovery network. We are collaborating on a community-based, open-source software project to simulate asteroid surveys to facilitate such coordination and develop strategies for improving discovery efficiency. Our effort so far has focused on development of a fast and efficient tool capable of accepting user-defined asteroid population models and telescope parameters such as a list of pointing angles and camera field-of-view, and generating an output list of detectable asteroids. The software takes advantage of the widely used and tested SPICE library and architecture developed by NASA’s Navigation and Ancillary Information Facility (Acton, 1996) for saving and retrieving asteroid trajectories and camera pointing. Orbit propagation is done using OpenOrb (Granvik et al. 2009) but future versions will allow the user to plug in a propagator of their choice. The software allows the simulation of both ground-based and space-based surveys. Performance is being tested using the Grav et al. (2011) asteroid population model and the LSST simulated survey “enigma_1189”.

  18. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  19. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  20. Computer self-efficacy - is there a gender gap in tertiary level introductory computing classes?

    Directory of Open Access Journals (Sweden)

    Shirley Gibbs

    Full Text Available This paper explores the relationship between introductory computing students, self-efficacy, and gender. Since the use of computers has become more common there has been speculation that the confidence and ability to use them differs between genders. Self-efficacy is an important and useful concept used to describe how a student may perceive their own ability or confidence in using and learning new technology. A survey of students in an introductory computing class has been completed intermittently since the late 1990\\'s. Although some questions have been adapted to meet the changing technology the aim of the survey has remain unchanged. In this study self-efficacy is measured using two self-rating questions. Students are asked to rate their confidence using a computer and also asked to give their perception of their computing knowledge. This paper examines these two aspects of a person\\'s computer self-efficacy in order to identify any differences that may occur between genders in two introductory computing classes, one in 1999 and the other in 2012. Results from the 1999 survey are compared with those from the survey completed in 2012 and investigated to ascertain if the perception that males were more likely to display higher computer self-efficacy levels than their female classmates does or did exist in a class of this type. Results indicate that while overall there has been a general increase in self-efficacy levels in 2012 compared with 1999, there is no significant gender gap.

  1. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  2. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  3. Using a Virtual Class to Demonstrate Computer-Mediated Group Dynamics Concepts

    Science.gov (United States)

    Franz, Timothy M.; Vicker, Lauren A.

    2010-01-01

    We report about an active learning demonstration designed to use a virtual class to present computer-mediated group communication course concepts to show that students can learn about these concepts in a virtual class. We designated 1 class period as a virtual rather than face-to-face class, when class members "attended" virtually using…

  4. Documenting the NASA Armstrong Flight Research Center Oblate Earth Simulation Equations of Motion and Integration Algorithm

    Science.gov (United States)

    Clarke, R.; Lintereur, L.; Bahm, C.

    2016-01-01

    A desire for more complete documentation of the National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC), Edwards, California legacy code used in the core simulation has led to this e ort to fully document the oblate Earth six-degree-of-freedom equations of motion and integration algorithm. The authors of this report have taken much of the earlier work of the simulation engineering group and used it as a jumping-o point for this report. The largest addition this report makes is that each element of the equations of motion is traced back to first principles and at no point is the reader forced to take an equation on faith alone. There are no discoveries of previously unknown principles contained in this report; this report is a collection and presentation of textbook principles. The value of this report is that those textbook principles are herein documented in standard nomenclature that matches the form of the computer code DERIVC. Previous handwritten notes are much of the backbone of this work, however, in almost every area, derivations are explicitly shown to assure the reader that the equations which make up the oblate Earth version of the computer routine, DERIVC, are correct.

  5. Fast Eigensolver for Computing 3D Earth's Normal Modes

    Science.gov (United States)

    Shi, J.; De Hoop, M. V.; Li, R.; Xi, Y.; Saad, Y.

    2017-12-01

    We present a novel parallel computational approach to compute Earth's normal modes. We discretize Earth via an unstructured tetrahedral mesh and apply the continuous Galerkin finite element method to the elasto-gravitational system. To resolve the eigenvalue pollution issue, following the analysis separating the seismic point spectrum, we utilize explicitly a representation of the displacement for describing the oscillations of the non-seismic modes in the fluid outer core. Effectively, we separate out the essential spectrum which is naturally related to the Brunt-Väisälä frequency. We introduce two Lanczos approaches with polynomial and rational filtering for solving this generalized eigenvalue problem in prescribed intervals. The polynomial filtering technique only accesses the matrix pair through matrix-vector products and is an ideal candidate for solving three-dimensional large-scale eigenvalue problems. The matrix-free scheme allows us to deal with fluid separation and self-gravitation in an efficient way, while the standard shift-and-invert method typically needs an explicit shifted matrix and its factorization. The rational filtering method converges much faster than the standard shift-and-invert procedure when computing all the eigenvalues inside an interval. Both two Lanczos approaches solve for the internal eigenvalues extremely accurately, comparing with the standard eigensolver. In our computational experiments, we compare our results with the radial earth model benchmark, and visualize the normal modes using vector plots to illustrate the properties of the displacements in different modes.

  6. A fast exact simulation method for a class of Markov jump processes.

    Science.gov (United States)

    Li, Yao; Hu, Lili

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze its properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.

  7. Beat the Bourgeoisie: A Social Class Inequality and Mobility Simulation Game

    Science.gov (United States)

    Norris, Dawn R.

    2013-01-01

    Simulation games can help overcome student resistance to thinking structurally about social class inequality, meritocracy, and mobility. Most inequality simulations focus solely on economic inequality and omit social and cultural capital, both of which contribute to social class reproduction. Using a pretest/posttest design, the current study…

  8. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  9. Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds

    Science.gov (United States)

    Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.

    2012-11-01

    A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.

  10. Teaching with simulations

    NARCIS (Netherlands)

    Rutten, N.P.G.

    2014-01-01

    This dissertation focuses on whole-class science teaching with computer simulations. Computer simulations display dynamic, visual representations of natural phenomena and can make a great contribution to the science classroom. Simulations can be used in multiple ways. Teachers who have an

  11. The application of neutral network integrated with genetic algorithm and simulated annealing for the simulation of rare earths separation processes by the solvent extraction technique using EHEHPA agent

    International Nuclear Information System (INIS)

    Tran Ngoc Ha; Pham Thi Hong Ha

    2003-01-01

    In the present work, neutral network has been used for mathematically modeling equilibrium data of the mixture of two rare earth elements, namely Nd and Pr with PC88A agent. Thermo-genetic algorithm based on the idea of the genetic algorithm and the simulated annealing algorithm have been used in the training procedure of the neutral networks, giving better result in comparison with the traditional modeling approach. The obtained neutral network modeling the experimental data is further used in the computer program to simulate the solvent extraction process of two elements Nd and Pr. Based on this computer program, various optional schemes for the separation of Nd and Pr have been investigated and proposed. (author)

  12. Computer simulation of probability of detection

    International Nuclear Information System (INIS)

    Fertig, K.W.; Richardson, J.M.

    1983-01-01

    This paper describes an integrated model for assessing the performance of a given ultrasonic inspection system for detecting internal flaws, where the performance of such a system is measured by probability of detection. The effects of real part geometries on sound propagations are accounted for and the noise spectra due to various noise mechanisms are measured. An ultrasonic inspection simulation computer code has been developed to be able to detect flaws with attributes ranging over an extensive class. The detection decision is considered to be a binary decision based on one received waveform obtained in a pulse-echo or pitch-catch setup. This study focuses on the detectability of flaws using an amplitude thresholding type. Some preliminary results on the detectability of radially oriented cracks in IN-100 for bore-like geometries are given

  13. 25th Space Simulation Conference. Environmental Testing: The Earth-Space Connection

    Science.gov (United States)

    Packard, Edward

    2008-01-01

    Topics covered include: Methods of Helium Injection and Removal for Heat Transfer Augmentation; The ESA Large Space Simulator Mechanical Ground Support Equipment for Spacecraft Testing; Temperature Stability and Control Requirements for Thermal Vacuum/Thermal Balance Testing of the Aquarius Radiometer; The Liquid Nitrogen System for Chamber A: A Change from Original Forced Flow Design to a Natural Flow (Thermo Siphon) System; Return to Mercury: A Comparison of Solar Simulation and Flight Data for the MESSENGER Spacecraft; Floating Pressure Conversion and Equipment Upgrades of Two 3.5kw, 20k, Helium Refrigerators; Affect of Air Leakage into a Thermal-Vacuum Chamber on Helium Refrigeration Heat Load; Special ISO Class 6 Cleanroom for the Lunar Reconnaissance Orbiter (LRO) Project; A State-of-the-Art Contamination Effects Research and Test Facility Martian Dust Simulator; Cleanroom Design Practices and Their Influence on Particle Counts; Extra Terrestrial Environmental Chamber Design; Contamination Sources Effects Analysis (CSEA) - A Tool to Balance Cost/Schedule While Managing Facility Availability; SES and Acoustics at GSFC; HST Super Lightweight Interchangeable Carrier (SLIC) Static Test; Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance; Estimating Shock Spectra: Extensions beyond GEVS; Structural Dynamic Analysis of a Spacecraft Multi-DOF Shaker Table; Direct Field Acoustic Testing; Manufacture of Cryoshroud Surfaces for Space Simulation Chambers; The New LOTIS Test Facility; Thermal Vacuum Control Systems Options for Test Facilities; Extremely High Vacuum Chamber for Low Outgassing Processing at NASA Goddard; Precision Cleaning - Path to Premier; The New Anechoic Shielded Chambers Designed for Space and Commercial Applications at LIT; Extraction of Thermal Performance Values from Samples in the Lunar Dust Adhesion Bell Jar; Thermal (Silicon Diode) Data Acquisition System; Aquarius's Instrument Science Data System (ISDS) Automated

  14. Running climate model on a commercial cloud computing environment: A case study using Community Earth System Model (CESM) on Amazon AWS

    Science.gov (United States)

    Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock

    2017-01-01

    The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.

  15. Computer simulation of a clinical magnet resonance tomography scanner for training purposes

    International Nuclear Information System (INIS)

    Hacklaender, T.; Mertens, H.; Cramer, B.M.

    2004-01-01

    Purpose: The idea for this project was born by the necessity to offer medical students an easy approach to the theoretical basics of magnetic resonance imaging. The aim was to simulate the features and functions of such a scanner on a commercially available computer by means of a computer program. Materials and Methods: The simulation was programmed in pure Java under the GNU General Public License and is freely available for a commercially available computer with Windows, Macintosh or Linux operating system. The graphic user interface is oriented to a real scanner. In an external program parameter, images for the proton density and the relaxation times T1 and T2 are calculated on the basis of clinical examinations. From this, the image calculation is carried out in the simulation program pixel by pixel on the basis of a pulse sequence chosen and modified by the user. The images can be stored and printed. In addition, it is possible to display and modify k-space images. Results: Seven classes of pulse sequences are implemented and up to 14 relevant sequence parameters, such as repetition time and echo time, can be altered. Aliasing and motion artifacts can be simulated. As the image calculation only takes a few seconds, interactive working is possible. (orig.)

  16. Distributed simulation of large computer systems

    International Nuclear Information System (INIS)

    Marzolla, M.

    2001-01-01

    Sequential simulation of large complex physical systems is often regarded as a computationally expensive task. In order to speed-up complex discrete-event simulations, the paradigm of Parallel and Distributed Discrete Event Simulation (PDES) has been introduced since the late 70s. The authors analyze the applicability of PDES to the modeling and analysis of large computer system; such systems are increasingly common in the area of High Energy and Nuclear Physics, because many modern experiments make use of large 'compute farms'. Some feasibility tests have been performed on a prototype distributed simulator

  17. LEGO - A Class Library for Accelerator Design and Simulation

    International Nuclear Information System (INIS)

    Cai, Yunhai

    1998-01-01

    An object-oriented class library of accelerator design and simulation is designed and implemented in a simple and modular fashion. All physics of single-particle dynamics is implemented based on the Hamiltonian in the local frame of the component. Symplectic integrators are used to approximate the integration of the Hamiltonian. A differential algebra class is introduced to extract a Taylor map up to arbitrary order. Analysis of optics is done in the same way both for the linear and non-linear cases. Recently, Monte Carlo simulation of synchrotron radiation has been added into the library. The code is used to design and simulate the lattices of the PEP-II and SPEAR3. And it is also used for the commissioning of the PEP-II. Some examples of how to use the library will be given

  18. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    Science.gov (United States)

    Chen, A.; Pham, L.; Kempler, S.; Theobald, M.; Esfandiari, A.; Campino, J.; Vollmer, B.; Lynnes, C.

    2011-12-01

    Cloud Computing technology has been used to offer high-performance and low-cost computing and storage resources for both scientific problems and business services. Several cloud computing services have been implemented in the commercial arena, e.g. Amazon's EC2 & S3, Microsoft's Azure, and Google App Engine. There are also some research and application programs being launched in academia and governments to utilize Cloud Computing. NASA launched the Nebula Cloud Computing platform in 2008, which is an Infrastructure as a Service (IaaS) to deliver on-demand distributed virtual computers. Nebula users can receive required computing resources as a fully outsourced service. NASA Goddard Earth Science Data and Information Service Center (GES DISC) migrated several GES DISC's applications to the Nebula as a proof of concept, including: a) The Simple, Scalable, Script-based Science Processor for Measurements (S4PM) for processing scientific data; b) the Atmospheric Infrared Sounder (AIRS) data process workflow for processing AIRS raw data; and c) the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI) for online access to, analysis, and visualization of Earth science data. This work aims to evaluate the practicability and adaptability of the Nebula. The initial work focused on the AIRS data process workflow to evaluate the Nebula. The AIRS data process workflow consists of a series of algorithms being used to process raw AIRS level 0 data and output AIRS level 2 geophysical retrievals. Migrating the entire workflow to the Nebula platform is challenging, but practicable. After installing several supporting libraries and the processing code itself, the workflow is able to process AIRS data in a similar fashion to its current (non-cloud) configuration. We compared the performance of processing 2 days of AIRS level 0 data through level 2 using a Nebula virtual computer and a local Linux computer. The result shows that Nebula has significantly

  19. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  20. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  1. Parallel reservoir simulator computations

    International Nuclear Information System (INIS)

    Hemanth-Kumar, K.; Young, L.C.

    1995-01-01

    The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90

  2. Computer simulation of ductile fracture

    International Nuclear Information System (INIS)

    Wilkins, M.L.; Streit, R.D.

    1979-01-01

    Finite difference computer simulation programs are capable of very accurate solutions to problems in plasticity with large deformations and rotation. This opens the possibility of developing models of ductile fracture by correlating experiments with equivalent computer simulations. Selected experiments were done to emphasize different aspects of the model. A difficult problem is the establishment of a fracture-size effect. This paper is a study of the strain field around notched tensile specimens of aluminum 6061-T651. A series of geometrically scaled specimens are tested to fracture. The scaled experiments are conducted for different notch radius-to-diameter ratios. The strains at fracture are determined from computer simulations. An estimate is made of the fracture-size effect

  3. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  4. Multi-objective optimization of GENIE Earth system models.

    Science.gov (United States)

    Price, Andrew R; Myerscough, Richard J; Voutchkov, Ivan I; Marsh, Robert; Cox, Simon J

    2009-07-13

    The tuning of parameters in climate models is essential to provide reliable long-term forecasts of Earth system behaviour. We apply a multi-objective optimization algorithm to the problem of parameter estimation in climate models. This optimization process involves the iterative evaluation of response surface models (RSMs), followed by the execution of multiple Earth system simulations. These computations require an infrastructure that provides high-performance computing for building and searching the RSMs and high-throughput computing for the concurrent evaluation of a large number of models. Grid computing technology is therefore essential to make this algorithm practical for members of the GENIE project.

  5. Efficient performance simulation of class D amplifier output stages

    DEFF Research Database (Denmark)

    Nyboe, Flemming; Risbo, Lars; Andreani, Pietro

    2005-01-01

    Straightforward simulation of amplifier distortion involves transient simulation of operation on a sine wave input signal, and a subsequent FFT of the output voltage. This approach is very slow on class D amplifiers, since the switching behavior forces simulation time steps that are many orders...... of magnitude smaller than the duration of one period of an audio sine wave. This work presents a method of simulating the amplifier transfer characteristic using a minimum amount of simulation time, and then deriving THD from the results....

  6. Hybrid Cloud Computing Environment for EarthCube and Geoscience Community

    Science.gov (United States)

    Yang, C. P.; Qin, H.

    2016-12-01

    The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.

  7. Applying sensor web strategies to big data earth observations

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2013-07-01

    Full Text Available Earth observation data and meta-data are a central concern of the earth sciences. These data are generated by a myriad of both in-situ and remote sensors. Other sources of data include computational simulations, various ex-situ sources...

  8. HTTR plant dynamic simulation using a hybrid computer

    International Nuclear Information System (INIS)

    Shimazaki, Junya; Suzuki, Katsuo; Nabeshima, Kunihiko; Watanabe, Koichi; Shinohara, Yoshikuni; Nakagawa, Shigeaki.

    1990-01-01

    A plant dynamic simulation of High-Temperature Engineering Test Reactor has been made using a new-type hybrid computer. This report describes a dynamic simulation model of HTTR, a hybrid simulation method for SIMSTAR and some results obtained from dynamics analysis of HTTR simulation. It concludes that the hybrid plant simulation is useful for on-line simulation on account of its capability of computation at high speed, compared with that of all digital computer simulation. With sufficient accuracy, 40 times faster computation than real time was reached only by changing an analog time scale for HTTR simulation. (author)

  9. GPU-accelerated micromagnetic simulations using cloud computing

    Energy Technology Data Exchange (ETDEWEB)

    Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  10. GPU-accelerated micromagnetic simulations using cloud computing

    International Nuclear Information System (INIS)

    Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.

    2016-01-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  11. Computer Simulation Western

    International Nuclear Information System (INIS)

    Rasmussen, H.

    1992-01-01

    Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)

  12. [Computer simulation of a clinical magnet resonance tomography scanner for training purposes].

    Science.gov (United States)

    Hackländer, T; Mertens, H; Cramer, B M

    2004-08-01

    The idea for this project was born by the necessity to offer medical students an easy approach to the theoretical basics of magnetic resonance imaging. The aim was to simulate the features and functions of such a scanner on a commercially available computer by means of a computer program. The simulation was programmed in pure Java under the GNU General Public License and is freely available for a commercially available computer with Windows, Macintosh or Linux operating system. The graphic user interface is oriented to a real scanner. In an external program parameter, images for the proton density and the relaxation times T1 and T2 are calculated on the basis of clinical examinations. From this, the image calculation is carried out in the simulation program pixel by pixel on the basis of a pulse sequence chosen and modified by the user. The images can be stored and printed. In addition, it is possible to display and modify k-space images. Seven classes of pulse sequences are implemented and up to 14 relevant sequence parameters, such as repetition time and echo time, can be altered. Aliasing and motion artifacts can be simulated. As the image calculation only takes a few seconds, interactive working is possible. The simulation has been used in the university education for more than 1 year, successfully illustrating the dependence of the MR images on the measuring parameters. This should facititate the approach of students to the understanding MR imaging in the future.

  13. TEST BED FOR THE SIMULATION OF MAGNETIC FIELD MEASUREMENTS OF LOW EARTH ORBIT SATELLITES

    Directory of Open Access Journals (Sweden)

    Alberto Gallina

    2018-03-01

    Full Text Available The paper presents a test bed designed to simulate magnetic environment experienced by a spacecraft on low Earth orbit. It consists of a spherical air bearing located inside a Helmholtz cage. The spherical air bearing is used for simulating microgravity conditions of orbiting bodies while the Helmholtz cage generates a controllable magnetic field resembling the one surrounding a satellite during its motion. Dedicated computer software is used to initially calculate the magnetic field on an established orbit. The magnetic field data is then translated into current values and transmitted to programmable power supplies energizing the cage. The magnetic field within the cage is finally measured by a test article mounted on the air bearing. The paper provides a description of the test bed and the test article design. An experimental test proves the good performance of the entire system.

  14. General-purpose parallel simulator for quantum computing

    International Nuclear Information System (INIS)

    Niwa, Jumpei; Matsumoto, Keiji; Imai, Hiroshi

    2002-01-01

    With current technologies, it seems to be very difficult to implement quantum computers with many qubits. It is therefore of importance to simulate quantum algorithms and circuits on the existing computers. However, for a large-size problem, the simulation often requires more computational power than is available from sequential processing. Therefore, simulation methods for parallel processors are required. We have developed a general-purpose simulator for quantum algorithms/circuits on the parallel computer (Sun Enterprise4500). It can simulate algorithms/circuits with up to 30 qubits. In order to test efficiency of our proposed methods, we have simulated Shor's factorization algorithm and Grover's database search, and we have analyzed robustness of the corresponding quantum circuits in the presence of both decoherence and operational errors. The corresponding results, statistics, and analyses are presented in this paper

  15. Imaging Near-Earth Electron Densities Using Thomson Scattering

    Science.gov (United States)

    2009-01-15

    geocentric solar magnetospheric (GSM) coordinates1. TECs were initially computed from a viewing loca- tion at the Sun-Earth L1 Lagrange point2 for both...further find that an elliptical Earth orbit (apogee ~30 RE) is a suitable lower- cost option for a demonstration mission. 5. SIMULATED OBSERVATIONS We

  16. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  17. Advanced computers and simulation

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1993-01-01

    Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators

  18. Protection for the U.S. Automobile Industry: A Joint Class Simulation in Trade Policy.

    Science.gov (United States)

    Hess, Peter N.; Ortmayer, Louis M.

    A description of a joint class simulation in trade policy undertaken by an international economics class and a political science class at Davidson College (Pennsylvania) is presented in three sections. Section I describes the structure of the simulation. Students were divided into groups of United States auto manufacturers, the United Auto…

  19. Computer simulations of collisionless shock waves

    International Nuclear Information System (INIS)

    Leroy, M.M.

    1984-01-01

    A review of the contributions of particle computer simulations to the understanding of the physics of magnetic shock waves in collisionless plasmas is presented. The emphasis is on the relation between the computer simulation results, spacecraft observations of shocks in space, and related theories, rather than on technical aspects of the numerics. It is shown that much has been learned from the comparison of ISEE spacecraft observations of the terrestrial bow shock and particle computer simulations concerning the quasi-perpendicular, supercritical shock (ion scale structure, ion reflection mechanism and ultimate dissipation processes). Particle computer simulations have also had an appreciable prospective role in the investigation of the physics of quasi-parallel shocks, about which still little is known observationally. Moreover, these numerical techniques have helped to clarify the process of suprathermal ion rejection by the shock into the foreshock, and the subsequent evolution of the ions in the foreshock. 95 references

  20. Rare earth germanates

    International Nuclear Information System (INIS)

    Bondar', I.A.; Vinogradova, N.V.; Dem'yanets, L.N.

    1983-01-01

    Rare earth germanates attract close attention both as an independent class of compounds and analogues of a widely spread class of natural and synthetic minerals. The methods of rare earth germanate synthesis (solid-phase, hydrothermal) are considered. Systems on the basis of germanium and rare earth oxides, phase diagrams, phase transformations are studied. Using different chemical analysese the processes of rare earth germanate formation are investigated. IR spectra of alkali and rare earth metal germanates are presented, their comparative analysis being carried out. Crystal structures of the compounds, lattice parameters are studied. Fields of possible application of rare earth germanates are shown

  1. Computer algebra simulation - what can it do?; Was leistet Computer-Algebra-Simulation?

    Energy Technology Data Exchange (ETDEWEB)

    Braun, S. [Visual Analysis AG, Muenchen (Germany)

    2001-07-01

    Shortened development times require new and improved calculation methods. Numeric methods have long become state of the art. However, although numeric simulations provide a better understanding of process parameters, they do not give a feast overview of the interdependences between parameters. Numeric simulations are effective only if all physical parameters are sufficiently known; otherwise, the efficiency will decrease due to the large number of variant calculations required. Computer algebra simulation closes this gap and provides a deeper understanding of the physical fundamentals of technical processes. [German] Neue und verbesserte Berechnungsmethoden sind notwendig, um die staendige Verkuerzung der Entwicklungszyklen zu ermoeglichen. Herkoemmliche Methoden, die auf einem rein numerischen Ansatz basieren, haben sich in vielen Anwendungsbereichen laengst zum Standard entwickelt. Aber nicht nur die staendig kuerzer werdenden Entwicklungszyklen, sondern auch die weiterwachsende Komplexitaet machen es notwendig, ein besseres Verstaendnis der beteiligten Prozessparameter zu gewinnen. Die numerische Simulation besticht zwar durch Detailloesungen, selbst bei komplexen Strukturen und Prozessen, allerdings liefert sie keine schnelle Abschaetzung ueber die Zusammenhaenge zwischen den einzelnen Parametern. Die numerische Simulation ist nur dann effektiv, wenn alle physikalischen Parameter hinreichend bekannt sind; andernfalls sinkt die Effizienz durch die notwendige Anzahl von notwendigen Variantenrechnungen sehr stark. Die Computer-Algebra-Simulation schliesst diese Luecke in dem sie es erlaubt, sich einen tieferen Einblick in die physikalische Funktionsweise technischer Prozesse zu verschaffen. (orig.)

  2. Computer Simulation of Nonuniform MTLs via Implicit Wendroff and State-Variable Methods

    Directory of Open Access Journals (Sweden)

    L. Brancik

    2011-04-01

    Full Text Available The paper deals with techniques for a computer simulation of nonuniform multiconductor transmission lines (MTLs based on the implicit Wendroff and the statevariable methods. The techniques fall into a class of finitedifference time-domain (FDTD methods useful to solve various electromagnetic systems. Their basic variants are extended and modified to enable solving both voltage and current distributions along nonuniform MTL’s wires and their sensitivities with respect to lumped and distributed parameters. An experimental error analysis is performed based on the Thomson cable whose analytical solutions are known, and some examples of simulation of both uniform and nonuniform MTLs are presented. Based on the Matlab language programme, CPU times are analyzed to compare efficiency of the methods. Some results for nonlinear MTLs simulation are presented as well.

  3. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  4. Computer simulation of rare earth solvent extraction circuits

    International Nuclear Information System (INIS)

    Voit, D.O.

    1988-01-01

    A BASIC language program has been written that simulates the performance of an integrated solvent extraction circuit consisting of an extractor, a reflux fed scrubber, and a stripper. The program is designed to simulate the performance of a circuit having an aqueous feed containing each of the lanthanide as well as yttrium. The Kremser equation is used to determine the separation occurring in each section of the circuit. The required input variables are the feed composition, the separation factors, the light key extraction factors and extractor feed zone distribution coefficient, the number of stages, and the reflux ratios. The program calculates the composition of the streams at each mode in the circuit, the total loading, and the remaining distribution coefficients. User interaction with the program is essential. The program has no capability to determine if the calculated values are consistent with various real restraints. Knowledge of the physical, chemical, and equilibrium behavior is essential to successfully utilize the program. The number of iterations required to achieve steady-state provides insight to the circuit response times

  5. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  6. Geodynamo and mantle convection simulations on the Earth Simulator using the Yin-Yang grid

    International Nuclear Information System (INIS)

    Kageyama, Akira; Yoshida, Masaki

    2005-01-01

    We have developed finite difference codes based on the Yin-Yang grid for the geodynamo simulation and the mantle convection simulation. The Yin-Yang grid is a kind of spherical overset grid that is composed of two identical component grids. The intrinsic simplicity of the mesh configuration of the Yin-Yang grid enables us to develop highly optimized simulation codes on massively parallel supercomputers. The Yin-Yang geodynamo code has achieved 15.2 Tflops with 4096 processors on the Earth Simulator. This represents 46% of the theoretical peak performance. The Yin-Yang mantle code has enabled us to carry out mantle convection simulations in realistic regimes with a Rayleigh number of 10 7 including strongly temperature dependent viscosity with spatial contrast up to 10 6

  7. Numerical characteristics of quantum computer simulation

    Science.gov (United States)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  8. Simulation Modeling of a Facility Layout in Operations Management Classes

    Science.gov (United States)

    Yazici, Hulya Julie

    2006-01-01

    Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…

  9. Analyzing Robotic Kinematics Via Computed Simulations

    Science.gov (United States)

    Carnahan, Timothy M.

    1992-01-01

    Computing system assists in evaluation of kinematics of conceptual robot. Displays positions and motions of robotic manipulator within work cell. Also displays interactions between robotic manipulator and other objects. Results of simulation displayed on graphical computer workstation. System includes both off-the-shelf software originally developed for automotive industry and specially developed software. Simulation system also used to design human-equivalent hand, to model optical train in infrared system, and to develop graphical interface for teleoperator simulation system.

  10. Computer Simulations, Disclosure and Duty of Care

    Directory of Open Access Journals (Sweden)

    John Barlow

    2006-05-01

    Full Text Available Computer simulations provide cost effective methods for manipulating and modeling 'reality'. However they are not real. They are imitations of a system or event, real or fabricated, and as such mimic, duplicate or represent that system or event. The degree to which a computer simulation aligns with and reproduces the ‘reality’ of the system or event it attempts to mimic or duplicate depends upon many factors including the efficiency of the simulation algorithm, the processing power of the computer hardware used to run the simulation model, and the expertise, assumptions and prejudices of those concerned with designing, implementing and interpreting the simulation output. Computer simulations in particular are increasingly replacing physical experimentation in many disciplines, and as a consequence, are used to underpin quite significant decision-making which may impact on ‘innocent’ third parties. In this context, this paper examines two interrelated issues: Firstly, how much and what kind of information should a simulation builder be required to disclose to potential users of the simulation? Secondly, what are the implications for a decision-maker who acts on the basis of their interpretation of a simulation output without any reference to its veracity, which may in turn comprise the safety of other parties?

  11. Computation and analysis of cavitating flow in Francis-class hydraulic turbines

    Science.gov (United States)

    Leonard, Daniel J.

    Hydropower is the most proven renewable energy technology, supplying the world with 16% of its electricity. Conventional hydropower generates a vast majority of that percentage. Although a mature technology, hydroelectric generation shows great promise for expansion through new dams and plants in developing hydro countries. Moreover, in developed hydro countries, such as the United States, installing generating units in existing dams and the modern refurbishment of existing plants can greatly expand generating capabilities with little to no further impact on the environment. In addition, modern computational technology and fluid dynamics expertise has led to substantial improvements in modern turbine design and performance. Cavitation has always presented a problem in hydroturbines, causing performance breakdown, erosion, damage, vibration, and noise. While modern turbines are usually designed to be cavitation-free at their best efficiency point, due to the variable demand of the energy market it is fairly common to operate at off-design conditions. Here, cavitation and its deleterious effects are unavoidable, and hence, cavitation is a limiting factor on the design and operation of these turbines. Multiphase Computational Fluid Dynamics (CFD) has been used in recent years to model cavitating flow for a large range of problems, including turbomachinery. However, CFD of cavitating flow in hydroturbines is still in its infancy. This dissertation presents steady-periodic Reynolds-averaged Navier-Stokes simulations of a cavitating Francis-class hydroturbine at model and prototype scales. Computational results of the reduced-scale model and full-scale prototype, undergoing performance breakdown, are compared with empirical model data and prototype performance estimations based on standard industry scalings from the model data. Mesh convergence of the simulations is also displayed. Comparisons are made between the scales to display that cavitation performance breakdown

  12. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    Science.gov (United States)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  13. Giant Impacts on Earth-Like Worlds

    Science.gov (United States)

    Kohler, Susanna

    2016-05-01

    Earth has experienced a large number of impacts, from the cratering events that may have caused mass extinctions to the enormous impact believed to have formed the Moon. A new study examines whether our planets impact history is typical for Earth-like worlds.N-Body ChallengesTimeline placing the authors simulations in context of the history of our solar system (click for a closer look). [Quintana et al. 2016]The final stages of terrestrial planet formation are thought to be dominated by giant impacts of bodies in the protoplanetary disk. During this stage, protoplanets smash into one another and accrete, greatly influencing the growth, composition, and habitability of the final planets.There are two major challenges when simulating this N-body planet formation. The first is fragmentation: since computational time scales as N^2, simulating lots of bodies that split into many more bodies is very computationally intensive. For this reason, fragmentation is usually ignored; simulations instead assume perfect accretion during collisions.Total number of bodies remaining within the authors simulations over time, with fragmentation included (grey) and ignored (red). Both simulations result in the same final number of bodies, but the ones that include fragmentation take more time to reach that final number. [Quintana et al. 2016]The second challengeis that many-body systems are chaotic, which means its necessary to do a large number of simulations to make statistical statements about outcomes.Adding FragmentationA team of scientists led by Elisa Quintana (NASA NPP Senior Fellow at the Ames Research Center) has recently pushed at these challenges by modeling inner-planet formation using a code that does include fragmentation. The team ran 140 simulations with and 140 without the effects of fragmentation using similar initial conditions to understand how including fragmentation affects the outcome.Quintana and collaborators then used the fragmentation-inclusive simulations to

  14. Modeling Earth Albedo Currents on Sun Sensors for Improved Vector Observations

    DEFF Research Database (Denmark)

    Bhanderi, Dan

    2006-01-01

    Earth albedo influences vector measurements of the solar line of sight vector, due to the induced current on in the photo voltaics of Sun sensors. Although advanced digital Sun sensors exist, these are typically expensive and may not be suited for satellites in the nano or pico-class. Previously...... an Earth albedo model, based on reflectivity data from NASA's Total Ozone Mapping Spectrometer project, has been published. In this paper the proposed model is presented, and the model is sought validated by comparing simulated data with telemetry from the Danish Ørsted satellite. A novel method...... for modeling Sun sensor output by incorporating the Earth albedo model is presented. This model utilizes the directional information of in the Earth albedo model, which is achieved by Earth surface partitioning. This allows accurate simulation of the Sun sensor output and the results are consistent with Ørsted...

  15. Simulation of a small computer of the TRA-1001 type on the BESM computer

    International Nuclear Information System (INIS)

    Galaktionov, V.V.

    1975-01-01

    Considered are the purpose and probable simulation ways of one computer by the other. The emulator (simulation program) is given for a small computer of TRA-1001 type on BESM-6 computer. The simulated computer basic elements are the following: memory (8 K words), central processor, input-output program channel, interruption circuit, computer panel. The work with the input-output devices, teletypes ASP-33, FS-1500 is also simulated. Under actual operation the emulator has been used for translating the programs prepared on punched cards with the aid of translator SLANG-1 by BESM-6 computer. The translator alignment from language COPLAN has been realized with the aid of the emulator

  16. A computer-simulated liver phantom (virtual liver phantom) for multidetector computed tomography evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Funama, Yoshinori [Kumamoto University, Department of Radiological Sciences, School of Health Sciences, Kumamoto (Japan); Awai, Kazuo; Nakayama, Yoshiharu; Liu, Da; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Miyazaki, Osamu; Goto, Taiga [Hitachi Medical Corporation, Tokyo (Japan); Hori, Shinichi [Gate Tower Institute of Image Guided Therapy, Osaka (Japan)

    2006-04-15

    The purpose of study was to develop a computer-simulated liver phantom for hepatic CT studies. A computer-simulated liver phantom was mathematically constructed on a computer workstation. The computer-simulated phantom was calibrated using real CT images acquired by an actual four-detector CT. We added an inhomogeneous texture to the simulated liver by referring to CT images of chronically damaged human livers. The mean CT number of the simulated liver was 60 HU and we added numerous 5-to 10-mm structures with 60{+-}10 HU/mm. To mimic liver tumors we added nodules measuring 8, 10, and 12 mm in diameter with CT numbers of 60{+-}10, 60{+-}15, and 60{+-}20 HU. Five radiologists visually evaluated similarity of the texture of the computer-simulated liver phantom and a real human liver to confirm the appropriateness of the virtual liver images using a five-point scale. The total score was 44 in two radiologists, and 42, 41, and 39 in one radiologist each. They evaluated that the textures of virtual liver were comparable to those of human liver. Our computer-simulated liver phantom is a promising tool for the evaluation of the image quality and diagnostic performance of hepatic CT imaging. (orig.)

  17. Computer Simulations of Lipid Bilayers and Proteins

    DEFF Research Database (Denmark)

    Sonne, Jacob

    2006-01-01

    The importance of computer simulations in lipid bilayer research has become more prominent for the last couple of decades and as computers get even faster, simulations will play an increasingly important part of understanding the processes that take place in and across cell membranes. This thesis...... entitled Computer simulations of lipid bilayers and proteins describes two molecular dynamics (MD) simulation studies of pure lipid bilayers as well as a study of a transmembrane protein embedded in a lipid bilayer matrix. Below follows a brief overview of the thesis. Chapter 1. This chapter is a short...... in the succeeding chapters is presented. Details on system setups, simulation parameters and other technicalities can be found in the relevant chapters. Chapter 3, DPPC lipid parameters: The quality of MD simulations is intimately dependent on the empirical potential energy function and its parameters, i...

  18. Simulating the Earth System Response to Negative Emissions

    Science.gov (United States)

    Jackson, R. B.; Milne, J.; Littleton, E. W.; Jones, C.; Canadell, J.; Peters, G. P.; van Vuuren, D.; Davis, S. J.; Jonas, M.; Smith, P.; Ciais, P.; Rogelj, J.; Torvanger, A.; Shrestha, G.

    2016-12-01

    The natural carbon sinks of the land and oceans absorb approximately half the anthropogenic CO2 emitted every year. The CO2 that is not absorbed accumulates in the Earth's atmosphere and traps the suns rays causing an increase in the global mean temperature. Removing this left over CO2 using negative emissions technologies (NETs) has been proposed as a strategy to lessen the accumulating CO2 and avoid dangerous climate change. Using CMIP5 Earth system model simulations this study assessed the impact on the global carbon cycle, and how the Earth system might respond, to negative emissions strategies applied to low emissions scenarios, over different times horizons from the year 2000 to 2300. The modeling results suggest that using NETs to remove atmospheric CO2 over five 50-year time horizons has varying effects at different points in time. The effects of anthropogenic and natural sources and sinks, can result in positive or negative changes in atmospheric CO2 concentration. Results show that historic emissions and the current state of the Earth System have impacts on the behavior of atmospheric CO2, as do instantaneous anthropogenic emissions. Indeed, varying background scenarios seemed to have a greater effect on atmospheric CO2 than the actual amount and timing of NETs. These results show how NETs interact with the physical climate-carbon cycle system and highlight the need for more research on earth-system dynamics as they relate to carbon sinks and sources and anthropogenic perturbations.

  19. A 1.8 trillion degrees-of-freedom, 1.24 petaflops global seismic wave simulation on the K computer

    KAUST Repository

    Tsuboi, Seiji

    2016-03-01

    We present high-performance simulations of global seismic wave propagation with an unprecedented accuracy of 1.2 s seismic period for a realistic three-dimensional Earth model using the spectral element method on the K computer. Our seismic simulations use a total of 665.2 billion grid points and resolve 1.8 trillion degrees of freedom. To realize these large-scale computations, we optimize a widely used community software code to efficiently address all hardware parallelization, especially thread-level parallelization to solve the bottleneck of memory usage for coarse-grained parallelization. The new code exhibits excellent strong scaling for the time stepping loop, that is, parallel efficiency on 82,134 nodes relative to 36,504 nodes is 99.54%. Sustained performance of these computations on the K computer is 1.24 petaflops, which is 11.84% of its peak performance. The obtained seismograms with an accuracy of 1.2 s for the entire globe should help us to better understand rupture mechanisms of devastating earthquakes.

  20. A 1.8 trillion degrees-of-freedom, 1.24 petaflops global seismic wave simulation on the K computer

    KAUST Repository

    Tsuboi, Seiji; Ando, Kazuto; Miyoshi, Takayuki; Peter, Daniel; Komatitsch, Dimitri; Tromp, Jeroen

    2016-01-01

    We present high-performance simulations of global seismic wave propagation with an unprecedented accuracy of 1.2 s seismic period for a realistic three-dimensional Earth model using the spectral element method on the K computer. Our seismic simulations use a total of 665.2 billion grid points and resolve 1.8 trillion degrees of freedom. To realize these large-scale computations, we optimize a widely used community software code to efficiently address all hardware parallelization, especially thread-level parallelization to solve the bottleneck of memory usage for coarse-grained parallelization. The new code exhibits excellent strong scaling for the time stepping loop, that is, parallel efficiency on 82,134 nodes relative to 36,504 nodes is 99.54%. Sustained performance of these computations on the K computer is 1.24 petaflops, which is 11.84% of its peak performance. The obtained seismograms with an accuracy of 1.2 s for the entire globe should help us to better understand rupture mechanisms of devastating earthquakes.

  1. Mapping land cover change over continental Africa using Landsat and Google Earth Engine cloud computing.

    Science.gov (United States)

    Midekisa, Alemayehu; Holl, Felix; Savory, David J; Andrade-Pacheco, Ricardo; Gething, Peter W; Bennett, Adam; Sturrock, Hugh J W

    2017-01-01

    Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth's land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources.

  2. Radiotherapy Monte Carlo simulation using cloud computing technology.

    Science.gov (United States)

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  3. Radiotherapy Monte Carlo simulation using cloud computing technology

    International Nuclear Information System (INIS)

    Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.

    2012-01-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  4. Class D management implementation approach of the first orbital mission of the Earth Venture series

    Science.gov (United States)

    Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris

    2013-09-01

    A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.

  5. Class D Management Implementation Approach of the First Orbital Mission of the Earth Venture Series

    Science.gov (United States)

    Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris

    2013-01-01

    A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.

  6. Earth2Class Overview: An Innovative Program Linking Classroom Educators and Research Scientists

    Science.gov (United States)

    Passow, M.; Iturrino, G. J.; Baggio, F. D.; Assumpcao, C. M.

    2005-12-01

    The Earth2Class (E2C) workshops, held at the Lamont-Doherty Earth Observatory (LDEO), provide an effective model for improving knowledge, teaching, and technology skills of middle and high school science educators through ongoing interactions with research scientists and educational technology. With support from an NSF GeoEd grant, E2C has developed monthly workshops, web-based resources, and summer institutes in which classroom teachers and research scientists have produced exemplar curriculum materials about a wide variety of cutting-edge geoscience investigations suitable for dissemination to teachers and students. Some of the goals of this program are focused to address questions such as: (1) What aspects of the E2C format and educational technology most effectively connect research discoveries with classroom teachers and their students? (2) What benefits result through interactions among teachers from highly diverse districts and backgrounds with research scientists, and what benefits do the scientists gain from participation? (3) How can the E2C format serve as a model for other research institution-school district partnerships as a mechanism for broader dissemination of scientific discoveries? E2C workshops have linked LDEO scientists from diverse research specialties-seismology, marine geology, paleoclimatology, ocean drilling, dendrochronology, remote sensing, impact craters, and others-with teachers from schools in the New York metropolitan area. Through the workshops, we have trained teachers to enhance content knowledge in the Earth Sciences and develop skills to incorporate new technologies. We have made a special effort to increase the teaching competency of K-12 Earth Sciences educators serving in schools with high numbers of students from underrepresented groups, thereby providing greater role models to attract students into science and math careers. E2C sponsored Earth Science Teachers Conferences, bringing together educators from New York and New

  7. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  8. Atomistic computer simulations a practical guide

    CERN Document Server

    Brazdova, Veronika

    2013-01-01

    Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o

  9. Fel simulations using distributed computing

    NARCIS (Netherlands)

    Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.

    2016-01-01

    While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code

  10. CUBESIM, Hypercube and Denelcor Hep Parallel Computer Simulation

    International Nuclear Information System (INIS)

    Dunigan, T.H.

    1988-01-01

    1 - Description of program or function: CUBESIM is a set of subroutine libraries and programs for the simulation of message-passing parallel computers and shared-memory parallel computers. Subroutines are supplied to simulate the Intel hypercube and the Denelcor HEP parallel computers. The system permits a user to develop and test parallel programs written in C or FORTRAN on a single processor. The user may alter such hypercube parameters as message startup times, packet size, and the computation-to-communication ratio. The simulation generates a trace file that can be used for debugging, performance analysis, or graphical display. 2 - Method of solution: The CUBESIM simulator is linked with the user's parallel application routines to run as a single UNIX process. The simulator library provides a small operating system to perform process and message management. 3 - Restrictions on the complexity of the problem: Up to 128 processors can be simulated with a virtual memory limit of 6 million bytes. Up to 1000 processes can be simulated

  11. Accelerator simulation using computers

    International Nuclear Information System (INIS)

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications

  12. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, Miroslav; Akkerboom, J.; Litjens, R.; van den Berg, Hans Leo

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  13. Computer Simulation in Information and Communication Engineering

    CERN Multimedia

    Anton Topurov

    2005-01-01

    CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...

  14. Computer simulation of the Charpy V-notch toughness test

    International Nuclear Information System (INIS)

    Norris, D.M. Jr.

    1977-01-01

    The dynamic Charpy V-notch test was simulated on a computer. The calculational models (for A-533 Grade B class 1 steel) used both a rounded and a flat-tipped striker. The notch stress/strain state was found to be independent of the three-point loading type and was most strongly correlated with notch-opening displacement. The dynamic stress/strain state at the time of fracture initiation was obtained by comparing the calculated deformed shape with that obtained in interrupted Charpy V-notch tests where cracking had started. The calculation was also compared with stress/strain states calculated in other geometries at failure. The distribution and partition of specimen energy was calculated and adiabatic heating and strain rate are discussed

  15. Fully automatic guidance and control for rotorcraft nap-of-the-Earth flight following planned profiles. Volume 1: Real-time piloted simulation

    Science.gov (United States)

    Clement, Warren F.; Gorder, Peter J.; Jewell, Wayne F.

    1991-01-01

    Developing a single-pilot, all-weather nap-of-the-earth (NOE) capability requires fully automatic NOE (ANOE) navigation and flight control. Innovative guidance and control concepts are investigated in a four-fold research effort that: (1) organizes the on-board computer-based storage and real-time updating of NOE terrain profiles and obstacles in course-oriented coordinates indexed to the mission flight plan; (2) defines a class of automatic anticipative pursuit guidance algorithms and necessary data preview requirements to follow the vertical, lateral, and longitudinal guidance commands dictated by the updated flight profiles; (3) automates a decision-making process for unexpected obstacle avoidance; and (4) provides several rapid response maneuvers. Acquired knowledge from the sensed environment is correlated with the forehand knowledge of the recorded environment (terrain, cultural features, threats, and targets), which is then used to determine an appropriate evasive maneuver if a nonconformity of the sensed and recorded environments is observed. This four-fold research effort was evaluated in both fixed-based and moving-based real-time piloted simulations, thereby, providing a practical demonstration for evaluating pilot acceptance of the automated concepts, supervisory override, manual operation, and re-engagement of the automatic system. Volume one describes the major components of the guidance and control laws as well as the results of the piloted simulations. Volume two describes the complete mathematical model of the fully automatic guidance system for rotorcraft NOE flight following planned flight profiles.

  16. Earth Model with Laser Beam Simulating Seismic Ray Paths.

    Science.gov (United States)

    Ryan, John Arthur; Handzus, Thomas Jay, Jr.

    1988-01-01

    Described is a simple device, that uses a laser beam to simulate P waves. It allows students to follow ray paths, reflections and refractions within the earth. Included is a set of exercises that lead students through the steps by which the presence of the outer and inner cores can be recognized. (Author/CW)

  17. Using Laptop Computers in Class: A Student Motivation Perspective

    Science.gov (United States)

    Houle, Philip A.; Reed, Diana; Vaughan, Amy Grace; Clayton, Suzanne R.

    2013-01-01

    This study examined the reasons why students choose to take laptop computers into college classes. The model involved the individual student choice involving opportunity, ability and motivation. The resulting model demonstrated how some (primary) factors, such as effective learning, directly impact the laptop usage choice, and other factors…

  18. Computational simulation of concurrent engineering for aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  19. Computational simulation for concurrent engineering of aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  20. Numerical simulation of earth fissures caused by overly aquifer exploitation at Guangming Village, China

    Science.gov (United States)

    Ye, S.; Franceschini, A.; Zhang, Y.; Janna, C.; Gong, X.; Yu, J.; Teatini, P.

    2017-12-01

    Earth fissures accompanying anthropogenic land subsidence due to overly aquifer exploitation create significant geohazards in China. In the framework of an efficient and safe management of groundwater, numerical models represent a unique scientific approach to predict the generation and development of earth fissures. However, the common geomechanical simulators fail to reproduce fissure development because, due to compatibility conditions, they cannot be effectively applied in discontinuous mechanics. We present an innovative modelling approach for the simulation of fissure development. Firstly, a regional 3D groundwater model is calibrated on available piezometric records; secondly, the regional outcome is used to define the boundary conditions of a local 3D groundwater model developed at the fissure scale and implementing a refined discretization of the local hydrogeologic setting; finally, the pressure change are used as forcing factor in a local 3D geomechanical model, which combines Finite Elements and Interface Elements to simulate the deformation of the continuous aquifer system and the generation and sliding/opening of earth fissures The approach has been applied to simulate the earth fissure at Guangming Village in Wuxi, China with land subsidence of more than 1 m caused by the overexploitation of the second confined aquifer. The first earth fissure was observed in 1998. It developed fast from 1998 to 2007. The domain addressed by the local simulations is 2 km wide and 5 km long. The thickness of the aquifer system ranges from 0 m, in the proximity of a mountain ridge southward, to 210 m northward and includes a phreatic aquifer, the first and second confined aquifers, and four aquitards. The simulations spanned the period from 1980, i.e. before the inception of large groundwater withdrawals, to 2015. The modelling results highlight that the earth fissures at Guangming Village have been caused by tension and shear, which developed from the land surface

  1. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  2. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  3. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  4. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  5. Journal of Earth System Science | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 121; Issue 3 ... The failure of atmospheric general circulation models (AGCMs) forced by ... Centre for Mathematical Modelling and Computer Simulation, Bangalore 560 037, India.

  6. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  7. Qualitative analysis of the rare earth element by simulation of inductively coupled plasma emission spectra

    International Nuclear Information System (INIS)

    Hashimoto, M.S.; Tobishima, Taeko; Kamitake, Seigo; Yasuda, Kazuo.

    1985-01-01

    The emission lines for qualitative analysis of rare earth elements by a simulation technique of ICP spectra were proposed. The spectra were simulated by employing a Gaussian (or a Lorentzian at high concentrations) profile. The simulated spectra corresponded quite well with the observed ones. The emission lines were selected so that the interference was as small as possible. The present qualitative analysis is based on a pattern recognition method where observed intensity ratios of the emission lines in each element are compared with those of a single analyte element. The qualitative analysis was performed for twelve standard solutions containing a single rare earth element and for eight standard solutions containing an element other than rare earth elements. The selection of the emission lines and the algorithm of the present qualitative analysis were justified. (author)

  8. Automatic temperature computation for realistic IR simulation

    Science.gov (United States)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  9. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    people who use computers every moment of their waking lives, others even ... How is discrete event simulation different from other kinds of simulation? ... time, energy consumption .... Schedule the CustomerDeparture event for this customer.

  10. A hybrid method for the computation of quasi-3D seismograms.

    Science.gov (United States)

    Masson, Yder; Romanowicz, Barbara

    2013-04-01

    The development of powerful computer clusters and efficient numerical computation methods, such as the Spectral Element Method (SEM) made possible the computation of seismic wave propagation in a heterogeneous 3D earth. However, the cost of theses computations is still problematic for global scale tomography that requires hundreds of such simulations. Part of the ongoing research effort is dedicated to the development of faster modeling methods based on the spectral element method. Capdeville et al. (2002) proposed to couple SEM simulations with normal modes calculation (C-SEM). Nissen-Meyer et al. (2007) used 2D SEM simulations to compute 3D seismograms in a 1D earth model. Thanks to these developments, and for the first time, Lekic et al. (2011) developed a 3D global model of the upper mantle using SEM simulations. At the local and continental scale, adjoint tomography that is using a lot of SEM simulation can be implemented on current computers (Tape, Liu et al. 2009). Due to their smaller size, these models offer higher resolution. They provide us with images of the crust and the upper part of the mantle. In an attempt to teleport such local adjoint tomographic inversions into the deep earth, we are developing a hybrid method where SEM computation are limited to a region of interest within the earth. That region can have an arbitrary shape and size. Outside this region, the seismic wavefield is extrapolated to obtain synthetic data at the Earth's surface. A key feature of the method is the use of a time reversal mirror to inject the wavefield induced by distant seismic source into the region of interest (Robertsson and Chapman 2000). We compute synthetic seismograms as follow: Inside the region of interest, we are using regional spectral element software RegSEM to compute wave propagation in 3D. Outside this region, the wavefield is extrapolated to the surface by convolution with the Green's functions from the mirror to the seismic stations. For now, these

  11. Earth System Modeling 2.0: A Blueprint for Models That Learn From Observations and Targeted High-Resolution Simulations

    Science.gov (United States)

    Schneider, Tapio; Lan, Shiwei; Stuart, Andrew; Teixeira, João.

    2017-12-01

    Climate projections continue to be marred by large uncertainties, which originate in processes that need to be parameterized, such as clouds, convection, and ecosystems. But rapid progress is now within reach. New computational tools and methods from data assimilation and machine learning make it possible to integrate global observations and local high-resolution simulations in an Earth system model (ESM) that systematically learns from both and quantifies uncertainties. Here we propose a blueprint for such an ESM. We outline how parameterization schemes can learn from global observations and targeted high-resolution simulations, for example, of clouds and convection, through matching low-order statistics between ESMs, observations, and high-resolution simulations. We illustrate learning algorithms for ESMs with a simple dynamical system that shares characteristics of the climate system; and we discuss the opportunities the proposed framework presents and the challenges that remain to realize it.

  12. Launch Site Computer Simulation and its Application to Processes

    Science.gov (United States)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  13. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  14. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  15. Computational steering of GEM based detector simulations

    Science.gov (United States)

    Sheharyar, Ali; Bouhali, Othmane

    2017-10-01

    Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.

  16. Highway traffic simulation on multi-processor computers

    Energy Technology Data Exchange (ETDEWEB)

    Hanebutte, U.R.; Doss, E.; Tentner, A.M.

    1997-04-01

    A computer model has been developed to simulate highway traffic for various degrees of automation with a high level of fidelity in regard to driver control and vehicle characteristics. The model simulates vehicle maneuvering in a multi-lane highway traffic system and allows for the use of Intelligent Transportation System (ITS) technologies such as an Automated Intelligent Cruise Control (AICC). The structure of the computer model facilitates the use of parallel computers for the highway traffic simulation, since domain decomposition techniques can be applied in a straight forward fashion. In this model, the highway system (i.e. a network of road links) is divided into multiple regions; each region is controlled by a separate link manager residing on an individual processor. A graphical user interface augments the computer model kv allowing for real-time interactive simulation control and interaction with each individual vehicle and road side infrastructure element on each link. Average speed and traffic volume data is collected at user-specified loop detector locations. Further, as a measure of safety the so- called Time To Collision (TTC) parameter is being recorded.

  17. WWC Review of the Report "Conceptualizing Astronomical Scale: Virtual Simulations on Handheld Tablet Computers Reverse Misconceptions." What Works Clearinghouse Single Study Review

    Science.gov (United States)

    What Works Clearinghouse, 2014

    2014-01-01

    The 2014 study, "Conceptualizing Astronomical Scale: Virtual Simulations on Handheld Tablet Computers Reverse Misconceptions," examined the effects of using the true-to-scale (TTS) display mode versus the orrery display mode in the iPad's Solar Walk software application on students' knowledge of the Earth's place in the solar system. The…

  18. The Shortlist Method for fast computation of the Earth Mover's Distance and finding optimal solutions to transportation problems.

    Science.gov (United States)

    Gottschlich, Carsten; Schuhmacher, Dominic

    2014-01-01

    Finding solutions to the classical transportation problem is of great importance, since this optimization problem arises in many engineering and computer science applications. Especially the Earth Mover's Distance is used in a plethora of applications ranging from content-based image retrieval, shape matching, fingerprint recognition, object tracking and phishing web page detection to computing color differences in linguistics and biology. Our starting point is the well-known revised simplex algorithm, which iteratively improves a feasible solution to optimality. The Shortlist Method that we propose substantially reduces the number of candidates inspected for improving the solution, while at the same time balancing the number of pivots required. Tests on simulated benchmarks demonstrate a considerable reduction in computation time for the new method as compared to the usual revised simplex algorithm implemented with state-of-the-art initialization and pivot strategies. As a consequence, the Shortlist Method facilitates the computation of large scale transportation problems in viable time. In addition we describe a novel method for finding an initial feasible solution which we coin Modified Russell's Method.

  19. Using Office Simulation Software in Teaching Computer Literacy Using Three Sets of Teaching/Learning Activities

    Directory of Open Access Journals (Sweden)

    Azad Ali

    2016-05-01

    Full Text Available The most common course delivery model is based on teacher (knowledge provider - student (knowledge receiver relationship. The most visible symptom of this situation is over-reliance on textbook’s tutorials. This traditional model of delivery reduces teacher flexibility, causes lack of interest among students, and often makes classes boring. Especially this is visible when teaching Computer Literacy courses. Instead, authors of this paper suggest a new active model which is based on MS Office simulation. The proposed model was discussed within the framework of three activities: guided software simulation, instructor-led activities, and self-directed learning activities. The model proposed in the paper of active teaching based on software simulation was proven as more effective than traditional.

  20. Alternative energy technologies an introduction with computer simulations

    CERN Document Server

    Buxton, Gavin

    2014-01-01

    Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe

  1. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  2. Simulation of classical thermal states on a quantum computer: A transfer-matrix approach

    International Nuclear Information System (INIS)

    Yung, Man-Hong; Nagaj, Daniel; Whitfield, James D.; Aspuru-Guzik, Alan

    2010-01-01

    We present a hybrid quantum-classical algorithm to simulate thermal states of classical Hamiltonians on a quantum computer. Our scheme employs a sequence of locally controlled rotations, building up the desired state by adding qubits one at a time. We identified a class of classical models for which our method is efficient and avoids potential exponential overheads encountered by Grover-like or quantum Metropolis schemes. Our algorithm also gives an exponential advantage for two-dimensional Ising models with magnetic field on a square lattice, compared with the previously known Zalka's algorithm.

  3. Design and simulation of a hybrid ventilation system with earth-air heat exchanger

    Energy Technology Data Exchange (ETDEWEB)

    Athienitis, A.K.; Zhao, M. [Concordia Univ., Centre for Building Studies, Montreal, PQ (Canada). Dept. of Building, Civil and Environmental Engineering; Roy, M. [Martin Roy and Associes Group Conseil Inc., Montreal, PQ (Canada)

    2005-07-01

    A simulation study was conducted during the design phase of a new circus building in Montreal which includes a hybrid ventilation system through which fresh air is supplied from an earth-air heat exchanger (EAHE). The EAHE has the potential to satisfy the cooling needs of the building and can also be used to preheat fresh air, thereby satisfying one-third or more of the building's heating needs. Another feature of the building is that it uses displacement ventilation by which the air is supplied at low velocities through large diffusers behind the top level seats or under the seats. In this study, computational fluid dynamics (CFD) simulations were carried out to help size the supply and return units of the heating, ventilating and air conditioning (HVAC) system, as well as the exhaust chimney. The primary objective of the CFD simulation was to determine the maximum velocity and temperature in the seated area to ensure thermal comfort. CFD simulation predictions were found to be in good agreement with preliminary measurements taken in the building. In order to monitor the operation of the system over the next year, the underground ducts were equipped with temperature sensors at several depths into the soil. The energy efficiency of the hybrid HVAC system will be assessed and the velocity and temperature distribution in the theatre will be examined under various operating and energy load conditions. 8 refs., 6 figs.

  4. Computer Game Design Classes: The Students' and Professionals' Perspectives

    Science.gov (United States)

    Swacha, Jakub; Skrzyszewski, Adam; Syslo, Wojciech A.

    2010-01-01

    There are multiple reasons that justify teaching computer game design. Its multi-aspectual nature creates opportunity to develop, at the same time, creativity, technical skills and ability to work in team. Thinking of game design classes, one needs direction on what to focus on so that the students could benefit the most. In this paper, we present…

  5. Earth Science Computational Architecture for Multi-disciplinary Investigations

    Science.gov (United States)

    Parker, J. W.; Blom, R.; Gurrola, E.; Katz, D.; Lyzenga, G.; Norton, C.

    2005-12-01

    Understanding the processes underlying Earth's deformation and mass transport requires a non-traditional, integrated, interdisciplinary, approach dependent on multiple space and ground based data sets, modeling, and computational tools. Currently, details of geophysical data acquisition, analysis, and modeling largely limit research to discipline domain experts. Interdisciplinary research requires a new computational architecture that is optimized to perform complex data processing of multiple solid Earth science data types in a user-friendly environment. A web-based computational framework is being developed and integrated with applications for automatic interferometric radar processing, and models for high-resolution deformation & gravity, forward models of viscoelastic mass loading over short wavelengths & complex time histories, forward-inverse codes for characterizing surface loading-response over time scales of days to tens of thousands of years, and inversion of combined space magnetic & gravity fields to constrain deep crustal and mantle properties. This framework combines an adaptation of the QuakeSim distributed services methodology with the Pyre framework for multiphysics development. The system uses a three-tier architecture, with a middle tier server that manages user projects, available resources, and security. This ensures scalability to very large networks of collaborators. Users log into a web page and have a personal project area, persistently maintained between connections, for each application. Upon selection of an application and host from a list of available entities, inputs may be uploaded or constructed from web forms and available data archives, including gravity, GPS and imaging radar data. The user is notified of job completion and directed to results posted via URLs. Interdisciplinary work is supported through easy availability of all applications via common browsers, application tutorials and reference guides, and worked examples with

  6. The role of computer simulation in nuclear technology development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, VV.; Ryazanov, D.K.; Tellin, A.I.

    2000-01-01

    In the report, the role and purpose of computer simulation in nuclear technology development is discussed. The authors consider such applications of computer simulation as: (a) Nuclear safety research; (b) Optimization of technical and economic parameters of acting nuclear plant; (c) Planning and support of reactor experiments; (d) Research and design new devices and technologies; (f) Design and development of 'simulators' for operating personnel training. Among marked applications, the following aspects of computer simulation are discussed in the report: (g) Neutron-physical, thermal and hydrodynamics models; (h) Simulation of isotope structure change and dam- age dose accumulation for materials under irradiation; (i) Simulation of reactor control structures. (authors)

  7. Development of computational science in JAEA. R and D of simulation

    International Nuclear Information System (INIS)

    Nakajima, Norihiro; Araya, Fumimasa; Hirayama, Toshio

    2006-01-01

    R and D of computational science in JAEA (Japan Atomic Energy Agency) is described. Environment of computer, R and D system in CCSE (Center for Computational Science and e-Systems), joint computational science researches in Japan and world, development of computer technologies, the some examples of simulation researches, 3-dimensional image vibrational platform system, simulation researches of FBR cycle techniques, simulation of large scale thermal stress for development of steam generator, simulation research of fusion energy techniques, development of grid computing technology, simulation research of quantum beam techniques and biological molecule simulation researches are explained. Organization of JAEA, development of computational science in JAEA, network of JAEA, international collaboration of computational science, and environment of ITBL (Information-Technology Based Laboratory) project are illustrated. (S.Y.)

  8. A computer code package for Monte Carlo photon-electron transport simulation Comparisons with experimental benchmarks

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M.

    2000-01-01

    A computer code package (PTSIM) for particle transport Monte Carlo simulation was developed using object oriented techniques of design and programming. A flexible system for simulation of coupled photon, electron transport, facilitating development of efficient simulation applications, was obtained. For photons: Compton and photo-electric effects, pair production and Rayleigh interactions are simulated, while for electrons, a class II condensed history scheme was considered, in which catastrophic interactions (Moeller electron-electron interaction, bremsstrahlung, etc.) are treated in detail and all other interactions with reduced individual effect on electron history are grouped together using continuous slowing down approximation and energy straggling theories. Electron angular straggling is simulated using Moliere theory or a mixed model in which scatters at large angles are treated as distinct events. Comparisons with experimentally benchmarks for electron transmission and bremsstrahlung emissions energy and angular spectra, and for dose calculations are presented

  9. Polymer Composites Corrosive Degradation: A Computational Simulation

    Science.gov (United States)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  10. 2012 Community Earth System Model (CESM) Tutorial - Proposal to DOE

    Energy Technology Data Exchange (ETDEWEB)

    Holland, Marika [National Center for Atmospheric Research, Boulder, CO (United States); Bailey, David A [National Center for Atmospheric Research, Boulder, CO (United States)

    2013-03-18

    The Community Earth System Model (CESM) is a fully-coupled, global climate model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states. This document provides the agenda and list of participants for the conference. Web materials for all lectures and practical sessions available from: http://www.cesm.ucar.edu/events/tutorials/073012/ .

  11. Challenges in computational fluid dynamics simulation for the nineties. Various examples of application

    International Nuclear Information System (INIS)

    Chabard, J.P.; Viollet, P.L.

    1991-01-01

    Most of the computational fluid dynamics applications which are encountered at the Research Branch of EDF (DER) are dealing with thermal exchanges. The development of numerical tools for the simulation of flows, devoted to this class of application, has been under way for 15 years. At the beginning this work was mainly concerned with a good simulation of the dynamics of the flow. Now these tools can be used to compute flows with thermal exchanges. The presentation will be limited to incompressible and one phase flows (the DER developments on two phase flows are discussed in the paper by MM. Hery, Boivin et Viollet (in the present magazine). First the softwares developed at DER will be presented. Then some applications of these tools to flows with thermal exchanges will be discussed. To conclude, the paper will treat the general case of the CFD codes. The challenges for the next years will be detailed in order to make these tools available for users involved in complex physical modeling [fr

  12. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  13. Structure, dynamics, and function of the monooxygenase P450 BM-3: insights from computer simulations studies

    International Nuclear Information System (INIS)

    Roccatano, Danilo

    2015-01-01

    The monooxygenase P450 BM-3 is a NADPH-dependent fatty acid hydroxylase enzyme isolated from soil bacterium Bacillus megaterium. As a pivotal member of cytochrome P450 superfamily, it has been intensely studied for the comprehension of structure–dynamics–function relationships in this class of enzymes. In addition, due to its peculiar properties, it is also a promising enzyme for biochemical and biomedical applications. However, despite the efforts, the full understanding of the enzyme structure and dynamics is not yet achieved. Computational studies, particularly molecular dynamics (MD) simulations, have importantly contributed to this endeavor by providing new insights at an atomic level regarding the correlations between structure, dynamics, and function of the protein. This topical review summarizes computational studies based on MD simulations of the cytochrome P450 BM-3 and gives an outlook on future directions. (topical review)

  14. GCM simulations of cold dry Snowball Earth atmospheres

    Science.gov (United States)

    Voigt, A.; Held, I.; Marotzke, J.

    2009-12-01

    We use the full-physics atmospheric general circulation model ECHAM5 to investigate cold and virtually dry Snowball Earth atmospheres. These result from specifying sea ice as the surface boundary condition everywhere, corresponding to a frozen aquaplanet, while keeping total solar irradiance at its present-day value of 1365 Wm-2 and setting atmospheric carbon dioxide to 300 ppmv. Here, we present four simulations corresponding to the four possible combinations of enabled or disabled diurnal and seasonal cycles. The aim of this study is twofold. First, we focus on the zonal-mean circulation of Snowball Earth atmospheres, which, due to missing moisture, might constitute an ideal though yet unexplored testbed for theories of atmospheric dynamics. Second, we investigate tropical surface temperatures with an emphasis on the impact of the diurnal and seasonal cycles. This will indicate whether the presence of the diurnal or seasonal cycle would facilitate or anticipate the escape from Snowball Earth conditions when total solar irradiance or atmospheric CO2 levels were increased. The dynamics of the tropical circulation in Snowball Earth atmospheres differs substantially from that in the modern atmosphere. The analysis of the mean zonal momentum budget reveals that the mean flow meridional advection of absolute vorticity is primarily balanced by vertical diffusion of zonal momentum. The contribution of eddies is found to be even smaller than the contribution of mean flow vertical advection of zonal momentum, the latter being usually neglected in theories for the Hadley circulation, at least in its upper tropospheric branch. Suppressing vertical diffusion of horizontal momentum above 850 hPa leads to a stronger Hadley circulation. This behaviour cannot be understood from axisymmetric models of the atmosphere, nor idealized atmospheric general circulation models, which both predict a weakening of the Hadley circulation when the vertical viscosity is decreased globally. We

  15. Computationally efficient SVM multi-class image recognition with confidence measures

    International Nuclear Information System (INIS)

    Makili, Lazaro; Vega, Jesus; Dormido-Canto, Sebastian; Pastor, Ignacio; Murari, Andrea

    2011-01-01

    Typically, machine learning methods produce non-qualified estimates, i.e. the accuracy and reliability of the predictions are not provided. Transductive predictors are very recent classifiers able to provide, simultaneously with the prediction, a couple of values (confidence and credibility) to reflect the quality of the prediction. Usually, a drawback of the transductive techniques for huge datasets and large dimensionality is the high computational time. To overcome this issue, a more efficient classifier has been used in a multi-class image classification problem in the TJ-II stellarator database. It is based on the creation of a hash function to generate several 'one versus the rest' classifiers for every class. By using Support Vector Machines as the underlying classifier, a comparison between the pure transductive approach and the new method has been performed. In both cases, the success rates are high and the computation time with the new method is up to 0.4 times the old one.

  16. Fluid simulation for computer graphics

    CERN Document Server

    Bridson, Robert

    2008-01-01

    Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.

  17. Large-scale simulations of error-prone quantum computation devices

    International Nuclear Information System (INIS)

    Trieu, Doan Binh

    2009-01-01

    The theoretical concepts of quantum computation in the idealized and undisturbed case are well understood. However, in practice, all quantum computation devices do suffer from decoherence effects as well as from operational imprecisions. This work assesses the power of error-prone quantum computation devices using large-scale numerical simulations on parallel supercomputers. We present the Juelich Massively Parallel Ideal Quantum Computer Simulator (JUMPIQCS), that simulates a generic quantum computer on gate level. It comprises an error model for decoherence and operational errors. The robustness of various algorithms in the presence of noise has been analyzed. The simulation results show that for large system sizes and long computations it is imperative to actively correct errors by means of quantum error correction. We implemented the 5-, 7-, and 9-qubit quantum error correction codes. Our simulations confirm that using error-prone correction circuits with non-fault-tolerant quantum error correction will always fail, because more errors are introduced than being corrected. Fault-tolerant methods can overcome this problem, provided that the single qubit error rate is below a certain threshold. We incorporated fault-tolerant quantum error correction techniques into JUMPIQCS using Steane's 7-qubit code and determined this threshold numerically. Using the depolarizing channel as the source of decoherence, we find a threshold error rate of (5.2±0.2) x 10 -6 . For Gaussian distributed operational over-rotations the threshold lies at a standard deviation of 0.0431±0.0002. We can conclude that quantum error correction is especially well suited for the correction of operational imprecisions and systematic over-rotations. For realistic simulations of specific quantum computation devices we need to extend the generic model to dynamic simulations, i.e. time-dependent Hamiltonian simulations of realistic hardware models. We focus on today's most advanced technology, i

  18. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Large Scale Earth's Bow Shock with Northern IMF as Simulated by PIC Code in Parallel with MHD Model

    Science.gov (United States)

    Baraka, Suleiman

    2016-06-01

    In this paper, we propose a 3D kinetic model (particle-in-cell, PIC) for the description of the large scale Earth's bow shock. The proposed version is stable and does not require huge or extensive computer resources. Because PIC simulations work with scaled plasma and field parameters, we also propose to validate our code by comparing its results with the available MHD simulations under same scaled solar wind (SW) and (IMF) conditions. We report new results from the two models. In both codes the Earth's bow shock position is found to be ≈14.8 R E along the Sun-Earth line, and ≈29 R E on the dusk side. Those findings are consistent with past in situ observations. Both simulations reproduce the theoretical jump conditions at the shock. However, the PIC code density and temperature distributions are inflated and slightly shifted sunward when compared to the MHD results. Kinetic electron motions and reflected ions upstream may cause this sunward shift. Species distributions in the foreshock region are depicted within the transition of the shock (measured ≈2 c/ ω pi for Θ Bn = 90° and M MS = 4.7) and in the downstream. The size of the foot jump in the magnetic field at the shock is measured to be (1.7 c/ ω pi ). In the foreshocked region, the thermal velocity is found equal to 213 km s-1 at 15 R E and is equal to 63 km s -1 at 12 R E (magnetosheath region). Despite the large cell size of the current version of the PIC code, it is powerful to retain macrostructure of planets magnetospheres in very short time, thus it can be used for pedagogical test purposes. It is also likely complementary with MHD to deepen our understanding of the large scale magnetosphere.

  20. Computer Access and Flowcharting as Variables in Learning Computer Programming.

    Science.gov (United States)

    Ross, Steven M.; McCormick, Deborah

    Manipulation of flowcharting was crossed with in-class computer access to examine flowcharting effects in the traditional lecture/laboratory setting and in a classroom setting where online time was replaced with manual simulation. Seventy-two high school students (24 male and 48 female) enrolled in a computer literacy course served as subjects.…

  1. Sophistication of computational science and fundamental physics simulations

    International Nuclear Information System (INIS)

    Ishiguro, Seiji; Ito, Atsushi; Usami, Shunsuke; Ohtani, Hiroaki; Sakagami, Hitoshi; Toida, Mieko; Hasegawa, Hiroki; Horiuchi, Ritoku; Miura, Hideaki

    2016-01-01

    Numerical experimental reactor research project is composed of the following studies: (1) nuclear fusion simulation research with a focus on specific physical phenomena of specific equipment, (2) research on advanced simulation method to increase predictability or expand its application range based on simulation, (3) visualization as the foundation of simulation research, (4) research for advanced computational science such as parallel computing technology, and (5) research aiming at elucidation of fundamental physical phenomena not limited to specific devices. Specifically, a wide range of researches with medium- to long-term perspectives are being developed: (1) virtual reality visualization, (2) upgrading of computational science such as multilayer simulation method, (3) kinetic behavior of plasma blob, (4) extended MHD theory and simulation, (5) basic plasma process such as particle acceleration due to interaction of wave and particle, and (6) research related to laser plasma fusion. This paper reviews the following items: (1) simultaneous visualization in virtual reality space, (2) multilayer simulation of collisionless magnetic reconnection, (3) simulation of microscopic dynamics of plasma coherent structure, (4) Hall MHD simulation of LHD, (5) numerical analysis for extension of MHD equilibrium and stability theory, (6) extended MHD simulation of 2D RT instability, (7) simulation of laser plasma, (8) simulation of shock wave and particle acceleration, and (9) study on simulation of homogeneous isotropic MHD turbulent flow. (A.O.)

  2. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  3. A Review of Freely Available Quantum Computer Simulation Software

    OpenAIRE

    Brandhorst-Satzkorn, Johan

    2012-01-01

    A study has been made of a few different freely available Quantum Computer simulators. All the simulators tested are available online on their respective websites. A number of tests have been performed to compare the different simulators against each other. Some untested simulators of various programming languages are included to show the diversity of the quantum computer simulator applications. The conclusion of the review is that LibQuantum is the best of the simulators tested because of ea...

  4. Cooperative Learning with a Computer in a Native Language Class.

    Science.gov (United States)

    Bennett, Ruth

    In a cooperative task, American Indian elementary students produced bilingual natural history dictionaries using a Macintosh computer. Students in grades 3 through 8 attended weekly, multi-graded bilingual classes in Hupa/English or Yurok/English, held at two public school field sites for training elementary teaching-credential candidates. Teams…

  5. Computer simulation of liquid crystals

    International Nuclear Information System (INIS)

    McBride, C.

    1999-01-01

    Molecular dynamics simulation performed on modern computer workstations provides a powerful tool for the investigation of the static and dynamic characteristics of liquid crystal phases. In this thesis molecular dynamics computer simulations have been performed for two model systems. Simulations of 4,4'-di-n-pentyl-bibicyclo[2.2.2]octane demonstrate the growth of a structurally ordered phase directly from an isotropic fluid. This is the first time that this has been achieved for an atomistic model. The results demonstrate a strong coupling between orientational ordering and molecular shape, but indicate that the coupling between molecular conformational changes and molecular reorientation is relatively weak. Simulations have also been performed for a hybrid Gay-Berne/Lennard-Jones model resulting in thermodynamically stable nematic and smectic phases. Frank elastic constants have been calculated for the nematic phase formed by the hybrid model through analysis of the fluctuations of the nematic director, giving results comparable with those found experimentally. Work presented in this thesis also describes the parameterization of the torsional potential of a fragment of a dimethyl siloxane polymer chain, disiloxane diol (HOMe 2 Si) 2 O, using ab initio quantum mechanical calculations. (author)

  6. Simulation of statistical γ-spectra of highly excited rare earth nuclei

    International Nuclear Information System (INIS)

    Schiller, A.; Munos, G.; Guttormsen, M.; Bergholt, L.; Melby, E.; Rekstad, J.; Siem, S.; Tveter, T.S.

    1997-05-01

    The statistical γ-spectra of highly excited even-even rare earth nuclei are simulated applying appropriate level density and strength function to a given nucleus. Hindrance effects due to K-conservation are taken into account. Simulations are compared to experimental data from the 163 Dy( 3 He,α) 162 Dy and 173 Yb( 3 He,α) 172 Yb reactions. The influence of the K quantum number at higher energies is discussed. 21 refs., 7 figs., 2 tabs

  7. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  8. Biomes computed from simulated climatologies

    Energy Technology Data Exchange (ETDEWEB)

    Claussen, M.; Esch, M. [Max-Planck-Institut fuer Meteorologie, Hamburg (Germany)

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

  9. Computer security simulation

    International Nuclear Information System (INIS)

    Schelonka, E.P.

    1979-01-01

    Development and application of a series of simulation codes used for computer security analysis and design are described. Boolean relationships for arrays of barriers within functional modules are used to generate composite effectiveness indices. The general case of multiple layers of protection with any specified barrier survival criteria is given. Generalized reduction algorithms provide numerical security indices in selected subcategories and for the system as a whole. 9 figures, 11 tables

  10. Using News Media Databases (LexisNexis) To Identify Relevant Topics For Introductory Earth Science Classes

    Science.gov (United States)

    Cervato, C.; Jach, J. Y.; Ridky, R.

    2003-12-01

    Introductory Earth science courses are undergoing pedagogical changes in universities across the country and are focusing more than ever on the non-science majors. Increasing enrollment of non-science majors in these introductory Earth science courses demands a new look at what is being taught and how the content can be objectively chosen. Assessing the content and effectiveness of these courses requires a quantitative investigation of introductory Earth science topics and their relevance to current issues and concerns. Relevance of Earth science topics can be linked to improved students' attitude toward science and a deeper understanding of concepts. We have used the Internet based national news search-engine LexisNexis Academic Universe (http://www.lexisnexis.org/) to select the occurrence of Earth science terms over the last 12 months, five and ten years both regionally and nationally. This database of term occurrences is being used to examine how Earth sciences have evolved in the news through the last 10 years and is also compared with textbook contents and course syllabi from randomly selected introductory earth science courses across the nation. These data constitute the quantitative foundation for this study and are being used to evaluate the relevance of introductory earth science course content. The relevance of introductory course content and current real-world issues to student attitudes is a crucial factor when considering changes in course curricula and pedagogy. We have examined students' conception of the nature of science and attitudes towards science and learning science using a Likert-scale assessment instrument in the fall 2002 Geology 100 classes at Iowa State University. A pre-test and post-test were administered to see if the students' attitudes changed during the semester using as reference a control group comprised of geoscience undergraduate and graduate students, and faculty. The results of the attitude survey have been analyzed in terms

  11. Understanding Islamist political violence through computational social simulation

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  12. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  13. Cloud Computing Technologies in Writing Class: Factors Influencing Students’ Learning Experience

    Directory of Open Access Journals (Sweden)

    Jenny WANG

    2017-07-01

    Full Text Available The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study addresses the implementation of the most commonly used cloud applications, Google Docs, in a higher education course. The learning environment integrated Google Docs that students are using to develop and deploy writing assignments in between classes has been subjected to learning experience assessment. Using the questionnaire as an instrument to study participants (n=28, the system has provided an effective learning environment in between classes for the students and the instructor to stay connected. Factors influencing students’ learning experience based on cloud applications include frequency of interaction online and students’ technology experience. Suggestions to cope with challenges regarding the use of them in higher education including the technical issues are also presented. Educators are therefore encouraged to embrace cloud computing technologies as they design the course curriculum in hoping to effectively enrich students’ learning.

  14. REACTOR: a computer simulation for schools

    International Nuclear Information System (INIS)

    Squires, D.

    1985-01-01

    The paper concerns computer simulation of the operation of a nuclear reactor, for use in schools. The project was commissioned by UKAEA, and carried out by the Computers in the Curriculum Project, Chelsea College. The program, for an advanced gas cooled reactor, is briefly described. (U.K.)

  15. TUF simulation of Darlington class IV power failure

    Energy Technology Data Exchange (ETDEWEB)

    Liauw, W K; Liu, W S; Leung, R K; Phillips, B S [Ontario Hydro, Toronto, ON (Canada)

    1996-12-31

    Presented here is the TUF simulation of the initial transient of the Class IV power failure event that occurred on November 25, 1993 at Darlington Unit 4. The important physical parameters and models that relate to this event are discussed. The agreements between the code predictions and the plant data on the thermal-hydraulics and controller responses demonstrate the code reliability for plant operational support. (author). 4 refs., 1 tab., 12 figs.

  16. TUF simulation of Darlington class IV power failure

    International Nuclear Information System (INIS)

    Liauw, W.K.; Liu, W.S.; Leung, R.K.; Phillips, B.S.

    1995-01-01

    Presented here is the TUF simulation of the initial transient of the Class IV power failure event that occurred on November 25, 1993 at Darlington Unit 4. The important physical parameters and models that relate to this event are discussed. The agreements between the code predictions and the plant data on the thermal-hydraulics and controller responses demonstrate the code reliability for plant operational support. (author). 4 refs., 1 tab., 12 figs

  17. Learning and instruction with computer simulations

    NARCIS (Netherlands)

    de Jong, Anthonius J.M.

    1991-01-01

    The present volume presents the results of an inventory of elements of such a computer learning environment. This inventory was conducted within a DELTA project called SIMULATE. In the project a learning environment that provides intelligent support to learners and that has a simulation as its

  18. Computer simulation on molten ionic salts

    International Nuclear Information System (INIS)

    Kawamura, K.; Okada, I.

    1978-01-01

    The extensive advances in computer technology have since made it possible to apply computer simulation to the evaluation of the macroscopic and microscopic properties of molten salts. The evaluation of the potential energy in molten salts systems is complicated by the presence of long-range energy, i.e. Coulomb energy, in contrast to simple liquids where the potential energy is easily evaluated. It has been shown, however, that no difficulties are encountered when the Ewald method is applied to the evaluation of Coulomb energy. After a number of attempts had been made to approximate the pair potential, the Huggins-Mayer potential based on ionic crystals became the most often employed. Since it is thought that the only appreciable contribution to many-body potential, not included in Huggins-Mayer potential, arises from the internal electrostatic polarization of ions in molten ionic salts, computer simulation with a provision for ion polarization has been tried recently. The computations, which are employed mainly for molten alkali halides, can provide: (1) thermodynamic data such as internal energy, internal pressure and isothermal compressibility; (2) microscopic configurational data such as radial distribution functions; (3) transport data such as the diffusion coefficient and electrical conductivity; and (4) spectroscopic data such as the intensity of inelastic scattering and the stretching frequency of simple molecules. The computed results seem to agree well with the measured results. Computer simulation can also be used to test the effectiveness of a proposed pair potential and the adequacy of postulated models of molten salts, and to obtain experimentally inaccessible data. A further application of MD computation employing the pair potential based on an ionic model to BeF 2 , ZnCl 2 and SiO 2 shows the possibility of quantitative interpretation of structures and glass transformation phenomena

  19. New Pedagogies on Teaching Science with Computer Simulations

    Science.gov (United States)

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  20. Large-scale simulations of error-prone quantum computation devices

    Energy Technology Data Exchange (ETDEWEB)

    Trieu, Doan Binh

    2009-07-01

    The theoretical concepts of quantum computation in the idealized and undisturbed case are well understood. However, in practice, all quantum computation devices do suffer from decoherence effects as well as from operational imprecisions. This work assesses the power of error-prone quantum computation devices using large-scale numerical simulations on parallel supercomputers. We present the Juelich Massively Parallel Ideal Quantum Computer Simulator (JUMPIQCS), that simulates a generic quantum computer on gate level. It comprises an error model for decoherence and operational errors. The robustness of various algorithms in the presence of noise has been analyzed. The simulation results show that for large system sizes and long computations it is imperative to actively correct errors by means of quantum error correction. We implemented the 5-, 7-, and 9-qubit quantum error correction codes. Our simulations confirm that using error-prone correction circuits with non-fault-tolerant quantum error correction will always fail, because more errors are introduced than being corrected. Fault-tolerant methods can overcome this problem, provided that the single qubit error rate is below a certain threshold. We incorporated fault-tolerant quantum error correction techniques into JUMPIQCS using Steane's 7-qubit code and determined this threshold numerically. Using the depolarizing channel as the source of decoherence, we find a threshold error rate of (5.2{+-}0.2) x 10{sup -6}. For Gaussian distributed operational over-rotations the threshold lies at a standard deviation of 0.0431{+-}0.0002. We can conclude that quantum error correction is especially well suited for the correction of operational imprecisions and systematic over-rotations. For realistic simulations of specific quantum computation devices we need to extend the generic model to dynamic simulations, i.e. time-dependent Hamiltonian simulations of realistic hardware models. We focus on today's most advanced

  1. Interoceanic canal excavation scheduling via computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Baldonado, Orlino C [Holmes and Narver, Inc., Los Angeles, CA (United States)

    1970-05-15

    The computer simulation language GPSS/360 was used to simulate the schedule of several nuclear detonation programs for the interoceanic canal project. The effects of using different weather restriction categories due to air blast and fallout were investigated. The effect of increasing the number of emplacement and stemming crews and the effect of varying the reentry period after detonating a row charge or salvo were also studied. Detonation programs were simulated for the proposed Routes 17A and 25E. The study demonstrates the method of using computer simulation so that a schedule and its associated constraints can be assessed for feasibility. Since many simulation runs can be made for a given set of detonation program constraints, one readily obtains an average schedule for a range of conditions. This provides a method for analyzing time-sensitive operations so that time and cost-effective operational schedules can be established. A comparison of the simulated schedules with those that were published shows them to be similar. (author)

  2. Interoceanic canal excavation scheduling via computer simulation

    International Nuclear Information System (INIS)

    Baldonado, Orlino C.

    1970-01-01

    The computer simulation language GPSS/360 was used to simulate the schedule of several nuclear detonation programs for the interoceanic canal project. The effects of using different weather restriction categories due to air blast and fallout were investigated. The effect of increasing the number of emplacement and stemming crews and the effect of varying the reentry period after detonating a row charge or salvo were also studied. Detonation programs were simulated for the proposed Routes 17A and 25E. The study demonstrates the method of using computer simulation so that a schedule and its associated constraints can be assessed for feasibility. Since many simulation runs can be made for a given set of detonation program constraints, one readily obtains an average schedule for a range of conditions. This provides a method for analyzing time-sensitive operations so that time and cost-effective operational schedules can be established. A comparison of the simulated schedules with those that were published shows them to be similar. (author)

  3. Biomes computed from simulated climatologies

    Energy Technology Data Exchange (ETDEWEB)

    Claussen, W.; Esch, M.

    1992-09-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study is undertaken in order to show the advantage of this biome model in comprehensively diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to failures in simulated rain fall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are seen for the tropical rain forests. A potential North-East shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting changes in vegetation patterns due to a rapid climate change, the latter simulation has to be taken as a prediction of changes in conditions favorable for the existence of certain biomes, not as a prediction of a future distribution of biomes. (orig.).

  4. Computer graphics in heat-transfer simulations

    International Nuclear Information System (INIS)

    Hamlin, G.A. Jr.

    1980-01-01

    Computer graphics can be very useful in the setup of heat transfer simulations and in the display of the results of such simulations. The potential use of recently available low-cost graphics devices in the setup of such simulations has not been fully exploited. Several types of graphics devices and their potential usefulness are discussed, and some configurations of graphics equipment are presented in the low-, medium-, and high-price ranges

  5. Parallel Computing for Brain Simulation.

    Science.gov (United States)

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  6. Influence of savanna fire on Australian monsoon season precipitation and circulation as simulated using a distributed computing environment

    Science.gov (United States)

    Lynch, Amanda H.; Abramson, David; Görgen, Klaus; Beringer, Jason; Uotila, Petteri

    2007-10-01

    Fires in the Australian savanna have been hypothesized to affect monsoon evolution, but the hypothesis is controversial and the effects have not been quantified. A distributed computing approach allows the development of a challenging experimental design that permits simultaneous variation of all fire attributes. The climate model simulations are distributed around multiple independent computer clusters in six countries, an approach that has potential for a range of other large simulation applications in the earth sciences. The experiment clarifies that savanna burning can shape the monsoon through two mechanisms. Boundary-layer circulation and large-scale convergence is intensified monotonically through increasing fire intensity and area burned. However, thresholds of fire timing and area are evident in the consequent influence on monsoon rainfall. In the optimal band of late, high intensity fires with a somewhat limited extent, it is possible for the wet season to be significantly enhanced.

  7. Regional Community Climate Simulations with variable resolution meshes in the Community Earth System Model

    Science.gov (United States)

    Zarzycki, C. M.; Gettelman, A.; Callaghan, P.

    2017-12-01

    Accurately predicting weather extremes such as precipitation (floods and droughts) and temperature (heat waves) requires high resolution to resolve mesoscale dynamics and topography at horizontal scales of 10-30km. Simulating such resolutions globally for climate scales (years to decades) remains computationally impractical. Simulating only a small region of the planet is more tractable at these scales for climate applications. This work describes global simulations using variable-resolution static meshes with multiple dynamical cores that target the continental United States using developmental versions of the Community Earth System Model version 2 (CESM2). CESM2 is tested in idealized, aquaplanet and full physics configurations to evaluate variable mesh simulations against uniform high and uniform low resolution simulations at resolutions down to 15km. Different physical parameterization suites are also evaluated to gauge their sensitivity to resolution. Idealized variable-resolution mesh cases compare well to high resolution tests. More recent versions of the atmospheric physics, including cloud schemes for CESM2, are more stable with respect to changes in horizontal resolution. Most of the sensitivity is due to sensitivity to timestep and interactions between deep convection and large scale condensation, expected from the closure methods. The resulting full physics model produces a comparable climate to the global low resolution mesh and similar high frequency statistics in the high resolution region. Some biases are reduced (orographic precipitation in the western United States), but biases do not necessarily go away at high resolution (e.g. summertime JJA surface Temp). The simulations are able to reproduce uniform high resolution results, making them an effective tool for regional climate studies and are available in CESM2.

  8. Computer simulation and experimental self-assembly of irradiated glycine amino acid under magnetic fields: Its possible significance in prebiotic chemistry.

    Science.gov (United States)

    Heredia, Alejandro; Colín-García, María; Puig, Teresa Pi I; Alba-Aldave, Leticia; Meléndez, Adriana; Cruz-Castañeda, Jorge A; Basiuk, Vladimir A; Ramos-Bernal, Sergio; Mendoza, Alicia Negrón

    2017-12-01

    Ionizing radiation may have played a relevant role in chemical reactions for prebiotic biomolecule formation on ancient Earth. Environmental conditions such as the presence of water and magnetic fields were possibly relevant in the formation of organic compounds such as amino acids. ATR-FTIR, Raman, EPR and X-ray spectroscopies provide valuable information about molecular organization of different glycine polymorphs under static magnetic fields. γ-glycine polymorph formation increases in irradiated samples interacting with static magnetic fields. The increase in γ-glycine polymorph agrees with the computer simulations. The AM1 semi-empirical simulations show a change in the catalyst behavior and dipole moment values in α and γ-glycine interaction with the static magnetic field. The simulated crystal lattice energy in α-glycine is also affected by the free radicals under the magnetic field, which decreases its stability. Therefore, solid α and γ-glycine containing free radicals under static magnetic fields might have affected the prebiotic scenario on ancient Earth by causing the oligomerization of glycine in prebiotic reactions. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Computational Dehydration of Crystalline Hydrates Using Molecular Dynamics Simulations

    DEFF Research Database (Denmark)

    Larsen, Anders Støttrup; Rantanen, Jukka; Johansson, Kristoffer E

    2017-01-01

    Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach...... to the dehydration of ampicillin trihydrate. The crystallographic unit cell of the trihydrate is used to construct the simulation cell containing 216 ampicillin and 648 water molecules. This system is dehydrated by removing water molecules during a 2200 ps simulation, and depending on the computational dehydration....... The structural changes could be followed in real time, and in addition, an intermediate amorphous phase was identified. The computationally identified dehydrated structure (anhydrate) was slightly different from the experimentally known anhydrate structure suggesting that the simulated computational structure...

  10. Computer simulation of thermal plant operations

    CERN Document Server

    O'Kelly, Peter

    2012-01-01

    This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.

  11. Reflections on the GUN CONTROL Simulation: Pedagogical Implications for EAP Writing Classes

    Science.gov (United States)

    Salies, Tania Gastao

    2007-01-01

    This article critically reflects on the GUN CONTROL simulation (Salies, 1994a) by retaking ideas advanced by Salies (2002) and applying them to the context of English for Academic Purposes (EAP) writing classes in Brazil. It examines how controlled practice relates to learners' performance on the first draft in a simulation-based content unit…

  12. Energy Exascale Earth System Model (E3SM) Project Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Bader, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-18

    The E3SM project will assert and maintain an international scientific leadership position in the development of Earth system and climate models at the leading edge of scientific knowledge and computational capabilities. With its collaborators, it will demonstrate its leadership by using these models to achieve the goal of designing, executing, and analyzing climate and Earth system simulations that address the most critical scientific questions for the nation and DOE.

  13. HRP's Healthcare Spin-Offs Through Computational Modeling and Simulation Practice Methodologies

    Science.gov (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Peng, Grace; Morrison, Tina; Erdemir, Ahmet; Myers, Jerry

    2014-01-01

    Spaceflight missions expose astronauts to novel operational and environmental conditions that pose health risks that are currently not well understood, and perhaps unanticipated. Furthermore, given the limited number of humans that have flown in long duration missions and beyond low Earth-orbit, the amount of research and clinical data necessary to predict and mitigate these health and performance risks are limited. Consequently, NASA's Human Research Program (HRP) conducts research and develops advanced methods and tools to predict, assess, and mitigate potential hazards to the health of astronauts. In this light, NASA has explored the possibility of leveraging computational modeling since the 1970s as a means to elucidate the physiologic risks of spaceflight and develop countermeasures. Since that time, substantial progress has been realized in this arena through a number of HRP funded activates such as the Digital Astronaut Project (DAP) and the Integrated Medical Model (IMM). Much of this success can be attributed to HRP's endeavor to establish rigorous verification, validation, and credibility (VV&C) processes that ensure computational models and simulations (M&S) are sufficiently credible to address issues within their intended scope. This presentation summarizes HRP's activities in credibility of modeling and simulation, in particular through its outreach to the community of modeling and simulation practitioners. METHODS: The HRP requires all M&S that can have moderate to high impact on crew health or mission success must be vetted in accordance to NASA Standard for Models and Simulations, NASA-STD-7009 (7009) [5]. As this standard mostly focuses on engineering systems, the IMM and DAP have invested substantial efforts to adapt the processes established in this standard for their application to biological M&S, which is more prevalent in human health and performance (HHP) and space biomedical research and operations [6,7]. These methods have also generated

  14. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  15. Quantum simulations with noisy quantum computers

    Science.gov (United States)

    Gambetta, Jay

    Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.

  16. Specification of the near-Earth space environment with SHIELDS

    International Nuclear Information System (INIS)

    Jordanova, Vania Koleva; Delzanno, Gian Luca; Henderson, Michael Gerard; Godinez, Humberto C.; Jeffery, Christopher Andrew Munn

    2017-01-01

    Here, predicting variations in the near-Earth space environment that can lead to spacecraft damage and failure is one example of “space weather” and a big space physics challenge. A project recently funded through the Los Alamos National Laboratory (LANL) Directed Research and Development (LDRD) program aims at developing a new capability to understand, model, and predict Space Hazards Induced near Earth by Large Dynamic Storms, the SHIELDS framework. The project goals are to understand the dynamics of the surface charging environment (SCE), the hot (keV) electrons representing the source and seed populations for the radiation belts, on both macro- and micro-scale. Important physics questions related to particle injection and acceleration associated with magnetospheric storms and substorms, as well as plasma waves, are investigated. These challenging problems are addressed using a team of world-class experts in the fields of space science and computational plasma physics, and state-of-the-art models and computational facilities. A full two-way coupling of physics-based models across multiple scales, including a global MHD (BATS-R-US) embedding a particle-in-cell (iPIC3D) and an inner magnetosphere (RAM-SCB) codes, is achieved. New data assimilation techniques employing in situ satellite data are developed; these provide an order of magnitude improvement in the accuracy in the simulation of the SCE. SHIELDS also includes a post-processing tool designed to calculate the surface charging for specific spacecraft geometry using the Curvilinear Particle-In-Cell (CPIC) code that can be used for reanalysis of satellite failures or for satellite design.

  17. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  18. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  19. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  20. Confidence range estimate of extended source imagery acquisition algorithms via computer simulations. [in optical communication systems

    Science.gov (United States)

    Chen, CHIEN-C.; Hui, Elliot; Okamoto, Garret

    1992-01-01

    Spatial acquisition using the sun-lit Earth as a beacon source provides several advantages over active beacon-based systems for deep-space optical communication systems. However, since the angular extend of the Earth image is large compared to the laser beam divergence, the acquisition subsystem must be capable of resolving the image to derive the proper pointing orientation. The algorithms used must be capable of deducing the receiver location given the blurring introduced by the imaging optics and the large Earth albedo fluctuation. Furthermore, because of the complexity of modelling the Earth and the tracking algorithms, an accurate estimate of the algorithm accuracy can only be made via simulation using realistic Earth images. An image simulator was constructed for this purpose, and the results of the simulation runs are reported.

  1. Computer Simulation of Reading.

    Science.gov (United States)

    Leton, Donald A.

    In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…

  2. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    Science.gov (United States)

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  3. Dynamics of global vegetation biomass simulated by the integrated Earth System Model

    Science.gov (United States)

    Mao, J.; Shi, X.; Di Vittorio, A. V.; Thornton, P. E.; Piao, S.; Yang, X.; Truesdale, J. E.; Bond-Lamberty, B. P.; Chini, L. P.; Thomson, A. M.; Hurtt, G. C.; Collins, W.; Edmonds, J.

    2014-12-01

    The global vegetation biomass stores huge amounts of carbon and is thus important to the global carbon budget (Pan et al., 2010). For the past few decades, different observation-based estimates and modeling of biomass in the above- and below-ground vegetation compartments have been comprehensively conducted (Saatchi et al., 2011; Baccini et al., 2012). However, uncertainties still exist, in particular for the simulation of biomass magnitude, tendency, and the response of biomass to climatic conditions and natural and human disturbances. The recently successful coupling of the integrated Earth System Model (iESM) (Di Vittorio et al., 2014; Bond-Lamberty et al., 2014), which links the Global Change Assessment Model (GCAM), Global Land-use Model (GLM), and Community Earth System Model (CESM), offers a great opportunity to understand the biomass-related dynamics in a fully-coupled natural and human modeling system. In this study, we focus on the systematic analysis and evaluation of the iESM simulated historical (1850-2005) and future (2006-2100) biomass changes and the response of the biomass dynamics to various impact factors, in particular the human-induced Land Use/Land Cover Change (LULCC). By analyzing the iESM simulations with and without the interactive LULCC feedbacks, we further study how and where the climate feedbacks affect socioeconomic decisions and LULCC, such as to alter vegetation carbon storage. References Pan Y et. al: A large and persistent carbon sink in the World's forests. Science 2011, 333:988-993. Saatchi SS et al: Benchmark map of forest carbon stocks in tropical regions across three continents. Proc Natl Acad Sci 2011, 108:9899-9904. Baccini A et al: Estimated carbon dioxide emissions from tropical deforestation improved by carbon-density maps. Nature Clim Change 2012, 2:182-185. Di Vittorio AV et al: From land use to land cover: restoring the afforestation signal in a coupled integrated assessment-earth system model and the implications for

  4. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  5. Computer Simulation of a Hardwood Processing Plant

    Science.gov (United States)

    D. Earl Kline; Philip A. Araman

    1990-01-01

    The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...

  6. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    Science.gov (United States)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  7. Interferences and events on epistemic shifts in physics through computer simulations

    CERN Document Server

    Warnke, Martin

    2017-01-01

    Computer simulations are omnipresent media in today's knowledge production. For scientific endeavors such as the detection of gravitational waves and the exploration of subatomic worlds, simulations are essential; however, the epistemic status of computer simulations is rather controversial as they are neither just theory nor just experiment. Therefore, computer simulations have challenged well-established insights and common scientific practices as well as our very understanding of knowledge. This volume contributes to the ongoing discussion on the epistemic position of computer simulations in a variety of physical disciplines, such as quantum optics, quantum mechanics, and computational physics. Originating from an interdisciplinary event, it shows that accounts of contemporary physics can constructively interfere with media theory, philosophy, and the history of science.

  8. Computed radiography simulation using the Monte Carlo code MCNPX

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.

    2009-01-01

    Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)

  9. Computed radiography simulation using the Monte Carlo code MCNPX

    Energy Technology Data Exchange (ETDEWEB)

    Correa, S.C.A. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Centro Universitario Estadual da Zona Oeste (CCMAT)/UEZO, Av. Manuel Caldeira de Alvarenga, 1203, Campo Grande, 23070-200, Rio de Janeiro, RJ (Brazil); Souza, E.M. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.b [PEN/COPPE-DNC/Poli CT, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Cassiano, D.H. [Instituto de Radioprotecao e Dosimetria/CNEN Av. Salvador Allende, s/n, Recreio, 22780-160, Rio de Janeiro, RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil)

    2010-09-15

    Simulating X-ray images has been of great interest in recent years as it makes possible an analysis of how X-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data.

  10. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  11. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  12. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-01-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  13. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  14. The use of the Climate-science Computational End Station (CCES) development and grand challenge team for the next IPCC assessment: an operational plan

    International Nuclear Information System (INIS)

    Washington, W M; Buja, L; Gent, P; Drake, J; Erickson, D; Anderson, D; Bader, D; Dickinson, R; Ghan, S; Jones, P; Jacob, R

    2008-01-01

    The grand challenge of climate change science is to predict future climates based on scenarios of anthropogenic emissions and other changes resulting from options in energy and development policies. Addressing this challenge requires a Climate Science Computational End Station consisting of a sustained climate model research, development, and application program combined with world-class DOE leadership computing resources to enable advanced computational simulation of the Earth system. This project provides the primary computer allocations for the DOE SciDAC and Climate Change Prediction Program. It builds on the successful interagency collaboration of the National Science and the U.S. Department of Energy in developing and applying the Community Climate System Model (CCSM) for climate change science. It also includes collaboration with the National Aeronautics and Space Administration in carbon data assimilation and university partners with expertise in high-end computational climate research

  15. Inovation of the computer system for the WWER-440 simulator

    International Nuclear Information System (INIS)

    Schrumpf, L.

    1988-01-01

    The configuration of the WWER-440 simulator computer system consists of four SMEP computers. The basic data processing unit consists of two interlinked SM 52/11.M1 computers with 1 MB of main memory. This part of the computer system of the simulator controls the operation of the entire simulator, processes the programs of technology behavior simulation, of the unit information system and of other special systems, guarantees program support and the operation of the instructor's console. An SM 52/11 computer with 256 kB of main memory is connected to each unit. It is used as a communication unit for data transmission using the DASIO 600 interface. Semigraphic color displays are based on the microprocessor modules of the SM 50/40 and SM 53/10 kit supplemented with a modified TESLA COLOR 110 ST tv receiver. (J.B.). 1 fig

  16. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    Science.gov (United States)

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  18. Computer simulation in cell radiobiology

    International Nuclear Information System (INIS)

    Yakovlev, A.Y.; Zorin, A.V.

    1988-01-01

    This research monograph demonstrates the possible ways of using stochastic simulation for exploring cell kinetics, emphasizing the effects of cell radiobiology. In vitro kinetics of normal and irradiated cells is the main subject, but some approaches to the simulation of controlled cell systems are considered as well: the epithelium of the small intestine in mice taken as a case in point. Of particular interest is the evaluation of simulation modelling as a tool for gaining insight into biological processes and hence the new inferences from concrete experimental data, concerning regularities in cell population response to irradiation. The book is intended to stimulate interest among computer science specialists in developing new, more efficient means for the simulation of cell systems and to help radiobiologists in interpreting the experimental data

  19. Quantum Genetic Algorithms for Computer Scientists

    OpenAIRE

    Lahoz Beltrá, Rafael

    2016-01-01

    Genetic algorithms (GAs) are a class of evolutionary algorithms inspired by Darwinian natural selection. They are popular heuristic optimisation methods based on simulated genetic mechanisms, i.e., mutation, crossover, etc. and population dynamical processes such as reproduction, selection, etc. Over the last decade, the possibility to emulate a quantum computer (a computer using quantum-mechanical phenomena to perform operations on data) has led to a new class of GAs known as “Quantum Geneti...

  20. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  1. Preparation, analysis, and release of simulated interplanetary grains into low earth orbit

    International Nuclear Information System (INIS)

    Stephens, J.R.; Strong, I.B.; Kunkle, T.D.

    1985-01-01

    Astronomical observations which reflect the optical and dynamical properties of interstellar and interplanetary grains are the primary means of identifying the shape, size, and the chemistry of extraterrestrial grain materials and is a major subject of this workshop. Except for recent samplings of extraterrestrial particles in near-Earth orbit and in the stratosphere, observations have been the only method of deducing the properties of extraterrestrial particles. Terrestrial laboratory experiments typically seek not to reproduce astrophysical conditions but to illuminate fundamental dust processes and properties which must be extrapolated to interesting astrophysical conditions. In this report, we discuss the formation and optical characterization of simulated interstellar and interplanetary dust with particular emphasis on studying the properties on irregularly shaped particles. We also discuss efforts to develop the techniques to allow dust experiments to be carried out in low-Earth orbit, thus extending the conditions under which dust experiments may be performed. The objectives of this study are threefold: (1) Elucidate the optical properties, including scattering and absorption, of simulated interstellar grains including SiC, silicates, and carbon grains produced in the laboratory. (2) Develop the capabilities to release grains and volatile materials into the near-Earth environment and study their dynamics and optical properties. (3) Study the interaction of released materials with the near-Earth environment to elucidate grain behavior in astrophysical environments. Interaction of grains with their environment may, for example, lead to grain alignment or coagulation, which results in observable phenomena such as polarization of lighter or a change of the scattering properties of the grains

  2. Tokamak control simulator

    International Nuclear Information System (INIS)

    Edelbaum, T.N.; Serben, S.; Var, R.E.

    1976-01-01

    A computer model of a tokamak experimental power reactor and its control system is being constructed. This simulator will allow the exploration of various open loop and closed loop strategies for reactor control. This paper provides a brief description of the simulator and some of the potential control problems associated with this class of tokamaks

  3. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  4. The Earth2Class Model for Professional Development to Implement the Next Generation Science Standards

    Science.gov (United States)

    Passow, M. J.; Assumpcao, C. M.; Baggio, F. D.; Hemming, S. R.; Goodwillie, A. M.; Brenner, C.

    2014-12-01

    Professional development for teachers involved in the implementation of the Next Generation Science Standards (NGSS) will require a multifaceted approach combining curriculum development, understanding the nature of science, applications of engineering and technology, integrating reading and writing, and other pedagogical components. The Earth2Class Workshops (E2C) at the Lamont-Doherty Earth Observatory of Columbia University (LDEO) provides one model for creating effective training to meet the NGSS challenges. E2C has provided more than 135 workshops since 1998 that have brought together LDEO research scientists with classroom teachers and students from the New York metropolitan area and elsewhere. Each session provides teachers with the chance to learn first-hand about the wide range of investigations conducted at LDEO. This approach aligns strongly with the NGSS goals: mastery of the disciplinary core ideas, science and engineering practices, understanding the nature of science, and cross-cutting relationships. During workshops, participating teachers interact with scientists to gain understanding of what stimulated research questions, how scientists put together all the components of investigations, and ways in which results are disseminated. Networking among teachers often leads to developing lesson plans based on the science, as well as support for professional growth not always possible within the school setting. Through the E2C website www.earth2class.org, teachers and students not able to attend the live workshops can access archival versions of the sessions. The website also provides a wide variety of educational resources. These have proved to be valuable on a national basis, as evidenced by an average of more than 300,000 hits per month from thousands of site visitors. Participating researchers have found E2C to be an effective approach to provide broader outreach of their results. During the next couple of years, the E2C program will expand to provide

  5. Simulation of biological ion channels with technology computer-aided design.

    Science.gov (United States)

    Pandey, Santosh; Bortei-Doku, Akwete; White, Marvin H

    2007-01-01

    Computer simulations of realistic ion channel structures have always been challenging and a subject of rigorous study. Simulations based on continuum electrostatics have proven to be computationally cheap and reasonably accurate in predicting a channel's behavior. In this paper we discuss the use of a device simulator, SILVACO, to build a solid-state model for KcsA channel and study its steady-state response. SILVACO is a well-established program, typically used by electrical engineers to simulate the process flow and electrical characteristics of solid-state devices. By employing this simulation program, we have presented an alternative computing platform for performing ion channel simulations, besides the known methods of writing codes in programming languages. With the ease of varying the different parameters in the channel's vestibule and the ability of incorporating surface charges, we have shown the wide-ranging possibilities of using a device simulator for ion channel simulations. Our simulated results closely agree with the experimental data, validating our model.

  6. Computational algorithms for simulations in atmospheric optics.

    Science.gov (United States)

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

  7. Computational Fluid Dynamics (CFD) Simulations of Jet Mixing in Tanks of Different Scales

    Science.gov (United States)

    Breisacher, Kevin; Moder, Jeffrey

    2010-01-01

    For long-duration in-space storage of cryogenic propellants, an axial jet mixer is one concept for controlling tank pressure and reducing thermal stratification. Extensive ground-test data from the 1960s to the present exist for tank diameters of 10 ft or less. The design of axial jet mixers for tanks on the order of 30 ft diameter, such as those planned for the Ares V Earth Departure Stage (EDS) LH2 tank, will require scaling of available experimental data from much smaller tanks, as well designing for microgravity effects. This study will assess the ability for Computational Fluid Dynamics (CFD) to handle a change of scale of this magnitude by performing simulations of existing ground-based axial jet mixing experiments at two tank sizes differing by a factor of ten. Simulations of several axial jet configurations for an Ares V scale EDS LH2 tank during low Earth orbit (LEO) coast are evaluated and selected results are also presented. Data from jet mixing experiments performed in the 1960s by General Dynamics with water at two tank sizes (1 and 10 ft diameter) are used to evaluate CFD accuracy. Jet nozzle diameters ranged from 0.032 to 0.25 in. for the 1 ft diameter tank experiments and from 0.625 to 0.875 in. for the 10 ft diameter tank experiments. Thermally stratified layers were created in both tanks prior to turning on the jet mixer. Jet mixer efficiency was determined by monitoring the temperatures on thermocouple rakes in the tanks to time when the stratified layer was mixed out. Dye was frequently injected into the stratified tank and its penetration recorded. There were no velocities or turbulence quantities available in the experimental data. A commercially available, time accurate, multi-dimensional CFD code with free surface tracking (FLOW-3D from Flow Science, Inc.) is used for the simulations presented. Comparisons are made between computed temperatures at various axial locations in the tank at different times and those observed experimentally. The

  8. SiMon: Simulation Monitor for Computational Astrophysics

    Science.gov (United States)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  9. Proceedings of the 2011 New York Workshop on Computer, Earth and Space Science

    CERN Document Server

    Naud, Catherine; CESS2011

    2011-01-01

    The purpose of the New York Workshop on Computer, Earth and Space Sciences is to bring together the New York area's finest Astronomers, Statisticians, Computer Scientists, Space and Earth Scientists to explore potential synergies between their respective fields. The 2011 edition (CESS2011) was a great success, and we would like to thank all of the presenters and participants for attending. This year was also special as it included authors from the upcoming book titled "Advances in Machine Learning and Data Mining for Astronomy". Over two days, the latest advanced techniques used to analyze the vast amounts of information now available for the understanding of our universe and our planet were presented. These proceedings attempt to provide a small window into what the current state of research is in this vast interdisciplinary field and we'd like to thank the speakers who spent the time to contribute to this volume.

  10. Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.

    Science.gov (United States)

    Scheeline, Alexander; Mork, Brian J.

    1988-01-01

    Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)

  11. Computer Simulation of Diffraction Patterns.

    Science.gov (United States)

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  12. Quantum Genetic Algorithms for Computer Scientists

    Directory of Open Access Journals (Sweden)

    Rafael Lahoz-Beltra

    2016-10-01

    Full Text Available Genetic algorithms (GAs are a class of evolutionary algorithms inspired by Darwinian natural selection. They are popular heuristic optimisation methods based on simulated genetic mechanisms, i.e., mutation, crossover, etc. and population dynamical processes such as reproduction, selection, etc. Over the last decade, the possibility to emulate a quantum computer (a computer using quantum-mechanical phenomena to perform operations on data has led to a new class of GAs known as “Quantum Genetic Algorithms” (QGAs. In this review, we present a discussion, future potential, pros and cons of this new class of GAs. The review will be oriented towards computer scientists interested in QGAs “avoiding” the possible difficulties of quantum-mechanical phenomena.

  13. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    Science.gov (United States)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  14. Cloud Computing Technologies in Writing Class: Factors Influencing Students' Learning Experience

    Science.gov (United States)

    Wang, Jenny

    2017-01-01

    The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS) system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study…

  15. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  16. Computational Approach for Improving Three-Dimensional Sub-Surface Earth Structure for Regional Earthquake Hazard Simulations in the San Francisco Bay Area

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-25

    In our Exascale Computing Project (ECP) we seek to simulate earthquake ground motions at much higher frequency than is currently possible. Previous simulations in the SFBA were limited to 0.5-1 Hz or lower (Aagaard et al. 2008, 2010), while we have recently simulated the response to 5 Hz. In order to improve confidence in simulated ground motions, we must accurately represent the three-dimensional (3D) sub-surface material properties that govern seismic wave propagation over a broad region. We are currently focusing on the San Francisco Bay Area (SFBA) with a Cartesian domain of size 120 x 80 x 35 km, but this area will be expanded to cover a larger domain. Currently, the United States Geologic Survey (USGS) has a 3D model of the SFBA for seismic simulations. However, this model suffers from two serious shortcomings relative to our application: 1) it does not fit most of the available low frequency (< 1 Hz) seismic waveforms from moderate (magnitude M 3.5-5.0) earthquakes; and 2) it is represented with much lower resolution than necessary for the high frequency simulations (> 5 Hz) we seek to perform. The current model will serve as a starting model for full waveform tomography based on 3D sensitivity kernels. This report serves as the deliverable for our ECP FY2017 Quarter 4 milestone to FY 2018 “Computational approach to developing model updates”. We summarize the current state of 3D seismic simulations in the SFBA and demonstrate the performance of the USGS 3D model for a few selected paths. We show the available open-source waveform data sets for model updates, based on moderate earthquakes recorded in the region. We present a plan for improving the 3D model utilizing the available data and further development of our SW4 application. We project how the model could be improved and present options for further improvements focused on the shallow geotechnical layers using dense passive recordings of ambient and human-induced noise.

  17. Class network routing

    Science.gov (United States)

    Bhanot, Gyan [Princeton, NJ; Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Takken, Todd E [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2009-09-08

    Class network routing is implemented in a network such as a computer network comprising a plurality of parallel compute processors at nodes thereof. Class network routing allows a compute processor to broadcast a message to a range (one or more) of other compute processors in the computer network, such as processors in a column or a row. Normally this type of operation requires a separate message to be sent to each processor. With class network routing pursuant to the invention, a single message is sufficient, which generally reduces the total number of messages in the network as well as the latency to do a broadcast. Class network routing is also applied to dense matrix inversion algorithms on distributed memory parallel supercomputers with hardware class function (multicast) capability. This is achieved by exploiting the fact that the communication patterns of dense matrix inversion can be served by hardware class functions, which results in faster execution times.

  18. A Computational Framework for Efficient Low Temperature Plasma Simulations

    Science.gov (United States)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  19. Describing Earth system simulations with the Metafor CIM

    Directory of Open Access Journals (Sweden)

    B. N. Lawrence

    2012-11-01

    Full Text Available The Metafor project has developed a common information model (CIM using the ISO19100 series formalism to describe numerical experiments carried out by the Earth system modelling community, the models they use, and the simulations that result. Here we describe the mechanism by which the CIM was developed, and its key properties. We introduce the conceptual and application versions and the controlled vocabularies developed in the context of supporting the fifth Coupled Model Intercomparison Project (CMIP5. We describe how the CIM has been used in experiments to describe model coupling properties and describe the near term expected evolution of the CIM.

  20. Use of computer graphics simulation for teaching of flexible sigmoidoscopy.

    Science.gov (United States)

    Baillie, J; Jowell, P; Evangelou, H; Bickel, W; Cotton, P

    1991-05-01

    The concept of simulation training in endoscopy is now well-established. The systems currently under development employ either computer graphics simulation or interactive video technology; each has its strengths and weaknesses. A flexible sigmoidoscopy training device has been designed which uses graphic routines--such as object oriented programming and double buffering--in entirely new ways. These programming techniques compensate for the limitations of currently available desk-top microcomputers. By boosting existing computer 'horsepower' with next generation coprocessors and sophisticated graphics tools such as intensity interpolation (Gouraud shading), the realism of computer simulation of flexible sigmoidoscopy is being greatly enhanced. The computer program has teaching and scoring capabilities, making it a truly interactive system. Use has been made of this ability to record, grade and store each trainee encounter in computer memory as part of a multi-center, prospective trial of simulation training being conducted currently in the USA. A new input device, a dummy endoscope, has been designed that allows application of variable resistance to the insertion tube. This greatly enhances tactile feedback, such as resistance during looping. If carefully designed trials show that computer simulation is an attractive and effective training tool, it is expected that this technology will evolve rapidly and be made widely available to trainee endoscopists.

  1. On the role of cost-sensitive learning in multi-class brain-computer interfaces.

    Science.gov (United States)

    Devlaminck, Dieter; Waegeman, Willem; Wyns, Bart; Otte, Georges; Santens, Patrick

    2010-06-01

    Brain-computer interfaces (BCIs) present an alternative way of communication for people with severe disabilities. One of the shortcomings in current BCI systems, recently put forward in the fourth BCI competition, is the asynchronous detection of motor imagery versus resting state. We investigated this extension to the three-class case, in which the resting state is considered virtually lying between two motor classes, resulting in a large penalty when one motor task is misclassified into the other motor class. We particularly focus on the behavior of different machine-learning techniques and on the role of multi-class cost-sensitive learning in such a context. To this end, four different kernel methods are empirically compared, namely pairwise multi-class support vector machines (SVMs), two cost-sensitive multi-class SVMs and kernel-based ordinal regression. The experimental results illustrate that ordinal regression performs better than the other three approaches when a cost-sensitive performance measure such as the mean-squared error is considered. By contrast, multi-class cost-sensitive learning enables us to control the number of large errors made between two motor tasks.

  2. Noise simulation in cone beam CT imaging with parallel computing

    International Nuclear Information System (INIS)

    Tu, S.-J.; Shaw, Chris C; Chen, Lingyun

    2006-01-01

    We developed a computer noise simulation model for cone beam computed tomography imaging using a general purpose PC cluster. This model uses a mono-energetic x-ray approximation and allows us to investigate three primary performance components, specifically quantum noise, detector blurring and additive system noise. A parallel random number generator based on the Weyl sequence was implemented in the noise simulation and a visualization technique was accordingly developed to validate the quality of the parallel random number generator. In our computer simulation model, three-dimensional (3D) phantoms were mathematically modelled and used to create 450 analytical projections, which were then sampled into digital image data. Quantum noise was simulated and added to the analytical projection image data, which were then filtered to incorporate flat panel detector blurring. Additive system noise was generated and added to form the final projection images. The Feldkamp algorithm was implemented and used to reconstruct the 3D images of the phantoms. A 24 dual-Xeon PC cluster was used to compute the projections and reconstructed images in parallel with each CPU processing 10 projection views for a total of 450 views. Based on this computer simulation system, simulated cone beam CT images were generated for various phantoms and technique settings. Noise power spectra for the flat panel x-ray detector and reconstructed images were then computed to characterize the noise properties. As an example among the potential applications of our noise simulation model, we showed that images of low contrast objects can be produced and used for image quality evaluation

  3. Turbulent geodynamo simulations: a leap towards Earth's core

    Science.gov (United States)

    Schaeffer, N.; Jault, D.; Nataf, H.-C.; Fournier, A.

    2017-10-01

    We present an attempt to reach realistic turbulent regime in direct numerical simulations of the geodynamo. We rely on a sequence of three convection-driven simulations in a rapidly rotating spherical shell. The most extreme case reaches towards the Earth's core regime by lowering viscosity (magnetic Prandtl number Pm = 0.1) while maintaining vigorous convection (magnetic Reynolds number Rm > 500) and rapid rotation (Ekman number E = 10-7) at the limit of what is feasible on today's supercomputers. A detailed and comprehensive analysis highlights several key features matching geomagnetic observations or dynamo theory predictions—all present together in the same simulation—but it also unveils interesting insights relevant for Earth's core dynamics. In this strong-field, dipole-dominated dynamo simulation, the magnetic energy is one order of magnitude larger than the kinetic energy. The spatial distribution of magnetic intensity is highly heterogeneous, and a stark dynamical contrast exists between the interior and the exterior of the tangent cylinder (the cylinder parallel to the axis of rotation that circumscribes the inner core). In the interior, the magnetic field is strongest, and is associated with a vigorous twisted polar vortex, whose dynamics may occasionally lead to the formation of a reverse polar flux patch at the surface of the shell. Furthermore, the strong magnetic field also allows accumulation of light material within the tangent cylinder, leading to stable stratification there. Torsional Alfvén waves are frequently triggered in the vicinity of the tangent cylinder and propagate towards the equator. Outside the tangent cylinder, the magnetic field inhibits the growth of zonal winds and the kinetic energy is mostly non-zonal. Spatio-temporal analysis indicates that the low-frequency, non-zonal flow is quite geostrophic (columnar) and predominantly large-scale: an m = 1 eddy spontaneously emerges in our most extreme simulations, without any

  4. Simulation of Satellite, Airborne and Terrestrial LiDAR with DART (I):Waveform Simulation with Quasi-Monte Carlo Ray Tracing

    Science.gov (United States)

    Gastellu-Etchegorry, Jean-Philippe; Yin, Tiangang; Lauret, Nicolas; Grau, Eloi; Rubio, Jeremy; Cook, Bruce D.; Morton, Douglas C.; Sun, Guoqing

    2016-01-01

    Light Detection And Ranging (LiDAR) provides unique data on the 3-D structure of atmosphere constituents and the Earth's surface. Simulating LiDAR returns for different laser technologies and Earth scenes is fundamental for evaluating and interpreting signal and noise in LiDAR data. Different types of models are capable of simulating LiDAR waveforms of Earth surfaces. Semi-empirical and geometric models can be imprecise because they rely on simplified simulations of Earth surfaces and light interaction mechanisms. On the other hand, Monte Carlo ray tracing (MCRT) models are potentially accurate but require long computational time. Here, we present a new LiDAR waveform simulation tool that is based on the introduction of a quasi-Monte Carlo ray tracing approach in the Discrete Anisotropic Radiative Transfer (DART) model. Two new approaches, the so-called "box method" and "Ray Carlo method", are implemented to provide robust and accurate simulations of LiDAR waveforms for any landscape, atmosphere and LiDAR sensor configuration (view direction, footprint size, pulse characteristics, etc.). The box method accelerates the selection of the scattering direction of a photon in the presence of scatterers with non-invertible phase function. The Ray Carlo method brings traditional ray-tracking into MCRT simulation, which makes computational time independent of LiDAR field of view (FOV) and reception solid angle. Both methods are fast enough for simulating multi-pulse acquisition. Sensitivity studies with various landscapes and atmosphere constituents are presented, and the simulated LiDAR signals compare favorably with their associated reflectance images and Laser Vegetation Imaging Sensor (LVIS) waveforms. The LiDAR module is fully integrated into DART, enabling more detailed simulations of LiDAR sensitivity to specific scene elements (e.g., atmospheric aerosols, leaf area, branches, or topography) and sensor configuration for airborne or satellite LiDAR sensors.

  5. The Simulation and Analysis of the Closed Die Hot Forging Process by A Computer Simulation Method

    Directory of Open Access Journals (Sweden)

    Dipakkumar Gohil

    2012-06-01

    Full Text Available The objective of this research work is to study the variation of various parameters such as stress, strain, temperature, force, etc. during the closed die hot forging process. A computer simulation modeling approach has been adopted to transform the theoretical aspects in to a computer algorithm which would be used to simulate and analyze the closed die hot forging process. For the purpose of process study, the entire deformation process has been divided in to finite number of steps appropriately and then the output values have been computed at each deformation step. The results of simulation have been graphically represented and suitable corrective measures are also recommended, if the simulation results do not agree with the theoretical values. This computer simulation approach would significantly improve the productivity and reduce the energy consumption of the overall process for the components which are manufactured by the closed die forging process and contribute towards the efforts in reducing the global warming.

  6. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    Science.gov (United States)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are

  7. Incorporating Earth Science into Other High School Science Classes

    Science.gov (United States)

    Manning, C. L. B.; Holzer, M.; Colson, M.; Courtier, A. M. B.; Jacobs, B. E.

    2016-12-01

    As states begin to review their standards, some adopt or adapt the NGSS and others write their own, many basing these on the Framework for K-12 Science Education. Both the NGSS and the Frameworks have an increased emphasis on Earth Science but many high school teachers are being asked to teach these standards in traditional Biology, Chemistry and Physics courses. At the Earth Educators Rendezvous, teachers, scientists, and science education researchers worked together to find the interconnections between the sciences using the NGSS and identified ways to reference the role of Earth Sciences in the other sciences during lectures, activities and laboratory assignments. Weaving Earth and Space sciences into the other curricular areas, the teams developed relevant problems for students to solve by focusing on using current issues, media stories, and community issues. These and other lessons and units of study will be presented along with other resources used by teachers to ensure students are gaining exposure and a deeper understanding of Earth and Space Science concepts.

  8. Prototyping and Simulating Parallel, Distributed Computations with VISA

    National Research Council Canada - National Science Library

    Demeure, Isabelle M; Nutt, Gary J

    1989-01-01

    ...] to support the design, prototyping, and simulation of parallel, distributed computations. In particular, VISA is meant to guide the choice of partitioning and communication strategies for such computations, based on their performance...

  9. MARSIS data and simulation exploited using array databases: PlanetServer/EarthServer for sounding radars

    Science.gov (United States)

    Cantini, Federico; Pio Rossi, Angelo; Orosei, Roberto; Baumann, Peter; Misev, Dimitar; Oosthoek, Jelmer; Beccati, Alan; Campalani, Piero; Unnithan, Vikram

    2014-05-01

    parallel computing has been developed and tested on a Tier 0 class HPC cluster computer located at CINECA, Bologna, Italy, to produce accurate simulations for the entire MARSIS dataset. Although the necessary computational resources have not yet been secured, through the HPC cluster at Jacobs University in Bremen it was possible to simulate a significant subset of orbits covering the area of the Medusae Fossae Formation (MFF), a seeimingly soft, easily eroded deposit that extends for nearly 1,000 km along the equator of Mars (e.g. Watters et al., 2007; Carter et al., 2009). Besides the MARSIS data, simulation of MARSIS surface clutter signal are included in the db to further improve its scientific value. Simulations will be available throught the project portal to end users/scientists and they will eventually be provided in the PSA/PDS archives. References: Baumann, P. On the management of multidimensional discrete data. VLDB J. 4 (3), 401-444, Special Issue on Spatial Database Systems, 1994. Carter, L. M., Campbell, B. A., Watters, T. R., Phillips, R. J., Putzig, N. E., Safaeinili, A., Plaut, J., Okubo, C., Egan, A. F., Biccari, D., Orosei, R. (2009). Shallow radar (SHARAD) sounding observations of the Medusae Fossae Formation, Mars. Icarus, 199(2), 295-302. Nouvel, J.-F., Herique, A., Kofman, W., Safaeinili, A. 2004. Radar signal simulation: Surface modeling with the Facet Method. Radio Science 39, 1013. Oosthoek, J.H.P, Flahaut J., Rossi, A. P., Baumann, P., Misev, D., Campalani, P., Unnithan, V. (2013) PlanetServer: Innovative Approaches for the Online Analysis of Hyperspectral Satellite Data from Mars, Advances in Space Research. DOI: 10.1016/j.asr.2013.07.002 Picardi, G., and 33 colleagues 2005. Radar Soundings of the Subsurface of Mars. Science 310, 1925-1928. Rossi, A. P., Baumann, P., Oosthoek, J., Beccati, A., Cantini, F., Misev, D. Orosei, R., Flahaut, J., Campalani, P., Unnithan, V. (2014),Geophys. Res. Abs., Vol. 16, #EGU2014-5149, this meeting. Watters, T. R

  10. Slab cooling system design using computer simulation

    NARCIS (Netherlands)

    Lain, M.; Zmrhal, V.; Drkal, F.; Hensen, J.L.M.

    2007-01-01

    For a new technical library building in Prague computer simulations were carried out to help design of slab cooling system and optimize capacity of chillers. In the paper is presented concept of new technical library HVAC system, the model of the building, results of the energy simulations for

  11. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. How Many Times Should One Run a Computational Simulation?

    DEFF Research Database (Denmark)

    Seri, Raffaello; Secchi, Davide

    2017-01-01

    This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces statisti......This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...

  13. An efficient algorithm for nucleolus and prekernel computation in some classes of TU-games

    NARCIS (Netherlands)

    Faigle, U.; Kern, Walter; Kuipers, J.

    1998-01-01

    We consider classes of TU-games. We show that we can efficiently compute an allocation in the intersection of the prekernel and the least core of the game if we can efficiently compute the minimum excess for any given allocation. In the case where the prekernel of the game contains exactly one core

  14. Simulation of climate change effects on streamflow, groundwater, and stream temperature using GSFLOW and SNTEMP in the Black Earth Creek Watershed, Wisconsin

    Science.gov (United States)

    Hunt, Randall J.; Westenbroek, Stephen M.; Walker, John F.; Selbig, William R.; Regan, R. Steven; Leaf, Andrew T.; Saad, David A.

    2016-08-23

    A groundwater/surface-water model was constructed and calibrated for the Black Earth Creek watershed in south-central Wisconsin. The model was then run to simulate scenarios representing common societal concerns in the basin, focusing on maintaining a cold-water resource in an urbanizing fringe near its upper stream reaches and minimizing downstream flooding. Although groundwater and surface water are considered a single resource, many hydrologic models simplistically simulate feedback loops between the groundwater system and other hydrologic processes. These feedbacks include timing and rates of evapotranspiration, surface runoff, soil-zone flow, and interactions with the groundwater system; however, computer models can now routinely and iteratively couple the surface-water and groundwater systems—albeit with longer model run times. In this study, preliminary calibrations of uncoupled transient surface-water and steady-state groundwater models were used to form the starting point for final calibration of one transient computer simulation that iteratively couples groundwater and surface water. The computer code GSFLOW (Groundwater/Surface-water FLOW) was used to simulate the coupled hydrologic system; a surface-water model represented hydrologic processes in the atmosphere, at land surface, and within the soil zone, and a groundwater-flow model represented the unsaturated zone, saturated zone, and streams. The coupled GSFLOW model was run on a daily time step during water years 1985–2007. Early simulation times (1985–2000) were used for spin-up to make the simulation results less sensitive to initial conditions specified; the spin-up period was not included in the model calibration. Model calibration used observed heads, streamflows, solar radiation, and snowpack measurements from 2000 to 2007 for history matching. Calibration was performed by using the PEST parameter estimation software suite.

  15. Computer simulation of gear tooth manufacturing processes

    Science.gov (United States)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  16. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.

    Science.gov (United States)

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-08

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

  17. Nanocrystalline material in toroidal cores for current transformer: analytical study and computational simulations

    Directory of Open Access Journals (Sweden)

    Benedito Antonio Luciano

    2005-12-01

    Full Text Available Based on electrical and magnetic properties, such as saturation magnetization, initial permeability, and coercivity, in this work are presented some considerations about the possibilities of applications of nanocrystalline alloys in toroidal cores for current transformers. It is discussed how the magnetic characteristics of the core material affect the performance of the current transformer. From the magnetic characterization and the computational simulations, using the finite element method (FEM, it has been verified that, at the typical CT operation value of flux density, the nanocrystalline alloys properties reinforce the hypothesis that the use of these materials in measurement CT cores can reduce the ratio and phase errors and can also improve its accuracy class.

  18. On a class of O(n²) problems in computational geometry

    NARCIS (Netherlands)

    Gajentaan, A.; Overmars, M.H.

    1993-01-01

    There are many problems in computational geometry for which the best know algorithms take time (n2) (or more) in the worst case while only very low lower bounds are known. In this paper we describe a large class of problems for which we prove that they are all at least as dicult as the following

  19. China’s Rare Earths Supply Forecast in 2025: A Dynamic Computable General Equilibrium Analysis

    Directory of Open Access Journals (Sweden)

    Jianping Ge

    2016-09-01

    Full Text Available The supply of rare earths in China has been the focus of significant attention in recent years. Due to changes in regulatory policies and the development of strategic emerging industries, it is critical to investigate the scenario of rare earth supplies in 2025. To address this question, this paper constructed a dynamic computable equilibrium (DCGE model to forecast the production, domestic supply, and export of China’s rare earths in 2025. Based on our analysis, production will increase by 10.8%–12.6% and achieve 116,335–118,260 tons of rare-earth oxide (REO in 2025, based on recent extraction control during 2011–2016. Moreover, domestic supply and export will be 75,081–76,800 tons REO and 38,797–39,400 tons REO, respectively. The technological improvements on substitution and recycling will significantly decrease the supply and mining activities of rare earths. From a policy perspective, we found that the elimination of export regulations, including export quotas and export taxes, does have a negative impact on China’s future domestic supply of rare earths. The policy conflicts between the increase in investment in strategic emerging industries, and the increase in resource and environmental taxes on rare earths will also affect China’s rare earths supply in the future.

  20. Theoretical and Computational Studies of Rare Earth Substitutes: A Test-bed for Accelerated Materials Development

    Energy Technology Data Exchange (ETDEWEB)

    Benedict, Lorin X. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-26

    Hard permanent magnets in wide use typically involve expensive Rare Earth elements. In this effort, we investigated candidate permanent magnet materials which contain no Rare Earths, while simultaneously exploring improvements in theoretical methodology which enable the better prediction of magnetic properties relevant for the future design and optimization of permanent magnets. This included a detailed study of magnetocrystalline anisotropy energies, and the use of advanced simulation tools to better describe magnetic properties at elevated temperatures.

  1. The visual simulators for architecture and computer organization learning

    OpenAIRE

    Nikolić Boško; Grbanović Nenad; Đorđević Jovan

    2009-01-01

    The paper proposes a method of an effective distance learning of architecture and computer organization. The proposed method is based on a software system that is possible to be applied in any course in this field. Within this system students are enabled to observe simulation of already created computer systems. The system provides creation and simulation of switch systems, too.

  2. Programme for the simulation of the TPA-i 1001 computer on the CDC-1604-A computer

    International Nuclear Information System (INIS)

    Belyaev, A.V.

    1976-01-01

    The basic features and capacities of the program simulating the 1001 TPA-i computer with the help of CDC-1604-A are described. The program is essentially aimed at translation of programs in the SLAHG language for the TPA-type computers. The basic part of the program simulates the work of the central TPA processor. This subprogram consequently performs the actions changing in the necessary manner the registers and memory states of the TPA computer. The simulated TPA computer has subprograms-analogous of external devices, i.e. the ASR-33 teletype, the FS 1501 tape reader, and the FACIT perforator. Work according to the program takes 1.65 - 2 times less time as against the work with TPA with the minimum set of external equipment [ru

  3. Large scale particle simulations in a virtual memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Million, R.; Wagner, J.S.; Tajima, T.

    1983-01-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceeds the computer core size. The required address space is automatically mapped onto slow disc memory the the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Assesses to slow memory significantly reduce the excecution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time. (orig.)

  4. Large-scale particle simulations in a virtual-memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Wagner, J.S.; Tajima, T.; Million, R.

    1982-08-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceed the computer core size. The required address space is automatically mapped onto slow disc memory by the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Accesses to slow memory significantly reduce the execution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time

  5. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    Science.gov (United States)

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  6. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  7. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    Science.gov (United States)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit

  8. A Computer Simulation of Community Pharmacy Practice for Educational Use.

    Science.gov (United States)

    Bindoff, Ivan; Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert

    2014-11-15

    To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor.

  9. Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics

    CERN Document Server

    Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI

    2006-01-01

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  10. Parallel Monte Carlo simulations on an ARC-enabled computing grid

    International Nuclear Information System (INIS)

    Nilsen, Jon K; Samset, Bjørn H

    2011-01-01

    Grid computing opens new possibilities for running heavy Monte Carlo simulations of physical systems in parallel. The presentation gives an overview of GaMPI, a system for running an MPI-based random walker simulation on grid resources. Integrating the ARC middleware and the new storage system Chelonia with the Ganga grid job submission and control system, we show that MPI jobs can be run on a world-wide computing grid with good performance and promising scaling properties. Results for relatively communication-heavy Monte Carlo simulations run on multiple heterogeneous, ARC-enabled computing clusters in several countries are presented.

  11. Computer simulation in nuclear science and engineering

    International Nuclear Information System (INIS)

    Akiyama, Mamoru; Miya, Kenzo; Iwata, Shuichi; Yagawa, Genki; Kondo, Shusuke; Hoshino, Tsutomu; Shimizu, Akinao; Takahashi, Hiroshi; Nakagawa, Masatoshi.

    1992-01-01

    The numerical simulation technology used for the design of nuclear reactors includes the scientific fields of wide range, and is the cultivated technology which grew in the steady efforts to high calculation accuracy through safety examination, reliability verification test, the assessment of operation results and so on. Taking the opportunity of putting numerical simulation to practical use in wide fields, the numerical simulation of five basic equations which describe the natural world and the progress of its related technologies are reviewed. It is expected that numerical simulation technology contributes to not only the means of design study but also the progress of science and technology such as the construction of new innovative concept, the exploration of new mechanisms and substances, of which the models do not exist in the natural world. The development of atomic energy and the progress of computers, Boltzmann's transport equation and its periphery, Navier-Stokes' equation and its periphery, Maxwell's electromagnetic field equation and its periphery, Schroedinger wave equation and its periphery, computational solid mechanics and its periphery, and probabilistic risk assessment and its periphery are described. (K.I.)

  12. An Evaluation of Mandibular Dental and Basal Arch Dimensions in Class I and Class II Division 1 Adult Syrian Patients using Cone-beam Computed Tomography.

    Science.gov (United States)

    Al-Hilal, Layal H; Sultan, Kinda; Hajeer, Mohammad Y; Mahmoud, Ghiath; Wanli, Abdulrahman A

    2018-04-01

    Aim: The aim of this study is (1) to inspect any difference in mandibular arch widths between males and females in class I and class II division 1 (class malocclusions using cone-beam computed tomography (CBCT), (2) to compare the mandibular dental and basal widths between the two groups, and (3) to investigate any possible correlation between dental and basal arch widths in both groups. Materials and methods: The CBCT images of 68 patients aged between 18 and 25 years consisted of 34 class I (17 males and 17 females) and 34 class (17 males and 17 females) who were recruited at the Department of Orthodontics, University of Damascus Dental School (Syria). Using on-demand three-dimensional (3D) on axial views, facial axis points for dental measurements and basal bone center (BBC) points for basal measurements were identified on lower canines and first molars. Dental and basal intercanine width (ICW) and intermolar width (IMW) were measured. Results: Independent t-test showed a statistically significant difference between males and females in several variables in both groups and a statistically significant difference between class I and class groups in the basal ICW for both genders and in the dental ICW for females only (p class I group, Pearson's correlation coefficients between dental and basal measurements showed a strong correlation in the IMW for both genders (r > 0.73; p class group, a moderate correlation in females' IMW (r = 0.67; p Class I patients had larger ICW than class II-1 patients in all measurements and had narrower IMW than class in most measurements for both genders. There were moderate-to-strong correlations between dental and basal dimensions. BBC points might be landmarks that accurately represent the basal bone arch. Clinical significance: CBCT-based assessments of dental and basal arch dimensions provide a great opportunity to accurately evaluate these aspects, to enhance clinicians' decisions regarding proper tooth movements, and to achieve

  13. Computational fluid dynamics simulations of light water reactor flows

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Weber, D.P.

    1999-01-01

    Advances in computational fluid dynamics (CFD), turbulence simulation, and parallel computing have made feasible the development of three-dimensional (3-D) single-phase and two-phase flow CFD codes that can simulate fluid flow and heat transfer in realistic reactor geometries with significantly reduced reliance, especially in single phase, on empirical correlations. The objective of this work was to assess the predictive power and computational efficiency of a CFD code in the analysis of a challenging single-phase light water reactor problem, as well as to identify areas where further improvements are needed

  14. COMPUTER LEARNING SIMULATOR WITH VIRTUAL REALITY FOR OPHTHALMOLOGY

    Directory of Open Access Journals (Sweden)

    Valeria V. Gribova

    2013-01-01

    Full Text Available A toolset of a medical computer learning simulator for ophthalmology with virtual reality and its implementation are considered in the paper. The simulator is oriented for professional skills training for students of medical universities. 

  15. Simulation in computer forensics teaching: the student experience

    OpenAIRE

    Crellin, Jonathan; Adda, Mo; Duke-Williams, Emma; Chandler, Jane

    2011-01-01

    The use of simulation in teaching computing is well established, with digital forensic investigation being a subject area where the range of simulation required is both wide and varied demanding a corresponding breadth of fidelity. Each type of simulation can be complex and expensive to set up resulting in students having only limited opportunities to participate and learn from the simulation. For example students' participation in mock trials in the University mock courtroom or in simulation...

  16. Molecular dynamics simulations and applications in computational toxicology and nanotoxicology.

    Science.gov (United States)

    Selvaraj, Chandrabose; Sakkiah, Sugunadevi; Tong, Weida; Hong, Huixiao

    2018-02-01

    Nanotoxicology studies toxicity of nanomaterials and has been widely applied in biomedical researches to explore toxicity of various biological systems. Investigating biological systems through in vivo and in vitro methods is expensive and time taking. Therefore, computational toxicology, a multi-discipline field that utilizes computational power and algorithms to examine toxicology of biological systems, has gained attractions to scientists. Molecular dynamics (MD) simulations of biomolecules such as proteins and DNA are popular for understanding of interactions between biological systems and chemicals in computational toxicology. In this paper, we review MD simulation methods, protocol for running MD simulations and their applications in studies of toxicity and nanotechnology. We also briefly summarize some popular software tools for execution of MD simulations. Published by Elsevier Ltd.

  17. The effects of computer assisted physics experiment simulations on students' learning

    Directory of Open Access Journals (Sweden)

    Turhan Civelek

    2013-11-01

    Full Text Available The main goal of this study is to present the significant difference between utilization of simulations of physics experiment during lectures and traditional physics lecture. Two groups of 115 students were selected for the purpose of the study. The same subjects have been taught to both groups, while a group of 115 had their lectures in science and technology class supported by physics experiment simulations for a month, the other group of115 had their lectures ina traditional way. The research has been conducted in Izzet Unver highs school in Istanbul, Gungoren. The main resource of this research is the data collected through surveys. The survey is a result of the literature and the suggestions of the experts on the topic. Thirty questions were prepared under ten topics. Two different surveys were conducted during the data collection. While the first survey questions focused on the effects of traditional lecturing on students, the second survey questions were targeting the effects of lecturing via the support of psychics experiment simulations. The data collected as a result of the survey which was coded in to SPSS Software and statistical anal yses was conducted. In order to test the significant difference between the means t-test was utilized. 0.05 was chosen as the significance level. As a result of the analyses utilized, significant differences were found in their satisfaction on class materials, in their motivation, in their learning speed, in their interest in the class, and in their contribution to the class. In findings such as the effect on students’ learning, information availability, organization of information, students’ integration to the class and gaining different point of views “lectures supported by physics experiment simulations” is significantly different from traditional lecturing. As the result of the literature review and the statistical analyses, “lectures supported via physics experiment simulations” seem to

  18. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    Science.gov (United States)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  19. HIGH-PERFORMANCE COMPUTING FOR THE STUDY OF EARTH AND ENVIRONMENTAL SCIENCE MATERIALS USING SYNCHROTRON X-RAY COMPUTED MICROTOMOGRAPHY

    International Nuclear Information System (INIS)

    FENG, H.; JONES, K.W.; MCGUIGAN, M.; SMITH, G.J.; SPILETIC, J.

    2001-01-01

    Synchrotron x-ray computed microtomography (CMT) is a non-destructive method for examination of rock, soil, and other types of samples studied in the earth and environmental sciences. The high x-ray intensities of the synchrotron source make possible the acquisition of tomographic volumes at a high rate that requires the application of high-performance computing techniques for data reconstruction to produce the three-dimensional volumes, for their visualization, and for data analysis. These problems are exacerbated by the need to share information between collaborators at widely separated locations over both local and tide-area networks. A summary of the CMT technique and examples of applications are given here together with a discussion of the applications of high-performance computing methods to improve the experimental techniques and analysis of the data

  20. HIGH-PERFORMANCE COMPUTING FOR THE STUDY OF EARTH AND ENVIRONMENTAL SCIENCE MATERIALS USING SYNCHROTRON X-RAY COMPUTED MICROTOMOGRAPHY.

    Energy Technology Data Exchange (ETDEWEB)

    FENG,H.; JONES,K.W.; MCGUIGAN,M.; SMITH,G.J.; SPILETIC,J.

    2001-10-12

    Synchrotron x-ray computed microtomography (CMT) is a non-destructive method for examination of rock, soil, and other types of samples studied in the earth and environmental sciences. The high x-ray intensities of the synchrotron source make possible the acquisition of tomographic volumes at a high rate that requires the application of high-performance computing techniques for data reconstruction to produce the three-dimensional volumes, for their visualization, and for data analysis. These problems are exacerbated by the need to share information between collaborators at widely separated locations over both local and tide-area networks. A summary of the CMT technique and examples of applications are given here together with a discussion of the applications of high-performance computing methods to improve the experimental techniques and analysis of the data.

  1. Computer simulation as representation of knowledge in education

    International Nuclear Information System (INIS)

    Krekic, Valerija Pinter; Namestovski, Zolt

    2009-01-01

    According to Aebli's operative method (1963) and Bruner's (1974) theory of representation the development of the process of thinking in teaching has the following phases - levels of abstraction: manipulation with specific things (specific phase), iconic representation (figural phase), symbolic representation (symbolic phase). Modern information technology has contributed to the enrichment of teaching and learning processes, especially in the fields of natural sciences and mathematics and those of production and technology. Simulation appears as a new possibility in the representation of knowledge. According to Guetzkow (1972) simulation is an operative representation of reality from a relevant aspect. It is about a model of an objective system, which is dynamic in itself. If that model is material it is a simple simulation, if it is abstract it is a reflective experiment, that is a computer simulation. This present work deals with the systematization and classification of simulation methods in the teaching of natural sciences and mathematics and of production and technology with special retrospective view on computer simulations and exemplar representation of the place and the role of this modern method of cognition. Key words: Representation of knowledge, modeling, simulation, education

  2. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  3. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  4. Entry, Descent and Landing Systems Analysis: Exploration Class Simulation Overview and Results

    Science.gov (United States)

    DwyerCianciolo, Alicia M.; Davis, Jody L.; Shidner, Jeremy D.; Powell, Richard W.

    2010-01-01

    NASA senior management commissioned the Entry, Descent and Landing Systems Analysis (EDL-SA) Study in 2008 to identify and roadmap the Entry, Descent and Landing (EDL) technology investments that the agency needed to make in order to successfully land large payloads at Mars for both robotic and exploration or human-scale missions. The year one exploration class mission activity considered technologies capable of delivering a 40-mt payload. This paper provides an overview of the exploration class mission study, including technologies considered, models developed and initial simulation results from the EDL-SA year one effort.

  5. Computer simulations of shear thickening of concentrated dispersions

    NARCIS (Netherlands)

    Boersma, W.H.; Laven, J.; Stein, H.N.

    1995-01-01

    Stokesian dynamics computer simulations were performed on monolayers of equally sized spheres. The influence of repulsive and attractive forces on the rheological behavior and on the microstructure were studied. Under specific conditions shear thickening could be observed in the simulations, usually

  6. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  7. Augmented Reality Simulations on Handheld Computers

    Science.gov (United States)

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  8. Computer Simulation of the Circulation Subsystem of a Library

    Science.gov (United States)

    Shaw, W. M., Jr.

    1975-01-01

    When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)

  9. Using EDUCache Simulator for the Computer Architecture and Organization Course

    Directory of Open Access Journals (Sweden)

    Sasko Ristov

    2013-07-01

    Full Text Available The computer architecture and organization course is essential in all computer science and engineering programs, and the most selected and liked elective course for related engineering disciplines. However, the attractiveness brings a new challenge, it requires a lot of effort by the instructor, to explain rather complicated concepts to beginners or to those who study related disciplines. The usage of visual simulators can improve both the teaching and learning processes. The overall goal is twofold: 1~to enable a visual environment to explain the basic concepts and 2~to increase the student's willingness and ability to learn the material.A lot of visual simulators have been used for the computer architecture and organization course. However, due to the lack of visual simulators for simulation of the cache memory concepts, we have developed a new visual simulator EDUCache simulator. In this paper we present that it can be effectively and efficiently used as a supporting tool in the learning process of modern multi-layer, multi-cache and multi-core multi-processors.EDUCache's features enable an environment for performance evaluation and engineering of software systems, i.e. the students will also understand the importance of computer architecture building parts and hopefully, will increase their curiosity for hardware courses in general.

  10. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  11. Effects of Psychology Courseware Use on Computer Anxiety in Students.

    Science.gov (United States)

    Lambert, Matthew E.; Lenthall, Gerard

    1989-01-01

    Describes study that examined the relationship between computer anxiety and the use of psychology courseware in an undergraduate abnormal psychology class using four computerized case simulations. Comparisons of pretest and posttest computer anxiety measures are described, and the relationship between computer anxiety/attitudes and computer use is…

  12. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Carolyn L., E-mail: wangcl@uw.edu [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Schopp, Jennifer G.; Kani, Kimia [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Petscavage-Thomas, Jonelle M. [Penn State Hershey Medical Center, Department of Radiology, 500 University Drive, Hershey, PA 17033 (United States); Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H. [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States)

    2013-12-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation.

  13. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    International Nuclear Information System (INIS)

    Wang, Carolyn L.; Schopp, Jennifer G.; Kani, Kimia; Petscavage-Thomas, Jonelle M.; Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H.

    2013-01-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation

  14. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient

  15. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  16. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    Science.gov (United States)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  17. Using computer simulations to probe the structure and dynamics of biopolymers

    International Nuclear Information System (INIS)

    Levy, R.M.; Hirata, F.; Kim, K.; Zhang, P.

    1987-01-01

    The use of computer simulations to study internal motions and thermodynamic properties is receiving increased attention. One important use of the method is to provide a more fundamental understanding of the molecular information contained in various kinds of experiments on these complex systems. In the first part of this paper the authors review recent work in their laboratory concerned with the use of computer simulations for the interpretation of experimental probes of molecular structure and dynamics of proteins and nucleic acids. The interplay between computer simulations and three experimental techniques is emphasized: (1) nuclear magnetic resonance relaxation spectroscopy, (2) refinement of macro-molecular x-ray structures, and (3) vibrational spectroscopy. The treatment of solvent effects in biopolymer simulations is a difficult problem. It is not possible to study systematically the effect of solvent conditions, e.g. added salt concentration, on biopolymer properties by means of simulations alone. In the last part of the paper the authors review a more analytical approach they developed to study polyelectrolyte properties of solvated biopolymers. The results are compared with computer simulations

  18. Digital control computer upgrade at the Cernavoda NPP simulator

    International Nuclear Information System (INIS)

    Ionescu, T.

    2006-01-01

    The Plant Process Computer equips some Nuclear Power Plants, like CANDU-600, with Centralized Control performed by an assembly of two computers known as Digital Control Computers (DCC) and working in parallel for safely driving of the plan at steady state and during normal maneuvers but also during abnormal transients when the plant is automatically steered to a safe state. The Centralized Control means both hardware and software with obligatory presence in the frame of the Full Scope Simulator and subject to changing its configuration with specific requirements during the plant and simulator life and covered by this subsection

  19. Computer based training simulator for Hunterston Nuclear Power Station

    International Nuclear Information System (INIS)

    Bowden, R.S.M.; Hacking, D.

    1978-01-01

    For reasons which are stated, the Hunterston-B nuclear power station automatic control system includes a manual over-ride facility. It is therefore essential for the station engineers to be trained to recognise and control all feasible modes of plant and logic malfunction. A training simulator has been built which consists of a replica of the shutdown monitoring panel in the Central Control Room and is controlled by a mini-computer. This paper highlights the computer aspects of the simulator and relevant derived experience, under the following headings: engineering background; shutdown sequence equipment; simulator equipment; features; software; testing; maintenance. (U.K.)

  20. Exploiting NASA's Cumulus Earth Science Cloud Archive with Services and Computation

    Science.gov (United States)

    Pilone, D.; Quinn, P.; Jazayeri, A.; Schuler, I.; Plofchan, P.; Baynes, K.; Ramachandran, R.

    2017-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) houses nearly 30PBs of critical Earth Science data and with upcoming missions is expected to balloon to between 200PBs-300PBs over the next seven years. In addition to the massive increase in data collected, researchers and application developers want more and faster access - enabling complex visualizations, long time-series analysis, and cross dataset research without needing to copy and manage massive amounts of data locally. NASA has started prototyping with commercial cloud providers to make this data available in elastic cloud compute environments, allowing application developers direct access to the massive EOSDIS holdings. In this talk we'll explain the principles behind the archive architecture and share our experience of dealing with large amounts of data with serverless architectures including AWS Lambda, the Elastic Container Service (ECS) for long running jobs, and why we dropped thousands of lines of code for AWS Step Functions. We'll discuss best practices and patterns for accessing and using data available in a shared object store (S3) and leveraging events and message passing for sophisticated and highly scalable processing and analysis workflows. Finally we'll share capabilities NASA and cloud services are making available on the archives to enable massively scalable analysis and computation in a variety of formats and tools.

  1. Computer simulation games in population and education.

    Science.gov (United States)

    Moreland, R S

    1988-01-01

    Computer-based simulation games are effective training tools that have several advantages. They enable players to learn in a nonthreatening manner and develop strategies to achieve goals in a dynamic environment. They also provide visual feedback on the effects of players' decisions, encourage players to explore and experiment with options before making final decisions, and develop players' skills in analysis, decision making, and cooperation. 2 games have been developed by the Research Triangle Institute for public-sector planning agencies interested in or dealing with developing countries. The UN Population and Development Game teaches players about the interaction between population variables and the national economy and how population policies complement other national policies, such as education. The BRIDGES Education Planning Game focuses on the effects education has on national policies. In both games, the computer simulates the reactions of a fictional country's socioeconomic system to players' decisions. Players can change decisions after seeing their effects on a computer screen and thus can improve their performance in achieving goals.

  2. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    Science.gov (United States)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  3. A note on simulated annealing to computer laboratory scheduling ...

    African Journals Online (AJOL)

    The concepts, principles and implementation of simulated Annealing as a modem heuristic technique is presented. Simulated Annealing algorithm is used in solving real life problem of Computer Laboratory scheduling in order to maximize the use of scarce and insufficient resources. KEY WORDS: Simulated Annealing ...

  4. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  5. Simulation of Robot Kinematics Using Interactive Computer Graphics.

    Science.gov (United States)

    Leu, M. C.; Mahajan, R.

    1984-01-01

    Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

  6. Computer simulations of long-time tails: what's new?

    NARCIS (Netherlands)

    Hoef, van der M.A.; Frenkel, D.

    1995-01-01

    Twenty five years ago Alder and Wainwright discovered, by simulation, the 'long-time tails' in the velocity autocorrelation function of a single particle in fluid [1]. Since then, few qualitatively new results on long-time tails have been obtained by computer simulations. However, within the

  7. Setting to earth for computer

    International Nuclear Information System (INIS)

    Gallego V, Luis Eduardo; Montana Ch, Johny Hernan; Tovar P, Andres Fernando; Amortegui, Francisco

    2000-01-01

    The program GMT allows the analysis of setting to earth for tensions DC and AC (of low frequency) of diverse configurations composed by cylindrical electrodes interconnected, in a homogeneous land or stratified (two layers). This analysis understands among other aspects: calculation of the setting resistance to earth, elevation of potential of the system (GPR), calculation of current densities in the conductors, potentials calculation in which point on the land surface (profile and surfaces), tensions calculation in passing and of contact, also, it carries out the interpretation of resistivity measures for Wenner and Schlumberger methods, finding a model of two layers

  8. Faster quantum chemistry simulation on fault-tolerant quantum computers

    International Nuclear Information System (INIS)

    Cody Jones, N; McMahon, Peter L; Yamamoto, Yoshihisa; Whitfield, James D; Yung, Man-Hong; Aspuru-Guzik, Alán; Van Meter, Rodney

    2012-01-01

    Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. We propose methods which substantially improve the performance of a particular form of simulation, ab initio quantum chemistry, on fault-tolerant quantum computers; these methods generalize readily to other quantum simulation problems. Quantum teleportation plays a key role in these improvements and is used extensively as a computing resource. To improve execution time, we examine techniques for constructing arbitrary gates which perform substantially faster than circuits based on the conventional Solovay–Kitaev algorithm (Dawson and Nielsen 2006 Quantum Inform. Comput. 6 81). For a given approximation error ϵ, arbitrary single-qubit gates can be produced fault-tolerantly and using a restricted set of gates in time which is O(log ϵ) or O(log log ϵ); with sufficient parallel preparation of ancillas, constant average depth is possible using a method we call programmable ancilla rotations. Moreover, we construct and analyze efficient implementations of first- and second-quantized simulation algorithms using the fault-tolerant arbitrary gates and other techniques, such as implementing various subroutines in constant time. A specific example we analyze is the ground-state energy calculation for lithium hydride. (paper)

  9. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  10. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  11. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  12. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  13. Computer simulation of high energy displacement cascades

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1990-01-01

    A methodology developed for modeling many aspects of high energy displacement cascades with molecular level computer simulations is reviewed. The initial damage state is modeled in the binary collision approximation (using the MARLOWE computer code), and the subsequent disposition of the defects within a cascade is modeled with a Monte Carlo annealing simulation (the ALSOME code). There are few adjustable parameters, and none are set to physically unreasonable values. The basic configurations of the simulated high energy cascades in copper, i.e., the number, size and shape of damage regions, compare well with observations, as do the measured numbers of residual defects and the fractions of freely migrating defects. The success of these simulations is somewhat remarkable, given the relatively simple models of defects and their interactions that are employed. The reason for this success is that the behavior of the defects is very strongly influenced by their initial spatial distributions, which the binary collision approximation adequately models. The MARLOWE/ALSOME system, with input from molecular dynamics and experiments, provides a framework for investigating the influence of high energy cascades on microstructure evolution. (author)

  14. Atmospheric dynamics and habitability range in Earth-like aquaplanets obliquity simulations

    Science.gov (United States)

    Nowajewski, Priscilla; Rojas, M.; Rojo, P.; Kimeswenger, S.

    2018-05-01

    We present the evolution of the atmospheric variables that affect planetary climate by increasing the obliquity by using a general circulation model (PlaSim) coupled to a slab ocean with mixed layer flux correction. We increase the obliquity between 30° and 90° in 16 aquaplanets with liquid sea surface and perform the simulation allowing the sea ice cover formation to be a consequence of its atmospheric dynamics. Insolation is maintained constant in each experiment, but changing the obliquity affects the radiation budget and the large scale circulation. Earth-like atmospheric dynamics is observed for planets with obliquity under 54°. Above this value, the latitudinal temperature gradient is reversed giving place to a new regime of jet streams, affecting the shape of Hadley and Ferrel cells and changing the position of the InterTropical Convergence Zone. As humidity and high temperatures determine Earth's habitability, we introduce the wet bulb temperature as an atmospheric index of habitability for Earth-like aquaplanets with above freezing temperatures. The aquaplanets are habitable all year round at all latitudes for values under 54°; above this value habitability decreases toward the poles due to high temperatures.

  15. Earth Science Informatics - Overview

    Science.gov (United States)

    Ramapriyan, H. K.

    2017-01-01

    Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes over 180 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies.The talk will present an overview of current efforts in ESI, the role members of IEEE GRSS play, and discuss

  16. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  17. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  18. The Use of Computer Simulation Gaming in Teaching Broadcast Economics.

    Science.gov (United States)

    Mancuso, Louis C.

    The purpose of this study was to develop a broadcast economic computer simulation and to ascertain how a lecture-computer simulation game compared as a teaching method with a more traditional lecture and case study instructional methods. In each of three sections of a broadcast economics course, a different teaching methodology was employed: (1)…

  19. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  20. Computer simulation of two-phase flow in nuclear reactors

    International Nuclear Information System (INIS)

    Wulff, W.

    1993-01-01

    Two-phase flow models dominate the requirements of economic resources for the development and use of computer codes which serve to analyze thermohydraulic transients in nuclear power plants. An attempt is made to reduce the effort of analyzing reactor transients by combining purpose-oriented modelling with advanced computing techniques. Six principles are presented on mathematical modeling and the selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited than the two-fluid model for the analysis of two-phase flow in nuclear reactors, because of the latter's closure problems. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost. (orig.)

  1. Computer simulation of molecular sorption in zeolites

    International Nuclear Information System (INIS)

    Calmiano, Mark Daniel

    2001-01-01

    The work presented in this thesis encompasses the computer simulation of molecular sorption. In Chapter 1 we outline the aims and objectives of this work. Chapter 2 follows in which an introduction to sorption in zeolites is presented, with discussion of structure and properties of the main zeolites studied. Chapter 2 concludes with a description of the principles and theories of adsorption. In Chapter 3 we describe the methodology behind the work carried out in this thesis. In Chapter 4 we present our first computational study, that of the sorption of krypton in silicalite. We describe work carried out to investigate low energy sorption sites of krypton in silicalite where we observe krypton to preferentially sorb into straight and sinusoidal channels over channel intersections. We simulate single step type I adsorption isotherms and use molecular dynamics to study the diffusion of krypton and obtain division coefficients and the activation energy. We compare our results to previous experimental and computational studies where we show our work to be in good agreement. In Chapter 5 we present a systematic study of the sorption of oxygen and nitrogen in five lithium substituted zeolites using a transferable interatomic potential that we have developed from ab initio calculations. We show increased loading of nitrogen compared to oxygen in all five zeolites studied as expected and simulate adsorption isotherms, which we compare to experimental and simulated data in the literature. In Chapter 6 we present work on the sorption of ferrocene in the zeolite NaY. We show that a simulated, low energy sorption site for ferrocene is correctly located by comparing to X-ray powder diffraction results for this same system. The thesis concludes with some overall conclusions and discussion of opportunities for future work. (author)

  2. Factors cost effectively improved using computer simulations of ...

    African Journals Online (AJOL)

    LPhidza

    effectively managed using computer simulations in semi-arid conditions pertinent to much of sub-Saharan Africa. ... small scale farmers to obtain optimal crop yields thus ensuring their food security and livelihood is ... those that simultaneously incorporate and simulate processes involved throughout the course of crop ...

  3. CloudMC: a cloud computing application for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-01-01

    This work presents CloudMC, a cloud computing application—developed in Windows Azure®, the platform of the Microsoft® cloud—for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based—the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice. (note)

  4. CloudMC: a cloud computing application for Monte Carlo simulation.

    Science.gov (United States)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-04-21

    This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.

  5. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  6. Simulation of Safety and Transient Analysis of a Pressurized Water Reactor using the Personal Computer Transient Analyzer

    Directory of Open Access Journals (Sweden)

    Sunday J. IBRAHIM

    2013-06-01

    Full Text Available Safety and transient analyses of a pressurised water reactor (PWR using the Personal Computer Transient Analyzer (PCTRAN simulator was carried out. The analyses presented a synergistic integration of a numerical model; a full scope high fidelity simulation system which adopted point reactor neutron kinetics model and movable boundary two phase fluid models to simplify the calculation of the program, so it could achieve real-time simulation on a personal computer. Various scenarios of transients and accidents likely to occur at any nuclear power plant were simulated. The simulations investigated the change of signals and parameters vis a vis loss of coolant accident, scram, turbine trip, inadvertent control rod insertion and withdrawal, containment failure, fuel handling accident in auxiliary building and containment, moderator dilution as well as a combination of these parameters. Furthermore, statistical analyses of the PCTRAN results were carried out. PCTRAN results for the loss of coolant accident (LOCA caused a rapid drop in coolant pressure at the rate of 21.8KN/m2/sec triggering a shutdown of the reactor protection system (RPS, while the turbine trip accident showed a rapid drop in total plant power at the rate of 14.3 MWe/sec causing a downtime in the plant. Fuel handling accidents mimic results showed release of radioactive materials in unacceptable doses. This work shows the potential classes of nuclear accidents likely to occur during operation in proposed reactor sites. The simulations are very appropriate in the light of Nigeria’s plan to generate nuclear energy in the region of 1000 MWe from reactors by 2017.

  7. Axial focusing of energy from a hypervelocity impact on earth

    Energy Technology Data Exchange (ETDEWEB)

    Boslough, M.B.; Chael, E.P.; Trucano, T.G.; Crawford, D.A.

    1994-12-01

    We have performed computational simulations to determine how energy from a large hypervelocity impact on the Earth`s surface would couple to its interior. Because of the first-order axial symmetry of both the impact energy source and the stress-wave velocity structure of the Earth, a disproportionate amount of energy is dissipated along the axis defined by the impact point and its antipode (point opposite the impact). For a symmetric and homogeneous Earth model, all the impact energy that is radiated as seismic waves into the Earth at a given takeoff angle (ray parameter), independent of azimuthal direction, is refocused (minus attenuation) on the axis of symmetry, regardless of the number of reflections and refractions it has experienced. Material on or near the axis of symmetry experiences more strain cycles with much greater amplitude than elsewhere, and therefore experiences more irreversible heating. The focusing is most intense in the upper mantle, within the asthenosphere, where seismic energy is most effectively converted to heat. For a sufficiently energetic impact, this mechanism might generate enough local heating to create an isostatic instability leading to uplift, possibly resulting in rifting, volcanism, or other rearrangement of the interior dynamics of the planet. These simulations demonstrate how hypervelocity impact energy can be transported to the Earth`s interior, supporting the possibility of a causal link between large impacts on Earth and major internally-driven geophysical processes.

  8. Formal Analysis of Dynamics Within Philosophy of Mind by Computer Simulation

    NARCIS (Netherlands)

    Bosse, T.; Schut, M.C.; Treur, J.

    2009-01-01

    Computer simulations can be useful tools to support philosophers in validating their theories, especially when these theories concern phenomena showing nontrivial dynamics. Such theories are usually informal, whilst for computer simulation a formally described model is needed. In this paper, a

  9. Rapid Ice-Sheet Changes and Mechanical Coupling to Solid-Earth/Sea-Level and Space Geodetic Observation

    Science.gov (United States)

    Adhikari, S.; Ivins, E. R.; Larour, E. Y.

    2015-12-01

    Perturbations in gravitational and rotational potentials caused by climate driven mass redistribution on the earth's surface, such as ice sheet melting and terrestrial water storage, affect the spatiotemporal variability in global and regional sea level. Here we present a numerically accurate, computationally efficient, high-resolution model for sea level. Unlike contemporary models that are based on spherical-harmonic formulation, the model can operate efficiently in a flexible embedded finite-element mesh system, thus capturing the physics operating at km-scale yet capable of simulating geophysical quantities that are inherently of global scale with minimal computational cost. One obvious application is to compute evolution of sea level fingerprints and associated geodetic and astronomical observables (e.g., geoid height, gravity anomaly, solid-earth deformation, polar motion, and geocentric motion) as a companion to a numerical 3-D thermo-mechanical ice sheet simulation, thus capturing global signatures of climate driven mass redistribution. We evaluate some important time-varying signatures of GRACE inferred ice sheet mass balance and continental hydrological budget; for example, we identify dominant sources of ongoing sea-level change at the selected tide gauge stations, and explain the relative contribution of different sources to the observed polar drift. We also report our progress on ice-sheet/solid-earth/sea-level model coupling efforts toward realistic simulation of Pine Island Glacier over the past several hundred years.

  10. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  11. Computer simulation studies in condensed-matter physics 5. Proceedings

    International Nuclear Information System (INIS)

    Landau, D.P.; Mon, K.K.; Schuettler, H.B.

    1993-01-01

    As the role of computer simulations began to increase in importance, we sensed a need for a ''meeting place'' for both experienced simulators and neophytes to discuss new techniques and results in an environment which promotes extended discussion. As a consequence of these concerns, The Center for Simulational Physics established an annual workshop on Recent Developments in Computer Simulation Studies in Condensed-Matter Physics. This year's workshop was the fifth in this series and the interest which the scientific community has shown demonstrates quite clearly the useful purpose which the series has served. The workshop was held at the University of Georgia, February 17-21, 1992, and these proceedings from a record of the workshop which is published with the goal of timely dissemination of the papers to a wider audience. The proceedings are divided into four parts. The first part contains invited papers which deal with simulational studies of classical systems and includes an introduction to some new simulation techniques and special purpose computers as well. A separate section of the proceedings is devoted to invited papers on quantum systems including new results for strongly correlated electron and quantum spin models. The third section is comprised of a single, invited description of a newly developed software shell designed for running parallel programs. The contributed presentations comprise the final chapter. (orig.). 79 figs

  12. A compositional reservoir simulator on distributed memory parallel computers

    International Nuclear Information System (INIS)

    Rame, M.; Delshad, M.

    1995-01-01

    This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. A portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented

  13. Computer simulation of ultrasonic waves in solids

    International Nuclear Information System (INIS)

    Thibault, G.A.; Chaplin, K.

    1992-01-01

    A computer model that simulates the propagation of ultrasonic waves has been developed at AECL Research, Chalk River Laboratories. This program is called EWE, short for Elastic Wave Equations, the mathematics governing the propagation of ultrasonic waves. This report contains a brief summary of the use of ultrasonic waves in non-destructive testing techniques, a discussion of the EWE simulation code explaining the implementation of the equations and the types of output received from the model, and an example simulation showing the abilities of the model. (author). 2 refs., 2 figs

  14. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    International Nuclear Information System (INIS)

    Foster, C.

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  15. Thermodynamic and transport properties of nitrogen fluid: Molecular theory and computer simulations

    Science.gov (United States)

    Eskandari Nasrabad, A.; Laghaei, R.

    2018-04-01

    Computer simulations and various theories are applied to compute the thermodynamic and transport properties of nitrogen fluid. To model the nitrogen interaction, an existing potential in the literature is modified to obtain a close agreement between the simulation results and experimental data for the orthobaric densities. We use the Generic van der Waals theory to calculate the mean free volume and apply the results within the modified Cohen-Turnbull relation to obtain the self-diffusion coefficient. Compared to experimental data, excellent results are obtained via computer simulations for the orthobaric densities, the vapor pressure, the equation of state, and the shear viscosity. We analyze the results of the theory and computer simulations for the various thermophysical properties.

  16. Computer simulation for sodium-concrete reactions

    International Nuclear Information System (INIS)

    Zhang Bin; Zhu Jizhou

    2006-01-01

    In the liquid metal cooled fast breeder reactors (LMFBRs), direct contacts between sodium and concrete is unavoidable. Due to sodium's high chemical reactivity, sodium would react with concrete violently. Lots of hydrogen gas and heat would be released then. This would harm the ignorantly of the containment. This paper developed a program to simualte sodium-conrete reactions across-the-board. It could give the reaction zone temperature, pool temperature, penetration depth, penetration rate, hydrogen flux and reaction heat and so on. Concrete was considered to be composed of silica and water only in this paper. The variable, the quitient of sodium hydroxide, was introduced in the continuity equation to simulate the chemical reactions more realistically. The product of the net gas flux and boundary depth was ably transformed to that of penetration rate and boundary depth. The complex chemical kinetics equations was simplified under some hypothesises. All the technique applied above simplified the computer simulation consumedly. In other words, they made the computer simulation feasible. Theoretics models that applied in the program and the calculation procedure were expatiated in detail. Good agreements of an overall transient behavior were obtained in the series of sodium-concrete reaction experiment analysis. The comparison between the analytical and experimental results showed the program presented in this paper was creditable and reasonable for simulating the sodium-concrete reactions. This program could be used for nuclear safety judgement. (authors)

  17. A review of computer-based simulators for ultrasound training.

    Science.gov (United States)

    Blum, Tobias; Rieger, Andreas; Navab, Nassir; Friess, Helmut; Martignoni, Marc

    2013-04-01

    Computer-based simulators for ultrasound training are a topic of recent interest. During the last 15 years, many different systems and methods have been proposed. This article provides an overview and classification of systems in this domain and a discussion of their advantages. Systems are classified and discussed according to the image simulation method, user interactions and medical applications. Computer simulation of ultrasound has one key advantage over traditional training. It enables novel training concepts, for example, through advanced visualization, case databases, and automatically generated feedback. Qualitative evaluations have mainly shown positive learning effects. However, few quantitative evaluations have been performed and long-term effects have to be examined.

  18. Computer Graphics Simulations of Sampling Distributions.

    Science.gov (United States)

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  19. Earth Science Data Education through Cooking Up Recipes

    Science.gov (United States)

    Weigel, A. M.; Maskey, M.; Smith, T.; Conover, H.

    2016-12-01

    One of the major challenges in Earth science research and applications is understanding and applying the proper methods, tools, and software for using scientific data. These techniques are often difficult and time consuming to identify, requiring novel users to conduct extensive research, take classes, and reach out for assistance, thus hindering scientific discovery and real-world applications. To address these challenges, the Global Hydrology Resource Center (GHRC) DAAC has developed a series of data recipes that novel users such as students, decision makers, and general Earth scientists can leverage to learn how to use Earth science datasets. Once the data recipe content had been finalized, GHRC computer and Earth scientists collaborated with a web and graphic designer to ensure the content is both attractively presented to data users, and clearly communicated to promote the education and use of Earth science data. The completed data recipes include, but are not limited to, tutorials, iPython Notebooks, resources, and tools necessary for addressing key difficulties in data use across a broad user base. These recipes enable non-traditional users to learn how to use data, but also curates and communicates common methods and approaches that may be difficult and time consuming for these users to identify.

  20. Computer simulation of nonequilibrium processes

    International Nuclear Information System (INIS)

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed

  1. Building an adiabatic quantum computer simulation in the classroom

    Science.gov (United States)

    Rodríguez-Laguna, Javier; Santalla, Silvia N.

    2018-05-01

    We present a didactic introduction to adiabatic quantum computation (AQC) via the explicit construction of a classical simulator of quantum computers. This constitutes a suitable route to introduce several important concepts for advanced undergraduates in physics: quantum many-body systems, quantum phase transitions, disordered systems, spin-glasses, and computational complexity theory.

  2. Quantum computer gate simulations | Dada | Journal of the Nigerian ...

    African Journals Online (AJOL)

    A new interactive simulator for Quantum Computation has been developed for simulation of the universal set of quantum gates and for construction of new gates of up to 3 qubits. The simulator also automatically generates an equivalent quantum circuit for any arbitrary unitary transformation on a qubit. Available quantum ...

  3. Direct simulation Monte Carlo ray tracing model of light scattering by a class of real particles and comparison with PROGRA2 experimental results

    International Nuclear Information System (INIS)

    Mikrenska, M.; Koulev, P.; Renard, J.-B.; Hadamcik, E.; Worms, J.-C.

    2006-01-01

    The Direct Simulation Monte Carlo (DSMC) model is presented for three-dimensional single scattering of natural light by suspended, randomly oriented, optically homogeneous and isotropic, rounded and stochastically rough cubic particles. The modelled particles have large size parameter that allows geometric optics approximation to be used. The proposed computational model is simple and flexible. It is tested by comparison with known geometric optics solution for a perfect cube and Lorenz-Mie solution for a sphere, as extreme cases of the class of rounded cubes. Scattering and polarization properties of particles with various geometrical and optical characteristics are examined. The experimental study of real NaCl crystals with new Progra 2 instrument in microgravity conditions is conducted. The experimental and computed polarization and brightness phase curves are compared

  4. Application of parallel computing techniques to a large-scale reservoir simulation

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-01-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance

  5. Advanced computational simulations of water waves interacting with wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  6. Cluster computing for lattice QCD simulations

    International Nuclear Information System (INIS)

    Coddington, P.D.; Williams, A.G.

    2000-01-01

    Full text: Simulations of lattice quantum chromodynamics (QCD) require enormous amounts of compute power. In the past, this has usually involved sharing time on large, expensive machines at supercomputing centres. Over the past few years, clusters of networked computers have become very popular as a low-cost alternative to traditional supercomputers. The dramatic improvements in performance (and more importantly, the ratio of price/performance) of commodity PCs, workstations, and networks have made clusters of off-the-shelf computers an attractive option for low-cost, high-performance computing. A major advantage of clusters is that since they can have any number of processors, they can be purchased using any sized budget, allowing research groups to install a cluster for their own dedicated use, and to scale up to more processors if additional funds become available. Clusters are now being built for high-energy physics simulations. Wuppertal has recently installed ALiCE, a cluster of 128 Alpha workstations running Linux, with a peak performance of 158 G flops. The Jefferson Laboratory in the US has a 16 node Alpha cluster and plans to upgrade to a 256 processor machine. In Australia, several large clusters have recently been installed. Swinburne University of Technology has a cluster of 64 Compaq Alpha workstations used for astrophysics simulations. Early this year our DHPC group constructed a cluster of 116 dual Pentium PCs (i.e. 232 processors) connected by a Fast Ethernet network, which is used by chemists at Adelaide University and Flinders University to run computational chemistry codes. The Australian National University has recently installed a similar PC cluster with 192 processors. The Centre for the Subatomic Structure of Matter (CSSM) undertakes large-scale high-energy physics calculations, mainly lattice QCD simulations. The choice of the computer and network hardware for a cluster depends on the particular applications to be run on the machine. Our

  7. Computer Networks E-learning Based on Interactive Simulations and SCORM

    Directory of Open Access Journals (Sweden)

    Francisco Andrés Candelas

    2011-05-01

    Full Text Available This paper introduces a new set of compact interactive simulations developed for the constructive learning of computer networks concepts. These simulations, which compose a virtual laboratory implemented as portable Java applets, have been created by combining EJS (Easy Java Simulations with the KivaNS API. Furthermore, in this work, the skills and motivation level acquired by the students are evaluated and measured when these simulations are combined with Moodle and SCORM (Sharable Content Object Reference Model documents. This study has been developed to improve and stimulate the autonomous constructive learning in addition to provide timetable flexibility for a Computer Networks subject.

  8. School physics teacher class management, laboratory practice, student engagement, critical thinking, cooperative learning and use of simulations effects on student performance

    Science.gov (United States)

    Riaz, Muhammad

    The purpose of this study was to examine how simulations in physics class, class management, laboratory practice, student engagement, critical thinking, cooperative learning, and use of simulations predicted the percentage of students achieving a grade point average of B or higher and their academic performance as reported by teachers in secondary school physics classes. The target population consisted of secondary school physics teachers who were members of Science Technology, Engineeering and,Mathematics Teachers of New York City (STEMteachersNYC) and American Modeling Teachers Association (AMTA). They used simulations in their physics classes in the 2013 and 2014 school years. Subjects for this study were volunteers. A survey was constructed based on a literature review. Eighty-two physics teachers completed the survey about instructional practice in physics. All respondents were anonymous. Classroom management was the only predictor of the percent of students achieving a grade point average of B or higher in high school physics class. Cooperative learning, use of simulations, and student engagement were predictors of teacher's views of student academic performance in high school physics class. All other variables -- class management, laboratory practice, critical thinking, and teacher self-efficacy -- were not predictors of teacher's views of student academic performance in high school physics class. The implications of these findings were discussed and recommendations for physics teachers to improve student learning were presented.

  9. Multi-Class Motor Imagery EEG Decoding for Brain-Computer Interfaces

    Science.gov (United States)

    Wang, Deng; Miao, Duoqian; Blohm, Gunnar

    2012-01-01

    Recent studies show that scalp electroencephalography (EEG) as a non-invasive interface has great potential for brain-computer interfaces (BCIs). However, one factor that has limited practical applications for EEG-based BCI so far is the difficulty to decode brain signals in a reliable and efficient way. This paper proposes a new robust processing framework for decoding of multi-class motor imagery (MI) that is based on five main processing steps. (i) Raw EEG segmentation without the need of visual artifact inspection. (ii) Considering that EEG recordings are often contaminated not just by electrooculography (EOG) but also other types of artifacts, we propose to first implement an automatic artifact correction method that combines regression analysis with independent component analysis for recovering the original source signals. (iii) The significant difference between frequency components based on event-related (de-) synchronization and sample entropy is then used to find non-contiguous discriminating rhythms. After spectral filtering using the discriminating rhythms, a channel selection algorithm is used to select only relevant channels. (iv) Feature vectors are extracted based on the inter-class diversity and time-varying dynamic characteristics of the signals. (v) Finally, a support vector machine is employed for four-class classification. We tested our proposed algorithm on experimental data that was obtained from dataset 2a of BCI competition IV (2008). The overall four-class kappa values (between 0.41 and 0.80) were comparable to other models but without requiring any artifact-contaminated trial removal. The performance showed that multi-class MI tasks can be reliably discriminated using artifact-contaminated EEG recordings from a few channels. This may be a promising avenue for online robust EEG-based BCI applications. PMID:23087607

  10. EarthServer: Cross-Disciplinary Earth Science Through Data Cube Analytics

    Science.gov (United States)

    Baumann, P.; Rossi, A. P.

    2016-12-01

    The unprecedented increase of imagery, in-situ measurements, and simulation data produced by Earth (and Planetary) Science observations missions bears a rich, yet not leveraged potential for getting insights from integrating such diverse datasets and transform scientific questions into actual queries to data, formulated in a standardized way.The intercontinental EarthServer [1] initiative is demonstrating new directions for flexible, scalable Earth Science services based on innovative NoSQL technology. Researchers from Europe, the US and Australia have teamed up to rigorously implement the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users (scientist, planners, decision makers) will always see just a few datacubes they can slice and dice.EarthServer has established client [2] and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman [3,4], enables direct interaction, including 3-D visualization, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS). Conversely, EarthServer has shaped and advanced WCS based on the experience gained. The first phase of EarthServer has advanced scalable array database technology into 150+ TB services. Currently, Petabyte datacubes are being built for ad-hoc and cross-disciplinary querying, e.g. using climate, Earth observation and ocean data.We will present the EarthServer approach, its impact on OGC / ISO / INSPIRE standardization, and its platform technology, rasdaman.References: [1] Baumann, et al. (2015) DOI: 10.1080/17538947.2014.1003106 [2] Hogan, P., (2011) NASA World Wind, Proceedings of the 2nd International Conference on Computing for Geospatial Research

  11. Type Families with Class, Type Classes with Family

    DEFF Research Database (Denmark)

    Serrano, Alejandro; Hage, Jurriaan; Bahr, Patrick

    2015-01-01

    Type classes and type families are key ingredients in Haskell programming. Type classes were introduced to deal with ad-hoc polymorphism, although with the introduction of functional dependencies, their use expanded to type-level programming. Type families also allow encoding type-level functions......, now as rewrite rules. This paper looks at the interplay of type classes and type families, and how to deal with shortcomings in both of them. Furthermore, we show how to use families to simulate classes at the type level. However, type families alone are not enough for simulating a central feature...... of type classes: elaboration, that is, generating code from the derivation of a rewriting. We look at ways to solve this problem in current Haskell, and propose an extension to allow elaboration during the rewriting phase....

  12. Computer Simulation of Angle-measuring System of Photoelectric Theodolite

    International Nuclear Information System (INIS)

    Zeng, L; Zhao, Z W; Song, S L; Wang, L T

    2006-01-01

    In this paper, a virtual test platform based on malfunction phenomena is designed, using the methods of computer simulation and numerical mask. It is used in the simulation training of angle-measuring system of photoelectric theodolite. Actual application proves that this platform supplies good condition for technicians making deep simulation training and presents a useful approach for the establishment of other large equipment simulation platforms

  13. iSTEM: Celebrating Earth Day with Sustainability

    Science.gov (United States)

    Sibley, Amanda; Kurz, Terri L.

    2014-01-01

    Earth Day is celebrated annually on April 22. Teachers often commemorate Earth Day with their classes by planting trees, discussing important conservation topics (such as recycling or preventing pollution), and encouraging students to take care of planet Earth. To promote observance of Earth Day in an intermediate elementary school classroom, this…

  14. Axial focusing of energy from a hypervelocity impact on earth

    International Nuclear Information System (INIS)

    Boslough, M.B.; Chael, E.P.; Trucano, T.G.; Crawford, D.A.

    1994-01-01

    We have performed computational simulations to determine how energy from a large hypervelocity impact on the Earth's surface would couple to its interior. Because of the first-order axial symmetry of both the impact energy source and the stress-wave velocity structure of the Earth, a disproportionate amount of energy is dissipated along the axis defined by the impact point and its antipode (point opposite the impact). For a symmetric and homogeneous Earth model, all the impact energy that is radiated as seismic waves into the Earth at a given takeoff angle (ray parameter), independent of azimuthal direction, is refocused (minus attenuation) on the axis of symmetry, regardless of the number of reflections and refractions it has experienced. Material on or near the axis of symmetry experiences more strain cycles with much greater amplitude than elsewhere, and therefore experiences more irreversible heating. The focusing is most intense in the upper mantle, within the asthenosphere, where seismic energy is most effectively converted to heat. For a sufficiently energetic impact, this mechanism might generate enough local heating to create an isostatic instability leading to uplift, possibly resulting in rifting, volcanism, or other rearrangement of the interior dynamics of the planet. These simulations demonstrate how hypervelocity impact energy can be transported to the Earth's interior, supporting the possibility of a causal link between large impacts on Earth and major internally-driven geophysical processes

  15. What do we want from computer simulation of SIMS using clusters?

    International Nuclear Information System (INIS)

    Webb, R.P.

    2008-01-01

    Computer simulation of energetic cluster interactions with surfaces has provided much needed insight into some of the complex processes which occur and are responsible for the desirable as well as undesirable effects which make the use of clusters in SIMS both useful and challenging. Simulations have shown how cluster impacts can cause meso-scale motion of the target material which can result in the relatively gentle up-lift of large intact molecules adsorbed on the surface in contrast to the behaviour of single atom impacts which tend to create discrete motion in the surface often ejecting fragments of adsorbed molecules instead. With the insight provided from simulations experimentalists can then improve their equipment to best maximise the desired effects. The past 40 years has seen great progress in simulation techniques and computer equipment. 40 years ago simulations were performed on simple atomic systems of around 300 atoms employing only simple pair-wise interaction potentials to times of several hundred femtoseconds. Currently simulations can be performed on large organic materials employing many body potentials for millions of atoms for times of many picoseconds. These simulations, however, can take several months of computation time. Even with the degree of realism introduced with these long time simulations they are still not perfect are often not capable of being used in a completely predictive way. Computer simulation is reaching a position where by any more effort to increase its realism will make it completely intractable to solution in a reasonable time frame and yet there is an increasing demand from experimentalists for something that can help in a predictive way to help in experiment design and interpretation. This paper will discuss the problems of computer simulation and what might be possible to achieve in the short term, what is unlikely ever to be possible without a major new break through and how we might exploit the meso-scale effects in

  16. Validation and computing and performance studies for the ATLAS simulation

    CERN Document Server

    Marshall, Z; The ATLAS collaboration

    2009-01-01

    We present the validation of the ATLAS simulation software pro ject. Software development is controlled by nightly builds and several levels of automatic tests to ensure stability. Computing validation, including CPU time, memory, and disk space required per event, is benchmarked for all software releases. Several different physics processes and event types are checked to thoroughly test all aspects of the detector simulation. The robustness of the simulation software is demonstrated by the production of 500 million events on the World-wide LHC Computing Grid in the last year.

  17. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    Science.gov (United States)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  18. Computational simulator of robotic manipulators

    International Nuclear Information System (INIS)

    Leal, Alexandre S.; Campos, Tarcisio P.R.

    1995-01-01

    Robotic application for industrial plants is discussed and a computational model for a mechanical manipulator of three links is presented. A neural network feed-forward type has been used to model the dynamic control of the manipulator. A graphic interface was developed in C programming language as a virtual world in order to visualize and simulate the arm movements handling radioactive waste environment. (author). 7 refs, 5 figs

  19. Final Report Collaborative Project. Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Frank [National Center for Atmospheric Research, Boulder, CO (United States); Dennis, John [National Center for Atmospheric Research, Boulder, CO (United States); MacCready, Parker [Univ. of Washington, Seattle, WA (United States); Whitney, Michael [Univ. of Connecticut

    2015-11-20

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. To develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.

  20. Modular Approaches to Earth Science Scientific Computing: 3D Electromagnetic Induction Modeling as an Example

    Science.gov (United States)

    Tandon, K.; Egbert, G.; Siripunvaraporn, W.

    2003-12-01

    We are developing a modular system for three-dimensional inversion of electromagnetic (EM) induction data, using an object oriented programming approach. This approach allows us to modify the individual components of the inversion scheme proposed, and also reuse the components for variety of problems in earth science computing howsoever diverse they might be. In particular, the modularity allows us to (a) change modeling codes independently of inversion algorithm details; (b) experiment with new inversion algorithms; and (c) modify the way prior information is imposed in the inversion to test competing hypothesis and techniques required to solve an earth science problem. Our initial code development is for EM induction equations on a staggered grid, using iterative solution techniques in 3D. An example illustrated here is an experiment with the sensitivity of 3D magnetotelluric inversion to uncertainties in the boundary conditions required for regional induction problems. These boundary conditions should reflect the large-scale geoelectric structure of the study area, which is usually poorly constrained. In general for inversion of MT data, one fixes boundary conditions at the edge of the model domain, and adjusts the earth?s conductivity structure within the modeling domain. Allowing for errors in specification of the open boundary values is simple in principle, but no existing inversion codes that we are aware of have this feature. Adding a feature such as this is straightforward within the context of the modular approach. More generally, a modular approach provides an efficient methodology for setting up earth science computing problems to test various ideas. As a concrete illustration relevant to EM induction problems, we investigate the sensitivity of MT data near San Andreas Fault at Parkfield (California) to uncertainties in the regional geoelectric structure.

  1. An Investigation of Computer-based Simulations for School Crises Management.

    Science.gov (United States)

    Degnan, Edward; Bozeman, William

    2001-01-01

    Describes development of a computer-based simulation program for training school personnel in crisis management. Addresses the data collection and analysis involved in developing a simulated event, the systems requirements for simulation, and a case study of application and use of the completed simulation. (Contains 21 references.) (Authors/PKP)

  2. The use of micro-computers in the simulation of ion beam optics

    International Nuclear Information System (INIS)

    Spaedtke, P.; Ivens, D.

    1989-01-01

    With computer simulation codes specific problems of the ion beam optics can be studied, which is useful in the design as in optimization of existing systems. Several such codes have been developed, unfortunately requiring substantial computer resources. Recent advances of mini- and micro-computers have now made it possible to develop simulation codes which can be run on these small computers also. In this paper, some of these codes will be presented and their computing time discussed. (author)

  3. Computer simulation of driven Alfven waves

    International Nuclear Information System (INIS)

    Geary, J.L. Jr.

    1986-01-01

    The first particle simulation study of shear Alfven wave resonance heating is presented. Particle simulation codes self-consistently follow the time evolution of the individual and collective aspects of particle dynamics as well as wave dynamics in a fully nonlinear fashion. Alfven wave heating is a possible means of increasing the temperature of magnetized plasmas. A new particle simulation model was developed for this application that incorporates Darwin's formulation of the electromagnetic fields with a guiding center approximation for electron motion perpendicular to the ambient magnetic field. The implementation of this model and the examination of its theoretical and computational properties are presented. With this model, several cases of Alfven wave heating is examined in both uniform and nonuniform simulation systems in a two dimensional slab. For the inhomogeneous case studies, the kinetic Alfven wave develops in the vicinity of the shear Alfven resonance region

  4. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  5. Computer Simulations and Theoretical Studies of Complex Systems: from complex fluids to frustrated magnets

    Science.gov (United States)

    Choi, Eunsong

    Computer simulations are an integral part of research in modern condensed matter physics; they serve as a direct bridge between theory and experiment by systemactically applying a microscopic model to a collection of particles that effectively imitate a macroscopic system. In this thesis, we study two very differnt condensed systems, namely complex fluids and frustrated magnets, primarily by simulating classical dynamics of each system. In the first part of the thesis, we focus on ionic liquids (ILs) and polymers--the two complementary classes of materials that can be combined to provide various unique properties. The properties of polymers/ILs systems, such as conductivity, viscosity, and miscibility, can be fine tuned by choosing an appropriate combination of cations, anions, and polymers. However, designing a system that meets a specific need requires a concrete understanding of physics and chemistry that dictates a complex interplay between polymers and ionic liquids. In this regard, molecular dynamics (MD) simulation is an efficient tool that provides a molecular level picture of such complex systems. We study the behavior of Poly (ethylene oxide) (PEO) and the imidazolium based ionic liquids, using MD simulations and statistical mechanics. We also discuss our efforts to develop reliable and efficient classical force-fields for PEO and the ionic liquids. The second part is devoted to studies on geometrically frustrated magnets. In particular, a microscopic model, which gives rise to an incommensurate spiral magnetic ordering observed in a pyrochlore antiferromagnet is investigated. The validation of the model is made via a comparison of the spin-wave spectra with the neutron scattering data. Since the standard Holstein-Primakoff method is difficult to employ in such a complex ground state structure with a large unit cell, we carry out classical spin dynamics simulations to compute spin-wave spectra directly from the Fourier transform of spin trajectories. We

  6. 77 FR 9839 - Amendment of Class D and Class E Airspace, and Establishment of Class E Airspace; Bozeman, MT

    Science.gov (United States)

    2012-02-21

    ..., to accommodate aircraft using Instrument Landing System (ILS) Localizer (LOC) standard instrument... 6005 Class E airspace areas extending upward from 700 feet or more above the surface of the earth...

  7. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  8. Positive Wigner functions render classical simulation of quantum computation efficient.

    Science.gov (United States)

    Mari, A; Eisert, J

    2012-12-07

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  9. Assessing Practical Skills in Physics Using Computer Simulations

    Science.gov (United States)

    Walsh, Kevin

    2018-01-01

    Computer simulations have been used very effectively for many years in the teaching of science but the focus has been on cognitive development. This study, however, is an investigation into the possibility that a student's experimental skills in the real-world environment can be judged via the undertaking of a suitably chosen computer simulation…

  10. Computer simulation of stair falls to investigate scenarios in child abuse.

    Science.gov (United States)

    Bertocci, G E; Pierce, M C; Deemer, E; Aguel, F

    2001-09-01

    To demonstrate the usefulness of computer simulation techniques in the investigation of pediatric stair falls. Since stair falls are a common falsely reported injury scenario in child abuse, our specific aim was to investigate the influence of stair characteristics on injury biomechanics of pediatric stair falls by using a computer simulation model. Our long-term goal is to use knowledge of biomechanics to aid in distinguishing between accidents and abuse. A computer simulation model of a 3-year-old child falling down stairs was developed using commercially available simulation software. This model was used to investigate the influence that stair characteristics have on biomechanical measures associated with injury risk. Since femur fractures occur in unintentional and abuse scenarios, biomechanical measures were focused on the lower extremities. The number and slope of steps and stair surface friction and elasticity were found to affect biomechanical measures associated with injury risk. Computer simulation techniques are useful for investigating the biomechanics of stair falls. Using our simulation model, we determined that stair characteristics have an effect on potential for lower extremity injuries. Although absolute values of biomechanical measures should not be relied on in an unvalidated model such as this, relationships between accident-environment factors and biomechanical measures can be studied through simulation. Future efforts will focus on model validation.

  11. Evaluation of Rankine cycle air conditioning system hardware by computer simulation

    Science.gov (United States)

    Healey, H. M.; Clark, D.

    1978-01-01

    A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.

  12. Hybrid simulation techniques applied to the earth's bow shock

    Science.gov (United States)

    Winske, D.; Leroy, M. M.

    1985-01-01

    The application of a hybrid simulation model, in which the ions are treated as discrete particles and the electrons as a massless charge-neutralizing fluid, to the study of the earth's bow shock is discussed. The essentials of the numerical methods are described in detail; movement of the ions, solution of the electromagnetic fields and electron fluid equations, and imposition of appropriate boundary and initial conditions. Examples of results of calculations for perpendicular shocks are presented which demonstrate the need for a kinetic treatment of the ions to reproduce the correct ion dynamics and the corresponding shock structure. Results for oblique shocks are also presented to show how the magnetic field and ion motion differ from the perpendicular case.

  13. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  14. HTMT-class Latency Tolerant Parallel Architecture for Petaflops Scale Computation

    Science.gov (United States)

    Sterling, Thomas; Bergman, Larry

    2000-01-01

    Computational Aero Sciences and other numeric intensive computation disciplines demand computing throughputs substantially greater than the Teraflops scale systems only now becoming available. The related fields of fluids, structures, thermal, combustion, and dynamic controls are among the interdisciplinary areas that in combination with sufficient resolution and advanced adaptive techniques may force performance requirements towards Petaflops. This will be especially true for compute intensive models such as Navier-Stokes are or when such system models are only part of a larger design optimization computation involving many design points. Yet recent experience with conventional MPP configurations comprising commodity processing and memory components has shown that larger scale frequently results in higher programming difficulty and lower system efficiency. While important advances in system software and algorithms techniques have had some impact on efficiency and programmability for certain classes of problems, in general it is unlikely that software alone will resolve the challenges to higher scalability. As in the past, future generations of high-end computers may require a combination of hardware architecture and system software advances to enable efficient operation at a Petaflops level. The NASA led HTMT project has engaged the talents of a broad interdisciplinary team to develop a new strategy in high-end system architecture to deliver petaflops scale computing in the 2004/5 timeframe. The Hybrid-Technology, MultiThreaded parallel computer architecture incorporates several advanced technologies in combination with an innovative dynamic adaptive scheduling mechanism to provide unprecedented performance and efficiency within practical constraints of cost, complexity, and power consumption. The emerging superconductor Rapid Single Flux Quantum electronics can operate at 100 GHz (the record is 770 GHz) and one percent of the power required by convention

  15. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  16. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  17. Computer Simulation Surgery for Mandibular Reconstruction Using a Fibular Osteotomy Guide

    Directory of Open Access Journals (Sweden)

    Woo Shik Jeong

    2014-09-01

    Full Text Available In the present study, a fibular osteotomy guide based on a computer simulation was applied to a patient who had undergone mandibular segmental ostectomy due to oncological complications. This patient was a 68-year-old woman who presented to our department with a biopsy-proven squamous cell carcinoma on her left gingival area. This lesion had destroyed the cortical bony structure, and the patient showed attenuation of her soft tissue along the inferior alveolar nerve, indicating perineural spread of the tumor. Prior to surgery, a three-dimensional computed tomography scan of the facial and fibular bones was performed. We then created a virtual computer simulation of the mandibular segmental defect through which we segmented the fibular to reconstruct the proper angulation in the original mandible. Approximately 2-cm segments were created on the basis of this simulation and applied to the virtually simulated mandibular segmental defect. Thus, we obtained a virtual model of the ideal mandibular reconstruction for this patient with a fibular free flap. We could then use this computer simulation for the subsequent surgery and minimize the bony gaps between the multiple fibular bony segments.

  18. Teaching Topographic Map Skills and Geomorphology Concepts with Google Earth in a One-Computer Classroom

    Science.gov (United States)

    Hsu, Hsiao-Ping; Tsai, Bor-Wen; Chen, Che-Ming

    2018-01-01

    Teaching high-school geomorphological concepts and topographic map reading entails many challenges. This research reports the applicability and effectiveness of Google Earth in teaching topographic map skills and geomorphological concepts, by a single teacher, in a one-computer classroom. Compared to learning via a conventional instructional…

  19. Toward regional-scale adjoint tomography in the deep earth

    Science.gov (United States)

    Masson, Y.; Romanowicz, B. A.

    2013-12-01

    Thanks to the development of efficient numerical computation methods, such as the Spectral Element Method (SEM) and to the increasing power of computer clusters, it is now possible to obtain regional-scale images of the Earth's interior using adjoint-tomography (e.g. Tape, C., et al., 2009). As for now, these tomographic models are limited to the upper layers of the earth, i.e., they provide us with high-resolution images of the crust and the upper part of the mantle. Given the gigantic amount of calculation it represents, obtaing similar models at the global scale (i.e. images of the entire Earth) seems out of reach at the moment. Furthermore, it's likely that the first generation of such global adjoint tomographic models will have a resolution significantly smaller than the current regional models. In order to image regions of interests in the deep Earth, such as plumes, slabs or large low shear velocity provinces (LLSVPs), while keeping the computation tractable, we are developing new tools that will allow us to perform regional-scale adjoint-tomography at arbitrary depths. In a recent study (Masson et al., 2013), we showed that a numerical equivalent of the time reversal mirrors used in experimental acoustics permits to confine the wave propagation computations (i.e. using SEM simulations) inside the region to be imaged. With this ability to limit wave propagation modeling inside a region of interest, obtaining the adjoint sensitivity kernels needed for tomographic imaging is only two steps further. First, the local wavefield modeling needs to be coupled with field extrapolation techniques in order to obtain synthetic seismograms at the surface of the earth. These seismograms will account for the 3D structure inside the region of interest in a quasi-exact manner. We will present preliminary results where the field-extrapolation is performed using Green's function computed in a 1D Earth model thanks to the Direct Solution Method (DSM). Once synthetic seismograms

  20. SNOW: a digital computer program for the simulation of ion beam devices

    International Nuclear Information System (INIS)

    Boers, J.E.

    1980-08-01

    A digital computer program, SNOW, has been developed for the simulation of dense ion beams. The program simulates the plasma expansion cup (but not the plasma source itself), the acceleration region, and a drift space with neutralization if desired. The ion beam is simulated by computing representative trajectories through the device. The potentials are simulated on a large rectangular matrix array which is solved by iterative techniques. Poisson's equation is solved at each point within the configuration using space-charge densities computed from the ion trajectories combined with background electron and/or ion distributions. The simulation methods are described in some detail along with examples of both axially-symmetric and rectangular beams. A detailed description of the input data is presented

  1. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  2. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  3. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Directory of Open Access Journals (Sweden)

    Jakob Jordan

    2018-02-01

    Full Text Available State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  4. Computer simulation of fatigue under diametrical compression

    OpenAIRE

    Carmona, H. A.; Kun, F.; Andrade Jr., J. S.; Herrmann, H. J.

    2006-01-01

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue, and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows to follow the development of the fracture process on the macro- and micro-level varying the relative influence of the mechanisms of damage accumulation over the ...

  5. Interactive Heat Transfer Simulations for Everyone

    Science.gov (United States)

    Xie, Charles

    2012-01-01

    Heat transfer is widely taught in secondary Earth science and physics. Researchers have identified many misconceptions related to heat and temperature. These misconceptions primarily stem from hunches developed in everyday life (though the confusions in terminology often worsen them). Interactive computer simulations that visualize thermal energy,…

  6. Computer simulations and the changing face of scientific experimentation

    CERN Document Server

    Duran, Juan M

    2013-01-01

    Computer simulations have become a central tool for scientific practice. Their use has replaced, in many cases, standard experimental procedures. This goes without mentioning cases where the target system is empirical but there are no techniques for direct manipulation of the system, such as astronomical observation. To these cases, computer simulations have proved to be of central importance. The question about their use and implementation, therefore, is not only a technical one but represents a challenge for the humanities as well. In this volume, scientists, historians, and philosophers joi

  7. A Computational Framework for Bioimaging Simulation

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  8. A Computational Framework for Bioimaging Simulation.

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  9. A Computational Framework for Bioimaging Simulation.

    Directory of Open Access Journals (Sweden)

    Masaki Watabe

    Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  10. simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    İlker Kalender

    2015-06-01

    Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described

  11. The challenge of quantum computer simulations of physical phenomena

    International Nuclear Information System (INIS)

    Ortiz, G.; Knill, E.; Gubernatis, J.E.

    2002-01-01

    The goal of physics simulation using controllable quantum systems ('physics imitation') is to exploit quantum laws to advantage, and thus accomplish efficient simulation of physical phenomena. In this Note, we discuss the fundamental concepts behind this paradigm of information processing, such as the connection between models of computation and physical systems. The experimental simulation of a toy quantum many-body problem is described

  12. High performance stream computing for particle beam transport simulations

    International Nuclear Information System (INIS)

    Appleby, R; Bailey, D; Higham, J; Salt, M

    2008-01-01

    Understanding modern particle accelerators requires simulating charged particle transport through the machine elements. These simulations can be very time consuming due to the large number of particles and the need to consider many turns of a circular machine. Stream computing offers an attractive way to dramatically improve the performance of such simulations by calculating the simultaneous transport of many particles using dedicated hardware. Modern Graphics Processing Units (GPUs) are powerful and affordable stream computing devices. The results of simulations of particle transport through the booster-to-storage-ring transfer line of the DIAMOND synchrotron light source using an NVidia GeForce 7900 GPU are compared to the standard transport code MAD. It is found that particle transport calculations are suitable for stream processing and large performance increases are possible. The accuracy and potential speed gains are compared and the prospects for future work in the area are discussed

  13. Computer simulation of variform fuel assemblies using Dragon code

    International Nuclear Information System (INIS)

    Ju Haitao; Wu Hongchun; Yao Dong

    2005-01-01

    The DRAGON is a cell code that developed for the CANDU reactor by the Ecole Polytechnique de Montreal of CANADA. Although, the DRAGON is mainly used to simulate the CANDU super-cell fuel assembly, it has an ability to simulate other geometries of the fuel assembly. However, only NEACRP benchmark problem of the BWR lattice cell was analyzed until now except for the CANDU reactor. We also need to develop the code to simulate the variform fuel assemblies, especially, for design of the advanced reactor. We validated that the cell code DRAGON is useful for simulating various kinds of the fuel assembly by analyzing the rod-shape fuel assembly of the PWR and the MTR plate-shape fuel assembly. Some other kinds of geometry of geometry were computed. Computational results show that the DRAGON is able to analyze variform fuel assembly problems and the precision is high. (authors)

  14. Teaching Computer Organization and Architecture Using Simulation and FPGA Applications

    OpenAIRE

    D. K.M. Al-Aubidy

    2007-01-01

    This paper presents the design concepts and realization of incorporating micro-operation simulation and FPGA implementation into a teaching tool for computer organization and architecture. This teaching tool helps computer engineering and computer science students to be familiarized practically with computer organization and architecture through the development of their own instruction set, computer programming and interfacing experiments. A two-pass assembler has been designed and implemente...

  15. Simulation of the 23 July 2012 Extreme Space Weather Event: What if This Extremely Rare CME Was Earth Directed?

    Science.gov (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Mays, M. Leila; Kuznetsova, Maria M.; Galvin, A. B.; Simunac, Kristin; Baker, Daniel N.; Li, Xinlin; Zheng, Yihua; Glocer, Alex

    2013-01-01

    Extreme space weather events are known to cause adverse impacts on critical modern day technological infrastructure such as high-voltage electric power transmission grids. On 23 July 2012, NASA's Solar Terrestrial Relations Observatory-Ahead (STEREO-A) spacecraft observed in situ an extremely fast coronal mass ejection (CME) that traveled 0.96 astronomical units (approx. 1 AU) in about 19 h. Here we use the SpaceWeather Modeling Framework (SWMF) to perform a simulation of this rare CME.We consider STEREO-A in situ observations to represent the upstream L1 solar wind boundary conditions. The goal of this study is to examine what would have happened if this Rare-type CME was Earth-bound. Global SWMF-generated ground geomagnetic field perturbations are used to compute the simulated induced geoelectric field at specific ground-based active INTERMAGNET magnetometer sites. Simulation results show that while modeled global SYM-H index, a high-resolution equivalent of the Dst index, was comparable to previously observed severe geomagnetic storms such as the Halloween 2003 storm, the 23 July CME would have produced some of the largest geomagnetically induced electric fields, making it very geoeffective. These results have important practical applications for risk management of electrical power grids.

  16. Simulation and Measurement of Through-the-Earth, Extremely Low-Frequency Signals Using Copper-Clad Steel Ground Rods

    OpenAIRE

    Damiano, Nicholas William; Yan, Lincan; Whisner, Bruce; Zhou, Chenming

    2017-01-01

    The underground mining environment can greatly affect radio signal propagation. Understanding how the earth affects signal propagation is a key to evaluating communications systems used during a mine emergency. One type of communication system is through-the-earth, which can utilize extremely low frequencies (ELF). This paper presents the simulation and measurement results of recent National Institute for Occupational Safety and Health (NIOSH) research aimed at investigating current injection...

  17. Impact of an equality constraint on the class-specific residual variances in regression mixtures: A Monte Carlo simulation study.

    Science.gov (United States)

    Kim, Minjung; Lamont, Andrea E; Jaki, Thomas; Feaster, Daniel; Howe, George; Van Horn, M Lee

    2016-06-01

    Regression mixture models are a novel approach to modeling the heterogeneous effects of predictors on an outcome. In the model-building process, often residual variances are disregarded and simplifying assumptions are made without thorough examination of the consequences. In this simulation study, we investigated the impact of an equality constraint on the residual variances across latent classes. We examined the consequences of constraining the residual variances on class enumeration (finding the true number of latent classes) and on the parameter estimates, under a number of different simulation conditions meant to reflect the types of heterogeneity likely to exist in applied analyses. The results showed that bias in class enumeration increased as the difference in residual variances between the classes increased. Also, an inappropriate equality constraint on the residual variances greatly impacted on the estimated class sizes and showed the potential to greatly affect the parameter estimates in each class. These results suggest that it is important to make assumptions about residual variances with care and to carefully report what assumptions are made.

  18. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  19. Computer Simulation of Global Profiles of Carbon Dioxide Using a Pulsed, 2-Micron, Coherent-Detection, Column-Content DIAL System

    Science.gov (United States)

    Kavaya, Michael J.; Singh, Upendra N.; Koch, Grady J.; Yu, Jirong; Frehlich, Rod G.

    2009-01-01

    We present preliminary results of computer simulations of the error in measuring carbon dioxide mixing ratio profiles from earth orbit. The simulated sensor is a pulsed, 2-micron, coherent-detection lidar alternately operating on at least two wavelengths. The simulated geometry is a nadir viewing lidar measuring the column content signal. Atmospheric absorption is modeled using FASCODE3P software with the HITRAN 2004 absorption line data base. Lidar shot accumulation is employed up to the horizontal resolution limit. Horizontal resolutions of 50, 100, and 200 km are shown. Assuming a 400 km spacecraft orbit, the horizontal resolutions correspond to measurement times of about 7, 14, and 28 s. We simulate laser pulse-pair repetition frequencies from 1 Hz to 100 kHz. The range of shot accumulation is 7 to 2.8 million pulse-pairs. The resultant error is shown as a function of horizontal resolution, laser pulse-pair repetition frequency, and laser pulse energy. The effect of different on and off pulse energies is explored. The results are compared to simulation results of others and to demonstrated 2-micron operating points at NASA Langley.

  20. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  1. Refining Pragmatically-Appropriate Oral Communication via Computer-Simulated Conversations

    Science.gov (United States)

    Sydorenko, Tetyana; Daurio, Phoebe; Thorne, Steven L.

    2018-01-01

    To address the problem of limited opportunities for practicing second language speaking in interaction, especially delicate interactions requiring pragmatic competence, we describe computer simulations designed for the oral practice of extended pragmatic routines and report on the affordances of such simulations for learning pragmatically…

  2. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  3. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  4. Computer Simulation of the Relationship between Selected Properties of PVD Coatings

    Directory of Open Access Journals (Sweden)

    Śliwa A.

    2016-06-01

    Full Text Available The possibility to apply the Finite Element Method to calculate internal stresses which occur in Ti+TiN, Ti+Ti(CxN1-x and Ti+TiC coatings obtained in the magnetron PVD process on the sintered high-speed steel of the PM HS6-5-3-8 type. For the purpose of computer simulation of internal stresses in coatings with the use of MES, the correct model of analyzed specimens was worked out and then it was experimentally verified by comparison of calculation results with the results of computer simulation. Accurate analysis of correlations indicated especially strong dependence between internal stresses and microhardness and between microhardness and erosion resistance what created conditions for establishing the dependence between internal stresses obtained in the result of computer simulation and erosion resistance as basic functional quality of coating. It has essential practical meaning because it allows to estimate predictable erosion resistance of coating exclusively on the base of the results of computer simulation for used parameters in the process of coating manufacturing.

  5. Comparison of real and computer-simulated outcomes of LASIK refractive surgery

    Science.gov (United States)

    Cano, Daniel; Barbero, Sergio; Marcos, Susana

    2004-06-01

    Computer simulations of alternative LASIK ablation patterns were performed for corneal elevation maps of 13 real myopic corneas (range of myopia, -2.0 to -11.5 D). The computationally simulated ablation patterns were designed with biconic surfaces (standard Munnerlyn pattern, parabolic pattern, and biconic pattern) or with aberrometry measurements (customized pattern). Simulated results were compared with real postoperative outcomes. Standard LASIK refractive surgery for myopia increased corneal asphericity and spherical aberration. Computations with the theoretical Munnerlyn ablation pattern did not increase the corneal asphericity and spherical aberration. The theoretical parabolic pattern induced a slight increase of asphericity and spherical aberration, explaining only 40% of the clinically found increase. The theoretical biconic pattern controlled corneal spherical aberration. Computations showed that the theoretical customized pattern can correct high-order asymmetric aberrations. Simulations of changes in efficiency due to reflection and nonnormal incidence of the laser light showed a further increase in corneal asphericity. Consideration of these effects with a parabolic pattern accounts for 70% of the clinical increase in asphericity.

  6. The problems of cosmic ray particle simulation for the near-Earth orbital and interplanetary flight conditions

    International Nuclear Information System (INIS)

    Nymmik, R.A.

    1999-01-01

    A wide range of the galactic cosmic ray and SEP event flux simulation problems for the near-Earth satellite and manned spacecraft orbits and for the interplanetary mission trajectories are discussed. The models of the galactic cosmic ray and SEP events in the Earth orbit beyond the Earth's magnetosphere are used as a basis. The particle fluxes in the near-Earth orbits should be calculated using the transmission functions. To calculate the functions, the dependences of the cutoff rigidities on the magnetic disturbance level and on magnetic local time have to be known. In the case of space flights towards the Sun and to the boundary of the solar system, particular attention is paid to the changes in the SEP event occurrence frequency and size. The particle flux gradients are applied in this case to galactic cosmic ray fluxes

  7. Use of computer simulations for the early introduction of nuclear engineering concepts

    International Nuclear Information System (INIS)

    Ougouag, A.M.; Zerguini, T.H.

    1985-01-01

    A sophomore level nuclear engineering (NE) course is being introduced at the University of Illinois. Via computer simulations, this course presents materials covering the most important aspects of the field. It is noted that computer simulations in nuclear engineering are cheaper and safer than experiments yet they provide an effective teaching tool for the early introduction of advanced concepts. The new course material can be used as a tutorial and for remedial learning. The use of computer simulation motivates learning since students associate computer activities with games. Such a course can help in the dissemination of the proper information to students from different fields, including liberal arts, and eventually increase undergraduate student enrollment in nuclear engineering

  8. UNH Data Cooperative: A Cyber Infrastructure for Earth System Studies

    Science.gov (United States)

    Braswell, B. H.; Fekete, B. M.; Prusevich, A.; Gliden, S.; Magill, A.; Vorosmarty, C. J.

    2007-12-01

    Earth system scientists and managers have a continuously growing demand for a wide array of earth observations derived from various data sources including (a) modern satellite retrievals, (b) "in-situ" records, (c) various simulation outputs, and (d) assimilated data products combining model results with observational records. The sheer quantity of data, and formatting inconsistencies make it difficult for users to take full advantage of this important information resource. Thus the system could benefit from a thorough retooling of our current data processing procedures and infrastructure. Emerging technologies, like OPeNDAP and OGC map services, open standard data formats (NetCDF, HDF) data cataloging systems (NASA-Echo, Global Change Master Directory, etc.) are providing the basis for a new approach in data management and processing, where web- services are increasingly designed to serve computer-to-computer communications without human interactions and complex analysis can be carried out over distributed computer resources interconnected via cyber infrastructure. The UNH Earth System Data Collaborative is designed to utilize the aforementioned emerging web technologies to offer new means of access to earth system data. While the UNH Data Collaborative serves a wide array of data ranging from weather station data (Climate Portal) to ocean buoy records and ship tracks (Portsmouth Harbor Initiative) to land cover characteristics, etc. the underlaying data architecture shares common components for data mining and data dissemination via web-services. Perhaps the most unique element of the UNH Data Cooperative's IT infrastructure is its prototype modeling environment for regional ecosystem surveillance over the Northeast corridor, which allows the integration of complex earth system model components with the Cooperative's data services. While the complexity of the IT infrastructure to perform complex computations is continuously increasing, scientists are often forced

  9. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...

  10. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  11. Plant Closings and Capital Flight: A Computer-Assisted Simulation.

    Science.gov (United States)

    Warner, Stanley; Breitbart, Myrna M.

    1989-01-01

    A course at Hampshire College was designed to simulate the decision-making environment in which constituencies in a medium-sized city would respond to the closing and relocation of a major corporate plant. The project, constructed as a role simulation with a computer component, is described. (MLW)

  12. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    International Nuclear Information System (INIS)

    Chow, J

    2015-01-01

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant

  13. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chow, J [Princess Margaret Cancer Center, Toronto, ON (Canada)

    2015-06-15

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.

  14. Computational plasticity algorithm for particle dynamics simulations

    Science.gov (United States)

    Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.

    2018-01-01

    The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.

  15. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    Science.gov (United States)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  16. Topics in computer simulations of statistical systems

    International Nuclear Information System (INIS)

    Salvador, R.S.

    1987-01-01

    Several computer simulations studying a variety of topics in statistical mechanics and lattice gauge theories are performed. The first study describes a Monte Carlo simulation performed on Ising systems defined on Sierpinsky carpets of dimensions between one and four. The critical coupling and the exponent γ are measured as a function of dimension. The Ising gauge theory in d = 4 - epsilon, for epsilon → 0 + , is then studied by performing a Monte Carlo simulation for the theory defined on fractals. A high statistics Monte Carlo simulation for the three-dimensional Ising model is presented for lattices of sizes 8 3 to 44 3 . All the data obtained agrees completely, within statistical errors, with the forms predicted by finite-sizing scaling. Finally, a method to estimate numerically the partition function of statistical systems is developed

  17. A computer simulation model to compute the radiation transfer of mountainous regions

    Science.gov (United States)

    Li, Yuguang; Zhao, Feng; Song, Rui

    2011-11-01

    In mountainous regions, the radiometric signal recorded at the sensor depends on a number of factors such as sun angle, atmospheric conditions, surface cover type, and topography. In this paper, a computer simulation model of radiation transfer is designed and evaluated. This model implements the Monte Carlo ray-tracing techniques and is specifically dedicated to the study of light propagation in mountainous regions. The radiative processes between sun light and the objects within the mountainous region are realized by using forward Monte Carlo ray-tracing methods. The performance of the model is evaluated through detailed comparisons with the well-established 3D computer simulation model: RGM (Radiosity-Graphics combined Model) based on the same scenes and identical spectral parameters, which shows good agreements between these two models' results. By using the newly developed computer model, series of typical mountainous scenes are generated to analyze the physical mechanism of mountainous radiation transfer. The results show that the effects of the adjacent slopes are important for deep valleys and they particularly affect shadowed pixels, and the topographic effect needs to be considered in mountainous terrain before accurate inferences from remotely sensed data can be made.

  18. Comparison of interradicular distances and cortical bone thickness in Thai patients with class I and class II skeletal patterns using cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Khumsarn, Nattida [Dental Division of Lamphun Hospital, Lamphun (Thailand); Patanaporn, Virush; Janhom, Apirum; Jotikasthira, Dhirawat [Faculty of Dentistry, Chiang Mai University, Chiang Mai (Thailand)

    2016-06-15

    This study evaluated and compared interradicular distances and cortical bone thickness in Thai patients with Class I and Class II skeletal patterns, using cone-beam computed tomography (CBCT). Pretreatment CBCT images of 24 Thai orthodontic patients with Class I and Class II skeletal patterns were included in the study. Three measurements were chosen for investigation: the mesiodistal distance between the roots, the width of the buccolingual alveolar process, and buccal cortical bone thickness. All distances were recorded at five different levels from the cementoenamel junction (CEJ). Descriptive statistical analysis and t-tests were performed, with the significance level for all tests set at p<0.05. Patients with a Class II skeletal pattern showed significantly greater maxillary mesiodistal distances (between the first and second premolars) and widths of the buccolingual alveolar process (between the first and second molars) than Class I skeletal pattern patients at 10 mm above the CEJ. The maxillary buccal cortical bone thicknesses between the second premolar and first molar at 8 mm above the CEJ in Class II patients were likewise significantly greater than in Class I patients. Patients with a Class I skeletal pattern showed significantly wider mandibular buccolingual alveolar processes than did Class II patients (between the first and second molars) at 4, 6, and 8 mm below the CEJ. In both the maxilla and mandible, the mesiodistal distances, the width of the buccolingual alveolar process, and buccal cortical bone thickness tended to increase from the CEJ to the apex in both Class I and Class II skeletal patterns.

  19. Comparison of interradicular distances and cortical bone thickness in Thai patients with class I and class II skeletal patterns using cone-beam computed tomography

    International Nuclear Information System (INIS)

    Khumsarn, Nattida; Patanaporn, Virush; Janhom, Apirum; Jotikasthira, Dhirawat

    2016-01-01

    This study evaluated and compared interradicular distances and cortical bone thickness in Thai patients with Class I and Class II skeletal patterns, using cone-beam computed tomography (CBCT). Pretreatment CBCT images of 24 Thai orthodontic patients with Class I and Class II skeletal patterns were included in the study. Three measurements were chosen for investigation: the mesiodistal distance between the roots, the width of the buccolingual alveolar process, and buccal cortical bone thickness. All distances were recorded at five different levels from the cementoenamel junction (CEJ). Descriptive statistical analysis and t-tests were performed, with the significance level for all tests set at p<0.05. Patients with a Class II skeletal pattern showed significantly greater maxillary mesiodistal distances (between the first and second premolars) and widths of the buccolingual alveolar process (between the first and second molars) than Class I skeletal pattern patients at 10 mm above the CEJ. The maxillary buccal cortical bone thicknesses between the second premolar and first molar at 8 mm above the CEJ in Class II patients were likewise significantly greater than in Class I patients. Patients with a Class I skeletal pattern showed significantly wider mandibular buccolingual alveolar processes than did Class II patients (between the first and second molars) at 4, 6, and 8 mm below the CEJ. In both the maxilla and mandible, the mesiodistal distances, the width of the buccolingual alveolar process, and buccal cortical bone thickness tended to increase from the CEJ to the apex in both Class I and Class II skeletal patterns

  20. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  1. Discovery and dynamical characterization of the Amor-class asteroid 2012 XH16

    Science.gov (United States)

    Wlodarczyk, I.; Cernis, K.; Boyle, R. P.; Laugalys, V.

    2014-03-01

    The near-Earth asteroid belt is continuously replenished with material originally moving in Amor-class orbits. Here, the orbit of the dynamically interesting Amor-class asteroid 2012 XH16 is analysed. This asteroid was discovered with the Vatican Advanced Technology Telescope (VATT) at the Mt Graham International Observatory as part of an ongoing asteroid survey focused on astrometry and photometry. The orbit of the asteroid was computed using 66 observations (57 obtained with VATT and 9 from the Lunar and Planetary Laboratory-Spacewatch II project) to give a = 1.63 au, e = 0.36, i = 3.76°. The absolute magnitude of the asteroid is 22.3 which translates into a diameter in the range 104-231 m, assuming the average albedos of S-type and C-type asteroids, respectively. We have used the current orbit to study the future dynamical evolution of the asteroid under the perturbations of the planets and the Moon, relativistic effects, and the Yarkovsky force. Asteroid 2012 XH16 is locked close to the strong 1:2 mean motion resonance with the Earth. The object shows stable evolution and could survive in near-resonance for a relatively long period of time despite experiencing frequent close encounters with Mars. Moreover, results of our computations show that the asteroid 2012 XH16 can survive in the Amor region at most for about 200-400 Myr. The evolution is highly chaotic with a characteristic Lyapunov time of 245 yr. Jupiter is the main perturber but the effects of Saturn, Mars and the Earth-Moon system are also important. In particular, secular resonances with Saturn are significant.

  2. An alternative methodology for planning computer class where teaching means are used

    Directory of Open Access Journals (Sweden)

    Maria del Carmen Carrillo Hernández

    2016-06-01

    Full Text Available Teaching subject of Informatics II, is provided in the fourth year of teaching career Informática- Labor Education, one of the objectives of it is to develop skills in students that allow plan and structure independently, original and creative, Computer class, where the use of computer media education is the guiding element from which students acquire knowledge. Professional practice have been identified limitations in this regard, with the aim of contributing to the development of these skills, the authors of this article propose an alternative methodology that will guide teachers of this subject, to lead the process learning learning so that the goals are met the program guides.

  3. School Physics Teacher Class Management, Laboratory Practice, Student Engagement, Critical Thinking, Cooperative Learning and Use of Simulations Effects on Student Performance

    Science.gov (United States)

    Riaz, Muhammad

    2015-01-01

    The purpose of this study was to examine how simulations in physics class, class management, laboratory practice, student engagement, critical thinking, cooperative learning, and use of simulations predicted the percentage of students achieving a grade point average of B or higher and their academic performance as reported by teachers in secondary…

  4. Batch Simulation of Rare Earths Extractive Separation by Di (2-Ethylhexyl) Phosphoric Acid and Tributylphosphate in Kerosene

    International Nuclear Information System (INIS)

    Kraikaew, Jarunee; Srinuttakul, Wanee

    2004-01-01

    Liquid-liquid extraction is applied to separate individual rare earths. In this research, 6-stage continuous countercurrent solvent extraction was simulated to extract rare earths from rare earth nitrate solution, which was obtained from monazite processing, to estimate the possible optimum operating conditions for pilot or industrial plants. The solvent(S) per feed(F) ratio (S/F) was varied from 1 to 3. The organic are 1.0 and 1.5 Molars (M) Di (2-ethylhexyl) phosphoric acid (D2EHPA) in kerosene. 50% tributylphosphate (TBP) in kerosene was applied for comparison. It was found that D2EHPA was a good extracting agent for heavy rare earths while TBP extracted well both light and heavy rare earths. After extraction with TBP and D2EHPA, the extraction efficiency at solvent per feed ratio (S/F) =2 and 3 showed a slight difference. S/F =2 was selected commercially for operation

  5. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  6. Predicting Flow Reversals in a Computational Fluid Dynamics Simulated Thermosyphon Using Data Assimilation.

    Science.gov (United States)

    Reagan, Andrew J; Dubief, Yves; Dodds, Peter Sheridan; Danforth, Christopher M

    2016-01-01

    A thermal convection loop is a annular chamber filled with water, heated on the bottom half and cooled on the top half. With sufficiently large forcing of heat, the direction of fluid flow in the loop oscillates chaotically, dynamics analogous to the Earth's weather. As is the case for state-of-the-art weather models, we only observe the statistics over a small region of state space, making prediction difficult. To overcome this challenge, data assimilation (DA) methods, and specifically ensemble methods, use the computational model itself to estimate the uncertainty of the model to optimally combine these observations into an initial condition for predicting the future state. Here, we build and verify four distinct DA methods, and then, we perform a twin model experiment with the computational fluid dynamics simulation of the loop using the Ensemble Transform Kalman Filter (ETKF) to assimilate observations and predict flow reversals. We show that using adaptively shaped localized covariance outperforms static localized covariance with the ETKF, and allows for the use of less observations in predicting flow reversals. We also show that a Dynamic Mode Decomposition (DMD) of the temperature and velocity fields recovers the low dimensional system underlying reversals, finding specific modes which together are predictive of reversal direction.

  7. Predicting Flow Reversals in a Computational Fluid Dynamics Simulated Thermosyphon Using Data Assimilation.

    Directory of Open Access Journals (Sweden)

    Andrew J Reagan

    Full Text Available A thermal convection loop is a annular chamber filled with water, heated on the bottom half and cooled on the top half. With sufficiently large forcing of heat, the direction of fluid flow in the loop oscillates chaotically, dynamics analogous to the Earth's weather. As is the case for state-of-the-art weather models, we only observe the statistics over a small region of state space, making prediction difficult. To overcome this challenge, data assimilation (DA methods, and specifically ensemble methods, use the computational model itself to estimate the uncertainty of the model to optimally combine these observations into an initial condition for predicting the future state. Here, we build and verify four distinct DA methods, and then, we perform a twin model experiment with the computational fluid dynamics simulation of the loop using the Ensemble Transform Kalman Filter (ETKF to assimilate observations and predict flow reversals. We show that using adaptively shaped localized covariance outperforms static localized covariance with the ETKF, and allows for the use of less observations in predicting flow reversals. We also show that a Dynamic Mode Decomposition (DMD of the temperature and velocity fields recovers the low dimensional system underlying reversals, finding specific modes which together are predictive of reversal direction.

  8. Propagation Velocity of Solid Earth Tides

    Science.gov (United States)

    Pathak, S.

    2017-12-01

    One of the significant considerations in most of the geodetic investigations is to take into account the outcome of Solid Earth tides on the location and its consequent impact on the time series of coordinates. In this research work, the propagation velocity resulting from the Solid Earth tides between the Indian stations is computed. Mean daily coordinates for the stations have been computed by applying static precise point positioning technique for a day. The computed coordinates are used as an input for computing the tidal displacements at the stations by Gravity method along three directions at 1-minute interval for 24 hours. Further the baseline distances are computed between four Indian stations. Computation of the propagation velocity for Solid Earth tides can be done by the virtue of study of the concurrent effect of it in-between the stations of identified baseline distance along with the time consumed by the tides for reaching from one station to another. The propagation velocity helps in distinguishing the impact at any station if the consequence at a known station for a specific time-period is known. Thus, with the knowledge of propagation velocity, the spatial and temporal effects of solid earth tides can be estimated with respect to a known station. As theoretically explained, the tides generated are due to the position of celestial bodies rotating about Earth. So the need of study is to observe the correlation of propagation velocity with the rotation speed of the Earth. The propagation velocity of Solid Earth tides comes out to be in the range of 440-470 m/s. This velocity comes out to be in a good agreement with the Earth's rotation speed.

  9. Computer simulation as an operational and training aid

    International Nuclear Information System (INIS)

    Lee, D.J.; Tottman-Trayner, E.

    1995-01-01

    The paper describes how the rapid development of desktop computing power, the associated fall in prices, and the advancement of computer graphics technology driven by the entertainment industry has enabled the nuclear industry to achieve improvements in operation and training through the use of computer simulation. Applications are focused on the fuel handling operations at Torness Power Station where visualization through computer modelling is being used to enhance operator awareness and to assist in a number of operational scenarios. It is concluded that there are significant benefits to be gained from the introduction of the facility at Torness as well as other locations. (author)

  10. Integration of adaptive process control with computational simulation for spin-forming

    International Nuclear Information System (INIS)

    Raboin, P. J. LLNL

    1998-01-01

    Improvements in spin-forming capabilities through upgrades to a metrology and machine control system and advances in numerical simulation techniques were studied in a two year project funded by Laboratory Directed Research and Development (LDRD) at Lawrence Livermore National Laboratory. Numerical analyses were benchmarked with spin-forming experiments and computational speeds increased sufficiently to now permit actual part forming simulations. Extensive modeling activities examined the simulation speeds and capabilities of several metal forming computer codes for modeling flat plate and cylindrical spin-forming geometries. Shape memory research created the first numerical model to describe this highly unusual deformation behavior in Uranium alloys. A spin-forming metrology assessment led to sensor and data acquisition improvements that will facilitate future process accuracy enhancements, such as a metrology frame. Finally, software improvements (SmartCAM) to the manufacturing process numerically integrate the part models to the spin-forming process and to computational simulations

  11. Surgical resource utilization in urban terrorist bombing: a computer simulation.

    Science.gov (United States)

    Hirshberg, A; Stein, M; Walden, R

    1999-09-01

    The objective of this study was to analyze the utilization of surgical staff and facilities during an urban terrorist bombing incident. A discrete-event computer model of the emergency room and related hospital facilities was constructed and implemented, based on cumulated data from 12 urban terrorist bombing incidents in Israel. The simulation predicts that the admitting capacity of the hospital depends primarily on the number of available surgeons and defines an optimal staff profile for surgeons, residents, and trauma nurses. The major bottlenecks in the flow of critical casualties are the shock rooms and the computed tomographic scanner but not the operating rooms. The simulation also defines the number of reinforcement staff needed to treat noncritical casualties and shows that radiology is the major obstacle to the flow of these patients. Computer simulation is an important new tool for the optimization of surgical service elements for a multiple-casualty situation.

  12. [The research on bidirectional reflectance computer simulation of forest canopy at pixel scale].

    Science.gov (United States)

    Song, Jin-Ling; Wang, Jin-Di; Shuai, Yan-Min; Xiao, Zhi-Qiang

    2009-08-01

    Computer simulation is based on computer graphics to generate the realistic 3D structure scene of vegetation, and to simulate the canopy regime using radiosity method. In the present paper, the authors expand the computer simulation model to simulate forest canopy bidirectional reflectance at pixel scale. But usually, the trees are complex structures, which are tall and have many branches. So there is almost a need for hundreds of thousands or even millions of facets to built up the realistic structure scene for the forest It is difficult for the radiosity method to compute so many facets. In order to make the radiosity method to simulate the forest scene at pixel scale, in the authors' research, the authors proposed one idea to simplify the structure of forest crowns, and abstract the crowns to ellipsoids. And based on the optical characteristics of the tree component and the characteristics of the internal energy transmission of photon in real crown, the authors valued the optical characteristics of ellipsoid surface facets. In the computer simulation of the forest, with the idea of geometrical optics model, the gap model is considered to get the forest canopy bidirectional reflectance at pixel scale. Comparing the computer simulation results with the GOMS model, and Multi-angle Imaging SpectroRadiometer (MISR) multi-angle remote sensing data, the simulation results are in agreement with the GOMS simulation result and MISR BRF. But there are also some problems to be solved. So the authors can conclude that the study has important value for the application of multi-angle remote sensing and the inversion of vegetation canopy structure parameters.

  13. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  14. Technology computer aided design simulation for VLSI MOSFET

    CERN Document Server

    Sarkar, Chandan Kumar

    2013-01-01

    Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and

  15. Computer simulations of the mechanical properties of metals

    DEFF Research Database (Denmark)

    Schiøtz, Jakob; Vegge, Tejs

    1999-01-01

    Atomic-scale computer simulations can be used to gain a better understanding of the mechanical properties of materials. In this paper we demonstrate how this can be done in the case of nanocrystalline copper, and give a brief overview of how simulations may be extended to larger length scales....... Nanocrystline metals are metals with grain sizes in the nanometre range, they have a number of technologically interesting properties such as much increased hardness and yield strength. Our simulations show that the deformation mechanisms are different in these materials than in coarse-grained materials...

  16. A Comparison of the Educational Effectiveness of Online versus In-Class Computer Literacy Courses

    Science.gov (United States)

    Heithecker, Julia Ann

    2013-01-01

    The purpose of this quantitative study was to compare the educational effectiveness of online versus in-class computer literacy courses, and examine the impact, if any, of student demographics (delimited to gender, age, work status, father and mother education, and enrollment status). Institutions are seeking ways to produce technologically…

  17. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  18. Optimizing Cognitive Load for Learning from Computer-Based Science Simulations

    Science.gov (United States)

    Lee, Hyunjeong; Plass, Jan L.; Homer, Bruce D.

    2006-01-01

    How can cognitive load in visual displays of computer simulations be optimized? Middle-school chemistry students (N = 257) learned with a simulation of the ideal gas law. Visual complexity was manipulated by separating the display of the simulations in two screens (low complexity) or presenting all information on one screen (high complexity). The…

  19. Definition, modeling and simulation of a grid computing system for high throughput computing

    CERN Document Server

    Caron, E; Tsaregorodtsev, A Yu

    2006-01-01

    In this paper, we study and compare grid and global computing systems and outline the benefits of having an hybrid system called dirac. To evaluate the dirac scheduling for high throughput computing, a new model is presented and a simulator was developed for many clusters of heterogeneous nodes belonging to a local network. These clusters are assumed to be connected to each other through a global network and each cluster is managed via a local scheduler which is shared by many users. We validate our simulator by comparing the experimental and analytical results of a M/M/4 queuing system. Next, we do the comparison with a real batch system and we obtain an average error of 10.5% for the response time and 12% for the makespan. We conclude that the simulator is realistic and well describes the behaviour of a large-scale system. Thus we can study the scheduling of our system called dirac in a high throughput context. We justify our decentralized, adaptive and oppor! tunistic approach in comparison to a centralize...

  20. Separation of rare earths by means of acid organophosphorous compounds. Structure-activity study by molecular simulation

    International Nuclear Information System (INIS)

    Fourcot, Fabrice

    1991-01-01

    The increasing number of industrial applications of rare earths has resulted in an increased demand in purified rare earths whereas their separation is difficult due to their high chemical similarity. The search for a better separation leads to the search for more selective extraction agents. Organophosphorous compounds appear to be the most selective. As the search for new extraction agents resulting in high lanthanide extraction efficiency or in a better selectivity between rare earths has been mainly empiric, this research thesis aims at developing a molecular simulation method which allows the number of molecules to be synthesized and tested to be reduced. After having briefly recalled general knowledge on liquid-liquid extraction and on rare earths, and described calculation methods (quantum methods, methods based on molecular mechanics, conformational analysis, methods of charge calculation), the author proposes a critical review of literature related to rare earth liquid-liquid extraction by organophosphorous acids with respect to the used extraction agent. The molecular modelling issue is then addressed by describing ways to apply it to extraction problems, faced problems, brought solutions and obtained results

  1. Sensitivity Analysis of Personal Exposure Assessment Using a Computer Simulated Person

    DEFF Research Database (Denmark)

    Brohus, Henrik; Jensen, H. K.

    2009-01-01

    The paper considers uncertainties related to personal exposure assessment using a computer simulated person. CFD is used to simulate a uniform flow field around a human being to determine the personal exposure to a contaminant source. For various vertical locations of a point contaminant source...... three additional factors are varied, namely the velocity, details of the computer simulated person, and the CFD model of the wind channel. The personal exposure is found to be highly dependent on the relative source location. Variation in the range of two orders of magnitude is found. The exposure...

  2. A hybrid three-class brain-computer interface system utilizing SSSEPs and transient ERPs

    Science.gov (United States)

    Breitwieser, Christian; Pokorny, Christoph; Müller-Putz, Gernot R.

    2016-12-01

    Objective. This paper investigates the fusion of steady-state somatosensory evoked potentials (SSSEPs) and transient event-related potentials (tERPs), evoked through tactile simulation on the left and right-hand fingertips, in a three-class EEG based hybrid brain-computer interface. It was hypothesized, that fusing the input signals leads to higher classification rates than classifying tERP and SSSEP individually. Approach. Fourteen subjects participated in the studies, consisting of a screening paradigm to determine person dependent resonance-like frequencies and a subsequent online paradigm. The whole setup of the BCI system was based on open interfaces, following suggestions for a common implementation platform. During the online experiment, subjects were instructed to focus their attention on the stimulated fingertips as indicated by a visual cue. The recorded data were classified during runtime using a multi-class shrinkage LDA classifier and the outputs were fused together applying a posterior probability based fusion. Data were further analyzed offline, involving a combined classification of SSSEP and tERP features as a second fusion principle. The final results were tested for statistical significance applying a repeated measures ANOVA. Main results. A significant classification increase was achieved when fusing the results with a combined classification compared to performing an individual classification. Furthermore, the SSSEP classifier was significantly better in detecting a non-control state, whereas the tERP classifier was significantly better in detecting control states. Subjects who had a higher relative band power increase during the screening session also achieved significantly higher classification results than subjects with lower relative band power increase. Significance. It could be shown that utilizing SSSEP and tERP for hBCIs increases the classification accuracy and also that tERP and SSSEP are not classifying control- and non

  3. Computer simulation of multiple dynamic photorefractive gratings

    DEFF Research Database (Denmark)

    Buchhave, Preben

    1998-01-01

    The benefits of a direct visualization of space-charge grating buildup are described. The visualization is carried out by a simple repetitive computer program, which simulates the basic processes in the band-transport model and displays the result graphically or in the form of numerical data. The...

  4. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    Science.gov (United States)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  5. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  6. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-01-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  7. Petascale molecular dynamics simulation using the fast multipole method on K computer

    KAUST Repository

    Ohno, Yousuke; Yokota, Rio; Koyama, Hiroshi; Morimoto, Gentaro; Hasegawa, Aki; Masumoto, Gen; Okimoto, Noriaki; Hirano, Yoshinori; Ibeid, Huda; Narumi, Tetsu; Taiji, Makoto

    2014-01-01

    In this paper, we report all-atom simulations of molecular crowding - a result from the full node simulation on the "K computer", which is a 10-PFLOPS supercomputer in Japan. The capability of this machine enables us to perform simulation of crowded cellular environments, which are more realistic compared to conventional MD simulations where proteins are simulated in isolation. Living cells are "crowded" because macromolecules comprise ∼30% of their molecular weight. Recently, the effects of crowded cellular environments on protein stability have been revealed through in-cell NMR spectroscopy. To measure the performance of the "K computer", we performed all-atom classical molecular dynamics simulations of two systems: target proteins in a solvent, and target proteins in an environment of molecular crowders that mimic the conditions of a living cell. Using the full system, we achieved 4.4 PFLOPS during a 520 million-atom simulation with cutoff of 28 Å. Furthermore, we discuss the performance and scaling of fast multipole methods for molecular dynamics simulations on the "K computer", as well as comparisons with Ewald summation methods. © 2014 Elsevier B.V. All rights reserved.

  8. Petascale molecular dynamics simulation using the fast multipole method on K computer

    KAUST Repository

    Ohno, Yousuke

    2014-10-01

    In this paper, we report all-atom simulations of molecular crowding - a result from the full node simulation on the "K computer", which is a 10-PFLOPS supercomputer in Japan. The capability of this machine enables us to perform simulation of crowded cellular environments, which are more realistic compared to conventional MD simulations where proteins are simulated in isolation. Living cells are "crowded" because macromolecules comprise ∼30% of their molecular weight. Recently, the effects of crowded cellular environments on protein stability have been revealed through in-cell NMR spectroscopy. To measure the performance of the "K computer", we performed all-atom classical molecular dynamics simulations of two systems: target proteins in a solvent, and target proteins in an environment of molecular crowders that mimic the conditions of a living cell. Using the full system, we achieved 4.4 PFLOPS during a 520 million-atom simulation with cutoff of 28 Å. Furthermore, we discuss the performance and scaling of fast multipole methods for molecular dynamics simulations on the "K computer", as well as comparisons with Ewald summation methods. © 2014 Elsevier B.V. All rights reserved.

  9. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  10. Effects of Relativity Lead to 'Warp Speed' Computations

    International Nuclear Information System (INIS)

    Vay, J.-L.

    2007-01-01

    A scientist at Lawrence Berkeley National Laboratory has discovered that a previously unnoticed consequence of Einstein's special theory of relativity can lead to speedup of computer calculations by orders of magnitude when applied to the computer modeling of a certain class of physical systems. This new finding offers the possibility of tackling some problems in a much shorter time and with far more precision than was possible before, as well as studying some configurations in every detail for the first time. The basis of Einstein's theory is the principle of relativity, which states that the laws of physics are the same for all observers, whether the 'observer' is a turtle 'racing' with a rabbit, or a beam of particles moving at near light speed. From the invariance of the laws of physics, one may be tempted to infer that the complexity of a system is independent of the motion of the observer, and consequently, a computer simulation will require the same number of mathematical operations, independently of the reference frame that is used for the calculation. Length contraction and time dilation are well known consequences of the special theory of relativity which lead to very counterintuitive effects. An alien observing human activity through a telescope in a spaceship traveling in the Vicinity of the earth near the speed of light would see everything flattened in the direction of propagation of its spaceship (for him, the earth would have the shape of a pancake), while all motions on earth would appear extremely slow, slowed almost to a standstill. Conversely, a space scientist observing the alien through a telescope based on earth would see a flattened alien almost to a standstill in a flattened spaceship. Meanwhile, an astronaut sitting in a spaceship moving at some lower velocity than the alien spaceship with regard to earth might see both the alien spaceship and the earth flattened in the same proportion and the motion unfolding in each of them at the same

  11. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    OpenAIRE

    Marko Hadjina; Nikša Fafandjel; Tin Matulja

    2015-01-01

    In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of sugge...

  12. AFFECTIVE COMPUTING AND AUGMENTED REALITY FOR CAR DRIVING SIMULATORS

    Directory of Open Access Journals (Sweden)

    Dragoș Datcu

    2017-12-01

    Full Text Available Car simulators are essential for training and for analyzing the behavior, the responses and the performance of the driver. Augmented Reality (AR is the technology that enables virtual images to be overlaid on views of the real world. Affective Computing (AC is the technology that helps reading emotions by means of computer systems, by analyzing body gestures, facial expressions, speech and physiological signals. The key aspect of the research relies on investigating novel interfaces that help building situational awareness and emotional awareness, to enable affect-driven remote collaboration in AR for car driving simulators. The problem addressed relates to the question about how to build situational awareness (using AR technology and emotional awareness (by AC technology, and how to integrate these two distinct technologies [4], into a unique affective framework for training, in a car driving simulator.

  13. A computational model to generate simulated three-dimensional breast masses

    Energy Technology Data Exchange (ETDEWEB)

    Sisternes, Luis de; Brankov, Jovan G.; Zysk, Adam M.; Wernick, Miles N., E-mail: wernick@iit.edu [Medical Imaging Research Center, Department of Electrical and Computer Engineering, Illinois Institute of Technology, Chicago, Illinois 60616 (United States); Schmidt, Robert A. [Kurt Rossmann Laboratories for Radiologic Image Research, Department of Radiology, The University of Chicago, Chicago, Illinois 60637 (United States); Nishikawa, Robert M. [Department of Radiology, University of Pittsburgh, Pittsburgh, Pennsylvania 15213 (United States)

    2015-02-15

    Purpose: To develop algorithms for creating realistic three-dimensional (3D) simulated breast masses and embedding them within actual clinical mammograms. The proposed techniques yield high-resolution simulated breast masses having randomized shapes, with user-defined mass type, size, location, and shape characteristics. Methods: The authors describe a method of producing 3D digital simulations of breast masses and a technique for embedding these simulated masses within actual digitized mammograms. Simulated 3D breast masses were generated by using a modified stochastic Gaussian random sphere model to generate a central tumor mass, and an iterative fractal branching algorithm to add complex spicule structures. The simulated masses were embedded within actual digitized mammograms. The authors evaluated the realism of the resulting hybrid phantoms by generating corresponding left- and right-breast image pairs, consisting of one breast image containing a real mass, and the opposite breast image of the same patient containing a similar simulated mass. The authors then used computer-aided diagnosis (CAD) methods and expert radiologist readers to determine whether significant differences can be observed between the real and hybrid images. Results: The authors found no statistically significant difference between the CAD features obtained from the real and simulated images of masses with either spiculated or nonspiculated margins. Likewise, the authors found that expert human readers performed very poorly in discriminating their hybrid images from real mammograms. Conclusions: The authors’ proposed method permits the realistic simulation of 3D breast masses having user-defined characteristics, enabling the creation of a large set of hybrid breast images containing a well-characterized mass, embedded within real breast background. The computational nature of the model makes it suitable for detectability studies, evaluation of computer aided diagnosis algorithms, and

  14. A computational model to generate simulated three-dimensional breast masses

    International Nuclear Information System (INIS)

    Sisternes, Luis de; Brankov, Jovan G.; Zysk, Adam M.; Wernick, Miles N.; Schmidt, Robert A.; Nishikawa, Robert M.

    2015-01-01

    Purpose: To develop algorithms for creating realistic three-dimensional (3D) simulated breast masses and embedding them within actual clinical mammograms. The proposed techniques yield high-resolution simulated breast masses having randomized shapes, with user-defined mass type, size, location, and shape characteristics. Methods: The authors describe a method of producing 3D digital simulations of breast masses and a technique for embedding these simulated masses within actual digitized mammograms. Simulated 3D breast masses were generated by using a modified stochastic Gaussian random sphere model to generate a central tumor mass, and an iterative fractal branching algorithm to add complex spicule structures. The simulated masses were embedded within actual digitized mammograms. The authors evaluated the realism of the resulting hybrid phantoms by generating corresponding left- and right-breast image pairs, consisting of one breast image containing a real mass, and the opposite breast image of the same patient containing a similar simulated mass. The authors then used computer-aided diagnosis (CAD) methods and expert radiologist readers to determine whether significant differences can be observed between the real and hybrid images. Results: The authors found no statistically significant difference between the CAD features obtained from the real and simulated images of masses with either spiculated or nonspiculated margins. Likewise, the authors found that expert human readers performed very poorly in discriminating their hybrid images from real mammograms. Conclusions: The authors’ proposed method permits the realistic simulation of 3D breast masses having user-defined characteristics, enabling the creation of a large set of hybrid breast images containing a well-characterized mass, embedded within real breast background. The computational nature of the model makes it suitable for detectability studies, evaluation of computer aided diagnosis algorithms, and

  15. Neurosurgical simulation by interactive computer graphics on iPad.

    Science.gov (United States)

    Maruyama, Keisuke; Kin, Taichi; Saito, Toki; Suematsu, Shinya; Gomyo, Miho; Noguchi, Akio; Nagane, Motoo; Shiokawa, Yoshiaki

    2014-11-01

    Presurgical simulation before complicated neurosurgery is a state-of-the-art technique, and its usefulness has recently become well known. However, simulation requires complex image processing, which hinders its widespread application. We explored handling the results of interactive computer graphics on the iPad tablet, which can easily be controlled anywhere. Data from preneurosurgical simulations from 12 patients (4 men, 8 women) who underwent complex brain surgery were loaded onto an iPad. First, DICOM data were loaded using Amira visualization software to create interactive computer graphics, and ParaView, another free visualization software package, was used to convert the results of the simulation to be loaded using the free iPad software KiwiViewer. The interactive computer graphics created prior to neurosurgery were successfully displayed and smoothly controlled on the iPad in all patients. The number of elements ranged from 3 to 13 (mean 7). The mean original data size was 233 MB, which was reduced to 10.4 MB (4.4% of original size) after image processing by ParaView. This was increased to 46.6 MB (19.9%) after decompression in KiwiViewer. Controlling the magnification, transfer, rotation, and selection of translucence in 10 levels of each element were smoothly and easily performed using one or two fingers. The requisite skill to smoothly control the iPad software was acquired within 1.8 trials on average in 12 medical students and 6 neurosurgical residents. Using an iPad to handle the result of preneurosurgical simulation was extremely useful because it could easily be handled anywhere.

  16. The null-event method in computer simulation

    International Nuclear Information System (INIS)

    Lin, S.L.

    1978-01-01

    The simulation of collisions of ions moving under the influence of an external field through a neutral gas to non-zero temperatures is discussed as an example of computer models of processes in which a probe particle undergoes a series of interactions with an ensemble of other particles, such that the frequency and outcome of the events depends on internal properties of the second particles. The introduction of null events removes the need for much complicated algebra, leads to a more efficient simulation and reduces the likelihood of logical error. (Auth.)

  17. Computational fluid dynamics for sport simulation

    CERN Document Server

    2009-01-01

    All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.

  18. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  19. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  20. Numerical simulation of the subsolar magnetopause current layer in the sun-earth meridian plane

    Science.gov (United States)

    Okuda, H.

    1993-01-01

    The formation and stability of the magnetopause current layer near the subsolar point in the sun-earth meridian plane are examined using a 2D electromagnetic particle simulation. For the case of zero IMF, the simulation results show that the current layer remains stable and is essentially the same as in the 1D simulation. The width of the current layer is given by the electron-ion hybrid gyroradius which is much smaller than the ion gyroradius. The current layer is found to remain stable for the northward IMF as well. As in the 1D simulation, the jump of the magnetic field at the current layer for the northward IMF remains small. For the southward IMF, collisionless magnetic reconnection is found to develop, leading to the formation of magnetic islands and density peaking within the current layer.