WorldWideScience

Sample records for argonne simulation framework

  1. MONARC Simulation Framework

    CERN Document Server

    Dobre, Ciprian

    2011-01-01

    This paper discusses the latest generation of the MONARC (MOdels of Networked Analysis at Regional Centers) simulation framework, as a design and modelling tool for large scale distributed systems applied to HEP experiments. A process-oriented approach for discrete event simulation is well-suited for describing concurrent running programs, as well as the stochastic arrival patterns that characterize how such systems are used. The simulation engine is based on Threaded Objects (or Active Objects), which offer great flexibility in simulating the complex behavior of distributed data processing programs. The engine provides an appropriate scheduling mechanism for the Active objects with support for interrupts. This approach offers a natural way of describing complex running programs that are data dependent and which concurrently compete for shared resources as well as large numbers of concurrent data transfers on shared resources. The framework provides a complete set of basic components (processing nodes, data s...

  2. A Simulation Framework for Evaluating Mobile Robots

    Science.gov (United States)

    2002-08-01

    A Simulation Framework for Evaluating Mobile Robots Stephen Balakirsky and Elena Messina National Institute of Standards and Technology Intelligent...deployment. Keywords: simulation, architectures, 4D/RCS, mobile robots , algo- rithm validation 1 Introduction There have been many recent successes in...DATES COVERED 00-00-2002 to 00-00-2002 4. TITLE AND SUBTITLE A Simulation Framework for Evaluating Mobile Robots 5a. CONTRACT NUMBER 5b. GRANT

  3. Flexible Residential Smart Grid Simulation Framework

    Science.gov (United States)

    Xiang, Wang

    Different scheduling and coordination algorithms controlling household appliances' operations can potentially lead to energy consumption reduction and/or load balancing in conjunction with different electricity pricing methods used in smart grid programs. In order to easily implement different algorithms and evaluate their efficiency against other ideas, a flexible simulation framework is desirable in both research and business fields. However, such a platform is currently lacking or underdeveloped. In this thesis, we provide a simulation framework to focus on demand side residential energy consumption coordination in response to different pricing methods. This simulation framework, equipped with an appliance consumption library using realistic values, aims to closely represent the average usage of different types of appliances. The simulation results of traditional usage yield close matching values compared to surveyed real life consumption records. Several sample coordination algorithms, pricing schemes, and communication scenarios are also implemented to illustrate the use of the simulation framework.

  4. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  5. Development of the ATLAS Simulation Framework

    Institute of Scientific and Technical Information of China (English)

    A.DellAcqua; K.Amako; 等

    2001-01-01

    Object-oriented (OO) approach is the key technology to develop a software system in the LHC/ATLAS experiment.We developed a OO simulation framework based on the Geant4 general-purpose simulation toolkit.Because of complexity of simulation in ATLAS,we payed most attention to the scalability in its design.Although the first target to apply this framework is to implement the ATLAS full detector simulation program,there is no experiment-specific code in it,therefore it can be utilized for the development of any simulation package,not only for HEP experiments but also for various different research domains ,In this paper we discuss our approach of design and implementation of the framework.

  6. Object-oriented framework for distributed simulation

    Science.gov (United States)

    Hunter, Julia; Carson, John A.; Colley, Martin; Standeven, John; Callaghan, Victor

    1999-06-01

    The benefits of object-oriented technology are widely recognized in software engineering. This paper describes the use of the object-oriented paradigm to create distributed simulations. The University of Essex Robotics and Intelligent Machines group has been carrying out research into distributed vehicle simulation since 1992. Part of this research has focused on the development of simulation systems to assist in the design of robotic vehicles. This paper describes the evolution of these systems, from an early toolkit used for teaching robotics to recent work on using simulation as a design tool in the creation of a new generation of unmanned underwater vehicles. It outlines experiences gained in using PVM, and ongoing research into the use of the emerging High Level Architecture as the basis for these frameworks. The paper concludes with the perceived benefits of adopting object-oriented methodologies as the basis for simulation frameworks.

  7. Chemical research at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Argonne National Laboratory is a research and development laboratory located 25 miles southwest of Chicago, Illinois. It has more than 200 programs in basic and applied sciences and an Industrial Technology Development Center to help move its technologies to the industrial sector. At Argonne, basic energy research is supported by applied research in diverse areas such as biology and biomedicine, energy conservation, fossil and nuclear fuels, environmental science, and parallel computer architectures. These capabilities translate into technological expertise in energy production and use, advanced materials and manufacturing processes, and waste minimization and environmental remediation, which can be shared with the industrial sector. The Laboratory`s technologies can be applied to help companies design products, substitute materials, devise innovative industrial processes, develop advanced quality control systems and instrumentation, and address environmental concerns. The latest techniques and facilities, including those involving modeling, simulation, and high-performance computing, are available to industry and academia. At Argonne, there are opportunities for industry to carry out cooperative research, license inventions, exchange technical personnel, use unique research facilities, and attend conferences and workshops. Technology transfer is one of the Laboratory`s major missions. High priority is given to strengthening U.S. technological competitiveness through research and development partnerships with industry that capitalize on Argonne`s expertise and facilities. The Laboratory is one of three DOE superconductivity technology centers, focusing on manufacturing technology for high-temperature superconducting wires, motors, bearings, and connecting leads. Argonne National Laboratory is operated by the University of Chicago for the U.S. Department of Energy.

  8. MCdevelop - the universal framework for Stochastic Simulations

    CERN Document Server

    Slawinska, M

    2011-01-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, mer...

  9. Simulation of the clock framework of Gaia

    CERN Document Server

    Castaneda, J; Portell, J; García-Berro, E; Luri, X; Castaneda, Javier; Gordo, Jose P.; Portell, Jordi; Garcia-Berro, Enrique; Luri, Xavier

    2005-01-01

    Gaia will perform astrometric measurements with an unprecedented resolution. Consequently, the electronics of the Astro instrument must time tag every measurement with a precision of a few nanoseconds. Hence, it requires a high stability clock signal, for which a Rb-type spacecraft master clock has been baselined. The distribution of its signal and the generation of clock subproducts must maintain these high accuracy requirements. We have developed a software application to simulate generic clock frameworks. The most critical clock structures for Gaia have also been identified, and its master clock has been parameterised.

  10. FACET: A simulation software framework for modeling complex societal processes and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  11. MCdevelop - a universal framework for Stochastic Simulations

    Science.gov (United States)

    Slawinska, M.; Jadach, S.

    2011-03-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http

  12. MAIA: a framework for developing agent-based social simulations

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dignum, Virginia; Bots, Pieter; Dijkema, Gerhard

    2013-01-01

    In this paper we introduce and motivate a conceptualization framework for agent-based social simulation, MAIA: Modelling Agent systems based on Institutional Analysis. The MAIA framework is based on Ostrom's Institutional Analysis and Development framework, and provides an extensive set of modelling

  13. A framework for response surface methodology for simulation optimization

    NARCIS (Netherlands)

    H.G. Neddermeijer; G.J. van Oortmarssen (Gerrit); N. Piersma (Nanda); R. Dekker (Rommert)

    2000-01-01

    textabstractWe develop a framework for automated optimization of stochastic simulation models using Response Surface Methodology. The framework is especially intended for simulation models where the calculation of the corresponding stochastic response function is very expensive or time-consuming. Re

  14. Argonne Tandem Linac Accelerator System (ATLAS)

    Data.gov (United States)

    Federal Laboratory Consortium — ATLAS is a national user facility at Argonne National Laboratory in Argonne, Illinois. The ATLAS facility is a leading facility for nuclear structure research in the...

  15. Simulation framework for spatio-spectral anomalous change detection

    Energy Technology Data Exchange (ETDEWEB)

    Theiler, James P [Los Alamos National Laboratory; Harvey, Neal R [Los Alamos National Laboratory; Porter, Reid B [Los Alamos National Laboratory; Wohlberg, Brendt E [Los Alamos National Laboratory

    2009-01-01

    The authors describe the development of a simulation framework for anomalous change detection that considers both the spatial and spectral aspects of the imagery. A purely spectral framework has previously been introduced, but the extension to spatio-spectral requires attention to a variety of new issues, and requires more careful modeling of the anomalous changes. Using this extended framework, they evaluate the utility of spatial image processing operators to enhance change detection sensitivity in (simulated) remote sensing imagery.

  16. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  17. The Astrophysics Simulation Collaboratory portal: A framework foreffective distributed research

    Energy Technology Data Exchange (ETDEWEB)

    Bondarescu, Ruxandra; Allen, Gabrielle; Daues, Gregory; Kelly,Ian; Russell, Michael; Seidel, Edward; Shalf, John; Tobias, Malcolm

    2003-03-03

    We describe the motivation, architecture, and implementation of the Astrophysics Simulation Collaboratory (ASC) portal. The ASC project provides a web-based problem solving framework for the astrophysics community that harnesses the capabilities of emerging computational grids.

  18. Service-Oriented Simulation Framework: An Overview and Unifying Methodology

    CERN Document Server

    Wang, Wenguang; Zhu, Yifan; Li, Qun; 10.1177/0037549710391838

    2010-01-01

    The prevailing net-centric environment demands and enables modeling and simulation to combine efforts from numerous disciplines. Software techniques and methodology, in particular service-oriented architecture, provide such an opportunity. Service-oriented simulation has been an emerging paradigm following on from object- and process-oriented methods. However, the ad-hoc frameworks proposed so far generally focus on specific domains or systems and each has its pros and cons. They are capable of addressing different issues within service-oriented simulation from different viewpoints. It is increasingly important to describe and evaluate the progress of numerous frameworks. In this paper, we propose a novel three-dimensional reference model for a service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks. In particular, the model inspects the crossover between the disciplines of modelin...

  19. Environmental Survey preliminary report, Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    1988-11-01

    This report presents the preliminary findings of the first phase of the Environmental Survey of the United States Department of Energy's (DOE) Argonne National Laboratory (ANL), conducted June 15 through 26, 1987. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. The team includes outside experts supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with ANL. The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. The on-site phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at ANL, and interviews with site personnel. The Survey team developed a Sampling and Analysis (S A) Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The S A Plan will be executed by the Oak Ridge National Laboratory (ORNL). When completed, the S A results will be incorporated into the Argonne National Laboratory Environmental Survey findings for inclusion in the Environmental Survey Summary Report. 75 refs., 24 figs., 60 tabs.

  20. Proposed environmental remediation at Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The Department of Energy (DOE) has prepared an Environmental Assessment evaluating proposed environmental remediation activity at Argonne National Laboratory-East (ANL-E), Argonne, Illinois. The environmental remediation work would (1) reduce, eliminate, or prevent the release of contaminants from a number of Resource Conservation and Recovery Act (RCRA) Solid Waste Management Units (SWMUs) and two radiologically contaminated sites located in areas contiguous with SWMUs, and (2) decrease the potential for exposure of the public, ANL-E employees, and wildlife to such contaminants. The actions proposed for SWMUs are required to comply with the RCRA corrective action process and corrective action requirements of the Illinois Environmental Protection Agency; the actions proposed are also required to reduce the potential for continued contaminant release. Based on the analysis in the EA, the DOE has determined that the proposed action does not constitute a major federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act of 1969 (NEPA). Therefore, the preparation of an Environmental Impact Statement is not required.

  1. Argonne National Laboratory 1985 publications

    Energy Technology Data Exchange (ETDEWEB)

    Kopta, J.A. (ED.); Hale, M.R. (comp.)

    1987-08-01

    This report is a bibliography of scientific and technical 1985 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1985. This compilation, prepared by the Technical Information Services Technical Publications Section (TPB), lists all nonrestricted 1985 publications submitted to TPS by Laboratory's Divisions. The report is divided into seven parts: Journal Articles - Listed by first author, ANL Reports - Listed by report number, ANL and non-ANL Unnumbered Reports - Listed by report number, Non-ANL Numbered Reports - Listed by report number, Books and Book Chapters - Listed by first author, Conference Papers - Listed by first author, Complete Author Index.

  2. GEMFsim: A Stochastic Simulator for the Generalized Epidemic Modeling Framework

    CERN Document Server

    Sahneh, Faryad Darabi; Shakeri, Heman; Fan, Futing; Scoglio, Caterina

    2016-01-01

    The recently proposed generalized epidemic modeling framework (GEMF) \\cite{sahneh2013generalized} lays the groundwork for systematically constructing a broad spectrum of stochastic spreading processes over complex networks. This article builds an algorithm for exact, continuous-time numerical simulation of GEMF-based processes. Moreover the implementation of this algorithm, GEMFsim, is available in popular scientific programming platforms such as MATLAB, R, Python, and C; GEMFsim facilitates simulating stochastic spreading models that fit in GEMF framework. Using these simulations one can examine the accuracy of mean-field-type approximations that are commonly used for analytical study of spreading processes on complex networks.

  3. A simulation framework for the CMS Track Trigger electronics

    CERN Document Server

    Amstutz, Christian; Weber, Marc; Palla, Fabrizio

    2014-01-01

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows \\mbox{co-simulation} with models developed in Hardware Description Languages, e.g.~VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system.

  4. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  5. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  6. Software Framework for Advanced Power Plant Simulations

    Energy Technology Data Exchange (ETDEWEB)

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  7. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    Science.gov (United States)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  8. A Simulation and Modeling Framework for Space Situational Awareness

    Science.gov (United States)

    Olivier, S.

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. This framework includes detailed models for threat scenarios, signatures, sensors, observables and knowledge extraction algorithms. The framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the details of the modeling and simulation framework, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical and infra-red brightness calculations, generic radar system models, generic optical and infra-red system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The specific modeling of the Space Surveillance Network is performed in collaboration with the Air Force Space Command Space Control Group. We will demonstrate the use of this integrated simulation and modeling framework on specific threat scenarios, including space debris and satellite maneuvers, and we will examine the results of case studies involving the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  9. Argonne National Laboratory 1986 publications

    Energy Technology Data Exchange (ETDEWEB)

    Kopta, J.A.; Springer, C.J.

    1987-12-01

    This report is a bibliography of scientific and technical 1986 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1986. This compilation, prepared by the Technical Information Services Technical Publications Section (TPS), lists all nonrestricted 1986 publications submitted to TPS by the Laboratory's Divisions. Author indexes list ANL authors only. If a first author is not an ANL employee, an asterisk in the bibliographic citation indicates the first ANL author. The report is divided into seven parts: Journal Articles -- Listed by first author; ANL Reports -- Listed by report number; ANL and non-ANL Unnumbered Reports -- Listed by report number; Non-ANL Numbered Reports -- Listed by report number; Books and Book Chapters -- Listed by first author; Conference Papers -- Listed by first author; and Complete Author Index.

  10. Power Aware Simulation Framework for Wireless Sensor Networks and Nodes

    Directory of Open Access Journals (Sweden)

    Glaser Johann

    2008-01-01

    Full Text Available Abstract The constrained resources of sensor nodes limit analytical techniques and cost-time factors limit test beds to study wireless sensor networks (WSNs. Consequently, simulation becomes an essential tool to evaluate such systems.We present the power aware wireless sensors (PAWiS simulation framework that supports design and simulation of wireless sensor networks and nodes. The framework emphasizes power consumption capturing and hence the identification of inefficiencies in various hardware and software modules of the systems. These modules include all layers of the communication system, the targeted class of application itself, the power supply and energy management, the central processing unit (CPU, and the sensor-actuator interface. The modular design makes it possible to simulate heterogeneous systems. PAWiS is an OMNeT++ based discrete event simulator written in C++. It captures the node internals (modules as well as the node surroundings (network, environment and provides specific features critical to WSNs like capturing power consumption at various levels of granularity, support for mobility, and environmental dynamics as well as the simulation of timing effects. A module library with standardized interfaces and a power analysis tool have been developed to support the design and analysis of simulation models. The performance of the PAWiS simulator is comparable with other simulation environments.

  11. Power Aware Simulation Framework for Wireless Sensor Networks and Nodes

    Directory of Open Access Journals (Sweden)

    Daniel Weber

    2008-07-01

    Full Text Available The constrained resources of sensor nodes limit analytical techniques and cost-time factors limit test beds to study wireless sensor networks (WSNs. Consequently, simulation becomes an essential tool to evaluate such systems.We present the power aware wireless sensors (PAWiS simulation framework that supports design and simulation of wireless sensor networks and nodes. The framework emphasizes power consumption capturing and hence the identification of inefficiencies in various hardware and software modules of the systems. These modules include all layers of the communication system, the targeted class of application itself, the power supply and energy management, the central processing unit (CPU, and the sensor-actuator interface. The modular design makes it possible to simulate heterogeneous systems. PAWiS is an OMNeT++ based discrete event simulator written in C++. It captures the node internals (modules as well as the node surroundings (network, environment and provides specific features critical to WSNs like capturing power consumption at various levels of granularity, support for mobility, and environmental dynamics as well as the simulation of timing effects. A module library with standardized interfaces and a power analysis tool have been developed to support the design and analysis of simulation models. The performance of the PAWiS simulator is comparable with other simulation environments.

  12. Unified Behavior Framework for Discrete Event Simulation Systems

    Science.gov (United States)

    2015-03-26

    Advanced Framework for Simulation, Integration, and Modeling AI Artificial Intelligence APL Application Layer BT Behavior Tree CPC Configurable Physical...promotion of code reuse. I. INTRODUCTION The purpose of autonomous agents in simulation systems is to represent lifelike intelligence . In doing so...plan-act (SPA) approach was the focus of artificial intelligence (AI) research for 30+ years until the mid-1980’s [1]. However the SPA approach to

  13. Hierarchical petascale simulation framework for stress corrosion cracking

    Science.gov (United States)

    Vashishta, P.; Kalia, R. K.; Nakano, A.; Kaxiras, E.; Grama, A.; Lu, G.; Eidenbenz, S.; Voter, A. F.; Hood, R. Q.; Moriarty, J. A.; Yang, L. H.

    2008-07-01

    We are developing a scalable parallel and distributed computational framework consisting of methods, algorithms, and integrated software tools for multi-terascle-to-petascale simulations of stress corrosion cracking (SCC) with quantum-level accuracy. We have performed multimillion- to billion-atom molecular dynamics (MD) simulations of deformation, flow, and fracture in amorphous silica with interatomic potentials and forces validated by density functional theory (DFT) calculations. Optimized potentials have been developed to study sulfur embrittlement of nickel with multimillion-to-multibillion atom MD simulations based on DFT and temperature dependent model generalized pseudopotential theory. We have also developed a quasi-continuum method embedded with quantum simulations based on DFT to reach macroscopic length scales and an accelerated molecular dynamics scheme to reach macroscopic time scales in simulations of solid-fluid interfaces that are relevant to SCC. A hybrid MD and mesoscale lattice Boltzmann simulation algorithm is being designed to study fluid flow through cracks.

  14. Gleam: the GLAST Large Area Telescope Simulation Framework

    CERN Document Server

    Boinee, P; De Angelis, Alessandro; Favretto, Dario; Frailis, Marco; Giannitrapani, Riccardo; Milotti, Edoardo; Longo, Francesco; Brigida, Monica; Gargano, Fabio; Giglietto, Nicola; Loparco, Francesco; Mazziotta, Mario Nicola; Cecchi, Claudia; Lubrano, Pasquale; Pepe, Monica; Baldini, Luca; Cohen-Tanugi, Johann; Kuss, Michael; Latronico, Luca; Omodei, Nicola; Spandre, Gloria; Bogart, Joanne R.; Dubois, Richard; Kamae, Tune; Rochester, Leon; Usher, Tracy; Burnett, Thompson H.; Robinson, Sean M.; Bastieri, Denis; Rando, Riccardo

    2003-01-01

    This paper presents the simulation of the GLAST high energy gamma-ray telescope. The simulation package, written in C++, is based on the Geant4 toolkit, and it is integrated into a general framework used to process events. A detailed simulation of the electronic signals inside Silicon detectors has been provided and it is used for the particle tracking, which is handled by a dedicated software. A unique repository for the geometrical description of the detector has been realized using the XML language and a C++ library to access this information has been designed and implemented.

  15. Development of a framework for optimization of reservoir simulation studies

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jiang; Delshad, Mojdeh; Sepehrnoori, Kamy [The University of Texas at Austin, Austin, TX (United States)

    2007-10-15

    We have developed a framework that distributes multiple reservoir simulations on a cluster of CPUs for fast and efficient process optimization studies. This platform utilizes several commercial reservoir simulators for flow simulations, an experimental design and a Monte Carlo algorithm with a global optimization search engine to identify the optimum combination of reservoir decision factors under uncertainty. This approach is applied to a well placement design for a field-scale development exercise. The uncertainties considered are in the fault structure, porosity and permeability, PVT, and relative permeabilities. The results indicate that the approach is practical and efficient for performing reservoir optimization studies. (author)

  16. Simulation Framework for Rebalancing of Autonomous Mobility on Demand Systems

    Directory of Open Access Journals (Sweden)

    Marczuk Katarzyna A.

    2016-01-01

    This study is built upon our previous work on Autonomous Mobility on Demand (AMOD systems. Our methodology is simulation-based and we make use of SimMobility, an agent-based microscopic simulation platform. In the current work we focus on the framework for testing different rebalancing policies for the AMOD systems. We compare three different rebalancing methods: (i no rebalancing, (ii offline rebalancing, and (iii online rebalancing. Simulation results indicate that rebalancing reduces the required fleet size and shortens the customers’ wait time.

  17. A cellular automaton framework for infectious disease spread simulation.

    Science.gov (United States)

    Pfeifer, Bernhard; Kugler, Karl; Tejada, Maria M; Baumgartner, Christian; Seger, Michael; Osl, Melanie; Netzer, Michael; Handler, Michael; Dander, Andreas; Wurz, Manfred; Graber, Armin; Tilg, Bernhard

    2008-01-01

    In this paper, a cellular automaton framework for processing the spatiotemporal spread of infectious diseases is presented. The developed environment simulates and visualizes how infectious diseases might spread, and hence provides a powerful instrument for health care organizations to generate disease prevention and contingency plans. In this study, the outbreak of an avian flu like virus was modeled in the state of Tyrol, and various scenarios such as quarantine, effect of different medications on viral spread and changes of social behavior were simulated.The proposed framework is implemented using the programming language Java. The set up of the simulation environment requires specification of the disease parameters and the geographical information using a population density colored map, enriched with demographic data.The results of the numerical simulations and the analysis of the computed parameters will be used to get a deeper understanding of how the disease spreading mechanisms work, and how to protect the population from contracting the disease. Strategies for optimization of medical treatment and vaccination regimens will also be investigated using our cellular automaton framework.In this study, six different scenarios were simulated. It showed that geographical barriers may help to slow down the spread of an infectious disease, however, when an aggressive and deadly communicable disease spreads, only quarantine and controlled medical treatment are able to stop the outbreak, if at all.

  18. A Generic Digitization Framework for the CDF Simulation

    Institute of Scientific and Technical Information of China (English)

    JimKowalkowski; MarcPaterno

    2001-01-01

    Digitization from GEANT tracking requires a predictable sequence of steps to produce raw simulated detector readout information.We have developed a software framework that simplifies the development and integration of digitizers by separating the coordination activities(sequencing and dispatching)from the actual digitization process.This separation allows the developers of digitizers to concentrate on digitization.The framework provides the sequencing infrastructure and a digitizer model,which means that all digitizers are required to follow the same sequencing rules and provide an interface that fits the model.

  19. A framework for simulation and inversion in electromagnetics

    CERN Document Server

    Heagy, Lindsey J; Kang, Seogi; Rosenkjaer, Gudni K; Oldenburg, Douglas W

    2016-01-01

    Simulations and inversions of geophysical electromagnetic data are paramount for discerning meaningful information about the subsurface from these data. Depending on the nature of the source electromagnetic experiments may be classified as time-domain or frequency-domain. Multiple heterogeneous and sometimes anisotropic physical properties, including electrical conductivity and magnetic permeability, may need be considered in a simulation. Depending on what one wants to accomplish in an inversion, the parameters which one inverts for may be a voxel-based description of the earth or some parametric representation that must be mapped onto a simulation mesh. Each of these permutations of the electromagnetic problem has implications in a numerical implementation of the forward simulation as well as in the computation of the sensitivities, which are required when considering gradient-based inversions. This paper proposes a framework for organizing and implementing electromagnetic simulations and gradient-based inv...

  20. Simulation framework and XML detector description for the CMS experiment

    CERN Document Server

    Arce, P; Boccali, T; Case, M; de Roeck, A; Lara, V; Liendl, M; Nikitenko, A N; Schröder, M; Strässner, A; Wellisch, H P; Wenzel, H

    2003-01-01

    Currently CMS event simulation is based on GEANT3 while the detector description is built from different sources for simulation and reconstruction. A new simulation framework based on GEANT4 is under development. A full description of the detector is available, and the tuning of the GEANT4 performance and the checking of the ability of the physics processes to describe the detector response is ongoing. Its integration on the CMS mass production system and GRID is also currently under development. The Detector Description Database project aims at providing a common source of information for Simulation, Reconstruction, Analysis, and Visualisation, while allowing for different representations as well as specific information for each application. A functional prototype, based on XML, is already released. Also examples of the integration of DDD in the GEANT4 simulation and in the reconstruction applications are provided.

  1. Push technology at Argonne National Laboratory.

    Energy Technology Data Exchange (ETDEWEB)

    Noel, R. E.; Woell, Y. N.

    1999-04-06

    Selective dissemination of information (SDI) services, also referred to as current awareness searches, are usually provided by periodically running computer programs (personal profiles) against a cumulative database or databases. This concept of pushing relevant content to users has long been integral to librarianship. Librarians traditionally turned to information companies to implement these searches for their users in business, academia, and the science community. This paper describes how a push technology was implemented on a large scale for scientists and engineers at Argonne National Laboratory, explains some of the challenges to designers/maintainers, and identifies the positive effects that SDI seems to be having on users. Argonne purchases the Institute for Scientific Information (ISI) Current Contents data (all subject areas except Humanities), and scientists no longer need to turn to outside companies for reliable SDI service. Argonne's database and its customized services are known as ACCESS (Argonne-University of Chicago Current Contents Electronic Search Service).

  2. A Simulink simulation framework of a MagLev model

    Energy Technology Data Exchange (ETDEWEB)

    Boudall, H.; Williams, R.D.; Giras, T.C. [University of Virginia, Charlottesville (United States). School of Enegineering and Applied Science

    2003-09-01

    This paper presents a three-degree-of-freedom model of a section of the magnetically levitated train Maglev. The Maglev system dealt with in this article utilizes electromagnetic levitation. Each MagLev vehicle section is viewed as two separate parts, namely a body and a chassis, coupled by a set of springs and dampers. The MagLev model includes the propulsion, the guidance and the levitation systems. The equations of motion are developed. A Simulink simulation framework is implemented in order to study the interaction between the different systems and the dynamics of a MagLev vehicle. The simulation framework will eventually serve as a tool to assist the design and development of the Maglev system in the United States of America. (author)

  3. Sorting, Searching, and Simulation in the MapReduce Framework

    DEFF Research Database (Denmark)

    Goodrich, Michael T.; Sitchinava, Nodari; Zhang, Qin

    2011-01-01

    usefulness of our approach by designing and analyzing efficient MapReduce algorithms for fundamental sorting, searching, and simulation problems. This study is motivated by a goal of ultimately putting the MapReduce framework on an equal theoretical footing with the well-known PRAM and BSP parallel...... models, which would benefit both the theory and practice of MapReduce algorithms. We describe efficient MapReduce algorithms for sorting, multi-searching, and simulations of parallel algorithms specified in the BSP and CRCW PRAM models. We also provide some applications of these results to problems...... in parallel computational geometry for the MapReduce framework, which result in efficient MapReduce algorithms for sorting, 2- and 3-dimensional convex hulls, and fixed-dimensional linear programming. For the case when mappers and reducers have a memory/message-I/O size of M = (N), for a small constant > 0...

  4. linear accelerator simulation framework with placet and guinea-pig

    CERN Document Server

    Snuverink, Jochem; CERN. Geneva. ATS Department

    2016-01-01

    Many good tracking tools are available for simulations for linear accelerators. However, several simple tasks need to be performed repeatedly, like lattice definitions, beam setup, output storage, etc. In addition, complex simulations can become unmanageable quite easily. A high level layer would therefore be beneficial. We propose LinSim, a linear accelerator framework with the codes PLACET and GUINEA-PIG. It provides a documented well-debugged high level layer of functionality. Users only need to provide the input settings and essential code and / or use some of the many implemented imperfections and algorithms. It can be especially useful for first-time users. Currently the following accelerators are implemented: ATF2, ILC, CLIC and FACET. This note is the comprehensive manual, discusses the framework design and shows its strength in some condensed examples.

  5. Etomica: an object-oriented framework for molecular simulation.

    Science.gov (United States)

    Schultz, Andrew J; Kofke, David A

    2015-03-30

    We describe the design of an object-oriented library of software components that are suitable for constructing simulations of systems of interacting particles. The emphasis of the discussion is on the general design of the components and how they interact, and less on details of the programming interface or its implementation. Example code is provided as an aid to understanding object-oriented programming structures and to demonstrate how the framework is applied.

  6. Sorting, Searching, and Simulation in the MapReduce Framework

    DEFF Research Database (Denmark)

    Goodrich, Michael T.; Sitchinava, Nodar; Zhang, Qin

    2011-01-01

    We study the MapReduce framework from an algorithmic standpoint, providing a generalization of the previous algorithmic models for MapReduce. We present optimal solutions for the fundamental problems of all-prefix-sums, sorting and multi-searching. Additionally, we design optimal simulations...... of the the well-established PRAM and BSP models in MapReduce, immediately resulting in optimal solutions to the problems of computing fixed-dimensional linear programming and 2-D and 3-D convex hulls....

  7. Hierarchical Visual Analysis and Steering Framework for Astrophysical Simulations

    Institute of Scientific and Technical Information of China (English)

    肖健; 张加万; 原野; 周鑫; 纪丽; 孙济洲

    2015-01-01

    A framework for accelerating modern long-running astrophysical simulations is presented, which is based on a hierarchical architecture where computational steering in the high-resolution run is performed under the guide of knowledge obtained in the gradually refined ensemble analyses. Several visualization schemes for facilitating ensem-ble management, error analysis, parameter grouping and tuning are also integrated owing to the pluggable modular design. The proposed approach is prototyped based on the Flash code, and it can be extended by introducing user-defined visualization for specific requirements. Two real-world simulations, i.e., stellar wind and supernova remnant, are carried out to verify the proposed approach.

  8. Velo: A Knowledge Management Framework for Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  9. A new framework for simulating forced homogeneous buoyant turbulent flows

    Science.gov (United States)

    Carroll, Phares L.; Blanquart, Guillaume

    2015-06-01

    This work proposes a new simulation methodology to study variable density turbulent buoyant flows. The mathematical framework, referred to as homogeneous buoyant turbulence, relies on a triply periodic domain and incorporates numerical forcing methods commonly used in simulation studies of homogeneous, isotropic flows. In order to separate the effects due to buoyancy from those due to large-scale gradients, the linear scalar forcing technique is used to maintain the scalar variance at a constant value. Two sources of kinetic energy production are considered in the momentum equation, namely shear via an isotropic forcing term and buoyancy via the gravity term. The simulation framework is designed such that the four dimensionless parameters of importance in buoyant mixing, namely the Reynolds, Richardson, Atwood, and Schmidt numbers, can be independently varied and controlled. The framework is used to interrogate fully non-buoyant, fully buoyant, and partially buoyant turbulent flows. The results show that the statistics of the scalar fields (mixture fraction and density) are not influenced by the energy production mechanism (shear vs. buoyancy). On the other hand, the velocity field exhibits anisotropy, namely a larger variance in the direction of gravity which is associated with a statistical dependence of the velocity component on the local fluid density.

  10. EIC detector simulations in FairRoot framework

    Science.gov (United States)

    Kiselev, Alexander; eRHIC task force Team

    2013-10-01

    The long-term RHIC facility upgrade plan foresees the addition of a high-energy electron beam to the existing hadron accelerator complex thus converting RHIC into an Electron-Ion Collider (eRHIC). A dedicated EIC detector, designed to efficiently register and identify deep inelastic electron scattering (DIS) processes in a wide range of center-of-mass energies is one of the key elements of this upgrade. Detailed Monte-Carlo studies are needed to optimize EIC detector components and to fine tune their design. The simulation package foreseen for this purpose (EicRoot) is based on the FairRoot framework developed and maintained at the GSI. A feature of this framework is its level of flexibility, allowing one to switch easily between different geometry (ROOT, GEANT) and transport (GEANT3, GEANT4, FLUKA) models. Apart from providing a convenient simulation environment the framework includes basic tools for visualization and allows for easy sharing of event reconstruction codes between higher level experiment-specific applications. The description of the main EicRoot features and first simulation results will be the main focus of the talk.

  11. Framework Application for Core Edge Transport Simulation (FACETS)

    Energy Technology Data Exchange (ETDEWEB)

    Krasheninnikov, Sergei; Pigarov, Alexander

    2011-10-15

    The FACETS (Framework Application for Core-Edge Transport Simulations) project of Scientific Discovery through Advanced Computing (SciDAC) Program was aimed at providing a high-fidelity whole-tokamak modeling for the U.S. magnetic fusion energy program and ITER through coupling separate components for each of the core region, edge region, and wall, with realistic plasma particles and power sources and turbulent transport simulation. The project also aimed at developing advanced numerical algorithms, efficient implicit coupling methods, and software tools utilizing the leadership class computing facilities under Advanced Scientific Computing Research (ASCR). The FACETS project was conducted by a multi-discipline, multi-institutional teams, the Lead PI was J.R. Cary (Tech-X Corp.). In the FACETS project, the Applied Plasma Theory Group at the MAE Department of UCSD developed the Wall and Plasma-Surface Interaction (WALLPSI) module, performed its validation against experimental data, and integrated it into the developed framework. WALLPSI is a one-dimensional, coarse grained, reaction/advection/diffusion code applied to each material boundary cell in the common modeling domain for a tokamak. It incorporates an advanced model for plasma particle transport and retention in the solid matter of plasma facing components, simulation of plasma heat power load handling, calculation of erosion/deposition, and simulation of synergistic effects in strong plasma-wall coupling.

  12. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  13. A Driver Behavior Learning Framework for Enhancing Traffic Simulation

    Directory of Open Access Journals (Sweden)

    Ramona Maria Paven

    2014-06-01

    Full Text Available Traffic simulation provides an essential support for developing intelligent transportation systems. It allows affordable validation of such systems using a large variety of scenarios that involves massive data input. However, realistic traffic models are hard to be implemented especially for microscopic traffic simulation. One of the hardest problems in this context is to model the behavior of drivers, due the complexity of human nature. The work presented in this paper proposes a framework for learning driver behavior based on a Hidden Markov Model technique. Moreover, we propose also a practical method to inject this behavior in a traffic model used by the SUMO traffic simulator. To demonstrate the effectiveness of this method we present a case study involving real traffic collected from Timisoara city area.

  14. ATLAS Detector Simulation in the Integrated Simulation Framework applied to the W Boson Mass Measurement

    CERN Document Server

    Ritsch, Elmar; Froidevaux, Daniel; Salzburger, Andreas

    One of the cornerstones for the success of the ATLAS experiment at the Large Hadron Collider (LHC) is a very accurate Monte Carlo detector simulation. However, a limit is being reached regarding the amount of simulated data which can be produced and stored with the computing resources available through the worldwide LHC computing grid (WLCG). The Integrated Simulation Framework (ISF) is a novel approach to detector simula- tion which enables a more efficient use of these computing resources and thus allows for the generation of more simulated data. Various simulation technologies are combined to allow for faster simulation approaches which are targeted at the specific needs of in- dividual physics studies. Costly full simulation technologies are only used where high accuracy is required by physics analyses and fast simulation technologies are applied everywhere else. As one of the first applications of the ISF, a new combined simulation approach is developed for the generation of detector calibration samples ...

  15. The PandaRoot framework for simulation, reconstruction and analysis

    Science.gov (United States)

    Spataro, Stefano; PANDA Collaboration

    2011-12-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  16. A modeling and simulation framework for electrokinetic nanoparticle treatment

    Science.gov (United States)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  17. A framework of modeling detector systems for computed tomography simulations

    Science.gov (United States)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  18. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Samantha S [ORNL; Elwasif, Wael R [ORNL; Bernholdt, David E [ORNL

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels of parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.

  19. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  20. A wind turbine hybrid simulation framework considering aeroelastic effects

    Science.gov (United States)

    Song, Wei; Su, Weihua

    2015-04-01

    In performing an effective structural analysis for wind turbine, the simulation of turbine aerodynamic loads is of great importance. The interaction between the wake flow and the blades may impact turbine blades loading condition, energy yield and operational behavior. Direct experimental measurement of wind flow field and wind profiles around wind turbines is very helpful to support the wind turbine design. However, with the growth of the size of wind turbines for higher energy output, it is not convenient to obtain all the desired data in wind-tunnel and field tests. In this paper, firstly the modeling of dynamic responses of large-span wind turbine blades will consider nonlinear aeroelastic effects. A strain-based geometrically nonlinear beam formulation will be used for the basic structural dynamic modeling, which will be coupled with unsteady aerodynamic equations and rigid-body rotations of the rotor. Full wind turbines can be modeled by using the multi-connected beams. Then, a hybrid simulation experimental framework is proposed to potentially address this issue. The aerodynamic-dominant components, such as the turbine blades and rotor, are simulated as numerical components using the nonlinear aeroelastic model; while the turbine tower, where the collapse of failure may occur under high level of wind load, is simulated separately as the physical component. With the proposed framework, dynamic behavior of NREL's 5MW wind turbine blades will be studied and correlated with available numerical data. The current work will be the basis of the authors' further studies on flow control and hazard mitigation on wind turbine blades and towers.

  1. A Virtual Engineering Framework for Simulating Advanced Power System

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Stanislav Borodai

    2008-06-18

    In this report is described the work effort performed to provide NETL with VE-Suite based Virtual Engineering software and enhanced equipment models to support NETL's Advanced Process Engineering Co-simulation (APECS) framework for advanced power generation systems. Enhancements to the software framework facilitated an important link between APECS and the virtual engineering capabilities provided by VE-Suite (e.g., equipment and process visualization, information assimilation). Model enhancements focused on improving predictions for the performance of entrained flow coal gasifiers and important auxiliary equipment (e.g., Air Separation Units) used in coal gasification systems. In addition, a Reduced Order Model generation tool and software to provide a coupling between APECS/AspenPlus and the GE GateCycle simulation system were developed. CAPE-Open model interfaces were employed where needed. The improved simulation capability is demonstrated on selected test problems. As part of the project an Advisory Panel was formed to provide guidance on the issues on which to focus the work effort. The Advisory Panel included experts from industry and academics in gasification, CO2 capture issues, process simulation and representatives from technology developers and the electric utility industry. To optimize the benefit to NETL, REI coordinated its efforts with NETL and NETL funded projects at Iowa State University, Carnegie Mellon University and ANSYS/Fluent, Inc. The improved simulation capabilities incorporated into APECS will enable researchers and engineers to better understand the interactions of different equipment components, identify weaknesses and processes needing improvement and thereby allow more efficient, less expensive plants to be developed and brought on-line faster and in a more cost-effective manner. These enhancements to APECS represent an important step toward having a fully integrated environment for performing plant simulation and engineering

  2. Wire chamber degradation at the Argonne ZGS

    Energy Technology Data Exchange (ETDEWEB)

    Haberichter, W.; Spinka, H.

    1986-01-01

    Experience with multiwire proportional chambers at high rates at the Argonne Zero Gradient Synchrotron is described. A buildup of silicon on the sense wires was observed where the beam passed through the chamber. Analysis of the chamber gas indicated that the density of silicon was probably less than 10 ppM.

  3. A Simulation Framework for Optimal Energy Storage Sizing

    Directory of Open Access Journals (Sweden)

    Carlos Suazo-Martínez

    2014-05-01

    Full Text Available Despite the increasing interest in Energy Storage Systems (ESS, quantification of their technical and economical benefits remains a challenge. To assess the use of ESS, a simulation approach for ESS optimal sizing is presented. The algorithm is based on an adapted Unit Commitment, including ESS operational constraints, and the use of high performance computing (HPC. Multiple short-term simulations are carried out within a multiple year horizon. Evaluation is performed for Chile's Northern Interconnected Power System (SING. The authors show that a single year evaluation could lead to sub-optimal results when evaluating optimal ESS size. Hence, it is advisable to perform long-term evaluations of ESS. Additionally, the importance of detailed simulation for adequate assessment of ESS contributions and to fully capture storage value is also discussed. Furthermore, the robustness of the optimal sizing approach is evaluated by means of a sensitivity analyses. The results suggest that regulatory frameworks should recognize multiple value streams from storage in order to encourage greater ESS integration.

  4. Framework Application for Core Edge Transport Simulation (FACETS)

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D; Shende, Sameer S; Huck, Kevin A; Mr. Alan Morris, and Mr. Wyatt Spear

    2012-03-14

    The goal of the FACETS project (Framework Application for Core-Edge Transport Simulations) was to provide a multiphysics, parallel framework application (FACETS) that will enable whole-device modeling for the U.S. fusion program, to provide the modeling infrastructure needed for ITER, the next step fusion confinement device. Through use of modern computational methods, including component technology and object oriented design, FACETS is able to switch from one model to another for a given aspect of the physics in a flexible manner. This enables use of simplified models for rapid turnaround or high-fidelity models that can take advantage of the largest supercomputer hardware. FACETS does so in a heterogeneous parallel context, where different parts of the application execute in parallel by utilizing task farming, domain decomposition, and/or pipelining as needed and applicable. ParaTools, Inc. was tasked with supporting the performance analysis and tuning of the FACETS components and framework in order to achieve the parallel scaling goals of the project. The TAU Performance System® was used for instrumentation, measurement, archiving, and profile / tracing analysis. ParaTools, Inc. also assisted in FACETS performance engineering efforts. Through the use of the TAU Performance System, ParaTools provided instrumentation, measurement, analysis and archival support for the FACETS project. Performance optimization of key components has yielded significant performance speedups. TAU was integrated into the FACETS build for both the full coupled application and the UEDGE component. The performance database provided archival storage of the performance regression testing data generated by the project, and helped to track improvements in the software development.

  5. A new framework for magnetohydrodynamic simulations with anisotropic pressure

    CERN Document Server

    Hirabayashi, Kota; Amano, Takanobu

    2016-01-01

    We describe a new theoretical and numerical framework of the magnetohydrodynamic simulation incorporated with an anisotropic pressure tensor, which can play an important role in a collisionless plasma. A classical approach to handle the anisotropy is based on the double adiabatic approximation assuming that a pressure tensor is well described only by the components parallel and perpendicular to the local magnetic field. This gyrotropic assumption, however, fails around a magnetically neutral region, where the cyclotron period may get comparable to or even longer than a dynamical time in a system, and causes a singularity in the mathematical expression. In this paper, we demonstrate that this singularity can be completely removed away by the combination of direct use of the 2nd-moment of the Vlasov equation and an ingenious gyrotropization model. Numerical tests also verify that the present model properly reduces to the standard MHD or the double adiabatic formulation in an asymptotic manner under an appropria...

  6. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  7. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Thomas [California Inst. of Technology (CalTech), Pasadena, CA (United States); Efendiev, Yalchin [Stanford Univ., CA (United States); Tchelepi, Hamdi [Texas A & M Univ., College Station, TX (United States); Durlofsky, Louis [Stanford Univ., CA (United States)

    2016-05-24

    Our work in this project is aimed at making fundamental advances in multiscale methods for flow and transport in highly heterogeneous porous media. The main thrust of this research is to develop a systematic multiscale analysis and efficient coarse-scale models that can capture global effects and extend existing multiscale approaches to problems with additional physics and uncertainties. A key emphasis is on problems without an apparent scale separation. Multiscale solution methods are currently under active investigation for the simulation of subsurface flow in heterogeneous formations. These procedures capture the effects of fine-scale permeability variations through the calculation of specialized coarse-scale basis functions. Most of the multiscale techniques presented to date employ localization approximations in the calculation of these basis functions. For some highly correlated (e.g., channelized) formations, however, global effects are important and these may need to be incorporated into the multiscale basis functions. Other challenging issues facing multiscale simulations are the extension of existing multiscale techniques to problems with additional physics, such as compressibility, capillary effects, etc. In our project, we explore the improvement of multiscale methods through the incorporation of additional (single-phase flow) information and the development of a general multiscale framework for flows in the presence of uncertainties, compressible flow and heterogeneous transport, and geomechanics. We have considered (1) adaptive local-global multiscale methods, (2) multiscale methods for the transport equation, (3) operator-based multiscale methods and solvers, (4) multiscale methods in the presence of uncertainties and applications, (5) multiscale finite element methods for high contrast porous media and their generalizations, and (6) multiscale methods for geomechanics.

  8. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    Energy Technology Data Exchange (ETDEWEB)

    Tchelepi, Hamdi

    2014-11-14

    A multiscale linear-solver framework for the pressure equation associated with flow in highly heterogeneous porous formations was developed. The multiscale based approach is cast in a general algebraic form, which facilitates integration of the new scalable linear solver in existing flow simulators. The Algebraic Multiscale Solver (AMS) is employed as a preconditioner within a multi-stage strategy. The formulations investigated include the standard MultiScale Finite-Element (MSFE) andMultiScale Finite-Volume (MSFV) methods. The local-stage solvers include incomplete factorization and the so-called Correction Functions (CF) associated with the MSFV approach. Extensive testing of AMS, as an iterative linear solver, indicate excellent convergence rates and computational scalability. AMS compares favorably with advanced Algebraic MultiGrid (AMG) solvers for highly detailed three-dimensional heterogeneous models. Moreover, AMS is expected to be especially beneficial in solving time-dependent problems of coupled multiphase flow and transport in large-scale subsurface formations.

  9. A framework for using simulation methodology in ergonomics interventions in design projects

    DEFF Research Database (Denmark)

    Broberg, Ole; Duarte, Francisco; Andersen, Simone Nyholm

    2014-01-01

    The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective......The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective...

  10. artG4: A Generic Framework for Geant4 Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Arvanitis, Tasha [Harvey Mudd Coll.; Lyon, Adam [Fermilab

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  11. A framework for web browser-based medical simulation using WebGL.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2012-01-01

    This paper presents a web browser-based software framework that provides accessibility, portability, and platform independence for medical simulation. Typical medical simulation systems are restricted to the underlying platform and device, which limits widespread use. Our framework allows realistic and efficient medical simulation using only the web browser for anytime anywhere access using a variety of platforms ranging from desktop PCs to tablets. The framework consists of visualization, simulation, and hardware integration modules that are fundamental components for multimodal interactive simulation. Benchmark tests are performed to validate the rendering and computing performance of our framework with latest web browsers including Chrome and Firefox. The results are quite promising opening up the possibility of developing web-based medical simulation technology.

  12. Argonne Bubble Experiment Thermal Model Development II

    Energy Technology Data Exchange (ETDEWEB)

    Buechler, Cynthia Eileen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report describes the continuation of the work reported in “Argonne Bubble Experiment Thermal Model Development”. The experiment was performed at Argonne National Laboratory (ANL) in 2014. A rastered 35 MeV electron beam deposited power in a solution of uranyl sulfate, generating heat and radiolytic gas bubbles. Irradiations were performed at three beam power levels, 6, 12 and 15 kW. Solution temperatures were measured by thermocouples, and gas bubble behavior was observed. This report will describe the Computational Fluid Dynamics (CFD) model that was developed to calculate the temperatures and gas volume fractions in the solution vessel during the irradiations. The previous report described an initial analysis performed on a geometry that had not been updated to reflect the as-built solution vessel. Here, the as-built geometry is used. Monte-Carlo N-Particle (MCNP) calculations were performed on the updated geometry, and these results were used to define the power deposition profile for the CFD analyses, which were performed using Fluent, Ver. 16.2. CFD analyses were performed for the 12 and 15 kW irradiations, and further improvements to the model were incorporated, including the consideration of power deposition in nearby vessel components, gas mixture composition, and bubble size distribution. The temperature results of the CFD calculations are compared to experimental measurements.

  13. Status of RF superconductivity at Argonne

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, K.W.

    1989-01-01

    Development of a superconducting (SC) slow-wave structures began at Argonne National Laboratory (ANL) in 1971, and led to the first SC heavy-ion linac (ATLAS - the Argonne Tandem-Linac Accelerating System), which began regularly scheduled operation in 1978. To date, more than 40,000 hours of bean-on target operating time has been accumulated with ATLAS. The Physics Division at ANL has continued to develop SC RF technology for accelerating heavy-ions, with the result that the SC linac has, up to the present, has been in an almost continuous process of upgrade and expansion. It should be noted that this has been accomplished while at the same time maintaining a vigorous operating schedule in support of the nuclear and atomic physics research programs of the division. In 1987, the Engineering Physics Division at ANL began development of SC RF components for the acceleration of high-brightness proton and deuterium beams. This work has included the evaluation of RF properties of high-{Tc} oxide superconductors, both for the above and for other applications. The two divisions collaborated while they worked on several applications of RF SC, and also worked to develop the technology generally. 11 refs., 6 figs.

  14. Parallel simulation of wormhole propagation with the Darcy-Brinkman-Forchheimer framework

    KAUST Repository

    Wu, Yuanqing

    2015-07-09

    The acid treatment of carbonate reservoirs is a widely practiced oil and gas well stimulation technique. The injected acid dissolves the material near the wellbore and creates flow channels that establish a good connectivity between the reservoir and the well. Such flow channels are called wormholes. Different from the traditional simulation technology relying on Darcy framework, the new Darcy-Brinkman-Forchheimer (DBF) framework is introduced to simulate the wormhole forming procedure. The DBF framework considers both large and small porosity conditions and should output better simulation results than the Darcy framework. To process the huge quantity of cells in the simulation grid and shorten the long simulation time of the traditional serial code, a parallel code with FORTRAN 90 and MPI was developed. The experimenting field approach to set coefficients in the model equations was also introduced. Moreover, a procedure to fill in the coefficient matrix in the linear system in the solver was described. After this, 2D dissolution experiments were carried out. In the experiments, different configurations of wormholes and a series of properties simulated by both frameworks were compared. We conclude that the numerical results of the DBF framework are more like wormholes and more stable than the Darcy framework, which is a demonstration of the advantages of the DBF framework. Finally, the scalability of the parallel code was evaluated, and we conclude that superlinear scalability can be achieved. © 2015 Elsevier Ltd.

  15. NEVESIM: Event-Driven Neural Simulation Framework with a Python Interface

    Directory of Open Access Journals (Sweden)

    Dejan ePecevski

    2014-08-01

    Full Text Available NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  16. NEVESIM: event-driven neural simulation framework with a Python interface.

    Science.gov (United States)

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  17. Virtual Micromagnetics: A Framework for Accessible and Reproducible Micromagnetic Simulation

    Directory of Open Access Journals (Sweden)

    Mark Vousden

    2016-10-01

    Full Text Available Computational micromagnetics requires numerical solution of partial differential equations to resolve complex interactions in magnetic nanomaterials. The Virtual Micromagnetics project described here provides virtual machine simulation environments to run open-source micromagnetic simulation packages [1]. These environments allow easy access to simulation packages that are often difficult to compile and install, and enable simulations and their data to be shared and stored in a single virtual hard disk file, which encourages reproducible research. Virtual Micromagnetics can be extended to automate the installation of micromagnetic simulation packages on non-virtual machines, and to support closed-source and new open-source simulation packages, including packages from disciplines other than micromagnetics, encouraging reuse. Virtual Micromagnetics is stored in a public GitHub repository under a three-clause Berkeley Software Distribution (BSD license.

  18. Integrating Realistic Simulation Engines within the MORSE Framework

    OpenAIRE

    Degroote, Arnaud; Koch, Pierrick; Lacroix, Simon

    2015-01-01

    International audience; The complexity of robotics is due to the tight interactions between hardware, complex softwares, and envi- ronments. While real world experience is the only way to assess the efficiency and robustness of a robotics system, simulations can help to pave the way to actual experiments. But an overall robotics system requires simulations at a level of realism which no holistic simulator can provide, given the wide spectrum of disciplines and physical processes involved. Thi...

  19. CrusDe: A plug-in based simulation framework for composable CRUStal DEformation simulations

    Science.gov (United States)

    Grapenthin, R.

    2008-12-01

    Within geoscience, Green's method is an established mathematical tool to analyze the dynamics of the Earth's crust in response to the application of a mass force, e.g. a surface load. Different abstractions from the Earth's interior as well as the particular effects caused by such a force are expressed by means of a Green's function, G, which is a particular solution to an inhomogeneous differential equation with boundary conditions. Surface loads, L, are defined by real data or as analytical expressions. The response of the crust to a surface load is gained by a 2D-convolution (**) of the Green's function with this load. The crustal response can be thought of as an instantaneous displacement which is followed by a gradual transition towards the final relaxed state of displacement. A relaxation function, R, describing such a transition depends on the rheological model for the ductile layer of the crust. The 1D-convolution (*) of the relaxation function with a load history, H, allows to include the temporal evolution of the surface load into a model. The product of the two convolution results expresses the displacement (rate) of the crust, U, at a certain time t: Ut = (R * H)t · (G ** L) Rather than implementing a variety of specific models, approaching crustal deformation problems from the general formulation in equation~1 opens the opportunity to consider reuse of model building blocks within a more flexible simulation framework. Model elements (Green's function, load function, etc.), operators, pre- and postprocessing, and even input and output routines could be part of a framework that enables a user to freely compose software components to resemble equation~1. The simulation framework CrusDe implements equation~1 in the proposed way. CrusDe's architecture defines interfaces for generic communication between the simulation core and the model elements. Thus, exchangeability of the particular model element implementations is possible. In the presented plug

  20. Software for the international linear collider: Simulation and reconstruction frameworks

    Indian Academy of Sciences (India)

    Ties Behnke; Frank Gaede; DESY Hamburg

    2007-12-01

    Software plays an increasingly important role already in the early stages of a large project like the ILC. In international collaboration a data format for the ILC detector and physics studies has been developed. Building upon this software frameworks are made available which ease the event reconstruction and analysis.

  1. A modular simulation framework for colonoscopy using a new haptic device.

    Science.gov (United States)

    Hellier, David; Samur, Evren; Passenger, Josh; Spälter, Ulrich; Frimmel, Hans; Appleyard, Mark; Bleuler, Hannes; Ourselin, Sébastien

    2008-01-01

    We have developed a multi-threaded framework for colonoscopy simulation utilising OpenGL with an interface to a real-time prototype colonoscopy haptic device. A modular framework has enabled us to support multiple haptic devices and efficiently integrate new research into physically based modelling of the colonoscope, colon and surrounding organs. The framework supports GPU accelerated algorithms as runtime modules, allowing the real-time calculations required for haptic feedback.

  2. Environmental monitoring at Argonne National Laboratory. Annual report, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Sedlet, J.

    1982-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1981 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  3. Environmental monitoring at Argonne National Laboratory. Annual report for 1980

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Duffy, T. L.; Sedlet, J.

    1981-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1980 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  4. Environmental monitoring at Argonne National Laboratory. Annual report for 1979

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Duffy, T. L.; Sedlet, J.

    1980-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1979 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, Argonne effluent water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environemetal penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measuremenets were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  5. Environmental monitoring at Argonne National Laboratory. Annual report for 1983

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Sedlet, J.

    1984-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1983 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, ground water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The potential radiation dose to off-site population groups is also estimated. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. 19 references, 8 figures, 49 tables.

  6. Environmental monitoring at Argonne National Laboratory. Annual report for 1982

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Sedlet, J.

    1983-03-01

    The results of the environmental monitoring program at Argonne Ntaional Laboratory for 1982 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, ground water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and masurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  7. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    Science.gov (United States)

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  8. Environmental assessment related to the operation of Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    1982-08-01

    In order to evaluate the environmental impacts of Argonne National Laboratory (ANL) operations, this assessment includes a descriptive section which is intended to provide sufficient detail to allow the various impacts to be viewed in proper perspective. In particular, details are provided on site characteristics, current programs, characterization of the existing site environment, and in-place environmental monitoring programs. In addition, specific facilities and operations that could conceivably impact the environment are described at length. 77 refs., 16 figs., 47 tabs.

  9. The Unified Behavior Framework for the Simulation of Autonomous Agents

    Science.gov (United States)

    2015-03-01

    Peter Bonasso, R. James Firby, E. Gat, D. Kortenkamp, D. P. Miller, and M. G. Slack, “Experiences with an architecture for intelligent, reactive agents...2009. [11] D. D. Hodson, D. P. Gehl, and R. O. Baldwin , “Building distributed sim- ulations utilizing the eaagles framework,” in The Interservice...control,” in Robotics and Automation (ICRA), 2014 IEEE International Conference on, 2014. [9] D. D. Hodson, D. P. Gehl, and R. O. Baldwin , “Building

  10. A general framework for perfect simulation of long memory processes

    CERN Document Server

    De Santis, Emilio

    2010-01-01

    In this paper a general approach for the perfect simulation of a stationary process with at most countable state space is outlined. The process is specified through a kernel, prescribing the probability of each state conditional to the whole past history. We follow the seminal paper of Comets, Fernandez and Ferrari, where sufficient conditions for the construction of a certain perfect simulation algorithm have been given. We generalize this approach by defining backward coalescence times for these kind of processes; this allows us to construct perfect simulation algorithms under weaker conditions.

  11. A general framework for perfect simulation of long memory processes

    OpenAIRE

    De Santis, Emilio; Piccioni, Mauro

    2010-01-01

    In this paper a general approach for the perfect simulation of a stationary process with at most countable state space is outlined. The process is specified through a kernel, prescribing the probability of each state conditional to the whole past history. We follow the seminal paper of Comets, Fernandez and Ferrari, where sufficient conditions for the construction of a certain perfect simulation algorithm have been given. We generalize this approach by defining backward coalescence times for ...

  12. A Simulation Framework for Optimal Energy Storage Sizing

    OpenAIRE

    Carlos Suazo-Martínez; Eduardo Pereira-Bonvallet; Rodrigo Palma-Behnke

    2014-01-01

    Despite the increasing interest in Energy Storage Systems (ESS), quantification of their technical and economical benefits remains a challenge. To assess the use of ESS, a simulation approach for ESS optimal sizing is presented. The algorithm is based on an adapted Unit Commitment, including ESS operational constraints, and the use of high performance computing (HPC). Multiple short-term simulations are carried out within a multiple year horizon. Evaluation is performed for Chile's No...

  13. A framework for simulating ultrasound imaging based on first order nonlinear pressure–velocity relations

    DEFF Research Database (Denmark)

    Du, Yigang; Fan, Rui; Li, Yong

    2016-01-01

    An ultrasound imaging framework modeled with the first order nonlinear pressure–velocity relations (NPVR) based simulation and implemented by a half-time staggered solution and pseudospectral method is presented in this paper. The framework is capable of simulating linear and nonlinear ultrasound...... propagation and reflections in a heterogeneous medium with different sound speeds and densities. It can be initialized with arbitrary focus, excitation and apodization for multiple individual channels in both 2D and 3D spatial fields. The simulated channel data can be generated using this framework......, and ultrasound image can be obtained by beamforming the simulated channel data. Various results simulated by different algorithms are illustrated for comparisons. The root mean square (RMS) errors for each compared pulses are calculated. The linear propagation is validated by an angular spectrum approach (ASA...

  14. Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking

    Energy Technology Data Exchange (ETDEWEB)

    Vashishta, Priya

    2014-12-01

    Reaction Dynamics in Energetic Materials: Detonation is a prototype of mechanochemistry, in which mechanically and thermally induced chemical reactions far from equilibrium exhibit vastly different behaviors. It is also one of the hardest multiscale physics problems, in which diverse length and time scales play important roles. The CACS group has performed multimillion-atom reactive MD simulations to reveal a novel two-stage reaction mechanism during the detonation of cyclotrimethylenetrinitramine (RDX) crystal. Rapid production of N2 and H2O within ~10 ps is followed by delayed production of CO molecules within ~ 1 ns. They found that further decomposition towards the final products is inhibited by the formation of large metastable C- and O-rich clusters with fractal geometry. The CACS group has also simulated the oxidation dynamics of close-packed aggregates of aluminum nanoparticles passivated by oxide shells. Their simulation results suggest an unexpectedly active role of the oxide shell as a nanoreactor.

  15. A Monte Carlo Simulation Framework for Testing Cosmological Models

    Directory of Open Access Journals (Sweden)

    Heymann Y.

    2014-10-01

    Full Text Available We tested alternative cosmologies using Monte Carlo simulations based on the sam- pling method of the zCosmos galactic survey. The survey encompasses a collection of observable galaxies with respective redshifts that have been obtained for a given spec- troscopic area of the sky. Using a cosmological model, we can convert the redshifts into light-travel times and, by slicing the survey into small redshift buckets, compute a curve of galactic density over time. Because foreground galaxies obstruct the images of more distant galaxies, we simulated the theoretical galactic density curve using an average galactic radius. By comparing the galactic density curves of the simulations with that of the survey, we could assess the cosmologies. We applied the test to the expanding-universe cosmology of de Sitter and to a dichotomous cosmology.

  16. NASA Earth Observing System Simulator Suite (NEOS3): A Forward Simulation Framework for Observing System Simulation Experiments

    Science.gov (United States)

    Niamsuwan, N.; Tanelli, S.; Johnson, M. P.; Jacob, J. C.; Jaruwatanadilok, S.; Oveisgharan, S.; Dao, D.; Simard, M.; Turk, F. J.; Tsang, L.; Liao, T. H.; Chau, Q.

    2014-12-01

    Future Earth observation missions will produce a large volume of interrelated data sets that will help us to cross-calibrate and validate spaceborne sensor measurements. A forward simulator is a crucial tool for examining the quality of individual products as well as resolving discrepancy among related data sets. NASA Earth Observing System Simulator Suite (NEOS3) is a highly customizable forward simulation tool for Earth remote sensing instruments. Its three-stage simulation process converts the 3D geophysical description of the scene being observed to corresponding electromagnetic emission and scattering signatures, and finally to observable parameters as reported by a (passive or active) remote sensing instrument. User-configurable options include selection of models for describing geophysical properties of atmospheric particles and their effects on the signal of interest, selection of wave scattering and propagation models, and activation of simplifying assumptions (trading between computation time and solution accuracy). The next generation of NEOS3, to be released in 2015, will feature additional state-of-the-art electromagnetic scattering models for various types of the Earth's surfaces and ground covers (e.g. layered snowpack, forest, vegetated soil, and sea ice) tailored specifically for missions like GPM and SMAP. To be included in 2015 is dedicated functionalities and interface that facilitate integrating NEOS3 into Observing System Simulation Experiment (OSSE) environments. This new generation of NEOS3 can also utilize high performance computing resources (parallel processing and cloud computing) and can be scaled to handle large or computation intensive problems. This presentation will highlight some notable features of NEOS3. Demonstration of its applications for evaluating new mission concepts, especially in the context of OSSE frameworks will also be presented.

  17. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    Directory of Open Access Journals (Sweden)

    Rescigno R.

    2014-03-01

    Full Text Available Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  18. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    Science.gov (United States)

    Rescigno, R.; Finck, Ch.; Juliani, D.; Baudot, J.; Dauvergne, D.; Dedes, G.; Krimmer, J.; Ray, C.; Reithinger, V.; Rousseau, M.; Testa, E.; Winter, M.

    2014-03-01

    Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  19. BOUT++: a framework for parallel plasma fluid simulations

    CERN Document Server

    Dudson, B D; Xu, X Q; Snyder, P B; Wilson, H R

    2008-01-01

    A new modular code called BOUT++ is presented, which simulates 3D fluid equations in curvilinear coordinates. Although aimed at simulating Edge Localised Modes (ELMs) in tokamak X-point geometry, the code is able to simulate a wide range of fluid models (magnetised and unmagnetised) involving an arbitrary number of scalar and vector fields, in a wide range of geometries. Time evolution is fully implicit, and 3rd-order WENO schemes are implemented. Benchmarks are presented for linear and non-linear problems (the Orszag-Tang vortex) showing good agreement. Performance of the code is tested by scaling with problem size and processor number, showing efficient scaling to thousands of processors. Linear initial-value simulations of ELMs using reduced ideal MHD are presented, and the results compared to the ELITE linear MHD eigenvalue code. The resulting mode-structures and growth-rate are found to be in good agreement (BOUT++ = 0.245, ELITE = 0.239). To our knowledge, this is the first time dissipationless, initial...

  20. A Component-Based Dataflow Framework for Simulation and Visualization

    NARCIS (Netherlands)

    Telea, Alexandru

    1999-01-01

    Reuse in the context of scientific simulation applications has mostly taken the form of procedural or object-oriented libraries. End users of such systems are however often non software experts needing very simple, possibly interactive ways to build applications from domain-specific components and t

  1. Designing a Virtual Olympic Games Framework by Using Simulation in Web 2.0 Technologies

    Science.gov (United States)

    Stoilescu, Dorian

    2013-01-01

    Instructional simulation had major difficulties in the past for offering limited possibilities in practice and learning. This article proposes a link between instructional simulation and Web 2.0 technologies. More exactly, I present the design of the Virtual Olympic Games Framework (VOGF), as a significant demonstration of how interactivity in…

  2. Power grid simulation applications developed using the GridPACK™ high performance computing framework

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chen, Yousu; Diao, Ruisheng; Huang, Zhenyu (Henry); Perkins, William; Palmer, Bruce

    2016-12-01

    This paper describes the GridPACK™ software framework for developing power grid simulations that can run on high performance computing platforms, with several example applications (dynamic simulation, static contingency analysis, and dynamic contingency analysis) that have been developed using GridPACK.

  3. A general simulation model developing process based on five-object framework

    Institute of Scientific and Technical Information of China (English)

    胡安斌; 伞冶; 陈建明; 陈永强

    2003-01-01

    Different paradigms that relate verification and validation to the simulation model have different development process. A simulation model developing process based on Five-Object Framework (FOF) is discussed in this paper. An example is given to demonstrate the applications of the proposed method.

  4. Hybrid framework for the simulation of stochastic chemical kinetics

    Science.gov (United States)

    Duncan, Andrew; Erban, Radek; Zygalakis, Konstantinos

    2016-12-01

    Stochasticity plays a fundamental role in various biochemical processes, such as cell regulatory networks and enzyme cascades. Isothermal, well-mixed systems can be modelled as Markov processes, typically simulated using the Gillespie Stochastic Simulation Algorithm (SSA) [25]. While easy to implement and exact, the computational cost of using the Gillespie SSA to simulate such systems can become prohibitive as the frequency of reaction events increases. This has motivated numerous coarse-grained schemes, where the "fast" reactions are approximated either using Langevin dynamics or deterministically. While such approaches provide a good approximation when all reactants are abundant, the approximation breaks down when one or more species exist only in small concentrations and the fluctuations arising from the discrete nature of the reactions become significant. This is particularly problematic when using such methods to compute statistics of extinction times for chemical species, as well as simulating non-equilibrium systems such as cell-cycle models in which a single species can cycle between abundance and scarcity. In this paper, a hybrid jump-diffusion model for simulating well-mixed stochastic kinetics is derived. It acts as a bridge between the Gillespie SSA and the chemical Langevin equation. For low reactant reactions the underlying behaviour is purely discrete, while purely diffusive when the concentrations of all species are large, with the two different behaviours coexisting in the intermediate region. A bound on the weak error in the classical large volume scaling limit is obtained, and three different numerical discretisations of the jump-diffusion model are described. The benefits of such a formalism are illustrated using computational examples.

  5. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  6. A Framework for the Measurement of Simulated Behavior Performance

    Science.gov (United States)

    2011-03-24

    takes professional soccer team recordings to aid Robocup team tactics development, measuring success through goal tracking [5]. With interest in...Cloning for Simulator Validation”. Ubbo Visser, Fernando Ribeiro, Takeshi Ohashi, and Frank Dellaert (editors), RoboCup 2007: Robot Soccer World Cup...XI, volume 5001 of Lecture Notes in Computer Science, 329–336. Springer Berlin / Heidelberg, 2008. URL http://dx.doi.org/10. 1007/978-3-540-68847-1_32

  7. NSME: a framework for network worm modeling and simulation

    OpenAIRE

    Lin, Siming; Cheng, Xueqi

    2006-01-01

    Various worms have a devastating impact on Internet. Packet level network modeling and simulation has become an approach to find effective countermeasures against worm threat. However, current alternatives are not fit enough for this purpose. For instance, they mostly focus on the details of lower layers of the network so that the abstraction of application layer is very coarse. In our work, we propose a formal description of network and worm models, and define network virtualization level...

  8. Digital system verification a combined formal methods and simulation framework

    CERN Document Server

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  9. Modeling and Simulation Framework for Flow-Based Microfluidic Biochips

    DEFF Research Database (Denmark)

    Schmidt, Morten Foged; Minhass, Wajid Hassan; Pop, Paul

    2013-01-01

    Microfluidic biochips are replacing the conventional biochemical analyzers and are able to integrate the necessary functions for biochemical analysis on-chip. In this paper we are interested in flow-based biochips, in which the fluidic flow is manipulated using integrated microvalves. By combining...... and error prone. In this paper, we present an Integrated Development Environment (IDE), which addresses (i) schematic capture of the biochip architecture and biochemical application, (ii) logic simulation of an application running on a biochip, and is able to integrate the high level synthesis tasks we have...

  10. Atomistic Simulation of Protein Encapsulation in Metal-Organic Frameworks.

    Science.gov (United States)

    Zhang, Haiyang; Lv, Yongqin; Tan, Tianwei; van der Spoel, David

    2016-01-28

    Fabrication of metal-organic frameworks (MOFs) with large apertures triggers a brand-new research area for selective encapsulation of biomolecules within MOF nanopores. The underlying inclusion mechanism is yet to be clarified however. Here we report a molecular dynamics study on the mechanism of protein encapsulation in MOFs. Evaluation for the binding of amino acid side chain analogues reveals that van der Waals interaction is the main driving force for the binding and that guest size acts as a key factor predicting protein binding with MOFs. Analysis on the conformation and thermodynamic stability of the miniprotein Trp-cage encapsulated in a series of MOFs with varying pore apertures and surface chemistries indicates that protein encapsulation can be achieved via maintaining a polar/nonpolar balance in the MOF surface through tunable modification of organic linkers and Mg-O chelating moieties. Such modifications endow MOFs with a more biocompatible confinement. This work provides guidelines for selective inclusion of biomolecules within MOFs and facilitates MOF functions as a new class of host materials and molecular chaperones.

  11. Argonne National Laboratory research offers clues to Alzheimer's plaques

    CERN Multimedia

    2003-01-01

    Researchers from Argonne National Laboratory and the University of Chicago have developed methods to directly observe the structure and growth of microscopic filaments that form the characteristic plaques found in the brains of those with Alzheimer's Disease (1 page).

  12. High-temperature superconductor applications development at Argonne National Laboratory

    Science.gov (United States)

    Hull, J. R.; Poeppel, R. B.

    1992-02-01

    Developments at Argonne National Laboratory of near and intermediate term applications using high-temperature superconductors are discussed. Near-term applications of liquid-nitrogen depth sensors, current leads, and magnetic bearings are discussed in detail.

  13. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    Science.gov (United States)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  14. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    Science.gov (United States)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  15. Argonne National Lab gets Linux network teraflop cluster

    CERN Multimedia

    2003-01-01

    "Linux NetworX, Salt Lake City, Utah, has delivered an Evolocity II (E2) Linux cluster to Argonne National Laboratory that is capable of performing more than one trillion calculations per second (1 teraFLOP). The cluster, named "Jazz" by Argonne, is designed to provide optimum performance for multiple disciplines such as chemistry, physics and reactor engineering and will be used by the entire scientific community at the Lab" (1 page).

  16. CoRoBa, a Multi Mobile Robot Control and Simulation Framework

    Directory of Open Access Journals (Sweden)

    Eric Colon

    2008-11-01

    Full Text Available This paper describes on-going development of a multi robot control framework named CoRoBa. CoRoBa is theoretically founded by reifying Real Time Design Patterns. It uses CORBA as its communication Middleware and consequently benefits from the interoperability of this standard. A multi-robot 3D simulator written in Java3D integrates seamlessly with this framework. Several demonstration applications have been developed to validate the design and implementation options.

  17. Argonne National Laboratory Site Environmental Report for Calendar Year 2013

    Energy Technology Data Exchange (ETDEWEB)

    Davis, T. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Gomez, J. L. [Argonne National Lab. (ANL), Argonne, IL (United States); Moos, L. P. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-09-02

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2013. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with environmental management, sustainability efforts, environmental corrective actions, and habitat restoration. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, Argonne, and other) and are compared with applicable standards intended to protect human health and the environment. A U.S. Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection (ICRP) recommendations and the U.S. Environmental Protection Agency’s (EPA) CAP-88 Version 3 computer code, was used in preparing this report.

  18. Argonne National Laboratory Site Environmental report for calendar year 2009.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.

    2010-08-04

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2009. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's (EPA) CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  19. Argonne National Laboratory site environmental report for calendar year 2006.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; ESH/QA Oversight

    2007-09-13

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2006. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  20. Argonne National Laboratory site enviromental report for calendar year 2008.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.

    2009-09-02

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2008. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  1. Argonne National Laboratory site environmental report for calendar year 2007.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.; ESH/QA Oversight

    2008-09-09

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2007. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  2. A detailed framework to incorporate dust in hydrodynamical simulations

    CERN Document Server

    Grassi, T; Haugboelle, T; Schleicher, D R G

    2016-01-01

    Dust plays a key role in the evolution of the ISM and its correct modelling in numerical simulations is therefore fundamental. We present a new and self-consistent model that treats grain thermal coupling with the gas, radiation balance, and surface chemistry for molecular hydrogen. This method can be applied to any dust distribution with an arbitrary number of grain types without affecting the overall computational cost. In this paper we describe in detail the physics and the algorithm behind our approach, and in order to test the methodology, we present some examples of astrophysical interest, namely (i) a one-zone collapse with complete gas chemistry and thermochemical processes, (ii) a 3D model of a low-metallicity collapse of a minihalo starting from cosmological initial conditions, and (iii) a turbulent molecular cloud with H-C-O chemistry (277 reactions), together with self-consistent cooling and heating solved on the fly. Although these examples employ the publicly available code KROME, our approach c...

  3. Bayesian uncertainty quantification and propagation in molecular dynamics simulations: A high performance computing framework

    Science.gov (United States)

    Angelikopoulos, Panagiotis; Papadimitriou, Costas; Koumoutsakos, Petros

    2012-10-01

    We present a Bayesian probabilistic framework for quantifying and propagating the uncertainties in the parameters of force fields employed in molecular dynamics (MD) simulations. We propose a highly parallel implementation of the transitional Markov chain Monte Carlo for populating the posterior probability distribution of the MD force-field parameters. Efficient scheduling algorithms are proposed to handle the MD model runs and to distribute the computations in clusters with heterogeneous architectures. Furthermore, adaptive surrogate models are proposed in order to reduce the computational cost associated with the large number of MD model runs. The effectiveness and computational efficiency of the proposed Bayesian framework is demonstrated in MD simulations of liquid and gaseous argon.

  4. Argonne National Laboratory institutional plan FY 2001--FY 2006.

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, S.D.

    2000-12-07

    This Institutional Plan describes what Argonne management regards as the optimal future development of Laboratory activities. The document outlines the development of both research programs and support operations in the context of the nation's R and D priorities, the missions of the Department of Energy (DOE) and Argonne, and expected resource constraints. The Draft Institutional Plan is the product of many discussions between DOE and Argonne program managers, and it also reflects programmatic priorities developed during Argonne's summer strategic planning process. That process serves additionally to identify new areas of strategic value to DOE and Argonne, to which Laboratory Directed Research and Development funds may be applied. The Draft Plan is provided to the Department before Argonne's On-Site Review. Issuance of the final Institutional Plan in the fall, after further comment and discussion, marks the culmination of the Laboratory's annual planning cycle. Chapter II of this Institutional Plan describes Argonne's missions and roles within the DOE laboratory system, its underlying core competencies in science and technology, and six broad planning objectives whose achievement is considered critical to the future of the Laboratory. Chapter III presents the Laboratory's ''Science and Technology Strategic Plan,'' which summarizes key features of the external environment, presents Argonne's vision, and describes how Argonne's strategic goals and objectives support DOE's four business lines. The balance of Chapter III comprises strategic plans for 23 areas of science and technology at Argonne, grouped according to the four DOE business lines. The Laboratory's 14 major initiatives, presented in Chapter IV, propose important advances in key areas of fundamental science and technology development. The ''Operations and Infrastructure Strategic Plan'' in Chapter V includes

  5. FERN – a Java framework for stochastic simulation and evaluation of reaction networks

    Directory of Open Access Journals (Sweden)

    Zimmer Ralf

    2008-08-01

    Full Text Available Abstract Background Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a do not provide the most efficient simulation algorithms and are difficult to extend, b cannot be easily integrated into other applications or c do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. Results In this article, we present FERN (Framework for Evaluation of Reaction Networks, a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. Conclusion FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward

  6. FNCS: A Framework for Power System and Communication Networks Co-Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ciraci, Selim; Daily, Jeffrey A.; Fuller, Jason C.; Fisher, Andrew R.; Marinovici, Laurentiu D.; Agarwal, Khushbu

    2014-04-13

    This paper describes the Fenix framework that uses a federated approach for integrating power grid and communication network simulators. Compared existing approaches, Fenix al- lows co-simulation of both transmission and distribution level power grid simulators with the communication network sim- ulator. To reduce the performance overhead of time synchro- nization, Fenix utilizes optimistic synchronization strategies that make speculative decisions about when the simulators are going to exchange messages. GridLAB-D (a distribution simulator), PowerFlow (a transmission simulator), and ns-3 (a telecommunication simulator) are integrated with the frame- work and are used to illustrate the enhanced performance pro- vided by speculative multi-threading on a smart grid applica- tion. Our speculative multi-threading approach achieved on average 20% improvement over the existing synchronization methods

  7. Development of a Computational Framework on Fluid-Solid Mixture Flow Simulations for the COMPASS Code

    Science.gov (United States)

    Zhang, Shuai; Morita, Koji; Shirakawa, Noriyuki; Yamamoto, Yuichi

    The COMPASS code is designed based on the moving particle semi-implicit method to simulate various complex mesoscale phenomena relevant to core disruptive accidents of sodium-cooled fast reactors. In this study, a computational framework for fluid-solid mixture flow simulations was developed for the COMPASS code. The passively moving solid model was used to simulate hydrodynamic interactions between fluid and solids. Mechanical interactions between solids were modeled by the distinct element method. A multi-time-step algorithm was introduced to couple these two calculations. The proposed computational framework for fluid-solid mixture flow simulations was verified by the comparison between experimental and numerical studies on the water-dam break with multiple solid rods.

  8. A framework-based approach to designing simulation-augmented surgical education and training programs.

    Science.gov (United States)

    Cristancho, Sayra M; Moussa, Fuad; Dubrowski, Adam

    2011-09-01

    The goal of simulation-based medical education and training is to help trainees acquire and refine the technical and cognitive skills necessary to perform clinical procedures. When designers incorporate simulation into programs, their efforts should be in line with training needs, rather than technology. Designers of simulation-augmented surgical training programs, however, face particular problems related to identifying a framework that guides the curricular design activity to fulfill the particular requirements of such training programs. These problems include the lack of (1) an objective identification of training needs, (2) a systematic design methodology to match training objectives with simulation resources, (3) structured assessments of performance, and (4) a research-centered view to evaluate and validate systematically the educational effectiveness of the program. In this report, we present a process called "Aim - FineTune - FollowThrough" to enable the connection of the identified problems to solutions, using frameworks from psychology, motor learning, education and experimental design.

  9. Dynamically adaptive Lattice Boltzmann simulation of shallow water flows with the Peano framework

    KAUST Repository

    Neumann, Philipp

    2015-09-01

    © 2014 Elsevier Inc. All rights reserved. We present a dynamically adaptive Lattice Boltzmann (LB) implementation for solving the shallow water equations (SWEs). Our implementation extends an existing LB component of the Peano framework. We revise the modular design with respect to the incorporation of new simulation aspects and LB models. The basic SWE-LB implementation is validated in different breaking dam scenarios. We further provide a numerical study on stability of the MRT collision operator used in our simulations.

  10. A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, K; Seymour, R; Wang, W; Kalia, R; Nakano, A; Vashishta, P; Shimojo, F; Yang, L H

    2009-02-17

    A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based on hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops {center_dot} day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).

  11. Fire protection review revisit no. 2, Argonne National Laboratory, Argonne, Illinois

    Science.gov (United States)

    Dobson, P. H.; Earley, M. W.; Mattern, L. J.

    1985-05-01

    A fire protection survey was conducted at Argonne National Laboratory on April 1-5, 8-12, and April 29-May 2, 1985. The purpose was to review the facility fire protection program and to make recommendations or identify areas according to criteria established by the Department of Energy. There has been a substantial improvement in fire protection at this laboratory since the 1977 audit. Numerous areas which were previously provided with detection systems only have since been provided with automatic sprinkler protection. The following basic fire protection features are not properly controlled: (1) resealing wall and floor penetrations between fire areas after installation of services; (2) cutting and welding; and (3) housekeeping. The present Fire Department manpower level appears adequate to control a route fire. Their ability to adequately handle a high-challenge fire, or one involving injuries to personnel, or fire spread beyond the initial fire area is doubtful.

  12. Task parallel sensitivity analysis and parameter estimation of groundwater simulations through the SALSSA framework

    Energy Technology Data Exchange (ETDEWEB)

    Schuchardt, Karen L.; Agarwal, Khushbu; Chase, Jared M.; Rockhold, Mark L.; Freedman, Vicky L.; Elsethagen, Todd O.; Scheibe, Timothy D.; Chin, George; Sivaramakrishnan, Chandrika

    2010-07-15

    The Support Architecture for Large-Scale Subsurface Analysis (SALSSA) provides an extensible framework, sophisticated graphical user interface, and underlying data management system that simplifies the process of running subsurface models, tracking provenance information, and analyzing the model results. Initially, SALSSA supported two styles of job control: user directed execution and monitoring of individual jobs, and load balancing of jobs across multiple machines taking advantage of many available workstations. Recent efforts in subsurface modelling have been directed at advancing simulators to take advantage of leadership class supercomputers. We describe two approaches, current progress, and plans toward enabling efficient application of the subsurface simulator codes via the SALSSA framework: automating sensitivity analysis problems through task parallelism, and task parallel parameter estimation using the PEST framework.

  13. A software framework for the portable parallelization of particle-mesh simulations

    DEFF Research Database (Denmark)

    Sbalzarini, I.F.; Walther, Jens Honore; Polasek, B.;

    2006-01-01

    Abstract: We present a software framework for the transparent and portable parallelization of simulations using particle-mesh methods. Particles are used to transport physical properties and a mesh is required in order to reinitialize the distorted particle locations, ensuring the convergence of ...

  14. A Simulation Framework for Evaluating Complete Reprogramming Solutions in Wireless Sensor Networks

    NARCIS (Netherlands)

    Horsman, Michel; Marin-Perianu, Mihai; Jansen, Pierre; Havinga, Paul

    2008-01-01

    We propose a simulation framework developed in Simulink for analyzing the performance of code dissemination in wireless sensor networks. The complete solution relies on a three-layer network stack where the LMAC, FixTree and RMD protocols operate in conjunction. For performance evaluation, we use in

  15. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  16. Tiger team assessment of the Argonne Illinois site

    Energy Technology Data Exchange (ETDEWEB)

    1990-10-19

    This report documents the results of the Department of Energy's (DOE) Tiger Team Assessment of the Argonne Illinois Site (AIS) (including the DOE Chicago Operations Office, DOE Argonne Area Office, Argonne National Laboratory-East, and New Brunswick Laboratory) and Site A and Plot M, Argonne, Illinois, conducted from September 17 through October 19, 1990. The Tiger Team Assessment was conducted by a team comprised of professionals from DOE, contractors, consultants. The purpose of the assessment was to provide the Secretary of Energy with the status of Environment, Safety, and Health (ES H) Programs at AIS. Argonne National Laboratory-East (ANL-E) is the principal tenant at AIS. ANL-E is a multiprogram laboratory operated by the University of Chicago for DOE. The mission of ANL-E is to perform basic and applied research that supports the development of energy-related technologies. There are a significant number of ES H findings and concerns identified in the report that require prompt management attention. A significant change in culture is required before ANL-E can attain consistent and verifiable compliance with statutes, regulations and DOE Orders. ES H activities are informal, fragmented, and inconsistently implemented. Communication is seriously lacking, both vertically and horizontally. Management expectations are not known or commondated adequately, support is not consistent, and oversight is not effective.

  17. A Mobility and Traffic Generation Framework for Modeling and Simulating Ad Hoc Communication Networks

    Directory of Open Access Journals (Sweden)

    Chris Barrett

    2004-01-01

    Full Text Available We present a generic mobility and traffic generation framework that can be incorporated into a tool for modeling and simulating large scale ad~hoc networks. Three components of this framework, namely a mobility data generator (MDG, a graph structure generator (GSG and an occlusion modification tool (OMT allow a variety of mobility models to be incorporated into the tool. The MDG module generates positions of transceivers at specified time instants. The GSG module constructs the graph corresponding to the ad hoc network from the mobility data provided by MDG. The OMT module modifies the connectivity of the graph produced by GSG to allow for occlusion effects. With two other modules, namely an activity data generator (ADG which generates packet transmission activities for transceivers and a packet activity simulator (PAS which simulates the movement and interaction of packets among the transceivers, the framework allows the modeling and simulation of ad hoc communication networks. The design of the framework allows a user to incorporate various realistic parameters crucial in the simulation. We illustrate the utility of our framework through a comparative study of three mobility models. Two of these are synthetic models (random waypoint and exponentially correlated mobility proposed in the literature. The third model is based on an urban population mobility modeling tool (TRANSIMS developed at the Los Alamos National Laboratory. This tool is capable of providing comprehensive information about the demographics, mobility and interactions of members of a large urban population. A comparison of these models is carried out by computing a variety of parameters associated with the graph structures generated by the models. There has recently been interest in the structural properties of graphs that arise in real world systems. We examine two aspects of this for the graphs created by the mobility models: change associated with power control (range of

  18. Numerical validation framework for micromechanical simulations based on synchrotron 3D imaging

    Science.gov (United States)

    Buljac, Ante; Shakoor, Modesar; Neggers, Jan; Bernacki, Marc; Bouchard, Pierre-Olivier; Helfen, Lukas; Morgeneyer, Thilo F.; Hild, François

    2017-03-01

    A combined computational-experimental framework is introduced herein to validate numerical simulations at the microscopic scale. It is exemplified for a flat specimen with central hole made of cast iron and imaged via in-situ synchrotron laminography at micrometer resolution during a tensile test. The region of interest in the reconstructed volume, which is close to the central hole, is analyzed by digital volume correlation (DVC) to measure kinematic fields. Finite element (FE) simulations, which account for the studied material microstructure, are driven by Dirichlet boundary conditions extracted from DVC measurements. Gray level residuals for DVC measurements and FE simulations are assessed for validation purposes.

  19. Numerical validation framework for micromechanical simulations based on synchrotron 3D imaging

    Science.gov (United States)

    Buljac, Ante; Shakoor, Modesar; Neggers, Jan; Bernacki, Marc; Bouchard, Pierre-Olivier; Helfen, Lukas; Morgeneyer, Thilo F.; Hild, François

    2016-11-01

    A combined computational-experimental framework is introduced herein to validate numerical simulations at the microscopic scale. It is exemplified for a flat specimen with central hole made of cast iron and imaged via in-situ synchrotron laminography at micrometer resolution during a tensile test. The region of interest in the reconstructed volume, which is close to the central hole, is analyzed by digital volume correlation (DVC) to measure kinematic fields. Finite element (FE) simulations, which account for the studied material microstructure, are driven by Dirichlet boundary conditions extracted from DVC measurements. Gray level residuals for DVC measurements and FE simulations are assessed for validation purposes.

  20. Integrated Simulation Environment for Unmanned Autonomous Systems—Towards a Conceptual Framework

    Directory of Open Access Journals (Sweden)

    M. G. Perhinschi

    2010-01-01

    Full Text Available The paper initiates a comprehensive conceptual framework for an integrated simulation environment for unmanned autonomous systems (UAS that is capable of supporting the design, analysis, testing, and evaluation from a “system of systems” perspective. The paper also investigates the current state of the art of modeling and performance assessment of UAS and their components and identifies directions for future developments. All the components of a comprehensive simulation environment focused on the testing and evaluation of UAS are identified and defined through detailed analysis of current and future required capabilities and performance. The generality and completeness of the simulation environment is ensured by including all operational domains, types of agents, external systems, missions, and interactions between components. The conceptual framework for the simulation environment is formulated with flexibility, modularity, generality, and portability as key objectives. The development of the conceptual framework for the UAS simulation reveals important aspects related to the mechanisms and interactions that determine specific UAS characteristics including complexity, adaptability, synergy, and high impact of artificial and human intelligence on system performance and effectiveness.

  1. Prototyping a coherent framework for full, fast and parameteric detector simulation for the FCC project

    CERN Document Server

    Hrdinka, Julia; Salzburger, Andreas; Hegner, Benedikt

    2015-01-01

    The outstanding success of the physics program of the Large Hadron Collider (LHC) including the discovery of the Higgs boson shifted the focus of part of the high energy physics community onto the planning phase for future circular collider (FCC) projects. A proton-proton collider is in consideration, as well as an electron-positron ring and an electron-proton option as potential LHC successor projects. Common to all projects is the need for a coherent software framework in order to carry out simulation studies to establish the potential physics reach or to test different technol- ogy approaches. Detector simulation is a particularly necessary tool needed for design studies of different detector concepts and to allow establishing the relevant performance parameters. In ad- dition, it allows to generate data as input for the development of reconstruction algorithms needed to cope with the expected future environments. We present a coherent framework that combines full, fast and parametric detector simulation e...

  2. Delphes, a framework for fast simulation of a generic collider experiment

    CERN Document Server

    Ovyn, S; Lemaître, V

    2009-01-01

    It is always delicate to know whether theoretical predictions are visible and measurable in a high energy collider experiment due to the complexity of the related detectors, data acquisition chain and software. We introduce here a new C++ based framework, DELPHES, for fast simulation of a general-purpose experiment. The simulation includes a tracking system, embedded into a magnetic field, calorimetry and a muon system, and possible very forward detectors arranged along the beamline. The framework is interfaced to standard file formats (e.g. Les Houches Event File) and outputs observable objects for analysis, like missing transverse energy and collections of electrons or jets. The simulation of detector response takes into account the detector resolution, and usual reconstruction algorithms, such as FASTJET. A simplified preselection can also be applied on processed data for trigger emulation. Detection of very forward scattered particles relies on the transport in beamlines with the HECTOR software. Finally,...

  3. Modelling and simulation of acrylic bone cement injection and curing within the framework of vertebroplasty

    CERN Document Server

    Landgraf, Ralf; Kolmeder, Sebastian; Lion, Alexander; Lebsack, Helena; Kober, Cornelia

    2013-01-01

    The minimal invasive procedure of vertebroplasty is a surgical technique to treat compression fractures of vertebral bodies. During the treatment liquid bone cement gets injected into the affected vertebral body and therein cures to a solid. In order to investigate the treatment and the impact of injected bone cement on the vertebra, an integrated modelling and simulation framework has been developed. The framework includes (i) the generation of computer models based on microCT images of human cancellous bone, (ii) CFD simulations of bone cement injection into the trabecular structure of a vertebral body as well as (iii) non-linear FEM simulations of the bone cement curing. Thereby, microstructural models of trabecular bone structures are employed. Furthermore, a detailed description of the material behaviour of acrylic bone cements is provided. More precisely, a non-linear fluid flow model is chosen for the representation of the bone cement behaviour during injection and a non-linear viscoelastic material mo...

  4. Web-Enabled Framework for Real-Time Scheduler Simulator: A Teaching Too

    Directory of Open Access Journals (Sweden)

    C. Yaashuwanth

    2010-01-01

    Full Text Available Problem statement: A Real-Time System (RTS is one which controls an environment by receiving data, processing it, and returning the results quickly enough to affect the functioning of the environment at that time. The main objective of this research was to develop an architectural model for the simulation of real time tasks to implement in distributed environment through web, and to make comparison between various scheduling algorithms. The proposed model can be used for preprogrammed scheduling policies for uniprocessor systems. This model provided user friendly Graphical User Interface (GUI. Approach: Though a lot of scheduling algorithms have been developed, just a few of them are available to be implemented in real-time applications. In order to use, test and evaluate a scheduling policy it must be integrated into an operating system, which is a complex task. Simulation is another alternative to evaluate a scheduling policy. Unfortunately, just a few real-time scheduling simulators have been developed to date and most of them require the use of a specific simulation language. Results: Task ID, deadline, priority, period, computation time and phase are the input task attributes to the scheduler simulator and chronograph imitating the real-time execution of the input task set and computational statistics of the schedule are the output. Conclusion: The Web-enabled framework proposed in this study gave the developer to evaluate the schedulability of the real time application. Numerous benefits were quoted in support of the Web-based deployment. The proposed framework can be used as an invaluable teaching tool. Further, the GUI of the framework will allow for easy comparison of the framework of existing scheduling policies and also simulate the behavior and verify the suitability of custom defined schedulers for real-time applications.

  5. A Conceptual Framework for Representing Human Behavior Characteristics in a System of Systems Agent-Based Survivability Simulation

    Science.gov (United States)

    2010-11-22

    distribution is unlimited. A CONCEPTUAL FRAMEWORK FOR REPRESENTING HUMAN BEHAVIOR CHARACTERISTICS IN A SYSTEM OF SYSTEMS AGENT-BASED SURVIVABILITY...27411 -0001 ABSTRACT A CONCEPTUAL FRAMEWORK FOR REPRESENTING HUMAN BEHAVIOR CHARACTERISTICS IN A SYSTEM OF SYSTEMS AGENT-BASED SURVIVABILITY SIMULATION...TITLE AND SUBTITLE A CONCEPTUAL FRAMEWORK FOR REPRESENTING HUMAN BEHAVIOR CHARACTERISTICS IN A SYSTEM OF SYSTEMS AGENT-BASED SURVIVABILITY

  6. A hierarchical Bayesian framework for force field selection in molecular dynamics simulations.

    Science.gov (United States)

    Wu, S; Angelikopoulos, P; Papadimitriou, C; Moser, R; Koumoutsakos, P

    2016-02-13

    We present a hierarchical Bayesian framework for the selection of force fields in molecular dynamics (MD) simulations. The framework associates the variability of the optimal parameters of the MD potentials under different environmental conditions with the corresponding variability in experimental data. The high computational cost associated with the hierarchical Bayesian framework is reduced by orders of magnitude through a parallelized Transitional Markov Chain Monte Carlo method combined with the Laplace Asymptotic Approximation. The suitability of the hierarchical approach is demonstrated by performing MD simulations with prescribed parameters to obtain data for transport coefficients under different conditions, which are then used to infer and evaluate the parameters of the MD model. We demonstrate the selection of MD models based on experimental data and verify that the hierarchical model can accurately quantify the uncertainty across experiments; improve the posterior probability density function estimation of the parameters, thus, improve predictions on future experiments; identify the most plausible force field to describe the underlying structure of a given dataset. The framework and associated software are applicable to a wide range of nanoscale simulations associated with experimental data with a hierarchical structure.

  7. Framework for Simulation of Heterogeneous MpSoC for Design Space Exploration

    Directory of Open Access Journals (Sweden)

    Bisrat Tafesse

    2013-01-01

    Full Text Available Due to the ever-growing requirements in high performance data computation, multiprocessor systems have been proposed to solve the bottlenecks in uniprocessor systems. Developing efficient multiprocessor systems requires effective exploration of design choices like application scheduling, mapping, and architecture design. Also, fault tolerance in multiprocessors needs to be addressed. With the advent of nanometer-process technology for chip manufacturing, realization of multiprocessors on SoC (MpSoC is an active field of research. Developing efficient low power, fault-tolerant task scheduling, and mapping techniques for MpSoCs require optimized algorithms that consider the various scenarios inherent in multiprocessor environments. Therefore there exists a need to develop a simulation framework to explore and evaluate new algorithms on multiprocessor systems. This work proposes a modular framework for the exploration and evaluation of various design algorithms for MpSoC system. This work also proposes new multiprocessor task scheduling and mapping algorithms for MpSoCs. These algorithms are evaluated using the developed simulation framework. The paper also proposes a dynamic fault-tolerant (FT scheduling and mapping algorithm for robust application processing. The proposed algorithms consider optimizing the power as one of the design constraints. The framework for a heterogeneous multiprocessor simulation was developed using SystemC/C++ language. Various design variations were implemented and evaluated using standard task graphs. Performance evaluation metrics are evaluated and discussed for various design scenarios.

  8. Implementation and performance of FDPS: A Framework Developing Parallel Particle Simulation Codes

    CERN Document Server

    Iwasawa, Masaki; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-01-01

    We have developed FDPS (Framework for Developing Particle Simulator), which enables researchers and programmers to develop high-performance parallel particle simulation codes easily. The basic idea of FDPS is to separate the program code for complex parallelization including domain decomposition, redistribution of particles, and exchange of particle information for interaction calculation between nodes, from actual interaction calculation and orbital integration. FDPS provides the former part and the users write the latter. Thus, a user can implement a high-performance fully parallelized $N$-body code only in 120 lines. In this paper, we present the structure and implementation of FDPS, and describe its performance on three sample applications: disk galaxy simulation, cosmological simulation and Giant impact simulation. All codes show very good parallel efficiency and scalability on K computer and XC30. FDPS lets the researchers concentrate on the implementation of physics and mathematical schemes, without wa...

  9. Argonne's contribution to regional development : successful examples.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y. I.

    2000-11-14

    Argonne National Laboratory's mission is basic research and technology development to meet national goals in scientific leadership, energy technology, and environmental quality. In addition to its core missions as a national research and development center, Argonne has exerted a positive impact on its regional economic development, has carried out outstanding educational programs not only for college/graduate students but also for pre-college students and teachers, and has fostered partnerships with universities for research collaboration and with industry for shaping the new technological frontiers.

  10. Performance model of the Argonne Voyager multimedia server

    Energy Technology Data Exchange (ETDEWEB)

    Disz, T.; Olson, R.; Stevens, R. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.

    1997-07-01

    The Argonne Voyager Multimedia Server is being developed in the Futures Lab of the Mathematics and Computer Science Division at Argonne National Laboratory. As a network-based service for recording and playing multimedia streams, it is important that the Voyager system be capable of sustaining certain minimal levels of performance in order for it to be a viable system. In this article, the authors examine the performance characteristics of the server. As they examine the architecture of the system, they try to determine where bottlenecks lie, show actual vs potential performance, and recommend areas for improvement through custom architectures and system tuning.

  11. Flexible simulation framework to couple processes in complex 3D models for subsurface utilization assessment

    Science.gov (United States)

    Kempka, Thomas; Nakaten, Benjamin; De Lucia, Marco; Nakaten, Natalie; Otto, Christopher; Pohl, Maik; Tillner, Elena; Kühn, Michael

    2016-04-01

    Utilization of the geological subsurface for production and storage of hydrocarbons, chemical energy and heat as well as for waste disposal requires the quantification and mitigation of environmental impacts as well as the improvement of georesources utilization in terms of efficiency and sustainability. The development of tools for coupled process simulations is essential to tackle these challenges, since reliable assessments are only feasible by integrative numerical computations. Coupled processes at reservoir to regional scale determine the behaviour of reservoirs, faults and caprocks, generally demanding for complex 3D geological models to be considered besides available monitoring and experimenting data in coupled numerical simulations. We have been developing a flexible numerical simulation framework that provides efficient workflows for integrating the required data and software packages to carry out coupled process simulations considering, e.g., multiphase fluid flow, geomechanics, geochemistry and heat. Simulation results are stored in structured data formats to allow for an integrated 3D visualization and result interpretation as well as data archiving and its provision to collaborators. The main benefits in using the flexible simulation framework are the integration of data geological and grid data from any third party software package as well as data export to generic 3D visualization tools and archiving formats. The coupling of the required process simulators in time and space is feasible, while different spatial dimensions in the coupled simulations can be integrated, e.g., 0D batch with 3D dynamic simulations. User interaction is established via high-level programming languages, while computational efficiency is achieved by using low-level programming languages. We present three case studies on the assessment of geological subsurface utilization based on different process coupling approaches and numerical simulations.

  12. A Multiscale Nested Modeling Framework to Simulate the Interaction of Surface Gravity Waves with Nonlinear Internal Gravity Waves

    Science.gov (United States)

    2015-09-30

    1 A multiscale nested modeling framework to simulate the interaction of surface gravity waves with nonlinear internal gravity waves...Minnesota LONG-TERM GOALS Our long-term goal is to develop a multiscale nested modeling framework that simulates, with the finest resolution...frameworks such as the proposed HYCOM-LZSNFS-SUNTANS-LES nested model are crucial for understanding multiscale processes that are unresolved, and hence

  13. Generation of annular, high-charge electron beams at the Argonne wakefield accelerator

    Science.gov (United States)

    Wisniewski, E. E.; Li, C.; Gai, W.; Power, J.

    2013-01-01

    We present and discuss the results from the experimental generation of high-charge annular(ring-shaped)electron beams at the Argonne Wakefield Accelerator (AWA). These beams were produced by using laser masks to project annular laser profiles of various inner and outer diameters onto the photocathode of an RF gun. The ring beam is accelerated to 15 MeV, then it is imaged by means of solenoid lenses. Transverse profiles are compared for different solenoid settings. Discussion includes a comparison with Parmela simulations, some applications of high-charge ring beams,and an outline of a planned extension of this study.

  14. Generic framework for mining cellular automata models on protein-folding simulations.

    Science.gov (United States)

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  15. Simulation-based Modeling Frameworks for Networked Multi-processor System-on-Chip

    DEFF Research Database (Denmark)

    Mahadevan, Shankar

    2006-01-01

    This thesis deals with modeling aspects of multi-processor system-on-chip (MpSoC) design affected by the on-chip interconnect, also called the Network-on-Chip (NoC), at various levels of abstraction. To begin with, we undertook a comprehensive survey of research and design practices of networked Mp......SoC. The survey presents the challenges of modeling and performance analysis of the hardware and the software components used in such devices. These challenges are further exasperated in a mixed abstraction workspace, which is typical of complex MpSoC design environment. We provide two simulation-based frameworks...... and the RIPE frameworks allows easy incorporation of IP cores from either frameworks, into a new instance of the design. This could pave the way for seamless design evaluation from system-level to cycletrue abstraction in future component-based MpSoC design practice....

  16. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  17. A dynamic subgrid-scale modeling framework for large eddy simulation using approximate deconvolution

    CERN Document Server

    Maulik, Romit

    2016-01-01

    We put forth a dynamic modeling framework for sub-grid parametrization of large eddy simulation of turbulent flows based upon the use of the approximate deconvolution procedure to compute the Smagorinsky constant self-adaptively from the resolved flow quantities. Our numerical assessments for solving the Burgers turbulence problem shows that the proposed approach could be used as a viable tool to address the turbulence closure problem due to its flexibility.

  18. A Hierarchical Framework for Visualising and Simulating Supply Chains in Virtual Environments

    Institute of Scientific and Technical Information of China (English)

    Hai-Yan Zhang; Zheng-Xu Zhao

    2005-01-01

    This paper presents research into applying virtual environment (VE) technology to supply chain management (SCM). Our research work has employed virtual manufacturing environments to represent supply chain nodes to simulate processes and activities in supply chain management. This will enable those who are involved in these processes and activities to gain an intuitive understanding of them, so as to design robust supply chains and make correct decisions at the right time.A framework system and its hierarchical structure for visualising and simulating supply chains in virtual environments are reported and detailed in this paper.

  19. Analysis of GEANT4 Physics List Properties in the 12 GeV MOLLER Simulation Framework

    Science.gov (United States)

    Haufe, Christopher; Moller Collaboration

    2013-10-01

    To determine the validity of new physics beyond the scope of the electroweak theory, nuclear physicists across the globe have been collaborating on future endeavors that will provide the precision needed to confirm these speculations. One of these is the MOLLER experiment - a low-energy particle experiment that will utilize the 12 GeV upgrade of Jefferson Lab's CEBAF accelerator. The motivation of this experiment is to measure the parity-violating asymmetry of scattered polarized electrons off unpolarized electrons in a liquid hydrogen target. This measurement would allow for a more precise determination of the electron's weak charge and weak mixing angle. While still in its planning stages, the MOLLER experiment requires a detailed simulation framework in order to determine how the project should be run in the future. The simulation framework for MOLLER, called ``remoll'', is written in GEANT4 code. As a result, the simulation can utilize a number of GEANT4 coded physics lists that provide the simulation with a number of particle interaction constraints based off of different particle physics models. By comparing these lists with one another using the data-analysis application ROOT, the most optimal physics list for the MOLLER simulation can be determined and implemented. This material is based upon work supported by the National Science Foundation under Grant No. 714001.

  20. Simulation-based optimization framework for reuse of agricultural drainage water in irrigation.

    Science.gov (United States)

    Allam, A; Tawfik, A; Yoshimura, C; Fleifle, A

    2016-05-01

    A simulation-based optimization framework for agricultural drainage water (ADW) reuse has been developed through the integration of a water quality model (QUAL2Kw) and a genetic algorithm. This framework was applied to the Gharbia drain in the Nile Delta, Egypt, in summer and winter 2012. First, the water quantity and quality of the drain was simulated using the QUAL2Kw model. Second, uncertainty analysis and sensitivity analysis based on Monte Carlo simulation were performed to assess QUAL2Kw's performance and to identify the most critical variables for determination of water quality, respectively. Finally, a genetic algorithm was applied to maximize the total reuse quantity from seven reuse locations with the condition not to violate the standards for using mixed water in irrigation. The water quality simulations showed that organic matter concentrations are critical management variables in the Gharbia drain. The uncertainty analysis showed the reliability of QUAL2Kw to simulate water quality and quantity along the drain. Furthermore, the sensitivity analysis showed that the 5-day biochemical oxygen demand, chemical oxygen demand, total dissolved solids, total nitrogen and total phosphorous are highly sensitive to point source flow and quality. Additionally, the optimization results revealed that the reuse quantities of ADW can reach 36.3% and 40.4% of the available ADW in the drain during summer and winter, respectively. These quantities meet 30.8% and 29.1% of the drainage basin requirements for fresh irrigation water in the respective seasons.

  1. Microworlds, Simulators, and Simulation: Framework for a Benchmark of Human Reliability Data Sources

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Boring; Dana Kelly; Carol Smidts; Ali Mosleh; Brian Dyre

    2012-06-01

    In this paper, we propose a method to improve the data basis of human reliability analysis (HRA) by extending the data sources used to inform HRA methods. Currently, most HRA methods are based on limited empirical data, and efforts to enhance the empirical basis behind HRA methods have not yet yielded significant new data. Part of the reason behind this shortage of quality data is attributable to the data sources used. Data have been derived from unrelated industries, from infrequent risk-significant events, or from costly control room simulator studies. We propose a benchmark of four data sources: a simplified microworld simulator using unskilled student operators, a full-scope control room simulator using skilled student operators, a full-scope control room simulator using licensed commercial operators, and a human performance modeling and simulation system using virtual operators. The goal of this research is to compare findings across the data sources to determine to what extent data may be used and generalized from cost effective sources.

  2. Argonne to open new facility for advanced vehicle testing

    CERN Multimedia

    2002-01-01

    Argonne National Laboratory will open it's Advanced Powertrain Research Facility on Friday, Nov. 15. The facility is North America's only public testing facility for engines, fuel cells, electric drives and energy storage. State-of-the-art performance and emissions measurement equipment is available to support model development and technology validation (1 page).

  3. Brookhaven Lab and Argonne Lab scientists invent a plasma valve

    CERN Multimedia

    2003-01-01

    Scientists from Brookhaven National Laboratory and Argonne National Laboratory have received U.S. patent number 6,528,948 for a device that shuts off airflow into a vacuum about one million times faster than mechanical valves or shutters that are currently in use (1 page).

  4. Argonne National Laboratory Publications July 1, 1968 - June 30, 1969.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1969-08-01

    This publication list is a bibliography of scientific and technical accounts originated at Argonne and published during the fiscal year 1969 (July 1, 1968 through June 30, 1969). It includes items published as journal articles, technical reports, books, etc., all of which have been made available to the public.

  5. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  6. Three Argonne technologies win R&D 100 awards

    CERN Multimedia

    2003-01-01

    "Three technologies developed or co-developed at the U.S. Department of Energy's Argonne National Laboratory have been recognized with R&D 100 Awards, which highlight some of the best products and technologies from around the world" (1 page).

  7. Simulation-Based E-Learning Framework for Entrepreneurship Education and Training

    Directory of Open Access Journals (Sweden)

    Constanţa-Nicoleta Bodea

    2015-02-01

    Full Text Available The paper proposes an e-Learning framework in entrepreneurship. The framework has three main components, for identification the business opportunities, for developing business scenarios and for risk analysis. A common database assures the components integration. The main components of this framework are already available; the main challenging for those interested in using them is to design an integrated flow of activities, adapted with their curricula and other educational settings. The originality of the approach is that the framework is domain independent and uses advanced IT technologies, such as recommendation algorithms, agent-based simulations and extended graphical support. Using this e-learning framework, the students can learn how to choose relevant characteristics/aspects for a type of business and how important is each of them according specific criteria; how to set realistic values for different characteristics/aspects of the business, how a business scenario can be changed in order to fit better to the business context and how to assess/evaluate business scenarios.

  8. Ximpol: a new X-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Baldini, Luca; Muleri, Fabio; Soffitta, Paolo; Omodei, Nicola; Pesce-Rollins, Melissa; Sgro, Carmelo; Latronico, Luca; Spada, Francesca; Manfreda, Alberto; Di Lalla, Niccolo

    2016-07-01

    We present a new simulation framework, ximpol, based on the Python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. ximpol is designed to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC---which make it a useful tool not only for simulating observations of astronomical sources, but also to develop and test end-to-end analysis chains. In this contribution we shall give an overview of the basic architecture of the software. Although in principle the framework is not tied to any specific mission or instrument design we shall present a few physically interesting case studies in the context of the XIPE mission phase study.

  9. An Object-Oriented Framework for Versatile Finite Element Based Simulations of Neurostimulation

    Directory of Open Access Journals (Sweden)

    Edward T. Dougherty

    2016-01-01

    Full Text Available Computational simulations of transcranial electrical stimulation (TES are commonly utilized by the neurostimulation community, and while vastly different TES application areas can be investigated, the mathematical equations and physiological characteristics that govern this research are identical. The goal of this work was to develop a robust software framework for TES that efficiently supports the spectrum of computational simulations routinely utilized by the TES community and in addition easily extends to support alternative neurostimulation research objectives. Using well-established object-oriented software engineering techniques, we have designed a software framework based upon the physical and computational aspects of TES. The framework’s versatility is demonstrated with a set of diverse neurostimulation simulations that (i reinforce the importance of using anisotropic tissue conductivities, (ii demonstrate the enhanced precision of high-definition stimulation electrodes, and (iii highlight the benefits of utilizing multigrid solution algorithms. Our approaches result in a framework that facilitates rapid prototyping of real-world, customized TES administrations and supports virtually any clinical, biomedical, or computational aspect of this treatment. Software reuse and maintainability are optimized, and in addition, the same code can be effortlessly augmented to provide support for alternative neurostimulation research endeavors.

  10. Simulation-optimization framework for multi-site multi-season hybrid stochastic streamflow modeling

    Science.gov (United States)

    Srivastav, Roshan; Srinivasan, K.; Sudheer, K. P.

    2016-11-01

    A simulation-optimization (S-O) framework is developed for the hybrid stochastic modeling of multi-site multi-season streamflows. The multi-objective optimization model formulated is the driver and the multi-site, multi-season hybrid matched block bootstrap model (MHMABB) is the simulation engine within this framework. The multi-site multi-season simulation model is the extension of the existing single-site multi-season simulation model. A robust and efficient evolutionary search based technique, namely, non-dominated sorting based genetic algorithm (NSGA - II) is employed as the solution technique for the multi-objective optimization within the S-O framework. The objective functions employed are related to the preservation of the multi-site critical deficit run sum and the constraints introduced are concerned with the hybrid model parameter space, and the preservation of certain statistics (such as inter-annual dependence and/or skewness of aggregated annual flows). The efficacy of the proposed S-O framework is brought out through a case example from the Colorado River basin. The proposed multi-site multi-season model AMHMABB (whose parameters are obtained from the proposed S-O framework) preserves the temporal as well as the spatial statistics of the historical flows. Also, the other multi-site deficit run characteristics namely, the number of runs, the maximum run length, the mean run sum and the mean run length are well preserved by the AMHMABB model. Overall, the proposed AMHMABB model is able to show better streamflow modeling performance when compared with the simulation based SMHMABB model, plausibly due to the significant role played by: (i) the objective functions related to the preservation of multi-site critical deficit run sum; (ii) the huge hybrid model parameter space available for the evolutionary search and (iii) the constraint on the preservation of the inter-annual dependence. Split-sample validation results indicate that the AMHMABB model is

  11. A framework for WRF to WRF-IBM grid nesting to enable multiscale simulations

    Energy Technology Data Exchange (ETDEWEB)

    Wiersema, David John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Univ. of California, Berkeley, CA (United States); Lundquist, Katherine A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chow, Fotini Katapodes [Univ. of California, Berkeley, CA (United States)

    2016-09-29

    With advances in computational power, mesoscale models, such as the Weather Research and Forecasting (WRF) model, are often pushed to higher resolutions. As the model’s horizontal resolution is refined, the maximum resolved terrain slope will increase. Because WRF uses a terrain-following coordinate, this increase in resolved terrain slopes introduces additional grid skewness. At high resolutions and over complex terrain, this grid skewness can introduce large numerical errors that require methods, such as the immersed boundary method, to keep the model accurate and stable. Our implementation of the immersed boundary method in the WRF model, WRF-IBM, has proven effective at microscale simulations over complex terrain. WRF-IBM uses a non-conforming grid that extends beneath the model’s terrain. Boundary conditions at the immersed boundary, the terrain, are enforced by introducing a body force term to the governing equations at points directly beneath the immersed boundary. Nesting between a WRF parent grid and a WRF-IBM child grid requires a new framework for initialization and forcing of the child WRF-IBM grid. This framework will enable concurrent multi-scale simulations within the WRF model, improving the accuracy of high-resolution simulations and enabling simulations across a wide range of scales.

  12. A framework of passive millimeter-wave imaging simulation for typical ground scenes

    Science.gov (United States)

    Yan, Luxin; Ge, Rui; Zhong, Sheng

    2009-10-01

    Passive millimeter-wave (PMMW) imaging offers advantages over visible and IR imaging in having better all weather performance. However the PMMW imaging sensors are state-of-the-art to date, sometimes it is required to predict and evaluate the performance of a PMMW sensor under a variety of weather, terrain and sensor operational conditions. The PMMW scene simulation is an efficient way. This paper proposes a framework of the PMMW simulation for ground scenes. Commercial scene modeling software, Multigen and Vega, are used to generate the multi-viewpoint and multi-scale description for natural ground scenes with visible images. The background and objects in the scene are classified based on perceptive color clusters and mapped with different materials. Further, the radiometric temperature images of the scene are calculated according to millimeter wave phenomenology: atmospheric propagation and emission including sky temperature, weather conditions, and physical temperature. Finally, the simulated output PMMW images are generated by applying the sensor characteristics such as the aperture size, data sample scheme and system noise. Tentative results show the simulation framework can provide reasonable scene's PMMW image with high fidelity.

  13. A Framework for Interactive Work Design based on Digital Work Analysis and Simulation

    CERN Document Server

    Ma, Liang; Fu, Huanzhang; Guo, Yang; Chablat, Damien; Bennis, Fouad; 10.1002/hfm.20178

    2010-01-01

    Due to the flexibility and adaptability of human, manual handling work is still very important in industry, especially for assembly and maintenance work. Well-designed work operation can improve work efficiency and quality; enhance safety, and lower cost. Most traditional methods for work system analysis need physical mock-up and are time consuming. Digital mockup (DMU) and digital human modeling (DHM) techniques have been developed to assist ergonomic design and evaluation for a specific worker population (e.g. 95 percentile); however, the operation adaptability and adjustability for a specific individual are not considered enough. In this study, a new framework based on motion tracking technique and digital human simulation technique is proposed for motion-time analysis of manual operations. A motion tracking system is used to track a worker's operation while he/she is conducting a manual handling work. The motion data is transferred to a simulation computer for real time digital human simulation. The data ...

  14. A modular modelling framework for hypotheses testing in the simulation of urbanisation

    CERN Document Server

    Cottineau, Clementine; Chapron, Paul; Coyrehourcq, Sebastien Rey; Pumain, Denise

    2015-01-01

    In this paper, we present a modelling experiment developed to study systems of cities and processes of urbanisation in large territories over long time spans. Building on geographical theories of urban evolution, we rely on agent-based models to 1/ formalise complementary and alternative hypotheses of urbanisation and 2/ explore their ability to simulate observed patterns in a virtual laboratory. The paper is therefore divided into two sections : an overview of the mechanisms implemented to represent competing hypotheses used to simulate urban evolution; and an evaluation of the resulting model structures in their ability to simulate - efficiently and parsimoniously - a system of cities (the Former Soviet Union) over several periods of time (before and after the crash of the USSR). We do so using a modular framework of model-building and evolutionary algorithms for the calibration of several model structures. This project aims at tackling equifinality in systems dynamics by confronting different mechanisms wi...

  15. Automated Object-Oriented Simulation Framework for Modelling of Superconducting Magnets at CERN

    CERN Document Server

    Maciejewski, Michał; Bartoszewicz, Andrzej

    The thesis aims at designing a flexible, extensible, user-friendly interface to model electro thermal transients occurring in superconducting magnets. Simulations are a fundamental tool for assessing the performance of a magnet and its protection system against the effects of a quench. The application is created using scalable and modular architecture based on object-oriented programming paradigm which opens an easy way for future extensions. What is more, each model composed of thousands of blocks is automatically created in MATLAB/Simulink. Additionally, the user is able to automatically run sets of simulations with varying parameters. Due to its scalability and modularity the framework can be easily used to simulate wide range of materials and magnet configurations.

  16. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    Science.gov (United States)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  17. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    CERN Document Server

    Hummel, Jacob

    2016-01-01

    We present the first public release (v0.1) of the open-source GADGET Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes GADGET and GIZMO using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics (SPH) datasets.

  18. Lattice Boltzmann Simulations in the Slip and Transition Flow Regime with the Peano Framework

    KAUST Repository

    Neumann, Philipp

    2012-01-01

    We present simulation results of flows in the finite Knudsen range, which is in the slip and transition flow regime. Our implementations are based on the Lattice Boltzmann method and are accomplished within the Peano framework. We validate our code by solving two- and three-dimensional channel flow problems and compare our results with respective experiments from other research groups. We further apply our Lattice Boltzmann solver to the geometrical setup of a microreactor consisting of differently sized channels and a reactor chamber. Here, we apply static adaptive grids to fur-ther reduce computational costs. We further investigate the influence of using a simple BGK collision kernel in coarse grid regions which are further away from the slip boundaries. Our results are in good agreement with theory and non-adaptive simulations, demonstrating the validity and the capabilities of our adaptive simulation software for flow problems at finite Knudsen numbers.

  19. A higher-order numerical framework for stochastic simulation of chemical reaction systems.

    KAUST Repository

    Székely, Tamás

    2012-07-15

    BACKGROUND: In this paper, we present a framework for improving the accuracy of fixed-step methods for Monte Carlo simulation of discrete stochastic chemical kinetics. Stochasticity is ubiquitous in many areas of cell biology, for example in gene regulation, biochemical cascades and cell-cell interaction. However most discrete stochastic simulation techniques are slow. We apply Richardson extrapolation to the moments of three fixed-step methods, the Euler, midpoint and θ-trapezoidal τ-leap methods, to demonstrate the power of stochastic extrapolation. The extrapolation framework can increase the order of convergence of any fixed-step discrete stochastic solver and is very easy to implement; the only condition for its use is knowledge of the appropriate terms of the global error expansion of the solver in terms of its stepsize. In practical terms, a higher-order method with a larger stepsize can achieve the same level of accuracy as a lower-order method with a smaller one, potentially reducing the computational time of the system. RESULTS: By obtaining a global error expansion for a general weak first-order method, we prove that extrapolation can increase the weak order of convergence for the moments of the Euler and the midpoint τ-leap methods, from one to two. This is supported by numerical simulations of several chemical systems of biological importance using the Euler, midpoint and θ-trapezoidal τ-leap methods. In almost all cases, extrapolation results in an improvement of accuracy. As in the case of ordinary and stochastic differential equations, extrapolation can be repeated to obtain even higher-order approximations. CONCLUSIONS: Extrapolation is a general framework for increasing the order of accuracy of any fixed-step stochastic solver. This enables the simulation of complicated systems in less time, allowing for more realistic biochemical problems to be solved.

  20. Doing It In The SWMF Way: From Separate Space Physics Simulation Programs To The Framework For Space Weather Simulation.

    Science.gov (United States)

    Volberg, O.; Toth, G.; Sokolov, I.; Ridley, A. J.; Gombosi, T. I.; de Zeeuw, D. C.; Hansen, K. C.; Chesney, D. R.; Stout, Q. F.; Powell, K. G.; Kane, K. J.; Oehmke, R. C.

    2003-12-01

    The NASA-funded Space Weather Modeling Framework (SWMF) is developed to provide "plug and play" type Sun-to-Earth simulation capabilities serving the space physics modeling community. In its fully developed form, the SWMF will comprise a series of interoperating models of physics domains, ranging from the surface of the Sun to the upper atmosphere of the Earth. In its current form the SWMF links together five models: Global Magnetosphere, Inner Heliosphere, Ionosphere Electrodynamics, Upper Atmosphere, and Inner Magnetosphere. The framework permits to switch models of any type. The SWMF is a structured collection of software building blocks that can be used or customized to develop Sun-Earth system modeling components, and to assemble them into application. The SWMF consist of utilities and data structures for creating model components and coupling them. The SWMF contains Control Model, which controls initialization and execution of the components. It is responsible for component registration, processor layout for each component and coupling schedules. A component is created from the user-supplied physics code by adding a wrapper, which provides the control functions and coupling interface to perform the data exchange with other components. Both the wrapper and coupling interface are constructed from the building blocks provided by the framework itself. The current SWMF implementation is based on the latest component technology and uses many important concepts of Object-Oriented Programming emulated in Fortran 90. Currently it works on Linux Beowulf clusters, SGI Origin 2000 and Compaq ES45 machines.

  1. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Jillian, E-mail: jillian.becker@health.qld.gov.au [Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia); Bridge, Pete [School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland (Australia); Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet [Radiation Oncology, Princess Alexandra Hospital, Brisbane, Queensland (Australia); Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia)

    2015-06-15

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.

  2. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Jan eHahne

    2015-09-01

    Full Text Available Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy...

  3. Introducing FACETS, the Framework Application for Core-Edge Transport Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R. [Tech-X Corporation; Candy, Jeff [General Atomics; Cohen, Ronald H. [Lawrence Livermore National Laboratory (LLNL); Krasheninnikov, Sergei I [ORNL; McCune, Douglas C [ORNL; Estep, Donald J [Colorado State University, Fort Collins; Larson, Jay W [ORNL; Malony, Allen [University of Oregon; Worley, Patrick H [ORNL; Carlsson, Johann Anders [ORNL; Hakim, A H [Tech-X Corporation; Hamill, P [Tech-X Corporation; Kruger, Scott E [ORNL; Muzsala, S [Tech-X Corporation; Pletzer, Alexander [ORNL; Shasharina, Svetlana [Tech-X Corporation; Wade-Stein, D [Tech-X Corporation; Wang, N [Tech-X Corporation; McInnes, Lois C [ORNL; Wildey, T [Tech-X Corporation; Casper, T. A. [Lawrence Livermore National Laboratory (LLNL); Diachin, Lori A [ORNL; Epperly, Thomas [Lawrence Livermore National Laboratory (LLNL); Rognlien, T. D. [Lawrence Livermore National Laboratory (LLNL); Fahey, Mark R [ORNL; Kuehn, Jeffery A [ORNL; Morris, A [University of Oregon; Shende, Sameer [University of Oregon; Feibush, E [Tech-X Corporation; Hammett, Gregory W [ORNL; Indireshkumar, K [Tech-X Corporation; Ludescher, C [Tech-X Corporation; Randerson, L [Tech-X Corporation; Stotler, D. [Princeton Plasma Physics Laboratory (PPPL); Pigarov, A [University of California, San Diego; Bonoli, P. [Massachusetts Institute of Technology (MIT); Chang, C S [New York University; D' Ippolito, D. A. [Lodestar Research Corporation; Colella, Philip [Lawrence Berkeley National Laboratory (LBNL); Keyes, David E [Columbia University; Bramley, R [Indiana University; Myra, J. R. [Lodestar Research Corporation

    2007-06-01

    The FACETS (Framework Application for Core-Edge Transport Simulations) project began in January 2007 with the goal of providing core to wall transport modeling of a tokamak fusion reactor. This involves coupling previously separate computations for the core, edge, and wall regions. Such a coupling is primarily through connection regions of lower dimensionality. The project has started developing a component-based coupling framework to bring together models for each of these regions. In the first year, the core model will be a 1 dimensional model (1D transport across flux surfaces coupled to a 2D equilibrium) with fixed equilibrium. The initial edge model will be the fluid model, UEDGE, but inclusion of kinetic models is planned for the out years. The project also has an embedded Scientific Application Partnership that is examining embedding a full-scale turbulence model for obtaining the crosssurface fluxes into a core transport code.

  4. Introducing FACETS, the Framework Application for Core-Edge Transport Simulations

    Science.gov (United States)

    Cary, J. R.; Candy, J.; Cohen, R. H.; Krasheninnikov, S.; McCune, D. C.; Estep, D. J.; Larson, J.; Malony, A. D.; Worley, P. H.; Carlsson, J. A.; Hakim, A. H.; Hamill, P.; Kruger, S.; Muzsala, S.; Pletzer, A.; Shasharina, S.; Wade-Stein, D.; Wang, N.; McInnes, L.; Wildey, T.; Casper, T.; Diachin, L.; Epperly, T.; Rognlien, T. D.; Fahey, M. R.; Kuehn, J. A.; Morris, A.; Shende, S.; Feibush, E.; Hammett, G. W.; Indireshkumar, K.; Ludescher, C.; Randerson, L.; Stotler, D.; Pigarov, A. Yu; Bonoli, P.; Chang, C. S.; D'Ippolito, D. A.; Colella, P.; Keyes, D. E.; Bramley, R.; Myra, J. R.

    2007-07-01

    The FACETS (Framework Application for Core-Edge Transport Simulations) project began in January 2007 with the goal of providing core to wall transport modeling of a tokamak fusion reactor. This involves coupling previously separate computations for the core, edge, and wall regions. Such a coupling is primarily through connection regions of lower dimensionality. The project has started developing a component-based coupling framework to bring together models for each of these regions. In the first year, the core model will be a 1 ½ dimensional model (1D transport across flux surfaces coupled to a 2D equilibrium) with fixed equilibrium. The initial edge model will be the fluid model, UEDGE, but inclusion of kinetic models is planned for the out years. The project also has an embedded Scientific Application Partnership that is examining embedding a full-scale turbulence model for obtaining the crosssurface fluxes into a core transport code.

  5. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Science.gov (United States)

    Hartwig, Zachary S.

    2016-04-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms.

  6. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  7. GNU polyxmass: a software framework for mass spectrometric simulations of linear (bio-polymeric analytes

    Directory of Open Access Journals (Sweden)

    Rusconi Filippo

    2006-04-01

    Full Text Available Abstract Background Nowadays, a variety of (bio-polymers can be analyzed by mass spectrometry. The detailed interpretation of the spectra requires a huge number of "hypothesis cycles", comprising the following three actions 1 put forth a structural hypothesis, 2 test it, 3 (invalidate it. This time-consuming and painstaking data scrutiny is alleviated by using specialized software tools. However, all the software tools available to date are polymer chemistry-specific. This imposes a heavy overhead to researchers who do mass spectrometry on a variety of (bio-polymers, as each polymer type will require a different software tool to perform data simulations and analyses. We developed a software to address the lack of an integrated software framework able to deal with different polymer chemistries. Results The GNU polyxmass software framework performs common (bio-chemical simulations–along with simultaneous mass spectrometric calculations–for any kind of linear bio-polymeric analyte (DNA, RNA, saccharides or proteins. The framework is organized into three modules, all accessible from one single binary program. The modules let the user to 1 define brand new polymer chemistries, 2 perform quick mass calculations using a desktop calculator paradigm, 3 graphically edit polymer sequences and perform (bio-chemical/mass spectrometric simulations. Any aspect of the mass calculations, polymer chemistry reactions or graphical polymer sequence editing is configurable. Conclusion The scientist who uses mass spectrometry to characterize (bio-polymeric analytes of different chemistries is provided with a single software framework for his data prediction/analysis needs, whatever the polymer chemistry being involved.

  8. The Application of Modeling and Simulation in Capacity Management within the ITIL Framework

    Science.gov (United States)

    Rahmani, Sonya; vonderHoff, Otto

    2010-01-01

    Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.

  9. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    Science.gov (United States)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  10. Abstract Radio Resource Management Framework for System Level Simulations in LTE-A Systems

    DEFF Research Database (Denmark)

    Fotiadis, Panagiotis; Viering, Ingo; Zanier, Paolo;

    2014-01-01

    This paper provides a simple mathematical model of different packet scheduling policies in Long Term Evolution- Advanced (LTE-A) systems, by investigating the performance of Proportional Fair (PF) and the generalized cross-Component Carrier scheduler from a theoretical perspective. For that purpose......, an abstract Radio Resource Management (RRM) framework has been developed and tested for different ratios of users with Carrier Aggregation (CA) capabilities. The conducted system level simulations confirm that the proposed model can satisfactorily capture the main properties of the aforementioned scheduling...

  11. Framework for the construction of a Monte Carlo simulated brain PET-MR image database

    Science.gov (United States)

    Thomas, B. A.; Erlandsson, K.; Drobnjak, I.; Pedemonte, S.; Vunckx, K.; Bousse, A.; Reilhac-Laborde, A.; Ourselin, S.; Hutton, B. F.

    2014-01-01

    Simultaneous PET-MR acquisition reduces the possibility of registration mismatch between the two modalities. This facilitates the application of techniques, either during reconstruction or post-reconstruction, that aim to improve the PET resolution by utilising structural information provided by MR. However, in order to validate such methods for brain PET-MR studies it is desirable to evaluate the performance using data where the ground truth is known. In this work, we present a framework for the production of datasets where simulations of both the PET and MR, based on real data, are generated such that reconstruction and post-reconstruction approaches can be fairly compared.

  12. Managing simulation-based training: A framework for optimizing learning, cost, and time

    Science.gov (United States)

    Richmond, Noah Joseph

    This study provides a management framework for optimizing training programs for learning, cost, and time when using simulation based training (SBT) and reality based training (RBT) as resources. Simulation is shown to be an effective means for implementing activity substitution as a way to reduce risk. The risk profile of 22 US Air Force vehicles are calculated, and the potential risk reduction is calculated under the assumption of perfect substitutability of RBT and SBT. Methods are subsequently developed to relax the assumption of perfect substitutability. The transfer effectiveness ratio (TER) concept is defined and modeled as a function of the quality of the simulator used, and the requirements of the activity trained. The Navy F/A-18 is then analyzed in a case study illustrating how learning can be maximized subject to constraints in cost and time, and also subject to the decision maker's preferences for the proportional and absolute use of simulation. Solution methods for optimizing multiple activities across shared resources are next provided. Finally, a simulation strategy including an operations planning program (OPP), an implementation program (IP), an acquisition program (AP), and a pedagogical research program (PRP) is detailed. The study provides the theoretical tools to understand how to leverage SBT, a case study demonstrating these tools' efficacy, and a set of policy recommendations to enable the US military to better utilize SBT in the future.

  13. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Hartwig, Zachary S., E-mail: hartwig@mit.edu

    2016-04-11

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  14. DELPHES 3, A modular framework for fast simulation of a generic collider experiment

    CERN Document Server

    de Favereau, J; Demin, P; Giammanco, A; Lemaître, V; Mertens, A; Selvaggi, M

    2013-01-01

    The version 3.0 of the DELPHES fast-simulation framework is presented. The tool is written in C++ and is interfaced with the most common Monte-Carlo file formats. Its goal is the simulation of a multipurpose detector that includes a track propagation system embedded in a magnetic field, electromagnetic and hadronic calorimeters, and a muon identification system. The new modular design allows to easily produce the collections that are needed for later analysis, from low level objects such as tracks and calorimeter deposits up to high level collections such as isolated electrons, jets, taus, and missing energy. New features such as pile-up and improved algorithms like the particle-flow reconstruction approach have also been implemented.

  15. A proposed simulation optimization model framework for emergency department problems in public hospital

    Science.gov (United States)

    Ibrahim, Ireen Munira; Liong, Choong-Yeun; Bakar, Sakhinah Abu; Ahmad, Norazura; Najmuddin, Ahmad Farid

    2015-12-01

    The Emergency Department (ED) is a very complex system with limited resources to support increase in demand. ED services are considered as good quality if they can meet the patient's expectation. Long waiting times and length of stay is always the main problem faced by the management. The management of ED should give greater emphasis on their capacity of resources in order to increase the quality of services, which conforms to patient satisfaction. This paper is a review of work in progress of a study being conducted in a government hospital in Selangor, Malaysia. This paper proposed a simulation optimization model framework which is used to study ED operations and problems as well as to find an optimal solution to the problems. The integration of simulation and optimization is hoped can assist management in decision making process regarding their resource capacity planning in order to improve current and future ED operations.

  16. Bounding box framework for efficient phase field simulation of grain growth in anisotropic systems

    CERN Document Server

    Vanherpe, L; Blanpain, B; Vandewalle, S

    2011-01-01

    A sparse bounding box algorithm is extended to perform efficient phase field simulations of grain growth in anisotropic systems. The extended bounding box framework allows to attribute different properties to different grain boundary types of a polycrystalline microstructure and can be combined with explicit, implicit or semi-implicit time stepping strategies. To illustrate the applicability of the software, the simulation results of a case study are analysed. They indicate the impact of a misorientation dependent boundary energy formulation on the evolution of the misorientation distribution of the grain boundary types and on the individual growth rates of the grains as a function of the number of grain faces. (C) 2011 Elsevier B.V. All rights reserved.

  17. GNSSim: An Open Source GNSS/GPS Framework for Unmanned Aerial Vehicular Network Simulation

    Directory of Open Access Journals (Sweden)

    Farha Jahan

    2015-08-01

    Full Text Available Unmanned systems are of great importance in accomplishing tasks where human lives are at risk. These systems are being deployed in tasks that are time-consuming, expensive or inconclusive if accomplished by human intervention. Design, development and testing of such vehicles using actual hardware could be quite costly and dangerous. Another issue is the limited outdoor usage permitted by Federal Aviation Administration regulations, which makes outdoor testing difficult. An optimal solution to this problem is to have a simulation environment where different operational scenarios, newly developed models, etc., can be studied. In this paper, we propose GNSSim, a Global Navigation Satellite System (GNSS simulation framework. We demonstrate its effectiveness by integrating it with UAVSim. This allows users to experiment easily by adjusting different satellite as well as UAV parameters. Related tests and evidence of the correctness of the implementation are presented.

  18. TileCal Beam Test Simulation Application in the FADS/Goofy Framework (GEANT4)

    CERN Document Server

    Solodkov, A A

    2003-01-01

    A new application for the Tile Calorimeter (TileCal) beam test simulation has been developed in GEANT4 within the FADS/Goofy framework. The geometry and readout systems for all the different TileCal modules have been implemented in a quite detailed way. This application allows to simulate all the TileCal beam test setup configurations existing so far. Details of the development as well as instructions to install and run the program are presented. The first tests have been performed for a beam test setup consisting of five prototype modules using negative pions with different energies and results of comparison to the experimental data from TileCal TDR are presented as well.

  19. Using a New Event-Based Simulation Framework for Investigating Resource Provisioning in Clouds

    Directory of Open Access Journals (Sweden)

    Simon Ostermann

    2011-01-01

    Full Text Available Today, Cloud computing proposes an attractive alternative to building large-scale distributed computing environments by which resources are no longer hosted by the scientists' computational facilities, but leased from specialised data centres only when and for how long they are needed. This new class of Cloud resources raises new interesting research questions in the fields of resource management, scheduling, fault tolerance, or quality of service, requiring hundreds to thousands of experiments for finding valid solutions. To enable such research, a scalable simulation framework is typically required for early prototyping, extensive testing and validation of results before the real deployment is performed. The scope of this paper is twofold. In the first part we present GroudSim, a Grid and Cloud simulation toolkit for scientific computing based on a scalable simulation-independent discrete-event engine. GroudSim provides a comprehensive set of features for complex simulation scenarios from simple job executions on leased computing resources to file transfers, calculation of costs and background load on resources. Simulations can be parameterised and are easily extendable by probability distribution packages for failures which normally occur in complex distributed environments. Experimental results demonstrate the improved scalability of GroudSim compared to a related process-based simulation approach. In the second part, we show the use of the GroudSim simulator to analyse the problem of dynamic provisioning of Cloud resources to scientific workflows that do not benefit from sufficient Grid resources as required by their computational demands. We propose and study four strategies for provisioning and releasing Cloud resources that take into account the general leasing model encountered in today's commercial Cloud environments based on resource bulks, fuzzy descriptions and hourly payment intervals. We study the impact of our techniques to the

  20. Parallel kinetic Monte Carlo simulation framework incorporating accurate models of adsorbate lateral interactions

    Science.gov (United States)

    Nielsen, Jens; d'Avezac, Mayeul; Hetherington, James; Stamatakis, Michail

    2013-12-01

    Ab initio kinetic Monte Carlo (KMC) simulations have been successfully applied for over two decades to elucidate the underlying physico-chemical phenomena on the surfaces of heterogeneous catalysts. These simulations necessitate detailed knowledge of the kinetics of elementary reactions constituting the reaction mechanism, and the energetics of the species participating in the chemistry. The information about the energetics is encoded in the formation energies of gas and surface-bound species, and the lateral interactions between adsorbates on the catalytic surface, which can be modeled at different levels of detail. The majority of previous works accounted for only pairwise-additive first nearest-neighbor interactions. More recently, cluster-expansion Hamiltonians incorporating long-range interactions and many-body terms have been used for detailed estimations of catalytic rate [C. Wu, D. J. Schmidt, C. Wolverton, and W. F. Schneider, J. Catal. 286, 88 (2012)]. In view of the increasing interest in accurate predictions of catalytic performance, there is a need for general-purpose KMC approaches incorporating detailed cluster expansion models for the adlayer energetics. We have addressed this need by building on the previously introduced graph-theoretical KMC framework, and we have developed Zacros, a FORTRAN2003 KMC package for simulating catalytic chemistries. To tackle the high computational cost in the presence of long-range interactions we introduce parallelization with OpenMP. We further benchmark our framework by simulating a KMC analogue of the NO oxidation system established by Schneider and co-workers [J. Catal. 286, 88 (2012)]. We show that taking into account only first nearest-neighbor interactions may lead to large errors in the prediction of the catalytic rate, whereas for accurate estimates thereof, one needs to include long-range terms in the cluster expansion.

  1. Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.

    Science.gov (United States)

    Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan

    2012-01-01

    The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.

  2. EEG-fMRI Bayesian framework for neural activity estimation: a simulation study

    Science.gov (United States)

    Croce, Pierpaolo; Basti, Alessio; Marzetti, Laura; Zappasodi, Filippo; Del Gratta, Cosimo

    2016-12-01

    Objective. Due to the complementary nature of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI), and given the possibility of simultaneous acquisition, the joint data analysis can afford a better understanding of the underlying neural activity estimation. In this simulation study we want to show the benefit of the joint EEG-fMRI neural activity estimation in a Bayesian framework. Approach. We built a dynamic Bayesian framework in order to perform joint EEG-fMRI neural activity time course estimation. The neural activity is originated by a given brain area and detected by means of both measurement techniques. We have chosen a resting state neural activity situation to address the worst case in terms of the signal-to-noise ratio. To infer information by EEG and fMRI concurrently we used a tool belonging to the sequential Monte Carlo (SMC) methods: the particle filter (PF). Main results. First, despite a high computational cost, we showed the feasibility of such an approach. Second, we obtained an improvement in neural activity reconstruction when using both EEG and fMRI measurements. Significance. The proposed simulation shows the improvements in neural activity reconstruction with EEG-fMRI simultaneous data. The application of such an approach to real data allows a better comprehension of the neural dynamics.

  3. A component-based FPGA design framework for neuronal ion channel dynamics simulations.

    Science.gov (United States)

    Mak, Terrence S T; Rachmuth, Guy; Lam, Kai-Pui; Poon, Chi-Sang

    2006-12-01

    Neuron-machine interfaces such as dynamic clamp and brain-implantable neuroprosthetic devices require real-time simulations of neuronal ion channel dynamics. Field-programmable gate array (FPGA) has emerged as a high-speed digital platform ideal for such application-specific computations. We propose an efficient and flexible component-based FPGA design framework for neuronal ion channel dynamics simulations, which overcomes certain limitations of the recently proposed memory-based approach. A parallel processing strategy is used to minimize computational delay, and a hardware-efficient factoring approach for calculating exponential and division functions in neuronal ion channel models is used to conserve resource consumption. Performances of the various FPGA design approaches are compared theoretically and experimentally in corresponding implementations of the alpha-amino-3-hydroxy-5-methyl-4-isoxazole propionic acid (AMPA) and N-methyl-D-aspartate (NMDA) synaptic ion channel models. Our results suggest that the component-based design framework provides a more memory economic solution, as well as more efficient logic utilization for large word lengths, whereas the memory-based approach may be suitable for time-critical applications where a higher throughput rate is desired.

  4. PHAISTOS: a framework for Markov chain Monte Carlo simulation and inference of protein structure.

    Science.gov (United States)

    Boomsma, Wouter; Frellsen, Jes; Harder, Tim; Bottaro, Sandro; Johansson, Kristoffer E; Tian, Pengfei; Stovgaard, Kasper; Andreetta, Christian; Olsson, Simon; Valentin, Jan B; Antonov, Lubomir D; Christensen, Anders S; Borg, Mikael; Jensen, Jan H; Lindorff-Larsen, Kresten; Ferkinghoff-Borg, Jesper; Hamelryck, Thomas

    2013-07-15

    We present a new software framework for Markov chain Monte Carlo sampling for simulation, prediction, and inference of protein structure. The software package contains implementations of recent advances in Monte Carlo methodology, such as efficient local updates and sampling from probabilistic models of local protein structure. These models form a probabilistic alternative to the widely used fragment and rotamer libraries. Combined with an easily extendible software architecture, this makes PHAISTOS well suited for Bayesian inference of protein structure from sequence and/or experimental data. Currently, two force-fields are available within the framework: PROFASI and OPLS-AA/L, the latter including the generalized Born surface area solvent model. A flexible command-line and configuration-file interface allows users quickly to set up simulations with the desired configuration. PHAISTOS is released under the GNU General Public License v3.0. Source code and documentation are freely available from http://phaistos.sourceforge.net. The software is implemented in C++ and has been tested on Linux and OSX platforms.

  5. Liquid chromatographic separation in metal-organic framework MIL-101: a molecular simulation study.

    Science.gov (United States)

    Hu, Zhongqiao; Chen, Yifei; Jiang, Jianwen

    2013-02-05

    A molecular simulation study is reported to investigate liquid chromatographic separation in metal-organic framework MIL-101. Two mixtures are considered: three amino acids (Arg, Phe, and Trp) in aqueous solution and three xylene isomers (p-, m-, and o-xylene) dissolved in hexane. For the first mixture, the elution order is found to be Arg > Phe > Trp. The hydrophilic Arg has the strongest interaction with the polar mobile phase (water) and the weakest interaction with the stationary phase (MIL-101), and thus transports at the fastest velocity. Furthermore, Arg forms the largest number of hydrogen bonds with water and possesses the largest hydrophilic solvent-accessible surface area. For the second mixture, the elution order is p-xylene > m-xylene > o-xylene, consistent with available experimental observation. With the largest polarity as compared to p- and m-xylenes, o-xylene interacts the most strongly with the stationary phase and exhibits the slowest transport velocity. For both mixtures, the underlying separation mechanism is elucidated from detailed energetic and structural analysis. It is revealed that the separation can be attributed to the cooperative solute-solvent and solute-framework interactions. This simulation study, for the first time, provides molecular insight into liquid chromatographic separation in a MOF and suggests that MIL-101 might be an interesting material for the separation of industrially important liquid mixtures.

  6. Global Simulation of Bioenergy Crop Productivity: Analytical Framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Shujiang [ORNL; Kline, Keith L [ORNL; Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Dr Jeff A [ORNL; Post, Wilfred M [ORNL; Brandt, Craig C [ORNL; Wullschleger, Stan D [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL

    2013-01-01

    A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulated a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.

  7. Global Simulation of Bioenergy Crop Productivity: Analytical framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Jeff A. {Cyber Sciences} [ORNL; Post, Wilfred M [ORNL; Wang, Dali [ORNL; Wullschleger, Stan D [ORNL; Kline, Keith L [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL; Kang, Shujiang [ORNL

    2014-01-01

    Contemporary global assessments of the deployment potential and sustainability aspects of biofuel crops lack quantitative details. This paper describes an analytical framework capable of meeting the challenges associated with global scale agro-ecosystem modeling. We designed a modeling platform for bioenergy crops, consisting of five major components: (i) standardized global natural resources and management data sets, (ii) global simulation unit and management scenarios, (iii) model calibration and validation, (iv) high-performance computing (HPC) modeling, and (v) simulation output processing and analysis. A case study with the HPC- Environmental Policy Integrated Climate model (HPC-EPIC) to simulate a perennial bioenergy crop, switchgrass (Panicum virgatum L.) and global biomass feedstock analysis on grassland demonstrates the application of this platform. The results illustrate biomass feedstock variability of switchgrass and provide insights on how the modeling platform can be expanded to better assess sustainable production criteria and other biomass crops. Feedstock potentials on global grasslands and within different countries are also shown. Future efforts involve developing databases of productivity, implementing global simulations for other bioenergy crops (e.g. miscanthus, energycane and agave), and assessing environmental impacts under various management regimes. We anticipated this platform will provide an exemplary tool and assessment data for international communities to conduct global analysis of biofuel biomass feedstocks and sustainability.

  8. A parallel framework for the FE-based simulation of knee joint motion.

    Science.gov (United States)

    Wawro, Martin; Fathi-Torbaghan, Madjid

    2004-08-01

    We present an object-oriented framework for the finite-element (FE)-based simulation of the human knee joint motion. The FE model of the knee joint is acquired from the patients in vivo by using magnetic resonance imaging. The MRI images are converted into a three-dimensional model and finally an all-hexahedral mesh for the FE analysis is generated. The simulation environment uses nonlinear finite-element analysis (FEA) and is capable of handling contact of the model to handle the complex rolling/sliding motion of the knee joint. The software strictly follows object-oriented concepts of software engineering in order to guarantee maximum extensibility and maintainability. The final goal of this work-in-progress is the creation of a computer-based biomechanical model of the knee joint which can be used in a variety of applications, ranging from prosthesis design and treatment planning (e.g., optimal reconstruction of ruptured ligaments) over surgical simulation to impact computations in crashworthiness simulations.

  9. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries

    Directory of Open Access Journals (Sweden)

    Drawert Brian

    2012-06-01

    Full Text Available Abstract Background Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. Results We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods

  10. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State University

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  11. Social simulation theory: a framework to explain nurses' understanding of patients' experiences of ill-health.

    Science.gov (United States)

    Nordby, Halvor

    2016-09-01

    A fundamental aim in caring practice is to understand patients' experiences of ill-health. These experiences have a qualitative content and cannot, unlike thoughts and beliefs with conceptual content, directly be expressed in words. Nurses therefore face a variety of interpretive challenges when they aim to understand patients' subjective perspectives on disease and illness. The article argues that theories on social simulation can shed light on how nurses manage to meet these challenges. The core assumption of social simulationism is that we do not understand other people by forming mental representations of how they think, but by putting ourselves in their situation in a more imaginative way. According to simulationism, any attempt to understand a patient's behavior is made on the basis of simulating what it is like to be that patient in the given context. The article argues that this approach to social interpretation can clarify how nurses manage to achieve aims of patient understanding, even when they have limited time to communicate and incomplete knowledge of patients' perspectives. Furthermore, simulation theory provides a normative framework for interpretation, in the sense that its theoretical assumptions constitute ideals for how nurses should seek to understand patients' experiences of illness.

  12. Developing a Conceptual Framework for Simulation Analysis in a Supply Chain Based on Common Platform (SCBCP

    Directory of Open Access Journals (Sweden)

    M. Fathollah

    2009-08-01

    Full Text Available As a competitive advantage in modern organizations, product diversification may cause complexities in today’s extended supplychains. However, the Common Platform (CP Strategy, as a concept of gaining maximum variety by minimum productionelements, is believed to be one of the answers to eliminate or decrease these complexities. The main purpose of this paper is toprovide a simulation framework for modeling the supply network of a case study in automotive industry in order to study theimpacts of part commonality through the chain. The electrical wiring harness is selected as the main part to be studiedaccording to essentiality and challenges of its procurement for the production of cars (as occurred in this case and many otherstudies. The paper does not provide the simulation results but it rather builds up the required foundation and gathers therelevant content to develop a realistic simulation model by closely studying the impacts of part multiplicity on differentfunctional areas of the selected supply network and extracting the critical success factors of applying part commonality.

  13. 1985 annual site environmental report for Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Sedlet, J.

    1986-03-01

    This is one in a series of annual reports prepared to provide DOE, environmental agencies, and the public with information on the level of radioactive and chemical pollutants in the environment and on the amounts of such substances, if any, added to the environment as a result of Argonne operations. Included in this report are the results of measurements obtained in 1985 for a number of radionuclides in air, surface water, ground water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in surface and subsurface water; and for the external penetrating radiation dose.

  14. Research in mathematics and computer science at Argonne

    Energy Technology Data Exchange (ETDEWEB)

    Pieper, G.W.

    1989-08-01

    This report reviews the research activities in the Mathematics and Computer Science Division at Argonne National Laboratory for the period January 1988 - August 1989. The body of the report gives a brief look at the MCS staff and the research facilities, and discusses various projects carried out in two major areas of research: analytical and numerical methods and advanced computing concepts. Projects funded by non-DOE sources are also discussed, and new technology transfer activities are described. Further information on division staff, visitors, workshops, and seminars is found in the appendices.

  15. Change in argonne national laboratory: a case study.

    Science.gov (United States)

    Mozley, A

    1971-10-01

    Despite traditional opposition to change within an institution and the known reluctance of an "old guard" to accept new managerial policies and techniques, the reactions suggested in this study go well beyond the level of a basic resistance to change. The response, indeed, drawn from a random sampling of Laboratory scientific and engineering personnel, comes close to what Philip Handler has recently described as a run on the scientific bank in a period of depression (1, p. 146). It appears that Argonne's apprehension stems less from the financial cuts that have reduced staff and diminished programs by an annual 10 percent across the last 3 fiscal years than from the administrative and conceptual changes that have stamped the institution since 1966. Administratively, the advent of the AUA has not forged a sense of collaborative effort implicit in the founding negotiations or contributed noticeably to increasing standards of excellence at Argonne. The AUA has, in fact, yet to exercise the constructive powers vested in them by the contract of reviewing and formulating long-term policy on the research and reactor side. Additionally, the University of Chicago, once the single operator, appears to have forfeited some of the trust and understanding that characterized the Laboratory's attitude to it in former years. In a period of complex and sensitive management the present directorate at Argonne is seriously dissociated from a responsible spectrum of opinion within the Laboratory. The crux of discontent among the creative scientific and engineering community appears to lie in a developed sense of being overadministered. In contrast to earlier periods, Argonne's professional staff feels a critical need for a voice in the formulation of Laboratory programs and policy. The Argonne senate could supply this mechanism. Slow to rally, their present concern springs from a firm conviction that the Laboratory is "withering on the vine." By contrast, the Laboratory director Powers

  16. Initial operation of the Argonne superconducting heavy-ion linac

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, K. W.

    1979-01-01

    Initial operation and recent development of the Argonne superconducting heavy-ion linac are discussed. The linac has been developed in order to demonstrate a cost-effective means of extending the performance of electrostatic tandem accelerators. The results of beam acceleration tests which began in June 1978 are described. At present 7 of a planned array of 22 resonators are operating on-line, and the linac system provides an effective accelerating potential of 7.5 MV. Although some technical problems remain, the level of performance and reliability is sufficient that appreciable beam time is becoming available to users.

  17. A multiscale framework for the simulation of the anisotropic mechanical behavior of shale

    CERN Document Server

    Li, Weixin; Jin, Congrui; Zhou, Xinwei; Cusatis, Gianluca

    2016-01-01

    Shale, like many other sedimentary rocks, is typically heterogeneous, anisotropic, and is characterized by partial alignment of anisotropic clay minerals and naturally formed bedding planes. In this study, a micromechanical framework based on the Lattice Discrete Particle Model (LDPM) is formulated to capture these features. Material anisotropy is introduced through an approximated geometric description of shale internal structure, which includes representation of material property variation with orientation and explicit modeling of parallel lamination. The model is calibrated by carrying out numerical simulations to match various experimental data, including the ones relevant to elastic properties, Brazilian tensile strength, and unconfined compressive strength. Furthermore, parametric study is performed to investigate the relationship between the mesoscale parameters and the macroscopic properties. It is shown that the dependence of the elastic stiffness, strength, and failure mode on loading orientation ca...

  18. A Conceptual and UML models of procurement process for simulation framework

    Directory of Open Access Journals (Sweden)

    Abdessamad Douraid

    2012-11-01

    Full Text Available This paper presents a set of conceptual and UML models that can be used to construct a simulation framework of procurement process. Whereas the good control of this process is crucial as well it composes an interesting ratio of costs along the whole chain. For this purpose, we took into account the information and the material flows of the upstream supply chain that linking the manufacturer and its suppliers. Our contribution is to make a reusable and a modular pattern of procurement process, which is able to be configured and used for several manufacturer industries. In order to benchmark the different scenarios of each configuration and to furnish a decision aids tool, for the sake of the decision makers to obtain the right choices.

  19. A framework to quantify uncertainty in simulations of oil transport in the ocean

    KAUST Repository

    Gonçalves, Rafael C.

    2016-03-02

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model\\'s output to be presented in a probabilistic framework so that the model\\'s predictions reflect the uncertainty in the model\\'s input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model\\'s uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.

  20. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    Science.gov (United States)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication

  1. A multi-paradigm modeling framework to simulate dynamic reciprocity in a bioreactor.

    Directory of Open Access Journals (Sweden)

    Himanshu Kaul

    Full Text Available Despite numerous technology advances, bioreactors are still mostly utilized as functional black-boxes where trial and error eventually leads to the desirable cellular outcome. Investigators have applied various computational approaches to understand the impact the internal dynamics of such devices has on overall cell growth, but such models cannot provide a comprehensive perspective regarding the system dynamics, due to limitations inherent to the underlying approaches. In this study, a novel multi-paradigm modeling platform capable of simulating the dynamic bidirectional relationship between cells and their microenvironment is presented. Designing the modeling platform entailed combining and coupling fully an agent-based modeling platform with a transport phenomena computational modeling framework. To demonstrate capability, the platform was used to study the impact of bioreactor parameters on the overall cell population behavior and vice versa. In order to achieve this, virtual bioreactors were constructed and seeded. The virtual cells, guided by a set of rules involving the simulated mass transport inside the bioreactor, as well as cell-related probabilistic parameters, were capable of displaying an array of behaviors such as proliferation, migration, chemotaxis and apoptosis. In this way the platform was shown to capture not only the impact of bioreactor transport processes on cellular behavior but also the influence that cellular activity wields on that very same local mass transport, thereby influencing overall cell growth. The platform was validated by simulating cellular chemotaxis in a virtual direct visualization chamber and comparing the simulation with its experimental analogue. The results presented in this paper are in agreement with published models of similar flavor. The modeling platform can be used as a concept selection tool to optimize bioreactor design specifications.

  2. Application of SALSSA Framework to the Validation of Smoothed Particle Hydrodynamics Simulations of Low Reynolds Number Flows

    Energy Technology Data Exchange (ETDEWEB)

    Schuchardt, Karen L.; Chase, Jared M.; Daily, Jeffrey A.; Elsethagen, Todd O.; Palmer, Bruce J.; Scheibe, Timothy D.

    2009-06-15

    The Support Architecture for Large-Scale Subsurface Analysis (SALSSA) provides an extensible framework, sophisticated graphical user interface (GUI), and underlying data management system that simplifies the process of running subsurface models, tracking provenance information, and analyzing the model results. The SALSSA software framework is currently being applied to validating the Smoothed Particle Hydrodynamics (SPH) model. SPH is a three-dimensional model of flow and transport in porous media at the pore scale. Fluid flow in porous media at velocities common in natural porous media occur at low Reynolds numbers and therefore it is important to verify that the SPH model is producing accurate flow solutions in this regime. Validating SPH requires performing a series of simulations and comparing these simulation flow solutions to analytical results or numerical results using other methods. This validation study is being facilitated by the SALLSA framework, which provides capabilities to setup, execute, analyze, and administer these SPH simulations.

  3. Microscale chemistry technology exchange at Argonne National Laboratory - east.

    Energy Technology Data Exchange (ETDEWEB)

    Pausma, R.

    1998-06-04

    The Division of Educational Programs (DEP) at Argonne National Laboratory-East interacts with the education community at all levels to improve science and mathematics education and to provide resources to instructors of science and mathematics. DEP conducts a wide range of educational programs and has established an enormous audience of teachers, both in the Chicago area and nationally. DEP has brought microscale chemistry to the attention of this huge audience. This effort has been supported by the U.S. Department of Energy through the Environmental Management Operations organization within Argonne. Microscale chemistry is a teaching methodology wherein laboratory chemistry training is provided to students while utilizing very small amounts of reagents and correspondingly small apparatus. The techniques enable a school to reduce significantly the cost of reagents, the cost of waste disposal and the dangers associated with the manipulation of chemicals. The cost reductions are achieved while still providing the students with the hands-on laboratory experience that is vital to students who might choose to pursue careers in the sciences. Many universities and colleges have already begun to switch from macroscale to microscale chemistry in their educational laboratories. The introduction of these techniques at the secondary education level will lead to freshman being better prepared for the type of experimentation that they will encounter in college.

  4. Draft environmental assessment of Argonne National Laboratory, East

    Energy Technology Data Exchange (ETDEWEB)

    1975-10-01

    This environmental assessment of the operation of the Argonne National Laboratory is related to continuation of research and development work being conducted at the Laboratory site at Argonne, Illinois. The Laboratory has been monitoring various environmental parameters both offsite and onsite since 1949. Meteorological data have been collected to support development of models for atmospheric dispersion of radioactive and other pollutants. Gaseous and liquid effluents, both radioactive and non-radioactive, have been measured by portable monitors and by continuous monitors at fixed sites. Monitoring of constituents of the terrestrial ecosystem provides a basis for identifying changes should they occur in this regime. The Laboratory has established a position of leadership in monitoring methodologies and their application. Offsite impacts of nonradiological accidents are primarily those associated with the release of chlorine and with sodium fires. Both result in releases that cause no health damage offsite. Radioactive materials released to the environment result in a cumulative dose to persons residing within 50 miles of the site of about 47 man-rem per year, compared to an annual total of about 950,000 man-rem delivered to the same population from natural background radiation. 100 refs., 17 figs., 33 tabs.

  5. Innovative framework to simulate the fate and transport of nonconservative constituents in urban combined sewer catchments

    Science.gov (United States)

    Morales, V. M.; Quijano, J. C.; Schmidt, A.; Garcia, M. H.

    2016-11-01

    We have developed a probabilistic model to simulate the fate and transport of nonconservative constituents in urban watersheds. The approach implemented here extends previous studies that rely on the geomorphological instantaneous unit hydrograph concept to include nonconservative constituents. This is implemented with a factor χ that affects the transfer functions and therefore accounts for the loss (gain) of mass associated with the constituent as it travels through the watershed. Using this framework, we developed an analytical solution for the dynamics of dissolved oxygen (DO) and biochemical oxygen demand (BOD) in urban networks based on the Streeter and Phelps model. This model breaks down the catchment into a discreet number of possible flow paths through the system, requiring less data and implementation effort than well-established deterministic models. Application of the model to one sewer catchment in the Chicago area with available BOD information proved its ability to predict the BOD concentration observed in the measurements. In addition, comparison of the model with a calibrated Storm Water Management Model (SWMM) of another sewer catchment from the Chicago area showed that the model predicted the BOD concentration as well as the widely accepted SWMM. The developed model proved to be a suitable alternative to simulate the fate and transport of constituents in urban catchments with limited and uncertain input data.

  6. A Simulation-Based Framework for the Cooperation of VMS Travel Guidance and Traffic Signal Control

    Directory of Open Access Journals (Sweden)

    Meng Li

    2014-01-01

    Full Text Available Nowadays, both travel guidance systems and traffic signal control systems are quite common for urban traffic management. In order to achieve collaborative effect, different models had been proposed in the last two decades. In recent years, with the development of variable message sign (VMS technology, more and more VMS panels are installed on major arterials to provide highly visible and concise graphs or text messages to drivers, especially in developing countries. To discover drivers’ responses to VMS, we establish a drivers’ en route diversion model according to a stated-preference survey. Basically, we proposed a cooperative mechanism and systematic framework of VMS travel guidance and major arterials signal operations. And then a two-stage nested optimization problem is formulated. To solve this optimization problem, a simulation-based optimization method is adopted to optimize the cooperative strategies with TRANSIMS. The proposed method is applied to the real network of Tianjin City comprising of 30 nodes and 46 links. Simulations show that this new method could well improve the network condition by 26.3%. And analysis reveals that GA with nested dynamic programming is an effective technique to solve the optimization problem.

  7. A discrete element based simulation framework to investigate particulate spray deposition processes

    KAUST Repository

    Mukherjee, Debanjan

    2015-06-01

    © 2015 Elsevier Inc. This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid-particle interactions, particle-surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid-particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions.

  8. Population genetics and molecular evolution of DNA sequences in transposable elements. I. A simulation framework.

    Science.gov (United States)

    Kijima, T E; Innan, Hideki

    2013-11-01

    A population genetic simulation framework is developed to understand the behavior and molecular evolution of DNA sequences of transposable elements. Our model incorporates random transposition and excision of transposable element (TE) copies, two modes of selection against TEs, and degeneration of transpositional activity by point mutations. We first investigated the relationships between the behavior of the copy number of TEs and these parameters. Our results show that when selection is weak, the genome can maintain a relatively large number of TEs, but most of them are less active. In contrast, with strong selection, the genome can maintain only a limited number of TEs but the proportion of active copies is large. In such a case, there could be substantial fluctuations of the copy number over generations. We also explored how DNA sequences of TEs evolve through the simulations. In general, active copies form clusters around the original sequence, while less active copies have long branches specific to themselves, exhibiting a star-shaped phylogeny. It is demonstrated that the phylogeny of TE sequences could be informative to understand the dynamics of TE evolution.

  9. A framework for stochastic simulation of distribution practices for hotel reservations

    Energy Technology Data Exchange (ETDEWEB)

    Halkos, George E.; Tsilika, Kyriaki D. [Laboratory of Operations Research, Department of Economics, University of Thessaly, Korai 43, 38 333, Volos (Greece)

    2015-03-10

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.

  10. A heterogeneous and parallel computing framework for high-resolution hydrodynamic simulations

    Science.gov (United States)

    Smith, Luke; Liang, Qiuhua

    2015-04-01

    Shock-capturing hydrodynamic models are now widely applied in the context of flood risk assessment and forecasting, accurately capturing the behaviour of surface water over ground and within rivers. Such models are generally explicit in their numerical basis, and can be computationally expensive; this has prohibited full use of high-resolution topographic data for complex urban environments, now easily obtainable through airborne altimetric surveys (LiDAR). As processor clock speed advances have stagnated in recent years, further computational performance gains are largely dependent on the use of parallel processing. Heterogeneous computing architectures (e.g. graphics processing units or compute accelerator cards) provide a cost-effective means of achieving high throughput in cases where the same calculation is performed with a large input dataset. In recent years this technique has been applied successfully for flood risk mapping, such as within the national surface water flood risk assessment for the United Kingdom. We present a flexible software framework for hydrodynamic simulations across multiple processors of different architectures, within multiple computer systems, enabled using OpenCL and Message Passing Interface (MPI) libraries. A finite-volume Godunov-type scheme is implemented using the HLLC approach to solving the Riemann problem, with optional extension to second-order accuracy in space and time using the MUSCL-Hancock approach. The framework is successfully applied on personal computers and a small cluster to provide considerable improvements in performance. The most significant performance gains were achieved across two servers, each containing four NVIDIA GPUs, with a mix of K20, M2075 and C2050 devices. Advantages are found with respect to decreased parametric sensitivity, and thus in reducing uncertainty, for a major fluvial flood within a large catchment during 2005 in Carlisle, England. Simulations for the three-day event could be performed

  11. The effectiveness of a high-fidelity teaching simulation based on an NLN/Jeffries simulation in the nursing education theoretical framework and its influencing factors

    Institute of Scientific and Technical Information of China (English)

    Fen-Fen Zhu; Li-Rong Wu

    2016-01-01

    Objective: To investigate the effectiveness of a high-fidelity teaching simulation based on an NLN/Jef-fries simulation in the nursing education theoretical framework and its influencing factors. Methods: A high-fidelity teaching simulation on clinical nursing practices using intelligent human an-alogues was conducted with 200 students, and The Simulation Design Scale, and the Student Satisfaction and Self-Confidence in Learning Scale developed by the National League for Nursing were used to evaluate the training effectiveness and its influencing factors. Results: For the high-fidelity teaching simulation, students gave scores of 4.36 ± 0.54 points for satis-faction and 4.33 ± 0.46 points for Self-Confidence. The students highly rated the five dimensions of teaching design, i.e., teaching objectives/information, assistance/support for students, problem solving, guided feedback, and fidelity. The teaching design was closely correlated with the satisfaction of the high-fidelity teaching simulation and self-efficacy, and the dimensions of teaching objectives/informa-tion and assistance/support for students were particularly strong predictors of teaching effectiveness. Conclusions: A high-fidelity teaching simulation based on Jeffries' theoretical framework improved student satisfaction with the simulation and their Self-Confidence. In planning simulations, teachers should take into account five characteristics, i.e., teaching objectives/information on simulation educa-tion, assistance/support for students, problem solving, guided reflection, and fidelity, to achieve better teaching effectiveness.

  12. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  13. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  14. Flood-hazard analysis of four headwater streams draining the Argonne National Laboratory property, DuPage County, Illinois

    Science.gov (United States)

    Soong, David T.; Murphy, Elizabeth A.; Straub, Timothy D.; Zeeb, Hannah L.

    2016-11-22

    Results of a flood-hazard analysis conducted by the U.S. Geological Survey, in cooperation with the Argonne National Laboratory, for four headwater streams within the Argonne National Laboratory property indicate that the 1-percent and 0.2-percent annual exceedance probability floods would cause multiple roads to be overtopped. Results indicate that most of the effects on the infrastructure would be from flooding of Freund Brook. Flooding on the Northeast and Southeast Drainage Ways would be limited to overtopping of one road crossing for each of those streams. The Northwest Drainage Way would be the least affected with flooding expected to occur in open grass or forested areas.The Argonne Site Sustainability Plan outlined the development of hydrologic and hydraulic models and the creation of flood-plain maps of the existing site conditions as a first step in addressing resiliency to possible climate change impacts as required by Executive Order 13653 “Preparing the United States for the Impacts of Climate Change.” The Hydrological Simulation Program-FORTRAN is the hydrologic model used in the study, and the Hydrologic Engineering Center‒River Analysis System (HEC–RAS) is the hydraulic model. The model results were verified by comparing simulated water-surface elevations to observed water-surface elevations measured at a network of five crest-stage gages on the four study streams. The comparison between crest-stage gage and simulated elevations resulted in an average absolute difference of 0.06 feet and a maximum difference of 0.19 feet.In addition to the flood-hazard model development and mapping, a qualitative stream assessment was conducted to evaluate stream channel and substrate conditions in the study reaches. This information can be used to evaluate erosion potential.

  15. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    Science.gov (United States)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  16. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF)?a model-based software framework that shall enable seamless continuity of mission design and...

  17. Argonne National Lab deploys Force10 networks' massively dense ethernet switch for supercomputing cluster

    CERN Multimedia

    2003-01-01

    "Force10 Networks, Inc. today announced that Argonne National Laboratory (Argonne, IL) has successfully deployed Force10 E-Series switch/routers to connect to the TeraGrid, the world's largest supercomputing grid, sponsored by the National Science Foundation (NSF)" (1/2 page).

  18. Frontiers: Research highlights 1946-1996 [50th Anniversary Edition. Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This special edition of 'Frontiers' commemorates Argonne National Laboratory's 50th anniversary of service to science and society. America's first national laboratory, Argonne has been in the forefront of U.S. scientific and technological research from its beginning. Past accomplishments, current research, and future plans are highlighted.

  19. Molecular simulation investigation into the performance of Cu-BTC metal-organic frameworks for carbon dioxide-methane separations

    NARCIS (Netherlands)

    Gutiérrez-Sevillano, J.J.; Caro-Pérez, A.; Dubbeldam, D.; Calero, S.

    2011-01-01

    We report a molecular simulation study for Cu-BTC metal-organic frameworks as carbon dioxide-methane separation devices. For this study we have computed adsorption and diffusion of methane and carbon dioxide in the structure, both as pure components and mixtures over the full range of bulk gas compo

  20. Enhanced adsorption selectivity of hydrogen/methane mixtures in metal-organic frameworks with interpenetration: A molecular simulation study

    NARCIS (Netherlands)

    Liu, B.; Yang, Q.; Xue, C.; Zhong, C.; Chen, B.; Smit, B.

    2008-01-01

    In this work a systematic molecular simulation study was performed to study the effect of interpenetration on gas mixture separation in metal−organic frameworks (MOFs). To do this, three pairs of isoreticular MOFs (IRMOFs) with and without interpenetration were adopted to compare their adsorption se

  1. An ancilla-based quantum simulation framework for non-unitary matrices

    Science.gov (United States)

    Daskin, Ammar; Kais, Sabre

    2017-01-01

    The success probability in an ancilla-based circuit generally decreases exponentially in the number of qubits consisted in the ancilla. Although the probability can be amplified through the amplitude amplification process, the input dependence of the amplitude amplification makes difficult to sequentially combine two or more ancilla-based circuits. A new version of the amplitude amplification known as the oblivious amplitude amplification runs independently of the input to the system register. This allows us to sequentially combine two or more ancilla-based circuits. However, this type of the amplification only works when the considered system is unitary or non-unitary but somehow close to a unitary. In this paper, we present a general framework to simulate non-unitary processes on ancilla-based quantum circuits in which the success probability is maximized by using the oblivious amplitude amplification. In particular, we show how to extend a non-unitary matrix to an almost unitary matrix. We then employ the extended matrix by using an ancilla-based circuit design along with the oblivious amplitude amplification. Measuring the distance of the produced matrix to the closest unitary matrix, a lower bound for the fidelity of the final state obtained from the oblivious amplitude amplification process is presented. Numerical simulations for random matrices of different sizes show that independent of the system size, the final amplified probabilities are generally around 0.75 and the fidelity of the final state is mostly high and around 0.95. Furthermore, we discuss the complexity analysis and show that combining two such ancilla-based circuits, a matrix product can be implemented. This may lead us to efficiently implement matrix functions represented as infinite matrix products on quantum computers.

  2. Users Handbook for the Argonne Premium Coal Sample Program

    Energy Technology Data Exchange (ETDEWEB)

    Vorres, K.S.

    1993-10-01

    This Users Handbook for the Argonne Premium Coal Samples provides the recipients of those samples with information that will enhance the value of the samples, to permit greater opportunities to compare their work with that of others, and aid in correlations that can improve the value to all users. It is hoped that this document will foster a spirit of cooperation and collaboration such that the field of basic coal chemistry may be a more efficient and rewarding endeavor for all who participate. The different sections are intended to stand alone. For this reason some of the information may be found in several places. The handbook is also intended to be a dynamic document, constantly subject to change through additions and improvements. Please feel free to write to the editor with your comments and suggestions.

  3. Development of a Lattice Boltzmann Framework for Numerical Simulation of Thrombosis

    Science.gov (United States)

    Harrison, S. E.; Bernsdorf, J.; Hose, D. R.; Lawford, P. V.

    The interacting factors relating to thrombogenesis were defined by Virchow in 1856 to be abnormalities of blood chemistry, the vessel wall and haemodynamics. Together, these factors are known as Virchow's triad. Many attempts have been made to simulate numerically certain aspects of the complex phenomena of thrombosis, but a comprehensive model, which includes the biochemical and physical aspects of Virchow's triad, and is capable of predicting thrombus development within physiological geometries has not yet been developed. Such a model would consider the role of platelets and the coagulation cascade along with the properties of the flow in the chosen vessel. A lattice Boltzmann thrombosis framework has been developed, on top of an existing flow solver, to model the formation of thrombi resulting from platelet activation and initiation of the coagulation cascade by one or more of the strands of Virchow's triad. Both processes then act in parallel, to restore homeostasis as the deposited thrombus disturbs the flow. Results are presented in a model of deep vein thrombosis (DVT), resulting from hypoxia and associated endothelial damage.

  4. Autogenerator-based modelling framework for development of strategic games simulations: rational pigs game extended.

    Science.gov (United States)

    Fabac, Robert; Radošević, Danijel; Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, "the tramp," one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players.

  5. Generic Procedure for Coupling the PHREEQC Geochemical Modeling Framework with Flow and Solute Transport Simulators

    Science.gov (United States)

    Wissmeier, L. C.; Barry, D. A.

    2009-12-01

    Computer simulations of water availability and quality play an important role in state-of-the-art water resources management. However, many of the most utilized software programs focus either on physical flow and transport phenomena (e.g., MODFLOW, MT3DMS, FEFLOW, HYDRUS) or on geochemical reactions (e.g., MINTEQ, PHREEQC, CHESS, ORCHESTRA). In recent years, several couplings between both genres of programs evolved in order to consider interactions between flow and biogeochemical reactivity (e.g., HP1, PHWAT). Software coupling procedures can be categorized as ‘close couplings’, where programs pass information via the memory stack at runtime, and ‘remote couplings’, where the information is exchanged at each time step via input/output files. The former generally involves modifications of software codes and therefore expert programming skills are required. We present a generic recipe for remotely coupling the PHREEQC geochemical modeling framework and flow and solute transport (FST) simulators. The iterative scheme relies on operator splitting with continuous re-initialization of PHREEQC and the FST of choice at each time step. Since PHREEQC calculates the geochemistry of aqueous solutions in contact with soil minerals, the procedure is primarily designed for couplings to FST’s for liquid phase flow in natural environments. It requires the accessibility of initial conditions and numerical parameters such as time and space discretization in the input text file for the FST and control of the FST via commands to the operating system (batch on Windows; bash/shell on Unix/Linux). The coupling procedure is based on PHREEQC’s capability to save the state of a simulation with all solid, liquid and gaseous species as a PHREEQC input file by making use of the dump file option in the TRANSPORT keyword. The output from one reaction calculation step is therefore reused as input for the following reaction step where changes in element amounts due to advection

  6. A Simulation Framework for Exploring Socioecological Dynamics and Sustainability of Settlement Systems Under Stress in Ancient Mesopotamia and Beyond

    Science.gov (United States)

    Christiansen, J. H.; Altaweel, M. R.

    2007-12-01

    The presentation will describe an object-oriented, agent-based simulation framework being used to help answer longstanding questions regarding the development trajectories and sustainability of ancient Mesopotamian settlement systems. This multidisciplinary, multi-model framework supports explicit, fine-scale representations of the dynamics of key natural processes such as crop growth, hydrology, and weather, operating concurrently with social processes such as kinship-driven behaviors, farming and herding practices, social stratification, and economic and political activities carried out by social agents that represent individual persons, households, and larger-scale organizations. The framework has allowed us to explore the inherently coupled dynamics of modeled settlements and landscapes that are undergoing diverse social and environmental stresses, both acute and chronic, across multi-generational time spans. The simulation framework was originally used to address single-settlement scenarios, but has recently been extended to begin to address settlement system sustainability issues at sub-regional to regional scale, by introducing a number of new dynamic mechanisms, such as the activities of nomadic communities, that manifest themselves at these larger spatial scales. The framework is flexible and scalable and has broad applicability. It has, for example, recently been adapted to address agroeconomic sustainability of settlement systems in modern rural Thailand, testing the resilience and vulnerability of settled landscapes in the face of such perturbations as large-scale political interventions, global economic shifts, and climate change.

  7. Experimental Evidence Supported by Simulations of a Very High H2 Diffusion in Metal Organic Framework Materials

    Science.gov (United States)

    Salles, F.; Jobic, H.; Maurin, G.; Koza, M. M.; Llewellyn, P. L.; Devic, T.; Serre, C.; Ferey, G.

    2008-06-01

    Quasielastic neutron scattering measurements are combined with molecular dynamics simulations to extract the self-diffusion coefficient of hydrogen in the metal organic frameworks MIL-47(V) and MIL-53(Cr). We find that the diffusivity of hydrogen at low loading is about 2 orders of magnitude higher than in zeolites. Such a high mobility has never been experimentally observed before in any nanoporous materials, although it was predicted in carbon nanotubes. Either 1D or 3D diffusion mechanisms are elucidated depending on the chemical features of the MIL framework.

  8. A Generic Simulation Framework for Non-Entangled based Experimental Quantum Cryptography and Communication: Quantum Cryptography and Communication Simulator (QuCCs)

    Science.gov (United States)

    Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad

    2016-11-01

    The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.

  9. Development of web-based collaborative framework for the simulation of embedded systems

    Directory of Open Access Journals (Sweden)

    Woong Yang

    2016-10-01

    In this study, It has been developed a Web-based collaboration framework which can be a flexible connection between macroscopically virtual environment and the physical environment. This framework is able to verifiy and manage physical environments. Also it can resolve the bottlenecks encountered during the base expansion and development process of IoT (Internet of Things environment.

  10. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    Science.gov (United States)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-09-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel performance with a simple one-dimensional A-B-C lattice KMC model and a more complex three-dimensional lattice KMC model of oxygen-vacancy diffusion in a fluorite structured metal oxide. KMCLib can keep track of individual particle movements and includes tools for mean square displacement analysis, and is therefore particularly well suited for studying diffusion processes at surfaces and in solids. Catalogue identifier: AESZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AESZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 49 064 No. of bytes in distributed program, including test data, etc.: 1 575 172 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer that can run a C++ compiler and a Python interpreter. Operating system: Tested on Ubuntu 12

  11. A simulation framework for pre-clinical studies on dose and image quality: concept and first validation

    Science.gov (United States)

    Smans, Kristien; Pauwels, Herman; Rogge, Frank; Struelens, Lara; Dragusin, Octavian; Vanhavere, Filip; Bosmans, Hilde

    2008-03-01

    Purpose: The purposes of the study were to set-up and validate a simulation framework for dose and image quality optimization studies. In a first phase we have evaluated whether CDRAD images as obtained with computed radiography plates could be simulated. Material and Methods: The Monte Carlo method is a numerical method that can be used to simulate radiation transport. It is in diagnostic radiology often used in dosimetry, but in present study it is used to simulate X-ray images. With the Monte Carlo software, MCNPX, the successive steps in the imaging chain were simulated: the X-ray beam, the attenuation and scatter process in a test object and image generation by an ideal detector. Those simulated images were further modified for specific properties of CR imaging systems. The signal-transfer-properties were used to convert the simulated images into the proper grey scale. To account for resolution properties the simulated images were convolved with the point spread function of the CR systems. In a last phase, noise, based on noise power spectrum (NPS) measurements, was added to the image. In this study, we simulated X-ray images of the CDRAD contrast-detail phantom. Those simulated images, modified for the CR-system, were compared with real X-ray images of the CDRAD phantom. All images were scored by computer readings. Results: First results confirm that realistic CDRAD images can be simulated and that reading results of series of simulated and real images have the same tendency. The simulations also show that white noise has a large influence on image quality and CDRAD analyses.

  12. Development of a Simulation Framework for Analyzing Security of Supply in Integrated Gas and Electric Power Systems

    Directory of Open Access Journals (Sweden)

    Kwabena Addo Pambour

    2017-01-01

    Full Text Available Gas and power networks are tightly coupled and interact with each other due to physically interconnected facilities. In an integrated gas and power network, a contingency observed in one system may cause iterative cascading failures, resulting in network wide disruptions. Therefore, understanding the impacts of the interactions in both systems is crucial for governments, system operators, regulators and operational planners, particularly, to ensure security of supply for the overall energy system. Although simulation has been widely used in the assessment of gas systems as well as power systems, there is a significant gap in simulation models that are able to address the coupling of both systems. In this paper, a simulation framework that models and simulates the gas and power network in an integrated manner is proposed. The framework consists of a transient model for the gas system and a steady state model for the power system based on AC-Optimal Power Flow. The gas and power system model are coupled through an interface which uses the coupling equations to establish the data exchange and coordination between the individual models. The bidirectional interlink between both systems considered in this studies are the fuel gas offtake of gas fired power plants for power generation and the power supply to liquefied natural gas (LNG terminals and electric drivers installed in gas compressor stations and underground gas storage facilities. The simulation framework is implemented into an innovative simulation tool named SAInt (Scenario Analysis Interface for Energy Systems and the capabilities of the tool are demonstrated by performing a contingency analysis for a real world example. Results indicate how a disruption triggered in one system propagates to the other system and affects the operation of critical facilities. In addition, the studies show the importance of using transient gas models for security of supply studies instead of successions of

  13. Towards a framework for teaching about information technology risk in health care: Simulating threats to health data and patient safety

    Directory of Open Access Journals (Sweden)

    Elizabeth M. Borycki

    2015-09-01

    Full Text Available In this paper the author describes work towards developing an integrative framework for educating health information technology professionals about technology risk. The framework considers multiple sources of risk to health data quality and integrity that can result from the use of health information technology (HIT and can be used to teach health professional students about these risks when using health technologies. This framework encompasses issues and problems that may arise from varied sources, including intentional alterations (e.g. resulting from hacking and security breaches as well as unintentional breaches and corruption of data (e.g. resulting from technical problems, or from technology-induced errors. The framework that is described has several levels: the level of human factors and usability of HIT, the level of monitoring of security and accuracy, the HIT architectural level, the level of operational and physical checks, the level of healthcare quality assurance policies and the data risk management strategies level. Approaches to monitoring and simulation of risk are also discussed, including a discussion of an innovative approach to monitoring potential quality issues. This is followed by a discussion of the application (using computer simulations to educate both students and health information technology professionals about the impact and spread of technology-induced and related types of data errors involving HIT.

  14. A General Simulation Framework for Supply Chain Modeling: State of the Art and Case Study

    OpenAIRE

    2010-01-01

    Nowadays there is a large availability of discrete event simulation software that can be easily used in different domains: from industry to supply chain, from healthcare to business management, from training to complex systems design. Simulation engines of commercial discrete event simulation software use specific rules and logics for simulation time and events management. Difficulties and limitations come up when commercial discrete event simulation software are used for modeling complex rea...

  15. Flow Induced Vibration Program at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    Argonne National Laboratory has had a Flow Induced Vibration Program since 1967; the Program currently resides in the Laboratory's Components Technology Division. Throughout its existence, the overall objective of the program has been to develop and apply new and/or improved methods of analysis and testing for the design evaluation of nuclear reactor plant components and heat exchange equipment from the standpoint of flow induced vibration. Historically, the majority of the program activities have been funded by the US Atomic Energy Commission (AEC), Energy Research and Development Administration (ERDA), and Department of Energy (DOE). Current DOE funding is from the Breeder Mechanical Component Development Division, Office of Breeder Technology Projects; Energy Conversion and Utilization Technology (ECUT) Program, Office of Energy Systems Research; and Division of Engineering, Mathematical and Geosciences, Office of Basic Energy Sciences. Testing of Clinch River Breeder Reactor upper plenum components has been funded by the Clinch River Breeder Reactor Plant (CRBRP) Project Office. Work has also been performed under contract with Foster Wheeler, General Electric, Duke Power Company, US Nuclear Regulatory Commission, and Westinghouse.

  16. Treatment of mixed radioactive liquid wastes at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Vandegrift, G.F.; Chamberlain, D.B.; Conner, C. [and others

    1994-03-01

    Aqueous mixed waste at Argonne National Laboratory (ANL) is traditionally generated in small volumes with a wide variety of compositions. A cooperative effort at ANL between Waste Management (WM) and the Chemical Technology Division (CMT) was established, to develop, install, and implement a robust treatment operation to handle the majority of such wastes. For this treatment, toxic metals in mixed-waste solutions are precipitated in a semiautomated system using Ca(OH){sub 2} and, for some metals, Na{sub 2}S additions. This step is followed by filtration to remove the precipitated solids. A filtration skid was built that contains several filter types which can be used, as appropriate, for a variety of suspended solids. When supernatant liquid is separated from the toxic-metal solids by decantation and filtration, it will be a low-level waste (LLW) rather than a mixed waste. After passing a Toxicity Characteristic Leaching Procedure (TCLP) test, the solids may also be treated as LLW.

  17. Argonne National Laboratory site environmental report for calendar year 2004.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Kolzow, R. G.

    2005-09-02

    This report discusses the accomplishments of the environmental protection program at Argonne National Laboratory (ANL) for calendar year 2004. The status of ANL environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of ANL operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the ANL site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and ANL effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, ANL, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  18. Routine environmental reaudit of the Argonne National Laboratory - West

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    This report documents the results of the Routine Environmental Reaudit of the Argonne National Laboratory - West (ANL-W), Idaho Falls, Idaho. During this audit, the activities conducted by the audit team included reviews of internal documents and reports from previous audits and assessments; interviews with U.S. Department of Energy (DOE), U.S. Environmental Protection Agency (EPA), State of Idaho Department of Health and Welfare (IDHW), and DOE contractor personnel; and inspections and observations of selected facilities and operations. The onsite portion of the audit was conducted from October 11 to October 22, 1993, by the DOE Office of Environmental Audit (EH-24), located within the Office of Environment, Safety and Health (EH). DOE 5482.113, {open_quotes}Environment, Safety, and Health Appraisal Program,{close_quotes} established the mission of EH-24 to provide comprehensive, independent oversight of Department-wide environmental programs on behalf of the Secretary of Energy. The ultimate goal of EH-24 is enhancement of environmental protection and minimization of risk to public health and the environment. EH-24 accomplishes its mission by conducting systematic and periodic evaluations of the Department`s environmental programs within line organizations, and by utilizing supplemental activities that serve to strengthen self-assessment and oversight functions within program, field, and contractor organizations.

  19. Advanced Simulation Framework for Design and Analysis of Space Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a computational framework for high performance, high fidelity computational fluid dynamics (CFD) to enable accurate, fast and robust...

  20. Advanced Simulation Framework for Design and Analysis of Space Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a high-performance, high-fidelity framework in the computational fluid dynamics (CFD) code called Loci-STREAM to enable accurate,...

  1. High Performance Hybrid RANS-LES Simulation Framework for Turbulent Combusting Flows Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a computational framework for high performance, high fidelity computational fluid dynamics (CFD) to enable accurate, fast and robust...

  2. Argonne Natl Lab receives TeraFLOP Cluster Linux NetworX

    CERN Multimedia

    2002-01-01

    " Linux NetworX announced today it has delivered an Evolocity II (E2) Linux cluster to Argonne National Laboratory that is capable of performing more than one trillion calculations per second (1 teraFLOP)" (1/2 page).

  3. A General Simulation Framework for Supply Chain Modeling: State of the Art and Case Study

    Directory of Open Access Journals (Sweden)

    Antonio Cimino

    2010-03-01

    Full Text Available Nowadays there is a large availability of discrete event simulation software that can be easily used in different domains: from industry to supply chain, from healthcare to business management, from training to complex systems design. Simulation engines of commercial discrete event simulation software use specific rules and logics for simulation time and events management. Difficulties and limitations come up when commercial discrete event simulation software are used for modeling complex real world-systems (i.e. supply chains, industrial plants. The objective of this paper is twofold: first a state of the art on commercial discrete event simulation software and an overview on discrete event simulation models development by using general purpose programming languages are presented; then a Supply Chain Order Performance Simulator (SCOPS, developed in C++ for investigating the inventory management problem along the supply chain under different supply chain scenarios is proposed to readers.

  4. A General Simulation Framework for Supply Chain Modeling: State of the Art and Case Study

    CERN Document Server

    Cimino, Antonio; Mirabelli, Giovanni

    2010-01-01

    Nowadays there is a large availability of discrete event simulation software that can be easily used in different domains: from industry to supply chain, from healthcare to business management, from training to complex systems design. Simulation engines of commercial discrete event simulation software use specific rules and logics for simulation time and events management. Difficulties and limitations come up when commercial discrete event simulation software are used for modeling complex real world-systems (i.e. supply chains, industrial plants). The objective of this paper is twofold: first a state of the art on commercial discrete event simulation software and an overview on discrete event simulation models development by using general purpose programming languages are presented; then a Supply Chain Order Performance Simulator (SCOPS, developed in C++) for investigating the inventory management problem along the supply chain under different supply chain scenarios is proposed to readers.

  5. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  6. Authorized limits for disposal of PCB capacitors from Buildings 361 and 391 at Argonne National Laboratory, Argonne, Illinois.

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, J.-J.; Chen, S.-Y.; Environmental Science Division

    2009-12-22

    This report contains data and analyses to support the approval of authorized release limits for the clearance from radiological control of polychlorinated biphenyl (PCB) capacitors in Buildings 361 and 391 at Argonne National Laboratory, Argonne, Illinois. These capacitors contain PCB oil that must be treated and disposed of as hazardous waste under the Toxic Substances Control Act (TSCA). However, they had been located in radiological control areas where the potential for neutron activation existed; therefore, direct release of these capacitors to a commercial facility for PCB treatment and landfill disposal is not allowable unless authorized release has been approved. Radiological characterization found no loose contamination on the exterior surface of the PCB capacitors; gamma spectroscopy analysis also showed the radioactivity levels of the capacitors were either at or slightly above ambient background levels. As such, conservative assumptions were used to expedite the analyses conducted to evaluate the potential radiation exposures of workers and the general public resulting from authorized release of the capacitors; for example, the maximum averaged radioactivity levels measured for capacitors nearest to the beam lines were assumed for the entire batch of capacitors. This approach overestimated the total activity of individual radionuclide identified in radiological characterization by a factor ranging from 1.4 to 640. On the basis of this conservative assumption, the capacitors were assumed to be shipped from Argonne to the Clean Harbors facility, located in Deer Park, Texas, for incineration and disposal. The Clean Harbors facility is a state-permitted TSCA facility for treatment and disposal of hazardous materials. At this facility, the capacitors are to be shredded and incinerated with the resulting incineration residue buried in a nearby landfill owned by the company. A variety of receptors that have the potential of receiving radiation exposures were

  7. Characterisation and testing of a prototype $6 \\times 6$ cm$^2$ Argonne MCP-PMT

    CERN Document Server

    Cowan, Greig A; Needham, Matthew; Gambetta, Silvia; Eisenhardt, Stephan; McBlane, Neil; Malek, Matthew

    2016-01-01

    The Argonne micro-channel plate photomultiplier tube (MCP-PMT) is an offshoot of the Large Area Pico-second Photo Detector (LAPPD) project, wherein \\mbox{6 $\\times$ 6 cm$^2$} sized detectors are made at Argonne National Laboratory. Measurements of the properties of these detectors, including gain, time and spatial resolution, dark count rates, cross-talk and sensitivity to magnetic fields are reported. In addition, possible applications of these devices in future neutrino and collider physics experiments are discussed.

  8. Argonne National Laboratory`s photooxidation organic mixed-waste treatment system

    Energy Technology Data Exchange (ETDEWEB)

    Shearer, T.L.; Torres, T.; Conner, C. [Argonne National Lab., IL (United States)] [and others

    1997-12-01

    This paper describes the installation and startup testing of the Argonne National Laboratory-East (ANL-E) photo-oxidation organic mixed-waste treatment system. This system will treat organic mixed (i.e., radioactive and hazardous) waste by oxidizing the organics to carbon dioxide and inorganic salts in an aqueous media. The residue will be treated in the existing radwaste evaporators. The system is installed in the waste management facility at the ANL-E site in Argonne, Illinois.

  9. interThermalPhaseChangeFoam—A framework for two-phase flow simulations with thermally driven phase change

    Directory of Open Access Journals (Sweden)

    Mahdi Nabil

    2016-01-01

    Full Text Available The volume-of-fluid (VOF approach is a mature technique for simulating two-phase flows. However, VOF simulation of phase-change heat transfer is still in its infancy. Multiple closure formulations have been proposed in the literature, each suited to different applications. While these have enabled significant research advances, few implementations are publicly available, actively maintained, or inter-operable. Here, a VOF solver is presented (interThermalPhaseChangeFoam, which incorporates an extensible framework for phase-change heat transfer modeling, enabling simulation of diverse phenomena in a single environment. The solver employs object oriented OpenFOAM library features, including Run-Time-Type-Identification to enable rapid implementation and run-time selection of phase change and surface tension force models. The solver is packaged with multiple phase change and surface tension closure models, adapted and refined from earlier studies. This code has previously been applied to study wavy film condensation, Taylor flow evaporation, nucleate boiling, and dropwise condensation. Tutorial cases are provided for simulation of horizontal film condensation, smooth and wavy falling film condensation, nucleate boiling, and bubble condensation. Validation and grid sensitivity studies, interfacial transport models, effects of spurious currents from surface tension models, effects of artificial heat transfer due to numerical factors, and parallel scaling performance are described in detail in the Supplemental Material (see Appendix A. By incorporating the framework and demonstration cases into a single environment, users can rapidly apply the solver to study phase-change processes of interest.

  10. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  11. Analysis of the Argonne distance tabletop exercise method.

    Energy Technology Data Exchange (ETDEWEB)

    Tanzman, E. A.; Nieves, L. A.; Decision and Information Sciences

    2008-02-14

    The purpose of this report is to summarize and evaluate the Argonne Distance Tabletop Exercise (DISTEX) method. DISTEX is intended to facilitate multi-organization, multi-objective tabletop emergency response exercises that permit players to participate from their own facility's incident command center. This report is based on experience during its first use during the FluNami 2007 exercise, which took place from September 19-October 17, 2007. FluNami 2007 exercised the response of local public health officials and hospitals to a hypothetical pandemic flu outbreak. The underlying purpose of the DISTEX method is to make tabletop exercising more effective and more convenient for playing organizations. It combines elements of traditional tabletop exercising, such as scenario discussions and scenario injects, with distance learning technologies. This distance-learning approach also allows playing organizations to include a broader range of staff in the exercise. An average of 81.25 persons participated in each weekly webcast session from all playing organizations combined. The DISTEX method required development of several components. The exercise objectives were based on the U.S. Department of Homeland Security's Target Capabilities List. The ten playing organizations included four public health departments and six hospitals in the Chicago area. An extent-of-play agreement identified the objectives applicable to each organization. A scenario was developed to drive the exercise over its five-week life. Weekly problem-solving task sets were designed to address objectives that could not be addressed fully during webcast sessions, as well as to involve additional playing organization staff. Injects were developed to drive play between webcast sessions, and, in some cases, featured mock media stories based in part on player actions as identified from the problem-solving tasks. The weekly 90-minute webcast sessions were discussions among the playing organizations

  12. Availability-based simulation and optimization modeling framework for open-pit mine truck allocation under dynamic constraints

    Institute of Scientific and Technical Information of China (English)

    Mena Rodrigo; Zio Enrico; Kristjanpoller Fredy; Arata Adolfo

    2013-01-01

    We present a novel system productivity simulation and optimization modeling framework in which equipment availability is a variable in the expected productivity function of the system.The framework is used for allocating trucks by route according to their operating performances in a truck-shovel system of an open-pit mine,so as to maximize the overall productivity of the fleet.We implement the framework in an originally designed and specifically developed simulator-optimizer software tool.We make an application on a real open-pit mine case study taking into account the stochasticity of the equipment behavior and environment.The total system production values obtained with and without considering the equipment reliability,availability and maintainability (RAM) characteristics are compared.We show that by taking into account the truck and shovel RAM aspects,we can maximize the total production of the system and obtain specific information on the production availability and productivity of its components.

  13. A new numerical framework for simulating the control of weather and climate on the evolution of soil-mantled hillslopes

    Science.gov (United States)

    Bovy, Benoît; Braun, Jean; Demoulin, Alain

    2016-06-01

    We present a new numerical framework for simulating short to long-term hillslope evolution. This modeling framework, to which we have given the name CLICHE (CLImate Control on Hillslope Evolution), aims to better capture the control of climate on soil dynamics. It allows the use of realistic forcing that involves, through a specific time discretization scheme, the variability of both the temperature and precipitation at time scales ranging from the daily rainfall events to the climatic oscillations of the Quaternary, also including seasonal variability. Two simple models of soil temperature and soil water balance permit the link between the climatic inputs and derived quantities that take part in the computation of the soil flux, such as the surface water discharge and the depth of the non-frozen soil layer. Using this framework together with a multi-process parameterization of soil transport, we apply an original method to calculate hillslope effective diffusivity as a function of climate. This allows us to demonstrate the ability of the model to simulate observed rates of hillslope erosion under different climates (cold and temperate) with a single set of parameter values. Numerical experiments furthermore suggest a potential high peak of sediment transport on hillslopes during the glacial-interglacial transitions of the Quaternary. We finally discuss the need to improve the parameterization of the soil production and transport processes in order to explicitly account for other key controlling factors that are also climate-sensitive, such as biological activity.

  14. Atomistic simulation studies on the dynamics and thermodynamics of nonpolar molecules within the zeolite imidazolate framework-8.

    Science.gov (United States)

    Pantatosaki, Evangelia; Pazzona, Federico G; Megariotis, Gregory; Papadopoulos, George K

    2010-02-25

    Statistical-mechanics-based simulation studies at the atomistic level of argon (Ar), methane (CH(4)), and hydrogen (H(2)) sorbed in the zeolite imidazolate framework-8 (ZIF-8) are reported. ZIF-8 is a product of a special kind of chemical process, recently termed as reticular synthesis, which has generated a class of materials of critical importance as molecular binders. In this work, we explore the mechanisms that govern the sorption thermodynamics and kinetics of nonpolar sorbates possessing different sizes and strength of interactions with the metal-organic framework to understand the outstanding properties of this novel class of sorbents, as revealed by experiments published elsewhere. For this purpose, we have developed an in-house modeling procedure involving calculations of sorption isotherms, partial internal energies, various probability density functions, and molecular dynamics for the simulation of the sorbed phase over a wide range of occupancies and temperatures within a digitally reconstructed unit cell of ZIF-8. The results showed that sorbates perceive a marked energetic inhomogeneity within the atomic framework of the metal-organic material under study, resulting in free energy barriers that give rise to inflections in the sorption isotherms and guide the dynamics of guest molecules.

  15. Experimental results obtained with the positron-annihilation- radiation telescope of the Toulouse-Argonne collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Naya, J.E.; von Ballmoos, P.; Albernhe, F.; Vedrenne, G. [Centre d`Etude Spatial des Rayonnements, Toulouse (France); Smither, R.K.; Faiz, M.; Fernandez, P.B.; Graber, T. [Argonne National Lab., IL (United States)

    1995-10-01

    We present laboratory measurements obtained with a ground-based prototype of a focusing positron-annihilation-radiation telescope developed by the Toulouse-Argonne collaboration. This balloon-borne telescope has been designed to collect 511-keV photons with an extremely low instrumental background. The telescope features a Laue diffraction lens and a detector module containing a small array of germanium detectors. It will provide a combination of high spatial and energy resolution (15 arc sec and 2 keV, respectively) with a sensitivity of {approximately}3{times}10{sup {minus}5} photons cm{sup {minus}2}s{sup {minus}1}. These features will allow us to resolve a possible narrow 511-keV line both energetically and spatially within a Galactic center ``microquasar`` or in other broad-class annihilators. The ground-based prototype consists of a crystal lens holding small cubes of diffracting germanium crystals and a 3{times}3 germanium array that detects the concentrated beam in the focal plane. Measured performances of the instrument at different line energies (511 keV and 662 keV) are presented and compared with Monte-Carlo simulations. The advantages of a 3{times}3 Ge-detector array with respect to a standard-monoblock detector have been confirmed. The results obtained in the laboratory have strengthened interest in a crystal-diffraction telescope, offering new perspectives for die future of experimental gamma-ray astronomy.

  16. Experimental results obtained with the positron-annihilation-radiation telescope of the Toulouse-Argonne collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Naya, J.E. [Toulouse-3 Univ., 31 (France). Centre d`Etude Spatiale des Rayonnements; Ballmoos, P. von [Toulouse-3 Univ., 31 (France). Centre d`Etude Spatiale des Rayonnements; Smither, R.K. [Argonne National Lab., IL (United States). Advanced Photon Source Div.; Faiz, M. [Argonne National Lab., IL (United States). Advanced Photon Source Div.; Fernandez, P.B. [Argonne National Lab., IL (United States). Advanced Photon Source Div.; Graber, T. [Argonne National Lab., IL (United States). Advanced Photon Source Div.; Albernhe, F. [Toulouse-3 Univ., 31 (France). Centre d`Etude Spatiale des Rayonnements; Vedrenne, G. [Toulouse-3 Univ., 31 (France). Centre d`Etude Spatiale des Rayonnements

    1996-04-11

    We present laboratory measurements obtained with a ground-based prototype of the focusing positron-annihilation-radiation telescope developed by the Toulouse-Argonne collaboration. This instrument has been designed to collect 511-keV photons from astrophysical sources when operating as a balloon borne observatory. The ground-based prototype consists of a crystal lens holding small cubes of diffracting germanium crystals and a 3 x 3 germanium array that detects the concentrated beam in the focal plane. Measured performances of the instrument at different line energies (511 and 662 keV) are presented and compared with Monte Carlo simulations; also the advantages of combining the lens with a detector array are discussed. The results obtained in the laboratory have strengthened interest in a crystal-diffraction telescope: the balloon instrument will provide a combination of high spatial and energy resolution (15 arc sec and 2 keV, respectively) with an extremely low instrumental background resulting in a sensitivity of similar 3.10{sup -5} photons cm{sup -2}s{sup -1}. These features will allow us to resolve a possible narrow 511-keV line both energetically and spatially within a Galactic center microquasar or in other broad-class annihilators. (orig.).

  17. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    Science.gov (United States)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  18. A Generalized Framework for Different Drought Indices: Testing its Suitability in a Simulation of the last two Millennia for Europe

    Science.gov (United States)

    Raible, Christoph C.; Baerenbold, Oliver; Gomez-Navarro, Juan Jose

    2016-04-01

    Over the past decades, different drought indices have been suggested in the literature. This study tackles the problem of how to characterize drought by defining a general framework and proposing a generalized family of drought indices that is flexible regarding the use of different water balance models. The sensitivity of various indices and its skill to represent drought conditions is evaluated using a regional model simulation in Europe spanning the last two millennia as test bed. The framework combines an exponentially damped memory with a normalization method based on quantile mapping. Both approaches are more robust and physically meaningful compared to the existing methods used to define drought indices. Still, framework is flexible with respect to the water balance, enabling users to adapt the index formulation to the data availability of different locations. Based on the framework, indices with different complex water balances are compared with each other. The comparison shows that a drought index considering only precipitation in the water balance is sufficient for Western to Central Europe. However, in the Mediterranean temperature effects via evapotranspiration need to be considered in order to produce meaningful indices representative of actual water deficit. Similarly, our results indicate that in north-eastern Europe and Scandinavia, snow and runoff effects needs to be considered in the index definition to obtain accurate results.

  19. PAMS - A New Collaborative Framework for Agent-Based Simulation of Complex Sysems

    OpenAIRE

    Nguyen Trong, Khanh; Marilleau, Nicolas; Vinh Ho, Tuong

    2008-01-01

    International audience; Major researches in the domain of complex systems are interdisciplinary, collaborative and geographically distributed. The purpose of our work is to explore a new collaborative approach that facilitates scientist's interactions during the modelling and simulating process. The originality of the presented approach is to consider models and simulators as a board of the collaboration: a shared object manipulated by a group of scientists. Agent-based simulations are powerf...

  20. Field-wide flow simulation in fractured porous media within lattice Boltzmann framework

    Science.gov (United States)

    Benamram, Z.; Tarakanov, A.; Nasrabadi, H.; Gildin, E.

    2016-10-01

    In this paper, a generalized lattice Boltzmann model for simulating fluid flow in porous media at the representative volume element scale is extended towards applications of hydraulically and naturally fractured reservoirs. The key element within the model is the development of boundary conditions for a vertical well and horizontal fracture with minimal node usage. In addition, the governing non-dimensional equations are derived and a new set of dimensionless numbers are presented for the simulation of a fractured reservoir system. Homogenous and heterogeneous vertical well and fracture systems are simulated and verified against commercial reservoir simulation suites. Results are in excellent agreement to analytical and finite difference solutions.

  1. Experimental spectra analysis in THM with the help of simulation based on Geant4 framework

    CERN Document Server

    Li, Chengbo; Zhou, Shuhua; Fu, Yuanyong; Zhou, Jing; Meng, Qiuying; Jiang, Zongjun; Wang, Xiaolian

    2014-01-01

    The Coulomb barrier and electron screening cause difficulties in directly measuring nuclear reaction cross sections of charged particles in astrophysical energies. The Trojan-horse method has been introduced to solve the difficulties as a powerful indirect tool. In order to understand experimental spectra better, Geant4 is employed to simulate the method for the first time. Validity and reliability of the simulation are examined by comparing the experimental data with simulated results. The Geant4 simulation can give useful information to understand the experimental spectra better in data analysis and is beneficial to the design for future related experiments.

  2. MASADA: A Modeling and Simulation Automated Data Analysis framework for continuous data-intensive validation of simulation models

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  3. MASADA: A MODELING AND SIMULATION AUTOMATED DATA ANALYSIS FRAMEWORK FOR CONTINUOUS DATA-INTENSIVE VALIDATION OF SIMULATION MODELS

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  4. The Flatworld Simulation Control Architecture (FSCA): A Framework for Scalable Immersive Visualization Systems

    Science.gov (United States)

    2004-12-01

    handling using the X10 home automation protocol. Each 3D graphics client renders its scene according to an assigned virtual camera position. By having...control protocol. DMX is a versatile and robust framework which overcomes limitations of the X10 home automation protocol which we are currently using

  5. An innovative strategy in evaluation: using a student engagement framework to evaluate a role-based simulation.

    Science.gov (United States)

    Smith, Morgan; Warland, Jane; Smith, Colleen

    2012-03-01

    Online role-play has the potential to actively engage students in authentic learning experiences and help develop their clinical reasoning skills. However, evaluation of student learning for this kind of simulation focuses mainly on the content and outcome of learning, rather than on the process of learning through student engagement. This article reports on the use of a student engagement framework to evaluate an online role-play offered as part of a course in Bachelor of Nursing and Bachelor of Midwifery programs. Instruments that measure student engagement to date have targeted large numbers of students at program and institutional levels, rather than at the level of a specific learning activity. Although the framework produced some useful findings for evaluation purposes, further refinement of the questions is required to be certain that deep learning results from the engagement that occurs with course-level learning initiatives.

  6. Role-playing simulation as an educational tool for health care personnel: developing an embedded assessment framework.

    Science.gov (United States)

    Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne

    2010-04-01

    Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.

  7. Punk Rock Fish: Applying a Conceptual Framework of Simulations in a High School Science Classroom

    Science.gov (United States)

    Helms, Samuel

    2009-01-01

    Like other learning tools, simulations benefit from an instructional context. To use Gagne and Briggs' model as an example, the learner should first be reminded of the prerequisite learnings, informed of the objectives, and his/her attention gained before being presented with the learning material. Research regarding simulations in education…

  8. A Modeling Framework for Supply Chain Simulation: Opportunities for Improved Decision Making

    NARCIS (Netherlands)

    Zee, van der D.J.; Vorst, van der J.G.A.J.

    2005-01-01

    Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and th

  9. A modeling framework for supply chain simulation : opportunities for improved decision-making

    NARCIS (Netherlands)

    van der Zee, D.J.; van der Vorst, J.G.A.J.

    2005-01-01

    Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and th

  10. Authorized limits for disposal of PCB capacitors from Buildings 361 and 391 at Argonne National Laboratory, Argonne, Illinois.

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, J.-J.; Chen, S.-Y.; Environmental Science Division

    2009-12-22

    This report contains data and analyses to support the approval of authorized release limits for the clearance from radiological control of polychlorinated biphenyl (PCB) capacitors in Buildings 361 and 391 at Argonne National Laboratory, Argonne, Illinois. These capacitors contain PCB oil that must be treated and disposed of as hazardous waste under the Toxic Substances Control Act (TSCA). However, they had been located in radiological control areas where the potential for neutron activation existed; therefore, direct release of these capacitors to a commercial facility for PCB treatment and landfill disposal is not allowable unless authorized release has been approved. Radiological characterization found no loose contamination on the exterior surface of the PCB capacitors; gamma spectroscopy analysis also showed the radioactivity levels of the capacitors were either at or slightly above ambient background levels. As such, conservative assumptions were used to expedite the analyses conducted to evaluate the potential radiation exposures of workers and the general public resulting from authorized release of the capacitors; for example, the maximum averaged radioactivity levels measured for capacitors nearest to the beam lines were assumed for the entire batch of capacitors. This approach overestimated the total activity of individual radionuclide identified in radiological characterization by a factor ranging from 1.4 to 640. On the basis of this conservative assumption, the capacitors were assumed to be shipped from Argonne to the Clean Harbors facility, located in Deer Park, Texas, for incineration and disposal. The Clean Harbors facility is a state-permitted TSCA facility for treatment and disposal of hazardous materials. At this facility, the capacitors are to be shredded and incinerated with the resulting incineration residue buried in a nearby landfill owned by the company. A variety of receptors that have the potential of receiving radiation exposures were

  11. Status on the Development of a Modeling and Simulation Framework for the Economic Assessment of Nuclear Hybrid Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bragg-Sitton, Shannon Michelle [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert Arthur [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Jong Suk [Idaho National Lab. (INL), Idaho Falls, ID (United States); Deason, Wesley Ray [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boardman, Richard Doin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Garcia, Humberto E. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    An effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year 2015 (FY15). The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status on its progress. Several tasks have been accomplished. First, starting from a simulation strategy, a rigorous mathematical formulation has been achieved in which the economic optimization of a Nuclear Hybrid Energy System is presented as a constrained robust (under uncertainty) optimization problem. Some possible algorithms for the solution of the optimization problem are presented. A variation of the Simultaneous Perturbation Stochastic Approximation algorithm has been implemented in RAVEN and preliminary tests have been performed. The development of the software infrastructure to support the simulation of the whole NHES has also moved forward. The coupling between RAVEN and an implementation of the Modelica language (OpenModelica) has been implemented, migrated under several operating systems and tested using an adapted model of a desalination plant. In particular, this exercise was focused on testing the coupling of the different code systems; testing parallel, computationally expensive simulations on the INL cluster; and providing a proof of concept for the possibility of using surrogate models to represent the different NHES subsystems. Another important step was the porting of the RAVEN code under the Windows™ operating system. This accomplishment makes RAVEN compatible with the development environment that is being used for dynamic simulation of NHES components. A very simplified model of a NHES on the electric market has been built in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces even for stochastic constraints. This

  12. A Framework for the Interactive Handling of High-Dimensional Simulation Data in Complex Geometries

    KAUST Repository

    Benzina, Amal

    2013-01-01

    Flow simulations around building infrastructure models involve large scale complex geometries, which when discretized in adequate detail entail high computational cost. Moreover, tasks such as simulation insight by steering or optimization require many such costly simulations. In this paper, we illustrate the whole pipeline of an integrated solution for interactive computational steering, developed for complex flow simulation scenarios that depend on a moderate number of both geometric and physical parameters. A mesh generator takes building information model input data and outputs a valid cartesian discretization. A sparse-grids-based surrogate model—a less costly substitute for the parameterized simulation—uses precomputed data to deliver approximated simulation results at interactive rates. Furthermore, a distributed multi-display visualization environment shows building infrastructure together with flow data. The focus is set on scalability and intuitive user interaction.

  13. A conceptual framework for using Doppler radar acquired atmospheric data for flight simulation

    Science.gov (United States)

    Campbell, W.

    1983-01-01

    A concept is presented which can permit turbulence simulation in the vicinity of microbursts. The method involves a large data base, but should be fast enough for use with flight simulators. The model permits any pilot to simulate any flight maneuver in any aircraft. The model simulates a wind field with three-component mean winds and three-component turbulent gusts, and gust variation over the body of an aircraft so that all aerodynamic loads and moments can be calculated. The time and space variation of mean winds and turbulent intensities associated with a particular atmospheric phenomenon such as a microburst is used in the model. In fact, Doppler radar data such as provided by JAWS is uniquely suited for use with the proposed model. The concept is completely general and is not restricted to microburst studies. Reentry and flight in terrestrial or planetary atmospheres could be realistically simulated if supporting data of sufficient resolution were available.

  14. Towards a conceptual multi-agent-based framework to simulate the spatial group decision-making process

    Science.gov (United States)

    Ghavami, Seyed Morsal; Taleai, Mohammad

    2017-04-01

    Most spatial problems are multi-actor, multi-issue and multi-phase in nature. In addition to their intrinsic complexity, spatial problems usually involve groups of actors from different organizational and cognitive backgrounds, all of whom participate in a social structure to resolve or reduce the complexity of a given problem. Hence, it is important to study and evaluate what different aspects influence the spatial problem resolution process. Recently, multi-agent systems consisting of groups of separate agent entities all interacting with each other have been put forward as appropriate tools to use to study and resolve such problems. In this study, then in order to generate a better level of understanding regarding the spatial problem group decision-making process, a conceptual multi-agent-based framework is used that represents and specifies all the necessary concepts and entities needed to aid group decision making, based on a simulation of the group decision-making process as well as the relationships that exist among the different concepts involved. The study uses five main influencing entities as concepts in the simulation process: spatial influence, individual-level influence, group-level influence, negotiation influence and group performance measures. Further, it explains the relationship among different concepts in a descriptive rather than explanatory manner. To illustrate the proposed framework, the approval process for an urban land use master plan in Zanjan—a provincial capital in Iran—is simulated using MAS, the results highlighting the effectiveness of applying an MAS-based framework when wishing to study the group decision-making process used to resolve spatial problems.

  15. Towards a conceptual multi-agent-based framework to simulate the spatial group decision-making process

    Science.gov (United States)

    Ghavami, Seyed Morsal; Taleai, Mohammad

    2016-11-01

    Most spatial problems are multi-actor, multi-issue and multi-phase in nature. In addition to their intrinsic complexity, spatial problems usually involve groups of actors from different organizational and cognitive backgrounds, all of whom participate in a social structure to resolve or reduce the complexity of a given problem. Hence, it is important to study and evaluate what different aspects influence the spatial problem resolution process. Recently, multi-agent systems consisting of groups of separate agent entities all interacting with each other have been put forward as appropriate tools to use to study and resolve such problems. In this study, then in order to generate a better level of understanding regarding the spatial problem group decision-making process, a conceptual multi-agent-based framework is used that represents and specifies all the necessary concepts and entities needed to aid group decision making, based on a simulation of the group decision-making process as well as the relationships that exist among the different concepts involved. The study uses five main influencing entities as concepts in the simulation process: spatial influence, individual-level influence, group-level influence, negotiation influence and group performance measures. Further, it explains the relationship among different concepts in a descriptive rather than explanatory manner. To illustrate the proposed framework, the approval process for an urban land use master plan in Zanjan—a provincial capital in Iran—is simulated using MAS, the results highlighting the effectiveness of applying an MAS-based framework when wishing to study the group decision-making process used to resolve spatial problems.

  16. From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems

    Science.gov (United States)

    2015-03-13

    From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber - Physical Systems ...ABSTRACT As the design complexity of cyber - physical systems continues to grow, modeling the system at higher abstraction levels with formal models of...Function-Architecture Co-simulation Framework for Timing Verification of Cyber - Physical Systems by Liangpeng Guo A dissertation submitted in partial

  17. GridPACK™ : A Framework for Developing Power Grid Simulations on High-Performance Computing Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, Bruce J.; Perkins, William A.; Chen, Yousu; Jin, Shuangshuang; Callahan, David; Glass, Kevin A.; Diao, Ruisheng; Rice, Mark J.; Elbert, Stephen T.; Vallem, Mallikarjuna R.; Huang, Zhenyu

    2016-05-01

    This paper describes the GridPACK™ framework, which is designed to help power grid engineers develop modeling software capable of running on high performance computers. The framework makes extensive use of software templates to provide high level functionality while at the same time allowing developers the freedom to express whatever models and algorithms they are using. GridPACK™ contains modules for setting up distributed power grid networks, assigning buses and branches with arbitrary behaviors to the network, creating distributed matrices and vectors and using parallel linear and non-linear solvers to solve algebraic equations. It also provides mappers to create matrices and vectors based on properties of the network and functionality to support IO and to mana

  18. PHASE II VAULT TESTING OF THE ARGONNE RFID SYSTEM

    Energy Technology Data Exchange (ETDEWEB)

    Willoner, T.; Turlington, R.; Koenig, R.

    2012-06-25

    The U.S. Department of Energy (DOE) (Environmental Management [EM], Office of Packaging and Transportation [EM-45]) Packaging and Certification Program (DOE PCP) has developed a Radio Frequency Identification (RFID) tracking and monitoring system, called ARG-US, for the management of nuclear materials packages during transportation and storage. The performance of the ARG-US RFID equipment and system has been fully tested in two demonstration projects in April 2008 and August 2009. With the strong support of DOE-SR and DOE PCP, a field testing program was completed in Savannah River Site's K-Area Material Storage (KAMS) Facility, an active Category I Plutonium Storage Facility, in 2010. As the next step (Phase II) of continued vault testing for the ARG-US system, the Savannah River Site K Area Material Storage facility has placed the ARG-US RFIDs into the 910B storage vault for operational testing. This latest version (Mark III) of the Argonne RFID system now has the capability to measure radiation dose and dose rate. This paper will report field testing progress of the ARG-US RFID equipment in KAMS, the operability and reliability trend results associated with the applications of the system, and discuss the potential benefits in enhancing safety, security and materials accountability. The purpose of this Phase II K Area test is to verify the accuracy of the radiation monitoring and proper functionality of the ARG-US RFID equipment and system under a realistic environment in the KAMS facility. Deploying the ARG-US RFID system leads to a reduced need for manned surveillance and increased inventory periods by providing real-time access to status and event history traceability, including environmental condition monitoring and radiation monitoring. The successful completion of the testing program will provide field data to support a future development and testing. This will increase Operation efficiency and cost effectiveness for vault operation. As the next step

  19. Stochastic simulation framework for the Limit Order Book using liquidity motivated agents

    OpenAIRE

    Efstathios Panayi; Gareth Peters

    2015-01-01

    In this paper we develop a new form of agent-based model for limit order books based on heterogeneous trading agents, whose motivations are liquidity driven. These agents are abstractions of real market participants, expressed in a stochastic model framework. We develop an efficient way to perform statistical calibration of the model parameters on Level 2 limit order book data from Chi-X, based on a combination of indirect inference and multi-objective optimisation. We then demonstrate how su...

  20. A versatile framework for simulating the dynamic mechanical structure of cytoskeletal networks

    CERN Document Server

    Freedman, Simon L; Hocky, Glen M; Dinner, Aaron R

    2016-01-01

    Computer simulations can aid in our understanding of how collective materials properties emerge from interactions between simple constituents. Here, we introduce a coarse- grained model of networks of actin filaments, myosin motors, and crosslinking proteins that enables simulation at biologically relevant time and length scales. We demonstrate that the model, with a consistent parameterization, qualitatively and quantitatively captures a suite of trends observed experimentally, including the statistics of filament fluctuations, mechanical responses to shear, motor motilities, and network rearrangements. The model can thus serve as a platform for interpretation and design of cytoskeletal materials experiments, as well as for further development of simulations incorporating active elements.

  1. FAST: a framework for simulation and analysis of large-scale protein-silicon biosensor circuits.

    Science.gov (United States)

    Gu, Ming; Chakrabartty, Shantanu

    2013-08-01

    This paper presents a computer aided design (CAD) framework for verification and reliability analysis of protein-silicon hybrid circuits used in biosensors. It is envisioned that similar to integrated circuit (IC) CAD design tools, the proposed framework will be useful for system level optimization of biosensors and for discovery of new sensing modalities without resorting to laborious fabrication and experimental procedures. The framework referred to as FAST analyzes protein-based circuits by solving inverse problems involving stochastic functional elements that admit non-linear relationships between different circuit variables. In this regard, FAST uses a factor-graph netlist as a user interface and solving the inverse problem entails passing messages/signals between the internal nodes of the netlist. Stochastic analysis techniques like density evolution are used to understand the dynamics of the circuit and estimate the reliability of the solution. As an example, we present a complete design flow using FAST for synthesis, analysis and verification of our previously reported conductometric immunoassay that uses antibody-based circuits to implement forward error-correction (FEC).

  2. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    Directory of Open Access Journals (Sweden)

    David P. Chassin

    2014-01-01

    Full Text Available Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  3. A Virtual Simulation Environment for Lunar Rover: Framework and Key Technologies

    Directory of Open Access Journals (Sweden)

    Yan-chun Yang

    2008-11-01

    Full Text Available Lunar rover development involves a large amount of validation works in realistic operational conditions, including its mechanical subsystem and on-board software. Real tests require equipped rover platform and a realistic terrain. It is very time consuming and high cost. To improve the development efficiency, a rover simulation environment called RSVE that affords real time capabilities with high fidelity has been developed. It uses fractional Brown motion (fBm technique and statistical properties to generate lunar surface. Thus, various terrain models for simulation can be generated through changing several parameters. To simulate lunar rover evolving on natural and unstructured surface with high realism, the whole dynamics of the multi-body systems and complex interactions with soft ground is integrated in this environment. An example for path planning algorithm and controlling algorithm testing in this environment is tested. This simulation environment runs on PC or Silicon Graphics.

  4. Autonomic, Agent-Based Simulation Management (A2SM) Framework Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large scale numerical simulations, as typified by climate models, space weather models, and the like, typically involve non-linear governing equations in discretized...

  5. A Framework for Parallel Numerical Simulations on Multi-Scale Geometries

    KAUST Repository

    Varduhn, Vasco

    2012-06-01

    In this paper, an approach on performing numerical multi-scale simulations on fine detailed geometries is presented. In particular, the focus lies on the generation of sufficient fine mesh representations, whereas a resolution of dozens of millions of voxels is inevitable in order to sufficiently represent the geometry. Furthermore, the propagation of boundary conditions is investigated by using simulation results on the coarser simulation scale as input boundary conditions on the next finer scale. Finally, the applicability of our approach is shown on a two-phase simulation for flooding scenarios in urban structures running from a city wide scale to a fine detailed in-door scale on feature rich building geometries. © 2012 IEEE.

  6. A technical framework to describe occupant behavior for building energy simulations

    Energy Technology Data Exchange (ETDEWEB)

    Turner, William; Hong, Tianzhen

    2013-12-20

    Green buildings that fail to meet expected design performance criteria indicate that technology alone does not guarantee high performance. Human influences are quite often simplified and ignored in the design, construction, and operation of buildings. Energy-conscious human behavior has been demonstrated to be a significant positive factor for improving the indoor environment while reducing the energy use of buildings. In our study we developed a new technical framework to describe energy-related human behavior in buildings. The energy-related behavior includes accounting for individuals and groups of occupants and their interactions with building energy services systems, appliances and facilities. The technical framework consists of four key components: i. the drivers behind energy-related occupant behavior, which are biological, societal, environmental, physical, and economical in nature ii. the needs of the occupants are based on satisfying criteria that are either physical (e.g. thermal, visual and acoustic comfort) or non-physical (e.g. entertainment, privacy, and social reward) iii. the actions that building occupants perform when their needs are not fulfilled iv. the systems with which an occupant can interact to satisfy their needs The technical framework aims to provide a standardized description of a complete set of human energy-related behaviors in the form of an XML schema. For each type of behavior (e.g., occupants opening/closing windows, switching on/off lights etc.) we identify a set of common behaviors based on a literature review, survey data, and our own field study and analysis. Stochastic models are adopted or developed for each type of behavior to enable the evaluation of the impact of human behavior on energy use in buildings, during either the design or operation phase. We will also demonstrate the use of the technical framework in assessing the impact of occupancy behavior on energy saving technologies. The technical framework presented is

  7. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    Science.gov (United States)

    2014-09-18

    www.ijetae.com (ISSN 2250-2459, ISO 9001 :2008 Certified Journal, Volume 4, Issue 2, February 2014) 829 Towards the Modeling and Simulation of Quantum Key...ISSN 2250-2459, ISO 9001 :2008 Certified Journal, Volume 4, Issue 2, February 2014) 830 Such a simulation capability needs to address many ―concerns...www.ijetae.com (ISSN 2250-2459, ISO 9001 :2008 Certified Journal, Volume 4, Issue 2, February 2014) 831 TABLE II END USER CAPABILITY REQUIREMENTS

  8. A Conceptual framework of Strategy, Structure and Innovative Behaviour for the Development of a Dynamic Simulation Model

    Science.gov (United States)

    Konstantopoulos, Nikolaos; Trivellas, Panagiotis; Reklitis, Panagiotis

    2007-12-01

    According to many researchers of organizational theory, a great number of problems encountered by the manufacturing firms are due to their failure to foster innovative behaviour by aligning business strategy and structure. From this point of view, the fit between strategy and structure is essential in order to facilitate firms' innovative behaviour. In the present paper, we adopt Porter's typology to operationalise business strategy (cost leadership, innovative and marketing differentiation, and focus). Organizational structure is built on four dimensions (centralization, formalization, complexity and employees' initiatives to implement new ideas). Innovativeness is measured as product innovation, process and technological innovation. This study provides the necessary theoretical framework for the development of a dynamic simulation method, although the simulation of social events is a quite difficult task, considering that there are so many alternatives (not all well understood).

  9. Dual-Function Metal-Organic Framework as a Versatile Catalyst for Detoxifying Chemical Warfare Agent Simulants.

    Science.gov (United States)

    Liu, Yangyang; Moon, Su-Young; Hupp, Joseph T; Farha, Omar K

    2015-12-22

    The nanocrystals of a porphyrin-based zirconium(IV) metal-organic framework (MOF) are used as a dual-function catalyst for the simultaneous detoxification of two chemical warfare agent simulants at room temperature. Simulants of nerve agent (such as GD, VX) and mustard gas, dimethyl 4-nitrophenyl phosphate and 2-chloroethyl ethyl sulfide, have been hydrolyzed and oxidized, respectively, to nontoxic products via a pair of pathways catalyzed by the same MOF. Phosphotriesterase-like activity of the Zr6-containing node combined with photoactivity of the porphyrin linker gives rise to a versatile MOF catalyst. In addition, bringing the MOF crystals down to the nanoregime leads to acceleration of the catalysis.

  10. Argonne National Laboratory summary site environmental report for calendar year 2006.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; ESH/QA Oversight

    2008-03-27

    This booklet is designed to inform the public about what Argonne National Laboratory is doing to monitor its environment and to protect its employees and neighbors from any adverse environmental impacts from Argonne research. The Downers Grove South Biology II class was selected to write this booklet, which summarizes Argonne's environmental monitoring programs for 2006. Writing this booklet also satisfies the Illinois State Education Standard, which requires that students need to know and apply scientific concepts to graduate from high school. This project not only provides information to the public, it will help students become better learners. The Biology II class was assigned to condense Argonne's 300-page, highly technical Site Environmental Report into a 16-page plain-English booklet. The site assessment relates to the class because the primary focus of the Biology II class is ecology and the environment. Students developed better learning skills by working together cooperatively, writing and researching more effectively. Students used the Argonne Site Environmental Report, the Internet, text books and information from Argonne scientists to help with their research on their topics. The topics covered in this booklet are the history of Argonne, groundwater, habitat management, air quality, Argonne research, Argonne's environmental non-radiological program, radiation, and compliance. The students first had to read and discuss the Site Environmental Report and then assign topics to focus on. Dr. Norbert Golchert and Mr. David Baurac, both from Argonne, came into the class to help teach the topics more in depth. The class then prepared drafts and wrote a final copy. Ashley Vizek, a student in the Biology class stated, 'I reviewed my material and read it over and over. I then took time to plan my paper out and think about what I wanted to write about, put it into foundation questions and started to write my paper. I rewrote and revised so I

  11. Argonne National Laboratory: Laboratory Directed Research and Development FY 1993 program activities. Annual report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1993-12-23

    The purposes of Argonne`s Laboratory Directed Research and Development (LDRD) Program are to encourage the development of novel concepts, enhance the Laboratory`s R&D capabilities, and further the development of its strategic initiatives. Projects are selected from proposals for creative and innovative R&D studies which are not yet eligible for timely support through normal programmatic channels. Among the aims of the projects supported by the Program are establishment of engineering ``proof-of-principle`` assessment of design feasibility for prospective facilities; development of an instrumental prototype, method, or system; or discovery in fundamental science. Several of these projects are closely associated with major strategic thrusts of the Laboratory as described in Argonne`s Five Year Institutional Plan, although the scientific implications of the achieved results extend well beyond Laboratory plans and objectives. The projects supported by the Program are distributed across the major programmatic areas at Argonne as indicated in the Laboratory LDRD Plan for FY 1993.

  12. Agent Based Modeling and Simulation Framework for Supply Chain Risk Management

    Science.gov (United States)

    2012-03-01

    Karimi . 2009. “Supply Chain Risk Identification Using a HAZOP-Based Approach,” AIChE Journal 55(6), 1447-1463. AFGLSC (Air Force Global...Srinivasan and I. Karimi . 2002. “Agent-based supply chain management – 1: framework,” Computers and Chemical Engineering 26(12), 1755-1769. Juttner, U...Springer: Berlin Heidelberg, 1-27. Naraharisetti, P.K., A. Adhitya, I.A. Karimi and R. Srinivasan. 2009. “From PSE to PSE2 – Decision support

  13. CO 2 adsorption in mono-, di- and trivalent cation-exchanged metal-organic frameworks: A molecular simulation study

    KAUST Repository

    Chen, Yifei

    2012-02-28

    A molecular simulation study is reported for CO 2 adsorption in rho zeolite-like metal-organic framework (rho-ZMOF) exchanged with a series of cations (Na +, K +, Rb +, Cs +, Mg 2+, Ca 2+, and Al 3+). The isosteric heat and Henry\\'s constant at infinite dilution increase monotonically with increasing charge-to-diameter ratio of cation (Cs + < Rb + < K + < Na + < Ca 2+ < Mg 2+ < Al 3+). At low pressures, cations act as preferential adsorption sites for CO 2 and the capacity follows the charge-to-diameter ratio. However, the free volume of framework becomes predominant with increasing pressure and Mg-rho-ZMOF appears to possess the highest saturation capacity. The equilibrium locations of cations are observed to shift slightly upon CO 2 adsorption. Furthermore, the adsorption selectivity of CO 2/H 2 mixture increases as Cs + < Rb + < K + < Na + < Ca 2+ < Mg 2+ ≈ Al 3+. At ambient conditions, the selectivity is in the range of 800-3000 and significantly higher than in other nanoporous materials. In the presence of 0.1% H 2O, the selectivity decreases drastically because of the competitive adsorption between H 2O and CO 2, and shows a similar value in all of the cation-exchanged rho-ZMOFs. This simulation study provides microscopic insight into the important role of cations in governing gas adsorption and separation, and suggests that the performance of ionic rho-ZMOF can be tailored by cations. © 2012 American Chemical Society.

  14. Voxel2MCNP: a framework for modeling, simulation and evaluation of radiation transport scenarios for Monte Carlo codes.

    Science.gov (United States)

    Pölz, Stefan; Laubersheimer, Sven; Eberhardt, Jakob S; Harrendorf, Marco A; Keck, Thomas; Benzler, Andreas; Breustedt, Bastian

    2013-08-21

    The basic idea of Voxel2MCNP is to provide a framework supporting users in modeling radiation transport scenarios using voxel phantoms and other geometric models, generating corresponding input for the Monte Carlo code MCNPX, and evaluating simulation output. Applications at Karlsruhe Institute of Technology are primarily whole and partial body counter calibration and calculation of dose conversion coefficients. A new generic data model describing data related to radiation transport, including phantom and detector geometries and their properties, sources, tallies and materials, has been developed. It is modular and generally independent of the targeted Monte Carlo code. The data model has been implemented as an XML-based file format to facilitate data exchange, and integrated with Voxel2MCNP to provide a common interface for modeling, visualization, and evaluation of data. Also, extensions to allow compatibility with several file formats, such as ENSDF for nuclear structure properties and radioactive decay data, SimpleGeo for solid geometry modeling, ImageJ for voxel lattices, and MCNPX's MCTAL for simulation results have been added. The framework is presented and discussed in this paper and example workflows for body counter calibration and calculation of dose conversion coefficients is given to illustrate its application.

  15. Glucose recovery from aqueous solutions by adsorption in metal-organic framework MIL-101: a molecular simulation study

    Science.gov (United States)

    Gupta, Krishna M.; Zhang, Kang; Jiang, Jianwen

    2015-08-01

    A molecular simulation study is reported on glucose recovery from aqueous solutions by adsorption in metal-organic framework MIL-101. The F atom of MIL-101 is identified to be the most favorable adsorption site. Among three MIL-101-X (X = H, NH2 or CH3), the parent MIL-101 exhibits the highest adsorption capacity and recovery efficacy. Upon functionalization by -NH2 or -CH3 group, the steric hindrance in MIL-101 increases; consequently, the interactions between glucose and framework become less attractive, thus reducing the capacity and mobility of glucose. The presence of ionic liquid, 1-ethyl-3-methyl-imidazolium acetate, as an impurity reduces the strength of hydrogen-bonding between glucose and MIL-101, and leads to lower capacity and mobility. Upon adding anti-solvent (ethanol or acetone), a similar adverse effect is observed. The simulation study provides useful structural and dynamic properties of glucose in MIL-101, and it suggests that MIL-101 might be a potential candidate for glucose recovery.

  16. A scaleable architecture for the modeling and simulation of intelligent transportation systems.

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Tentner, A.

    1999-03-17

    A distributed, scaleable architecture for the modeling and simulation of Intelligent Transportation Systems on a network of workstations or a parallel computer has been developed at Argonne National Laboratory. The resulting capability provides a modular framework supporting plug-in models, hardware, and live data sources; visually realistic graphics displays to support training and human factors studies; and a set of basic ITS models. The models and capabilities are described, along with atypical scenario involving dynamic rerouting of smart vehicles which send probe reports to and receive traffic advisories from a traffic management center capable of incident detection.

  17. A hybrid local/non-local framework for the simulation of damage and fracture

    KAUST Repository

    Azdoud, Yan

    2014-01-01

    Recent advances in non-local continuum models, notably peridynamics, have spurred a paradigm shift in solid mechanics simulation by allowing accurate mathematical representation of singularities and discontinuities. This doctoral work attempts to extend the use of this theory to a community more familiar with local continuum models. In this communication, a coupling strategy - the morphing method -, which bridges local and non-local models, is presented. This thesis employs the morphing method to ease use of the non-local model to represent problems with failure-induced discontinuities. First, we give a quick review of strategies for the simulation of discrete degradation, and suggest a hybrid local/non-local alternative. Second, we present the technical concepts involved in the morphing method and evaluate the quality of the coupling. Third, we develop a numerical tool for the simulation of the hybrid model for fracture and damage and demonstrate its capabilities on numerical model examples

  18. Distributed Geant4 simulation in medical and space science applications using DIANE framework and the GRID

    CERN Document Server

    Moscicki, J T; Mantero, A; Pia, M G

    2003-01-01

    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed analysis environment (DIANE) is a R&D study, focusing on semiinteractive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European data g...

  19. The Framework for Simulation of Bioinspired Security Mechanisms against Network Infrastructure Attacks

    Directory of Open Access Journals (Sweden)

    Andrey Shorov

    2014-01-01

    Full Text Available The paper outlines a bioinspired approach named “network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed prosedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine nessesary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described.

  20. The framework for simulation of bioinspired security mechanisms against network infrastructure attacks.

    Science.gov (United States)

    Shorov, Andrey; Kotenko, Igor

    2014-01-01

    The paper outlines a bioinspired approach named "network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed procedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine necessary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described.

  1. Metal organic frameworks (MOFs) for degradation of nerve agent simulant parathion

    Science.gov (United States)

    Parathion, a simulant of nerve agent VX, has been studied for degradation on Fe3+, Fe2+ and zerovalent iron supported on chitosan. Chitosan, a naturally occurring biopolymer derivative of chitin, is a very good adsorbent for many chemicals including metals. Chitosan is used as supporting biopolymer ...

  2. I. Dissociation free energies in drug-receptor systems via non equilibrium alchemical simulations: theoretical framework

    CERN Document Server

    Procacci, Piero

    2016-01-01

    In this contribution I critically revise the alchemical reversible approach in the context of the statistical mechanics theory of non covalent bonding in drug receptor systems. I show that most of the pitfalls and entanglements for the binding free energies evaluation in computer simulations are rooted in the equilibrium assumption that is implicit in the reversible method. These critical issues can be resolved by using a non-equilibrium variant of the alchemical method in molecular dynamics simulations, relying on the production of many independent trajectories with a continuous dynamical evolution of an externally driven alchemical coordinate, completing the decoupling of the ligand in a matter of few tens of picoseconds rather than nanoseconds. The absolute binding free energy can be recovered from the annihilation work distributions by applying an unbiased unidirectional free energy estimate, on the assumption that any observed work distribution is given by a mixture of normal distributions, whose compone...

  3. Unusual adsorption site behavior in PCN-14 metal-organic framework predicted from Monte Carlo simulation.

    Science.gov (United States)

    Lucena, Sebastião M P; Mileo, Paulo G M; Silvino, Pedro F G; Cavalcante, Célio L

    2011-12-01

    The adsorption equilibrium of methane in PCN-14 was simulated by the Monte Carlo technique in the grand canonical ensemble. A new force field was proposed for the methane/PCN-14 system, and the temperature dependence of the molecular siting was investigated. A detailed study of the statistics of the center of mass and potential energy showed a surprising site behavior with no energy barriers between weak and strong sites, allowing open metal sites to guide methane molecules to other neighboring sites. Moreover, this study showed that a model assuming weakly adsorbing open metal clusters in PCN-14, densely populated only at low temperatures (below 150 K), can explain published experimental data. These results also explain previously observed discrepancies between neutron diffraction experiments and Monte Carlo simulations.

  4. Adaptive learning in agents behaviour: A framework for electricity markets simulation

    DEFF Research Database (Denmark)

    Pinto, Tiago; Vale, Zita; Sousa, Tiago M.

    2014-01-01

    that combines several distinct strategies to build actions proposals, so that the best can be chosen at each time, depending on the context and simulation circumstances. The choosing process includes reinforcement learning algorithms, a mechanism for negotiating contexts analysis, a mechanism for the management...... players and simulates their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. This paper presents a methodology to provide decision support to electricity market negotiating players. This model...... allows integrating different strategic approaches for electricity market negotiations, and choosing the most appropriate one at each time, for each different negotiation context. This methodology is integrated in ALBidS (Adaptive Learning strategic Bidding System) – a multiagent system that provides...

  5. Evolution of Occupant Survivability Simulation Framework Using FEM-SPH Coupling

    Science.gov (United States)

    2011-08-01

    dynamics and detailed soil fracture mechanics simulations in production work do not align. The buried mine problem poses several challenges since...and bulk modulus, the major difference between the DRDC and ESI soil material is the pressure cutoff for tensile fracture . With this parameter SPH...distal parts of the tibia and the fibula are included in one rigid body; each foot is a rigid body; organs are deformable; and skin and flesh are

  6. Spent fuel treatment and mineral waste form development at Argonne National Laboratory-West

    Energy Technology Data Exchange (ETDEWEB)

    Goff, K.M.; Benedict, R.W.; Bateman, K. [Argonne National Lab., Idaho Falls, ID (United States); Lewis, M.A.; Pereira, C. [Argonne National Lab., IL (United States); Musick, C.A. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)

    1996-07-01

    At Argonne National Laboratory-West (ANL-West) there are several thousand kilograms of metallic spent nuclear fuel containing bond sodium. This fuel will be treated in the Fuel Conditioning Facility (FCF) at ANL-West to produce stable waste forms for storage and disposal. Both mineral and metal high-level waste forms will be produced. The mineral waste form will contain the active metal fission products and the transuranics. Cold small-scale waste form testing has been on-going at Argonne in Illinois. Large-scale testing is commencing at ANL-West.

  7. A generic open-source software framework supporting scenario simulations in bioterrorist crises.

    Science.gov (United States)

    Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie

    2013-09-01

    Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.

  8. A flexible object-based software framework for modeling complex systems with interacting natural and societal processes.

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-15

    The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations. The DIAS infrastructure makes it feasible to build and manipulate complex simulation scenarios in which many thousands of objects can interact via dozens to hundreds of concurrent dynamic processes. The flexibility and extensibility of the DIAS software infrastructure stem mainly from (1) the abstraction of object behaviors, (2) the encapsulation and formalization of model functionality, and (3) the mutability of domain object contents. DIAS simulation objects are inherently capable of highly flexible and heterogeneous spatial realizations. Geospatial graphical representation of DIAS simulation objects is addressed via the GeoViewer, an object-based GIS toolkit application developed at ANL. DIAS simulation capabilities have been extended by inclusion of societal process models generated by the Framework for Addressing Cooperative Extended Transactions (FACET), another object-based framework developed at Argonne National Laboratory. By using FACET models to implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations, it has become possible to conveniently address a broad range of issues involving interaction and feedback among natural and societal processes. Example DIAS application areas discussed in this paper include a dynamic virtual oceanic environment, detailed simulation of clinical, physiological, and logistical aspects of health care delivery, and studies of agricultural sustainability of urban centers under environmental stress in ancient Mesopotamia.

  9. Connecting Artificial Brains to Robots in a Comprehensive Simulation Framework: The Neurorobotics Platform

    Science.gov (United States)

    Falotico, Egidio; Vannucci, Lorenzo; Ambrosano, Alessandro; Albanese, Ugo; Ulbrich, Stefan; Vasquez Tieck, Juan Camilo; Hinkel, Georg; Kaiser, Jacques; Peric, Igor; Denninger, Oliver; Cauli, Nino; Kirtay, Murat; Roennau, Arne; Klinker, Gudrun; Von Arnim, Axel; Guyot, Luc; Peppicelli, Daniel; Martínez-Cañada, Pablo; Ros, Eduardo; Maier, Patrick; Weber, Sandro; Huber, Manuel; Plecher, David; Röhrbein, Florian; Deser, Stefan; Roitberg, Alina; van der Smagt, Patrick; Dillman, Rüdiger; Levi, Paul; Laschi, Cecilia; Knoll, Alois C.; Gewaltig, Marc-Oliver

    2017-01-01

    Combined efforts in the fields of neuroscience, computer science, and biology allowed to design biologically realistic models of the brain based on spiking neural networks. For a proper validation of these models, an embodiment in a dynamic and rich sensory environment, where the model is exposed to a realistic sensory-motor task, is needed. Due to the complexity of these brain models that, at the current stage, cannot deal with real-time constraints, it is not possible to embed them into a real-world task. Rather, the embodiment has to be simulated as well. While adequate tools exist to simulate either complex neural networks or robots and their environments, there is so far no tool that allows to easily establish a communication between brain and body models. The Neurorobotics Platform is a new web-based environment that aims to fill this gap by offering scientists and technology developers a software infrastructure allowing them to connect brain models to detailed simulations of robot bodies and environments and to use the resulting neurorobotic systems for in silico experimentation. In order to simplify the workflow and reduce the level of the required programming skills, the platform provides editors for the specification of experimental sequences and conditions, environments, robots, and brain–body connectors. In addition to that, a variety of existing robots and environments are provided. This work presents the architecture of the first release of the Neurorobotics Platform developed in subproject 10 “Neurorobotics” of the Human Brain Project (HBP).1 At the current state, the Neurorobotics Platform allows researchers to design and run basic experiments in neurorobotics using simulated robots and simulated environments linked to simplified versions of brain models. We illustrate the capabilities of the platform with three example experiments: a Braitenberg task implemented on a mobile robot, a sensory-motor learning task based on a robotic controller

  10. A framework for epistemic uncertainty quantification of turbulent scalar flux models for Reynolds-averaged Navier-Stokes simulations

    Science.gov (United States)

    Gorlé, C.; Iaccarino, G.

    2013-05-01

    Reynolds-averaged Navier-Stokes (RANS) simulations are a practical approach for solving complex multi-physics turbulent flows, but the underlying assumptions of the turbulence models introduce errors and uncertainties in the simulation outcome. The flow in scramjet combustors is an example of such a complex flow and the accurate characterization of safety and operability limits of these engines using RANS simulations requires an assessment of the model uncertainty. The objective of this paper is to present a framework for the epistemic uncertainty quantification of turbulence and mixing models in RANS simulations. The capabilities of the methodology will be demonstrated by performing simulations of the mixing of an underexpanded jet in a supersonic cross flow, which involves many flow features observed in scramjet engines. The fundamental sources of uncertainty in the RANS simulations are the models used for the Reynolds stresses in the momentum equations and the turbulent scalar fluxes in the scalar transport equations. The methodology consists in directly perturbing the modeled quantities in the equations, thereby establishing a method that is completely independent of the initial model form to overcome the limitations of traditional sensitivity studies. The perturbations are defined in terms of the decomposed Reynolds stress tensor, i.e., the tensor magnitude and the eigenvalues and eigenvectors of the normalized anisotropy tensor. The turbulent scalar fluxes are perturbed by using the perturbed Reynolds stresses in a generalized gradient diffusion model formulation and by changing the model constant. The perturbations were parameterized based on a comparison between the Reynolds stresses obtained from a baseline RANS simulation and those obtained from a large-eddy simulation database. Subsequently an optimization problem was solved, varying the parameters in the perturbation functions to maximize a quantity of interest that quantifies the downstream mixing. The

  11. Diffusion dynamics and concentration of toxic materials from quantum dots-based nanotechnologies: an agent-based modeling simulation framework

    Energy Technology Data Exchange (ETDEWEB)

    Agusdinata, Datu Buyung, E-mail: bagusdinata@niu.edu; Amouie, Mahbod [Northern Illinois University, Department of Industrial & Systems Engineering and Environment, Sustainability, & Energy Institute (United States); Xu, Tao [Northern Illinois University, Department of Chemistry and Biochemistry (United States)

    2015-01-15

    Due to their favorable electrical and optical properties, quantum dots (QDs) nanostructures have found numerous applications including nanomedicine and photovoltaic cells. However, increased future production, use, and disposal of engineered QD products also raise concerns about their potential environmental impacts. The objective of this work is to establish a modeling framework for predicting the diffusion dynamics and concentration of toxic materials released from Trioctylphosphine oxide-capped CdSe. To this end, an agent-based model simulation with reaction kinetics and Brownian motion dynamics was developed. Reaction kinetics is used to model the stability of surface capping agent particularly due to oxidation process. The diffusion of toxic Cd{sup 2+} ions in aquatic environment was simulated using an adapted Brownian motion algorithm. A calibrated parameter to reflect sensitivity to reaction rate is proposed. The model output demonstrates the stochastic spatial distribution of toxic Cd{sup 2+} ions under different values of proxy environmental factor parameters. With the only chemistry considered was oxidation, the simulation was able to replicate Cd{sup 2+} ion release from Thiol-capped QDs in aerated water. The agent-based method is the first to be developed in the QDs application domain. It adds both simplicity of the solubility and rate of release of Cd{sup 2+} ions and complexity of tracking of individual atoms of Cd at the same time.

  12. FAST modularization framework for wind turbine simulation: full-system linearization

    Science.gov (United States)

    Jonkman, J. M.; Jonkman, B. J.

    2016-09-01

    The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well- established methods and tools for analyzing linear systems. This paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.

  13. Towards multi-phase flow simulations in the PDE framework Peano

    KAUST Repository

    Bungartz, Hans-Joachim

    2011-07-27

    In this work, we present recent enhancements and new functionalities of our flow solver in the partial differential equation framework Peano. We start with an introduction including an overview of the Peano development and a short description of the basic concepts of Peano and the flow solver in Peano concerning the underlying structured but adaptive Cartesian grids, the data structure and data access optimisation, and spatial and time discretisation of the flow solver. The new features cover geometry interfaces and additional application functionalities. The two geometry interfaces, a triangulation-based description supported by the tool preCICE and a built-in geometry using geometry primitives such as cubes, spheres, or tetrahedra allow for the efficient treatment of complex and changing geometries, an essential ingredient for most application scenarios. The new application functionality concerns a coupled heat-flow problem and two-phase flows. We present numerical examples, performance and validation results for these new functionalities. © 2011 Springer-Verlag.

  14. On a framework for generating PoD curves assisted by numerical simulations

    Science.gov (United States)

    Subair, S. Mohamed; Agrawal, Shweta; Balasubramaniam, Krishnan; Rajagopal, Prabhu; Kumar, Anish; Rao, Purnachandra B.; Tamanna, Jayakumar

    2015-03-01

    The Probability of Detection (PoD) curve method has emerged as an important tool for the assessment of the performance of NDE techniques, a topic of particular interest to the nuclear industry where inspection qualification is very important. The conventional experimental means of generating PoD curves though, can be expensive, requiring large data sets (covering defects and test conditions), and equipment and operator time. Several methods of achieving faster estimates for PoD curves using physics-based modelling have been developed to address this problem. Numerical modelling techniques are also attractive, especially given the ever-increasing computational power available to scientists today. Here we develop procedures for obtaining PoD curves, assisted by numerical simulation and based on Bayesian statistics. Numerical simulations are performed using Finite Element analysis for factors that are assumed to be independent, random and normally distributed. PoD curves so generated are compared with experiments on austenitic stainless steel (SS) plates with artificially created notches. We examine issues affecting the PoD curve generation process including codes, standards, distribution of defect parameters and the choice of the noise threshold. We also study the assumption of normal distribution for signal response parameters and consider strategies for dealing with data that may be more complex or sparse to justify this. These topics are addressed and illustrated through the example case of generation of PoD curves for pulse-echo ultrasonic inspection of vertical surface-breaking cracks in SS plates.

  15. Casting Simulation Within the Framework of ICME: Coupling of Solidification, Heat Treatment, and Structural Analysis

    Science.gov (United States)

    Guo, Jianzheng; Scott, Sam; Cao, Weisheng; Köser, Ole

    2016-05-01

    Integrated computational materials engineering (ICME) is becoming a compulsory practice for developing advanced materials, re-thinking manufacturing processing, and engineering components to meet challenging design goals quickly and cost-effectively. As a key component of the ICME approach, a numerical approach is being developed for the prediction of casting microstructure, defects formation and mechanical properties from solidification to heat treatment. Because of the processing conditions and complexity of geometry, material properties of a cast part are not normally homogeneous. This variation and the potential weakening inherent in manufacturing are currently accommodated by incorporating large safety factors that counter design goals. The simulation of the different manufacturing process stages is integrated such that the resultant microstructure of the previous event is used as the initial condition of the following event, ensuring the tracking of the component history while maintaining a high level of accuracy across these manufacturing stages. This paper explains the significance of integrated analytical prediction to obtain more precise simulation results and sets out how available techniques may be applied accordingly.

  16. Research framework of integrated simulation on bilateral interaction between water cycle and socio-economic development

    Science.gov (United States)

    Hao, C. F.; Yang, X. L.; Niu, C. W.; Jia, Y. W.

    2016-08-01

    The mechanism of bilateral interaction between natural water cycle evolution and socio-economic development has been obscured in current research due to the complexity of the hydrological process and the socio-economic system. The coupling of economic model CGE (Computable General Equilibrium) and distributed hydrological model WEP (Water and Energy transfer Processes) provides a model-based tool for research on response and feedback of water cycle and social development, as well as economic prospects under the constraint of water resources. On one hand, water policies, such as water use limitation and water price adjustment under different levels of socio-economic development, are to be evaluated by CGE model as assumed conditions and corresponding results of water demand could be put into WEP model to simulate corresponding response during the whole process of water cycle. On the other hand, variation of available water resources quantity under different scenarios simulated by WEP model may provide proper limitation for water demand in CGE model, and corresponding change of economic factors could indicate the influence of water resources constraints on socio-economic development. The research is believed to be helpful for better understanding of bilateral interaction between water and society.

  17. SIMULATION FRAMEWORK FOR REGIONAL GEOLOGIC CO{sub 2} STORAGE ALONG ARCHES PROVINCE OF MIDWESTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Sminchak, Joel

    2012-09-30

    This report presents final technical results for the project Simulation Framework for Regional Geologic CO{sub 2} Storage Infrastructure along Arches Province of the Midwest United States. The Arches Simulation project was a three year effort designed to develop a simulation framework for regional geologic carbon dioxide (CO{sub 2}) storage infrastructure along the Arches Province through development of a geologic model and advanced reservoir simulations of large-scale CO{sub 2} storage. The project included five major technical tasks: (1) compilation of geologic, hydraulic and injection data on Mount Simon, (2) development of model framework and parameters, (3) preliminary variable density flow simulations, (4) multi-phase model runs of regional storage scenarios, and (5) implications for regional storage feasibility. The Arches Province is an informal region in northeastern Indiana, northern Kentucky, western Ohio, and southern Michigan where sedimentary rock formations form broad arch and platform structures. In the province, the Mount Simon sandstone is an appealing deep saline formation for CO{sub 2} storage because of the intersection of reservoir thickness and permeability. Many CO{sub 2} sources are located in proximity to the Arches Province, and the area is adjacent to coal fired power plants along the Ohio River Valley corridor. Geophysical well logs, rock samples, drilling logs, and geotechnical tests were evaluated for a 500,000 km{sup 2} study area centered on the Arches Province. Hydraulic parameters and historical operational information was also compiled from Mount Simon wastewater injection wells in the region. This information was integrated into a geocellular model that depicts the parameters and conditions in a numerical array. The geologic and hydraulic data were integrated into a three-dimensional grid of porosity and permeability, which are key parameters regarding fluid flow and pressure buildup due to CO{sub 2} injection. Permeability data

  18. SIMULATION FRAMEWORK FOR REGIONAL GEOLOGIC CO{sub 2} STORAGE ALONG ARCHES PROVINCE OF MIDWESTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Sminchak, Joel

    2012-09-30

    This report presents final technical results for the project Simulation Framework for Regional Geologic CO{sub 2} Storage Infrastructure along Arches Province of the Midwest United States. The Arches Simulation project was a three year effort designed to develop a simulation framework for regional geologic carbon dioxide (CO{sub 2}) storage infrastructure along the Arches Province through development of a geologic model and advanced reservoir simulations of large-scale CO{sub 2} storage. The project included five major technical tasks: (1) compilation of geologic, hydraulic and injection data on Mount Simon, (2) development of model framework and parameters, (3) preliminary variable density flow simulations, (4) multi-phase model runs of regional storage scenarios, and (5) implications for regional storage feasibility. The Arches Province is an informal region in northeastern Indiana, northern Kentucky, western Ohio, and southern Michigan where sedimentary rock formations form broad arch and platform structures. In the province, the Mount Simon sandstone is an appealing deep saline formation for CO{sub 2} storage because of the intersection of reservoir thickness and permeability. Many CO{sub 2} sources are located in proximity to the Arches Province, and the area is adjacent to coal fired power plants along the Ohio River Valley corridor. Geophysical well logs, rock samples, drilling logs, and geotechnical tests were evaluated for a 500,000 km{sup 2} study area centered on the Arches Province. Hydraulic parameters and historical operational information was also compiled from Mount Simon wastewater injection wells in the region. This information was integrated into a geocellular model that depicts the parameters and conditions in a numerical array. The geologic and hydraulic data were integrated into a three-dimensional grid of porosity and permeability, which are key parameters regarding fluid flow and pressure buildup due to CO{sub 2} injection. Permeability data

  19. A comparison of regional flood frequency analysis approaches in a simulation framework

    Science.gov (United States)

    Ganora, D.; Laio, F.

    2016-07-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve at ungauged (or scarcely gauged) sites. Different RFA approaches exist, depending on the way the information is transferred to the site of interest, but it is not clear in the literature if a specific method systematically outperforms the others. The aim of this study is to provide a framework wherein carrying out the intercomparison by building up a virtual environment based on synthetically generated data. The considered regional approaches include: (i) a unique regional curve for the whole region; (ii) a multiple-region model where homogeneous subregions are determined through cluster analysis; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially smooth estimation procedure where the parameters of the regional model vary continuously along the space. Virtual environments are generated considering different patterns of heterogeneity, including step change and smooth variations. If the region is heterogeneous, with the parent distribution changing continuously within the region, the spatially smooth regional approach outperforms the others, with overall errors 10-50% lower than the other methods. In the case of a step-change, the spatially smooth and clustering procedures perform similarly if the heterogeneity is moderate, while clustering procedures work better when the step-change is severe. To extend our findings, an extensive sensitivity analysis has been performed to investigate the effect of sample length, number of virtual stations, return period of the predicted quantile, variability of the scale parameter of the parent distribution, number of predictor variables and different parent distribution. Overall, the spatially smooth approach appears as the most robust approach as its performances are more stable across different patterns of heterogeneity, especially when short records are

  20. Handling of the Generation of Primary Events in Gauss, the LHCb Simulation Framework

    CERN Multimedia

    Corti, G; Brambach, T; Brook, N H; Gauvin, N; Harrison, K; Harrison, P; He, J; Ilten, P J; Jones, C R; Lieng, M H; Manca, G; Miglioranzi, S; Robbe, P; Vagnoni, V; Whitehead, M; Wishahi, J

    2010-01-01

    The LHCb simulation application, Gauss, consists of two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup. For the LHCb experimental program it is particularly important to model B meson decays: the EvtGen code developed in CLEO and BaBar has been chosen and customized for non coherent B production as occuring in pp collisions at the LHC. The initial proton-proton collision is provided by a different generator engine, currently Pythia 6 for massive production of signal and generic pp collisions events. Beam gas events, background events originating from proton halo, cosmics and calibration events for different detectors can be generated in addition to pp collisions. Different generator packages are available in the physics community or specifically developed in LHCb, and are used for the different purposes. Running conditions affecting the events generated such as the size of the luminous region, the number of collisions occuring in a bunc...

  1. Optimising and extending the geometrical modeller of a physics simulation framework

    CERN Document Server

    Urban, P

    1998-01-01

    The design of highly complex particle detectors used in High Energy Physics involves both CAD systems and physics simulation packages like Geant4. Geant4 is able to exchange detector geometries with CAD systems, conforming to the Standard for the Exchange of Product Model Data (STEP); Boundary Representation (B-Rep) models are transferred. Particle tracking is performed in these models, requiring efficient and accurate intersection computations from the geometrical modeller. The results of extending and optimising the modeller of Geant4 form the contents of this thesis. Swept surfaces: surfaces of linear extrusion and surfaces of revolution have been implemented. The problem of classifying points on surfaces bounded by curves as being inside or outside has been solved. These tasks necessitated the extension and optimisation of code related to curves and lead to a re-design of this code. Emphasis was put on efficiency and on dealing with numerical errors. The results will be integrated into the upcoming beta t...

  2. The Design of Cognitive Social Simulation Framework using Statistical Methodology in the Domain of Academic Science

    Directory of Open Access Journals (Sweden)

    R. Sivakumar

    2013-05-01

    Full Text Available Modeling the behavior of the cognitive architecture in the context of social simulation using statistical methodologies is currently a growing research area. Normally, a cognitive architecture for an intelligent agent involves artificial computational process which exemplifies theories of cognition in computer algorithms under the consideration of state space. More specifically, for such cognitive system with large state space the problem like large tables and data sparsity are faced. Hence in this paper, we have proposed a method using a value iterative approach based on Q-learning algorithm, with function approximation technique to handle the cognitive systems with large state space. From the experimental results in the application domain of academic science it has been verified that the proposed approach has better performance compared to its existing approaches.

  3. A framework of motion capture system based human behaviours simulation for ergonomic analysis

    CERN Document Server

    Ma, Ruina; Bennis, Fouad; Ma, Liang

    2011-01-01

    With the increasing of computer capabilities, Computer aided ergonomics (CAE) offers new possibilities to integrate conventional ergonomic knowledge and to develop new methods into the work design process. As mentioned in [1], different approaches have been developed to enhance the efficiency of the ergonomic evaluation. Ergonomic expert systems, ergonomic oriented information systems, numerical models of human, etc. have been implemented in numerical ergonomic software. Until now, there are ergonomic software tools available, such as Jack, Ergoman, Delmia Human, 3DSSPP, and Santos, etc. [2-4]. The main functions of these tools are posture analysis and posture prediction. In the visualization part, Jack and 3DSSPP produce results to visualize virtual human tasks in 3-dimensional, but without realistic physical properties. Nowadays, with the development of computer technology, the simulation of physical world is paid more attention. Physical engines [5] are used more and more in computer game (CG) field. The a...

  4. CO adsorption over Pd nanoparticles: A general framework for IR simulations on nanoparticles

    Science.gov (United States)

    Zeinalipour-Yazdi, Constantinos D.; Willock, David J.; Thomas, Liam; Wilson, Karen; Lee, Adam F.

    2016-04-01

    CO vibrational spectra over catalytic nanoparticles under high coverages/pressures are discussed from a DFT perspective. Hybrid B3LYP and PBE DFT calculations of CO chemisorbed over Pd4 and Pd13 nanoclusters, and a 1.1 nm Pd38 nanoparticle, have been performed in order to simulate the corresponding coverage dependent infrared (IR) absorption spectra, and hence provide a quantitative foundation for the interpretation of experimental IR spectra of CO over Pd nanocatalysts. B3LYP simulated IR intensities are used to quantify site occupation numbers through comparison with experimental DRIFTS spectra, allowing an atomistic model of CO surface coverage to be created. DFT adsorption energetics for low CO coverage (θ → 0) suggest the CO binding strength follows the order hollow > bridge > linear, even for dispersion-corrected functionals for sub-nanometre Pd nanoclusters. For a Pd38 nanoparticle, hollow and bridge-bound are energetically similar (hollow ≈ bridge > atop). It is well known that this ordering has not been found at the high coverages used experimentally, wherein atop CO has a much higher population than observed over Pd(111), confirmed by our DRIFTS spectra for Pd nanoparticles supported on a KIT-6 silica, and hence site populations were calculated through a comparison of DFT and spectroscopic data. At high CO coverage (θ = 1), all three adsorbed CO species co-exist on Pd38, and their interdiffusion is thermally feasible at STP. Under such high surface coverages, DFT predicts that bridge-bound CO chains are thermodynamically stable and isoenergetic to an entirely hollow bound Pd/CO system. The Pd38 nanoparticle undergoes a linear (3.5%), isotropic expansion with increasing CO coverage, accompanied by 63 and 30 cm- 1 blue-shifts of hollow and linear bound CO respectively.

  5. A discrete element and ray framework for rapid simulation of acoustical dispersion of microscale particulate agglomerations

    Science.gov (United States)

    Zohdi, T. I.

    2016-03-01

    In industry, particle-laden fluids, such as particle-functionalized inks, are constructed by adding fine-scale particles to a liquid solution, in order to achieve desired overall properties in both liquid and (cured) solid states. However, oftentimes undesirable particulate agglomerations arise due to some form of mutual-attraction stemming from near-field forces, stray electrostatic charges, process ionization and mechanical adhesion. For proper operation of industrial processes involving particle-laden fluids, it is important to carefully breakup and disperse these agglomerations. One approach is to target high-frequency acoustical pressure-pulses to breakup such agglomerations. The objective of this paper is to develop a computational model and corresponding solution algorithm to enable rapid simulation of the effect of acoustical pulses on an agglomeration composed of a collection of discrete particles. Because of the complex agglomeration microstructure, containing gaps and interfaces, this type of system is extremely difficult to mesh and simulate using continuum-based methods, such as the finite difference time domain or the finite element method. Accordingly, a computationally-amenable discrete element/discrete ray model is developed which captures the primary physical events in this process, such as the reflection and absorption of acoustical energy, and the induced forces on the particulate microstructure. The approach utilizes a staggered, iterative solution scheme to calculate the power transfer from the acoustical pulse to the particles and the subsequent changes (breakup) of the pulse due to the particles. Three-dimensional examples are provided to illustrate the approach.

  6. Computation of Two-Body Matrix Elements From the Argonne $v_{18}$ Potential

    CERN Document Server

    Mihaila, B; Mihaila, Bogdan; Heisenberg, Jochen H.

    1998-01-01

    We discuss the computation of two-body matrix elements from the Argonne $v_{18}$ interaction. The matrix elements calculation is presented both in particle-particle and in particle-hole angular momentum coupling. The procedures developed here can be applied to the case of other NN potentials, provided that they have a similar operator format.

  7. Argonne National Laboratory-East site environmental report for calendar year 1995

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Kolzow, R.G. [Environmental Management Operation, Argonne National Lab., IL (United States)

    1996-09-01

    This report presents the environmental report for the Argonne National Laboratory-East for the year of 1995. Topics discussed include: general description of the site including climatology, geology, seismicity, hydrology, vegetation, endangered species, population, water and land use, and archaeology; compliance summary; environmental program information; environmental nonradiological program information; ground water protection; and radiological monitoring program.

  8. Applied mathematical sciences research at Argonne, April 1, 1981-March 31, 1982

    Energy Technology Data Exchange (ETDEWEB)

    Pieper, G.W. (ed.)

    1982-01-01

    This report reviews the research activities in Applied Mathematical Sciences at Argonne National Laboratory for the period April 1, 1981, through March 31, 1982. The body of the report discusses various projects carried out in three major areas of research: applied analysis, computational mathematics, and software engineering. Information on section staff, visitors, workshops, and seminars is found in the appendices.

  9. Bush will tour Illionois lab working to fight terrorism Argonne develops chemical detectors

    CERN Multimedia

    2002-01-01

    "A chemical sensor that detects cyanide gas, a biochip that can determine the presence of anthrax, and a portable device that finds concealed nuclear materials are among the items scientists at Argonne National Laboratory are working on to combat terrorism" (1/2 page).

  10. Quality management at Argonne National Laboratory: Status, accomplishments, and lessons learned

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-06-01

    In April 1992, Argonne National Laboratory (ANL) launched the implementation of quality management (QM) as an initiative of the Laboratory Director. The goal of the program is to seek ways of improving Laboratory performance and effectiveness by drawing from the realm of experiences in the global total quality management movement. The Argonne QM initiative began with fact finding and formulating a strategy for implementation; the emphasis is that the underlying principles of QM should be an integral part of how the Laboratory is managed and operated. A primary theme that has guided the Argonne QM initiative is to consider only those practices that offer the potential for real improvement, make sense, fit the culture, and would be credible to the broad population. In October 1993, the Laboratory began to pilot a targeted set of QM activities selected to produce outcomes important to the Laboratory--strengthening the customer focus, improving work processes, enhancing employee involvement and satisfaction, and institutionalizing QM. This report describes the results of the just-concluded QM development and demonstration phase in terms of detailed strategies, accomplishments, and lessons learned. These results are offered as evidence to support the conclusion that the Argonne QM initiative has achieved value-added results and credibility and is well positioned to support future deployment across the entire Laboratory as an integrated management initiative. Recommendations for follow-on actions to implement future deployment are provided separately.

  11. Argonne National Laboratory research to help U.S. steel industry

    CERN Multimedia

    2003-01-01

    Argonne National Laboratory has joined a $1.29 million project to develop technology software that will use advanced computational fluid dynamics (CFD), a method of solving fluid flow and heat transfer problems. This technology allows engineers to evaluate and predict erosion patterns within blast furnaces (1 page).

  12. Update on intrusive characterization of mixed contact-handled transuranic waste at Argonne-West

    Energy Technology Data Exchange (ETDEWEB)

    Dwight, C.C.; Jensen, B.A.; Bryngelson, C.D.; Duncan, D.S.

    1997-02-03

    Argonne National Laboratory and Lockheed Martin Idaho Technologies Company have jointly participated in the Department of Energy`s (DOE) Waste Isolation Pilot Plant (WIPP) Transuranic Waste Characterization Program since 1990. Intrusive examinations have been conducted in the Waste Characterization Area, located at Argonne-West in Idaho Falls, Idaho, on over 200 drums of mixed contact-handled transuranic waste. This is double the number of drums characterized since the last update at the 1995 Waste Management Conference. These examinations have provided waste characterization information that supports performance assessment of WIPP and that supports Lockheed`s compliance with the Resource Conservation and Recovery Act. Operating philosophies and corresponding regulatory permits have been broadened to provide greater flexibility and capability for waste characterization, such as the provision for minor treatments like absorption, neutralization, stabilization, and amalgamation. This paper provides an update on Argonne`s intrusive characterization permits, procedures, results, and lessons learned. Other DOE sites that must deal with mixed contact-handled transuranic waste have initiated detailed planning for characterization of their own waste. The information presented herein could aid these other storage and generator sites in further development of their characterization efforts.

  13. Argonne National Laboratory annual report of Laboratory Directed Research and Development Program Activities FY 2009.

    Energy Technology Data Exchange (ETDEWEB)

    Office of the Director

    2010-04-09

    I am pleased to submit Argonne National Laboratory's Annual Report on its Laboratory Directed Research and Development (LDRD) activities for fiscal year 2009. Fiscal year 2009 saw a heightened focus by DOE and the nation on the need to develop new sources of energy. Argonne scientists are investigating many different sources of energy, including nuclear, solar, and biofuels, as well as ways to store, use, and transmit energy more safely, cleanly, and efficiently. DOE selected Argonne as the site for two new Energy Frontier Research Centers (EFRCs) - the Institute for Atom-Efficient Chemical Transformations and the Center for Electrical Energy Storage - and funded two other EFRCs to which Argonne is a major partner. The award of at least two of the EFRCs can be directly linked to early LDRD-funded efforts. LDRD has historically seeded important programs and facilities at the lab. Two of these facilities, the Advanced Photon Source and the Center for Nanoscale Materials, are now vital contributors to today's LDRD Program. New and enhanced capabilities, many of which relied on LDRD in their early stages, now help the laboratory pursue its evolving strategic goals. LDRD has, since its inception, been an invaluable resource for positioning the Laboratory to anticipate, and thus be prepared to contribute to, the future science and technology needs of DOE and the nation. During times of change, LDRD becomes all the more vital for facilitating the necessary adjustments while maintaining and enhancing the capabilities of our staff and facilities. Although I am new to the role of Laboratory Director, my immediate prior service as Deputy Laboratory Director for Programs afforded me continuous involvement in the LDRD program and its management. Therefore, I can attest that Argonne's program adhered closely to the requirements of DOE Order 413.2b and associated guidelines governing LDRD. Our LDRD program management continually strives to be more efficient. In

  14. MASH: a framework for the automation of x-ray optical simulations

    Science.gov (United States)

    Sondhauss, Peter

    2014-09-01

    MASH stands for "Macros for the Automation of SHadow". It allows to run a set of ray-tracing simulations, for a range of photon energies for example, fully automatically. Undulator gaps, crystal angles etc. are tuned automatically. Important output parameters, such as photon flux, photon irradiance, focal spot size, bandwidth, etc. are then directly provided as function of photon energy. A photon energy scan is probably the most commonly requested one, but any parameter or set of parameters can be scanned through as well. Heat load calculations with finite element analysis providing temperatures, stress and deformations (Comsol) are fully integrated. The deformations can be fed back into the ray-tracing process simply by activating a switch. MASH tries to hide program internals such as le names, calls to pre-processors etc., so that the user (nearly) only needs to provide the optical setup. It comes with a web interface, which allows to run it remotely on a central computation server. Hence, no local installation or licenses are required, just a web browser and access to the local network. Numerous tools are provided to look at the ray-tracing results in the web-browser. The results can be also downloaded for local analysis. All files are human readable text files that can be easily imported into third-party programs for further processing. All set parameters are stored in a single human-readable file in XML format.

  15. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  16. Development of a Parallel Overset Grid Framework for Moving Body Simulations in OpenFOAM

    Directory of Open Access Journals (Sweden)

    Dominic Chandar

    2015-12-01

    Full Text Available OpenFOAM is an industry-standard Open-Source fluid dynamics code that is used to solve the Navier-Stokes equations for a variety of flow situations. It is currently being used extensively by researchers to study a plethora of physical problems ranging from fundamental fluid dynamics to complex multiphase flows. When it comes to modeling the flow surrounding moving bodies that involve large displacements such as that of ocean risers, sinking of a ship, or the free-flight of an insect, it is cumbersome to utilize a single computational grid and move the body of interest. In this work, we discuss a high-fidelity approach based on overset or overlapping grids which overcomes the necessity of using a single computational grid. The overset library is parallelized using the Message Passing Interface (MPI and Pthreads and is linked dynamically to OpenFOAM. Computational results are presented to demonstrate the potential of this method for simulating problems with large displacements.

  17. A framework for the evaluation of turbulence closures used in mesoscale ocean large-eddy simulations

    CERN Document Server

    Graham, Jonathan Pietarila

    2012-01-01

    We present a methodology to determine the best turbulence closure for an eddy-permitting ocean model: measurement of the error-landscape of the closure's subgrid spectral transfers and flux. Using a high-resolution benchmark, we compare each closure's model of energy and enstrophy transfer to the actual transfer observed in the benchmark run. The error-landscape norms enable us to both make objective comparisons between the closures and to optimize each closure's free parameter for a fair comparison. We apply this method to 6 different closures for forced-dissipative simulations of the barotropic vorticity equation on a f-plane (2D Navier-Stokes equation). The hyper-viscous closure most closely reproduces the enstrophy cascade especially at larger scales due to the concentration of its dissipative effects to the very smallest scales. The viscous and Leith closures perform nearly as well especially at smaller scales where all three models were dissipative. The Smagorinsky closure dissipates enstrophy at the wr...

  18. In Situ Probes of Capture and Decomposition of Chemical Warfare Agent Simulants by Zr-Based Metal Organic Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Plonka, Anna M.; Wang, Qi; Gordon, Wesley O.; Balboa, Alex; Troya, Diego; Guo, Weiwei; Sharp, Conor H.; Senanayake, Sanjaya D.; Morris, John R.; Hill, Craig L.; Frenkel, Anatoly I.

    2017-01-18

    Zr-based metal organic frameworks (MOFs) have been recently shown to be among the fastest catalysts of nerve-agent hydrolysis in solution. We report a detailed study of the adsorption and decomposition of a nerve-agent simulant, dimethyl methylphosphonate (DMMP), on UiO-66, UiO-67, MOF-808, and NU-1000 using synchrotron-based X-ray powder diffraction, X-ray absorption, and infrared spectroscopy, which reveals key aspects of the reaction mechanism. The diffraction measurements indicate that all four MOFs adsorb DMMP (introduced at atmospheric pressures through a flow of helium or air) within the pore space. In addition, the combination of X-ray absorption and infrared spectra suggests direct coordination of DMMP to the Zr6 cores of all MOFs, which ultimately leads to decomposition to phosphonate products. These experimental probes into the mechanism of adsorption and decomposition of chemical warfare agent simulants on Zr-based MOFs open new opportunities in rational design of new and superior decontamination materials.

  19. C++QEDv2 Milestone 10: A C++/Python application-programming framework for simulating open quantum dynamics

    Science.gov (United States)

    Sandner, Raimar; Vukics, András

    2014-09-01

    The v2 Milestone 10 release of C++QED is primarily a feature release, which also corrects some problems of the previous release, especially as regards the build system. The adoption of C++11 features has led to many simplifications in the codebase. A full doxygen-based API manual [1] is now provided together with updated user guides. A largely automated, versatile new testsuite directed both towards computational and physics features allows for quickly spotting arising errors. The states of trajectories are now savable and recoverable with full binary precision, allowing for trajectory continuation regardless of evolution method (single/ensemble Monte Carlo wave-function or Master equation trajectory). As the main new feature, the framework now presents Python bindings to the highest-level programming interface, so that actual simulations for given composite quantum systems can now be performed from Python. Catalogue identifier: AELU_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: yes No. of lines in distributed program, including test data, etc.: 492422 No. of bytes in distributed program, including test data, etc.: 8070987 Distribution format: tar.gz Programming language: C++/Python. Computer: i386-i686, x86 64. Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2. External routines: Boost C

  20. Causal Mathematical Logic as a guiding framework for the prediction of "Intelligence Signals" in brain simulations

    Science.gov (United States)

    Lanzalaco, Felix; Pissanetzky, Sergio

    2013-12-01

    A recent theory of physical information based on the fundamental principles of causality and thermodynamics has proposed that a large number of observable life and intelligence signals can be described in terms of the Causal Mathematical Logic (CML), which is proposed to encode the natural principles of intelligence across any physical domain and substrate. We attempt to expound the current definition of CML, the "Action functional" as a theory in terms of its ability to possess a superior explanatory power for the current neuroscientific data we use to measure the mammalian brains "intelligence" processes at its most general biophysical level. Brain simulation projects define their success partly in terms of the emergence of "non-explicitly programmed" complex biophysical signals such as self-oscillation and spreading cortical waves. Here we propose to extend the causal theory to predict and guide the understanding of these more complex emergent "intelligence Signals". To achieve this we review whether causal logic is consistent with, can explain and predict the function of complete perceptual processes associated with intelligence. Primarily those are defined as the range of Event Related Potentials (ERP) which include their primary subcomponents; Event Related Desynchronization (ERD) and Event Related Synchronization (ERS). This approach is aiming for a universal and predictive logic for neurosimulation and AGi. The result of this investigation has produced a general "Information Engine" model from translation of the ERD and ERS. The CML algorithm run in terms of action cost predicts ERP signal contents and is consistent with the fundamental laws of thermodynamics. A working substrate independent natural information logic would be a major asset. An information theory consistent with fundamental physics can be an AGi. It can also operate within genetic information space and provides a roadmap to understand the live biophysical operation of the phenotype

  1. The Planetary Accretion Shock. I. Framework for Radiation-hydrodynamical Simulations and First Results

    Science.gov (United States)

    Marleau, Gabriel-Dominique; Klahr, Hubert; Kuiper, Rolf; Mordasini, Christoph

    2017-02-01

    The key aspect determining the postformation luminosity of gas giants has long been considered to be the energetics of the accretion shock at the surface of the planet. We use one-dimensional radiation-hydrodynamical simulations to study the radiative loss efficiency and to obtain postshock temperatures and pressures and thus entropies. The efficiency is defined as the fraction of the total incoming energy flux that escapes the system (roughly the Hill sphere), taking into account the energy recycling that occurs ahead of the shock in a radiative precursor. We focus in this paper on a constant equation of state (EOS) to isolate the shock physics but use constant and tabulated opacities. While robust quantitative results will have to await a self-consistent treatment including hydrogen dissociation and ionization, the results presented here show the correct qualitative behavior and can be understood from semianalytical calculations. The shock is found to be isothermal and supercritical for a range of conditions relevant to the core accretion formation scenario (CA), with Mach numbers { M }≳ 3. Across the shock, the entropy decreases significantly by a few times {k}{{B}}/{{baryon}}. While nearly 100% of the incoming kinetic energy is converted to radiation locally, the efficiencies are found to be as low as roughly 40%, implying that a significant fraction of the total accretion energy is brought into the planet. However, for realistic parameter combinations in the CA scenario, we find that a nonzero fraction of the luminosity always escapes the Hill sphere. This luminosity could explain, at least in part, recent observations in the young LkCa 15 and HD 100546 systems.

  2. Argonne National Laboratory summary site environmental report for calendar year 2007.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.

    2009-05-22

    This summary of Argonne National Laboratory's Site Environmental Report for calendar year 2007 was written by 20 students at Downers Grove South High School in Downers Grove, Ill. The student authors are classmates in Mr. Howard's Bio II course. Biology II is a research-based class that teaches students the process of research by showing them how the sciences apply to daily life. For the past seven years, Argonne has worked with Biology II students to create a short document summarizing the Site Environmental Report to provide the public with an easy-to-read summary of the annual 300-page technical report on the results of Argonne's on-site environmental monitoring program. The summary is made available online and given to visitors to Argonne, researchers interested in collaborating with Argonne, future employees, and many others. In addition to providing Argonne and the public with an easily understandable short summary of a large technical document, the participating students learn about professional environmental monitoring procedures, achieve a better understanding of the time and effort put forth into summarizing and publishing research, and gain confidence in their own abilities to express themselves in writing. The Argonne Summary Site Environmental Report fits into the educational needs for 12th grade students. Illinois State Educational Goal 12 states that a student should understand the fundamental concepts, principles, and interconnections of the life, physical, and earth/space sciences. To create this summary booklet, the students had to read and understand the larger technical report, which discusses in-depth many activities and programs that have been established by Argonne to maintain a safe local environment. Creating this Summary Site Environmental Report also helps students fulfill Illinois State Learning Standard 12B5a, which requires that students be able to analyze and explain biodiversity issues, and the causes and effects of

  3. Toward an ontology framework supporting the integration of geographic information with modeling and simulation for critical infrastructure protection

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, John J [Los Alamos National Laboratory; Bent, Russell W [Los Alamos National Laboratory; Linger, Steve P [Los Alamos National Laboratory

    2009-01-01

    Protecting the nation's infrastructure from natural disasters, inadvertent failures, or intentional attacks is a major national security concern. Gauging the fragility of infrastructure assets, and understanding how interdependencies across critical infrastructures affect their behavior, is essential to predicting and mitigating cascading failures, as well as to planning for response and recovery. Modeling and simulation (M&S) is an indispensable part of characterizing this complex system of systems and anticipating its response to disruptions. Bringing together the necessary components to perform such analyses produces a wide-ranging and coarse-grained computational workflow that must be integrated with other analysis workflow elements. There are many points in both types of work flows in which geographic information (GI) services are required. The GIS community recognizes the essential contribution of GI in this problem domain as evidenced by past OGC initiatives. Typically such initiatives focus on the broader aspects of GI analysis workflows, leaving concepts crucial to integrating simulations within analysis workflows to that community. Our experience with large-scale modeling of interdependent critical infrastructures, and our recent participation in a DRS initiative concerning interoperability for this M&S domain, has led to high-level ontological concepts that we have begun to assemble into an architecture that spans both computational and 'world' views of the problem, and further recognizes the special requirements of simulations that go beyond common workflow ontologies. In this paper we present these ideas, and offer a high-level ontological framework that includes key geospatial concepts as special cases of a broader view.

  4. Assessment of North America photosynthetic uptake of CO2 through simulations of COS in a Lagrangian particle dispersion model framework

    Science.gov (United States)

    Chen, H.; Montzka, S. A.; Andrews, A. E.; Sweeney, C.; Jacobson, A. R.; Petron, G.; Trudeau, M.; Miller, B. R.; Karion, A.; Martin, J.; Gerbig, C.; Campbell, J.; Abu-Naser, M.; Berry, J. A.; Baker, I. T.; Nehrkorn, T.; Eluszkiewicz, J.; Tans, P. P.

    2012-12-01

    Improving our understanding of terrestrial gross carbon fluxes, i.e. gross primary production (GPP) and respiration, plays a key role in evaluating feedbacks and thereby improving our ability to predict future climate. Since GPP can only be directly measured on very small scales, estimates of GPP at regional to global scales are derived only from biospheric model simulations. Recent studies suggest that carbonyl sulfide be a useful tracer to provide constraints on GPP, based on the fact that both COS and CO2 are simultaneously taken up by plants. Here we present an assessment of GPP estimates for North America from the Simple Biosphere (SiB) model, the Carnegie-Ames-Stanford Approach (CASA) model, and the MPI-BGC model through atmospheric transport simulations of COS in a Lagrangian particle dispersion model (LPDM) framework. We evaluate the impacts of boundary condition and soil uptake on the GPP estimates we derive. This study uses measurements of COS and CO2 from the NOAA/ESRL tall tower and aircraft air sampling networks, and LPDM simulations backward in time are used to quantify the contribution from different sources to observed mole fractions. A measurement over the continent contains information about terrestrial fluxes provided the upwind, or background concentration is known. Hence, the background state is an important part of the observed signal to be simulated. Empirical boundary curtains are built based on observations at the NOAA/ESRL marine boundary layer stations and from aircraft vertical profiles. These curtains are utilized as the lateral boundary conditions for COS and CO2 for the North American model domain. To assess the uncertainty of the background values for observations, we compare calculated background values based on the empirical curtains and two different models that identify where on the curtain the air entered the model domain: WRF-STILT and HYSPLIT-NAM12. Furthermore, the non-GPP related COS fluxes due to anthropogenic emissions and

  5. Exploring the "what if?" in geology through a RESTful open-source framework for cloud-based simulation and analysis

    Science.gov (United States)

    Klump, Jens; Robertson, Jess

    2016-04-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Our framework consist of two layers: (a) a ground-truth layer that contains geological models, which can be statistically based on historical operations data, and (b) a network of RESTful synthetic sensor microservices which can query the ground-truth for underlying properties and produce a simulated measurement to a control layer, which could be a database or LIMS, a machine learner or a companies' existing data infrastructure. Ground truth data are generated by an implicit geological model which serves as a host for nested models of geological processes as smaller scales. Our two layers are implemented using Flask and Gunicorn, which are open source Python web application framework and server, the PyData stack (numpy, scipy etc) and Rabbit MQ (an open-source queuing library). Sensor data is encoded using a JSON-LD version of the SensorML and Observations and Measurements standards. Containerisation of the synthetic sensors using Docker and CoreOS allows rapid and scalable deployment of large numbers of sensors, as well as sensor discovery to form a self-organized dynamic network of sensors. Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the

  6. On-lattice agent-based simulation of populations of cells within the open-source Chaste framework

    KAUST Repository

    Figueredo, G. P.

    2013-02-21

    Over the years, agent-based models have been developed that combine cell division and reinforced random walks of cells on a regular lattice, reaction-diffusion equations for nutrients and growth factors; and ordinary differential equations for the subcellular networks regulating the cell cycle. When linked to a vascular layer, this multiple scale model framework has been applied to tumour growth and therapy. Here, we report on the creation of an agent-based multi-scale environment amalgamating the characteristics of these models within a Virtual Physiological Human (VPH) Exemplar Project. This project enables reuse, integration, expansion and sharing of the model and relevant data. The agent-based and reaction-diffusion parts of the multi-scale model have been implemented and are available for download as part of the latest public release of Chaste (Cancer, Heart and Soft Tissue Environment; http://www.cs.ox.ac.uk/chaste/), part of the VPH Toolkit (http://toolkit.vph-noe.eu/). The environment functionalities are verified against the original models, in addition to extra validation of all aspects of the code. In this work, we present the details of the implementation of the agent-based environment, including the system description, the conceptual model, the development of the simulation model and the processes of verification and validation of the simulation results. We explore the potential use of the environment by presenting exemplar applications of the \\'what if\\' scenarios that can easily be studied in the environment. These examples relate to tumour growth, cellular competition for resources and tumour responses to hypoxia (low oxygen levels). We conclude our work by summarizing the future steps for the expansion of the current system.

  7. Numerical simulation and experimental validation of biofilm in a multi-physics framework using an SPH based method

    Science.gov (United States)

    Soleimani, Meisam; Wriggers, Peter; Rath, Henryke; Stiesch, Meike

    2016-10-01

    In this paper, a 3D computational model has been developed to investigate biofilms in a multi-physics framework using smoothed particle hydrodynamics (SPH) based on a continuum approach. Biofilm formation is a complex process in the sense that several physical phenomena are coupled and consequently different time-scales are involved. On one hand, biofilm growth is driven by biological reaction and nutrient diffusion and on the other hand, it is influenced by fluid flow causing biofilm deformation and interface erosion in the context of fluid and deformable solid interaction. The geometrical and numerical complexity arising from these phenomena poses serious complications and challenges in grid-based techniques such as finite element. Here the solution is based on SPH as one of the powerful meshless methods. SPH based computational modeling is quite new in the biological community and the method is uniquely robust in capturing the interface-related processes of biofilm formation such as erosion. The obtained results show a good agreement with experimental and published data which demonstrates that the model is capable of simulating and predicting overall spatial and temporal evolution of biofilm.

  8. Diagnostic studies on lithium-ion cells at Argonne National Laboratory: an overview

    Science.gov (United States)

    Abraham, Daniel P.

    2010-04-01

    High-power and high-energy lithium-ion cells are being studied at Argonne National Laboratory (Argonne) as part of the U.S. Department of Energy's FreedomCar and Vehicle Technologies (FCVT) program. Cells ranging in capacity from 1 mAh to 1Ah, and containing a variety of electrodes and electrolytes, are examined to determine suitable material combinations that will meet and exceed the FCVT performance, cost, and safety targets. In this article, accelerated aging of 18650-type cells, and characterization of components harvested from these cells, is described. Several techniques that include electrochemical measurements, analytical electron microscopy, and x-ray spectroscopy were used to study the various cell components. Data from these studies were used to identify the most likely contributors to property degradation and determine mechanisms responsible for cell capacity fade and impedance rise.

  9. Argonne National Laboratory Physics Division annual report, January--December 1996

    Energy Technology Data Exchange (ETDEWEB)

    Thayer, K.J. [ed.

    1997-08-01

    The past year has seen several of the Physics Division`s new research projects reach major milestones with first successful experiments and results: the atomic physics station in the Basic Energy Sciences Research Center at the Argonne Advanced Photon Source was used in first high-energy, high-brilliance x-ray studies in atomic and molecular physics; the Short Orbit Spectrometer in Hall C at the Thomas Jefferson National Accelerator (TJNAF) Facility that the Argonne medium energy nuclear physics group was responsible for, was used extensively in the first round of experiments at TJNAF; at ATLAS, several new beams of radioactive isotopes were developed and used in studies of nuclear physics and nuclear astrophysics; the new ECR ion source at ATLAS was completed and first commissioning tests indicate excellent performance characteristics; Quantum Monte Carlo calculations of mass-8 nuclei were performed for the first time with realistic nucleon-nucleon interactions using state-of-the-art computers, including Argonne`s massively parallel IBM SP. At the same time other future projects are well under way: preparations for the move of Gammasphere to ATLAS in September 1997 have progressed as planned. These new efforts are imbedded in, or flowing from, the vibrant ongoing research program described in some detail in this report: nuclear structure and reactions with heavy ions; measurements of reactions of astrophysical interest; studies of nucleon and sub-nucleon structures using leptonic probes at intermediate and high energies; atomic and molecular structure with high-energy x-rays. The experimental efforts are being complemented with efforts in theory, from QCD to nucleon-meson systems to structure and reactions of nuclei. Finally, the operation of ATLAS as a national users facility has achieved a new milestone, with 5,800 hours beam on target for experiments during the past fiscal year.

  10. Derived concentration guideline levels for Argonne National Laboratory's building 310 area.

    Energy Technology Data Exchange (ETDEWEB)

    Kamboj, S., Dr.; Yu, C ., Dr. (Environmental Science Division)

    2011-08-12

    The derived concentration guideline level (DCGL) is the allowable residual radionuclide concentration that can remain in soil after remediation of the site without radiological restrictions on the use of the site. It is sometimes called the single radionuclide soil guideline or the soil cleanup criteria. This report documents the methodology, scenarios, and parameters used in the analysis to support establishing radionuclide DCGLs for Argonne National Laboratory's Building 310 area.

  11. Research in mathematics and computer science at Argonne, July 1, 1986-January 6, 1988

    Energy Technology Data Exchange (ETDEWEB)

    Pieper, G.W. (ed.)

    1988-01-01

    This report reviews the research activities in the Mathematics and Computer Science Division at Argonne National Laboratory for the period July 1, 1986, through January 6, 1988. The body of the report gives a brief look at the MCS staff and the research facilities, and discusses various projects carried out in two major areas of research: analytical and numerical methods and advanced computer systems concepts. Information on division staff, visitors, workshops, and seminars is found in the appendixes. 6 figs.

  12. Status report on the positive ion injector (PII) for ATLAS at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Zinkann, G.P.; Added, N.; Billquist, P.; Bogaty, J.; Clifft, B.; Markovich, P.; Phillips, D.; Strickhorn, P.; Shepard, K.W.

    1991-01-01

    The Positive Ion Injector (PII) is part of the Uranuim upgrade for ATLAS accelerator at Argonne National Laboratory. This paper will include a technical discussion of the Positive Ion Injector (PII) accelerator with its superconducting, niobium, very low-velocity accelerating structures. It will also discuss the current construction schedule of PII, and review an upgrade of the fast- tuning system. 10 refs., 6 figs.

  13. Argonne National Laboratory Annual Report of Laboratory Directed Research and Development program activities FY 2011.

    Energy Technology Data Exchange (ETDEWEB)

    (Office of The Director)

    2012-04-25

    As a national laboratory Argonne concentrates on scientific and technological challenges that can only be addressed through a sustained, interdisciplinary focus at a national scale. Argonne's eight major initiatives, as enumerated in its strategic plan, are Hard X-ray Sciences, Leadership Computing, Materials and Molecular Design and Discovery, Energy Storage, Alternative Energy and Efficiency, Nuclear Energy, Biological and Environmental Systems, and National Security. The purposes of Argonne's Laboratory Directed Research and Development (LDRD) Program are to encourage the development of novel technical concepts, enhance the Laboratory's research and development (R and D) capabilities, and pursue its strategic goals. projects are selected from proposals for creative and innovative R and D studies that require advance exploration before they are considered to be sufficiently developed to obtain support through normal programmatic channels. Among the aims of the projects supported by the LDRD Program are the following: establishment of engineering proof of principle, assessment of design feasibility for prospective facilities, development of instrumentation or computational methods or systems, and discoveries in fundamental science and exploratory development.

  14. Argonne National Laboratory Annual Report of Laboratory Directed Research and Development program activities FY 2010.

    Energy Technology Data Exchange (ETDEWEB)

    (Office of The Director)

    2012-04-25

    As a national laboratory Argonne concentrates on scientific and technological challenges that can only be addressed through a sustained, interdisciplinary focus at a national scale. Argonne's eight major initiatives, as enumerated in its strategic plan, are Hard X-ray Sciences, Leadership Computing, Materials and Molecular Design and Discovery, Energy Storage, Alternative Energy and Efficiency, Nuclear Energy, Biological and Environmental Systems, and National Security. The purposes of Argonne's Laboratory Directed Research and Development (LDRD) Program are to encourage the development of novel technical concepts, enhance the Laboratory's research and development (R and D) capabilities, and pursue its strategic goals. projects are selected from proposals for creative and innovative R and D studies that require advance exploration before they are considered to be sufficiently developed to obtain support through normal programmatic channels. Among the aims of the projects supported by the LDRD Program are the following: establishment of engineering proof of principle, assessment of design feasibility for prospective facilities, development of instrumentation or computational methods or systems, and discoveries in fundamental science and exploratory development.

  15. Environment, Safety and Health Progress Assessment of the Argonne Illinois Site

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    This report documents the results of the US Department of Energy (DOE) Environment, Safety and Health (ES&H) Progress Assessment of the Argonne Illinois Site (AIS), near Chicago, Illinois, conducted from October 25 through November 9, 1993. During the Progress Assessment, activities included a selective review of the ES&H management systems and programs with principal focus on the DOE Office of Energy Research (ER); CH, which includes the Argonne Area Office; the University of Chicago; and the contractor`s organization responsible for operation of Argonne National Laboratory (ANL). The ES&H Progress Assessments are part of DOE`s continuing effort to institutionalize line management accountability and the self-assessment process throughout DOE and its contractor organizations. The purpose of the AIS ES&H Progress Assessment was to provide the Secretary of Energy, senior DOE managers, and contractor management with concise independent information on the following: change in culture and attitude related to ES&H activities; progress and effectiveness of the ES&H corrective actions resulting from the previous Tiger Team Assessment; adequacy and effectiveness of the ES&H self-assessment process of the DOE line organizations, the site management, and the operating contractor; and effectiveness of DOE and contractor management structures, resources, and systems to effectively address ES&H problems and new ES&H initiatives.

  16. Argonne National Laboratory Annual Report of Laboratory Directed Research and Development Program Activities for FY 1994

    Energy Technology Data Exchange (ETDEWEB)

    None

    1995-02-25

    The purposes of Argonne's Laboratory Directed Research and Development (LDRD) Program are to encourage the development of novel concepts, enhance the Laboratory's R and D capabilities, and further the development of its strategic initiatives. Projects are selected from proposals for creative and innovative R and D studies which are not yet eligible for timely support through normal programmatic channels. Among the aims of the projects supported by the Program are establishment of engineering proof-of-principle; assessment of design feasibility for prospective facilities; development of an instrumental prototype, method, or system; or discovery in fundamental science. Several of these projects are closely associated with major strategic thrusts of the Laboratory as described in Argonne's Five-Year Institutional Plan, although the scientific implications of the achieved results extend well beyond Laboratory plans and objectives. The projects supported by the Program are distributed across the major programmatic areas at Argonne as indicated in the Laboratory's LDRD Plan for FY 1994. Project summaries of research in the following areas are included: (1) Advanced Accelerator and Detector Technology; (2) X-ray Techniques for Research in Biological and Physical Science; (3) Nuclear Technology; (4) Materials Science and Technology; (5) Computational Science and Technology; (6) Biological Sciences; (7) Environmental Sciences: (8) Environmental Control and Waste Management Technology; and (9) Novel Concepts in Other Areas.

  17. Leidos Biomed Teams with NCI, DOE, and Argonne National Lab to Support National X-Ray Resource | Poster

    Science.gov (United States)

    Scientists are making progress in understanding a bleeding disorder caused by prescription drug interactions, thanks to a high-tech research facility involving two federal national laboratories, Argonne and Frederick.

  18. Argonne National Laboratory High Energy Physics Division semiannual report of research activities, January 1, 1989--June 30, 1989

    Energy Technology Data Exchange (ETDEWEB)

    1989-01-01

    This paper discuss the following areas on High Energy Physics at Argonne National Laboratory: experimental program; theory program; experimental facilities research; accelerator research and development; and SSC detector research and development.

  19. Argonne National Laboratory`s photo-oxidation organic mixed waste treatment system - installation and startup testing

    Energy Technology Data Exchange (ETDEWEB)

    Shearer, T.L.; Nelson, R.A.; Torres, T.; Conner, C.; Wygmans, D.

    1997-09-01

    This paper describes the installation and startup testing of the Argonne National Laboratory (ANL-E) Photo-Oxidation Organic Mixed Waste Treatment System. This system will treat organic mixed (i.e., radioactive and hazardous) waste by oxidizing the organics to carbon dioxide and inorganic salts in an aqueous media. The residue will be treated in the existing radwaste evaporators. The system is installed in the Waste Management Facility at the ANL-E site in Argonne, Illinois. 1 fig.

  20. Towards a framework for teaching about information technology risk in health care: Simulating threats to health data and patient safety

    OpenAIRE

    2015-01-01

    In this paper the author describes work towards developing an integrative framework for educating health information technology professionals about technology risk. The framework considers multiple sources of risk to health data quality and integrity that can result from the use of health information technology (HIT) and can be used to teach health professional students about these risks when using health technologies. This framework encompasses issues and problems that may arise from varied ...

  1. Development and analysis of a meteorological database, Argonne National Laboratory, Illinois

    Science.gov (United States)

    Over, Thomas M.; Price, Thomas H.; Ishii, Audrey

    2010-01-01

    A database of hourly values of air temperature, dewpoint temperature, wind speed, and solar radiation from January 1, 1948, to September 30, 2003, primarily using data collected at the Argonne National Laboratory station, was developed for use in continuous-time hydrologic modeling in northeastern Illinois. Missing and apparently erroneous data values were replaced with adjusted values from nearby stations used as 'backup'. Temporal variations in the statistical properties of the data resulting from changes in measurement and data-storage methodologies were adjusted to match the statistical properties resulting from the data-collection procedures that have been in place since January 1, 1989. The adjustments were computed based on the regressions between the primary data series from Argonne National Laboratory and the backup series using data obtained during common periods; the statistical properties of the regressions were used to assign estimated standard errors to values that were adjusted or filled from other series. Each hourly value was assigned a corresponding data-source flag that indicates the source of the value and its transformations. An analysis of the data-source flags indicates that all the series in the database except dewpoint have a similar fraction of Argonne National Laboratory data, with about 89 percent for the entire period, about 86 percent from 1949 through 1988, and about 98 percent from 1989 through 2003. The dewpoint series, for which observations at Argonne National Laboratory did not begin until 1958, has only about 71 percent Argonne National Laboratory data for the entire period, about 63 percent from 1948 through 1988, and about 93 percent from 1989 through 2003, indicating a lower reliability of the dewpoint sensor. A basic statistical analysis of the filled and adjusted data series in the database, and a series of potential evapotranspiration computed from them using the computer program LXPET (Lamoreux Potential

  2. GSim: GPU-accelerated software process simulation framework%GSim:支持GPU加速软件过程仿真框架

    Institute of Scientific and Technical Information of China (English)

    张备; 翟健; 杨秋松

    2012-01-01

    为了提高软件过程仿真的效率,提出了一种基于图形处理单元(graphic processing unit,GPU)加速的仿真框架.该框架利用图形化语言和随机参数来描述一个过程模型,将模型转换为RansomSpec字节码从而可以在GPU平台上运行,以期借助GPU平台的高并行特性提高原有仿真算法的效率.实验结果表明,通过这种框架,基于GPU的随机软件过程仿真与传统基于CPU串行的仿真算法相比在效率上提高一个数量级.%A graphic processing unit (GPU) -accelerated simulation framework is presented to improve the efficiency of software process simulation. With the framework, processes are described with a graphical language with annotations of stochastic parameters that will be later transferred into an internal form that amenable to running on GPU platform, where the performance of simulation may could benefit from the high concurrency feature of GPU. Experimental results show that, by using this framework, GPU-ena-bled stochastic software process simulation can reach one order of magnitude faster than traditional CPU serial algorithms.

  3. General Modeling and Simulation Framework of Data Link System%数据链系统通用建模与仿真框架∗

    Institute of Scientific and Technical Information of China (English)

    王文政; 曹琦

    2016-01-01

    In order to provide a general tool for research on the analysis, evaluation and training of data link system, the modeling and simulation framework is proposed based on the analysis of the needs of modeling and si-mulation of data link system, which is aimed at improving the reuse of data link system model, meeting the needs of different levels of data link system simulation at the same time, and enhancing the data link si-mulation system scalability. Firstly, the framework of component modeling of data link system is presented, the logic structure of the models in which is analyzed, and the model-ing process is described. Then, the framework of data link system simulation is designed based on the modeling framework, and its structure and application process are described in detail.%为开展数据链系统性能分析、效能评估及模拟训练等多样化研究,在建模和仿真需求分析的基础上,提出了通用数据链系统建模与仿真框架,旨在提高模型重用性,满足不同层次数据链仿真需求以及增强仿真系统可扩展性。首先给出了数据链系统组件化建模框架,分析了其模型逻辑结构,并对建模流程进行了阐述。然后,基于建模框架设计了数据链系统仿真框架,并对其构成和运用流程进行了详细描述。

  4. Simulation Research Framework with Embedded Intelligent Algorithms for Analysis of Multi-Target, Multi-Sensor, High-Cluttered Environments

    Science.gov (United States)

    Hanlon, Nicholas P.

    nearly identical performance metrics at orders of magnitude faster in execution. Second, a fuzzy inference system is presented that alleviates air traffic controllers from information overload by utilizing flight plan data and radar/GPS correlation values to highlight aircraft that deviate from their intended routes. Third, a genetic algorithm optimizes sensor placement that is robust and capable of handling unexpected routes in the environment. Fourth, a fuzzy CUSUM algorithm more accurately detects and corrects aircraft mode changes. Finally, all the work is packaged in a holistic simulation research framework that provides evaluation and analysis of various multi-sensor, multi-target scenarios.

  5. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  6. Argonne CW Linac (ACWL) -- Legacy from SDI and opportunities for the future

    Energy Technology Data Exchange (ETDEWEB)

    McMichael, G.E.; Yule, T.J.

    1994-08-01

    The former Strategic Defense Initiative Organization (SDIO) invested significant resources over a 6-year period to develop and build an accelerator to demonstrate the launching of a cw beam with characteristics suitable for a space-based Neutral Particle Beam (NPD) system. This accelerator, the CWDD (Continuous Wave Deuterium Demonstrator) accelerator, was designed to accelerate 80 mA cw of D{sup {minus}} to 7.5 MeV. A considerable amount of hardware was constructed and installed in the Argonne-based facility, and major performance milestones were achieved before program funding from the Department of Defense ended in October 1993. Existing assets have been turned over to Argonne. Assets include a fully functional 200 kV cw D{sup {minus}} injector, a cw RFQ that has been tuned, leak checked and aligned, beam lines and a high-power beam stop, all installed in a shielded vault with appropriate safety and interlock systems. In addition, there are two high power (1 MW) cw rf amplifiers and all the ancillary power, cooling and control systems required for a high-power accelerator system. The SDI mission required that the CWDD accelerator structures operate at cryogenic temperatures (26 K), a requirement that placed severe limitations on operating period (CWDD would have provided 20 seconds of cw beam every 90 minutes). However, the accelerator structures were designed for full-power rf operation with water cooling and ACWL (Argonne Continuous Wave Linac), the new name for CWDD in its water-cooled, positive-ion configuration, will be able to operate continuously. Project status and achievements will be reviewed. Preliminary design of a proton conversion for the RFQ, and other proposals for turning ACWL into a testbed for cw-linac engineering, will be discussed.

  7. Status of the Argonne heavy-ion-fusion low-beta linac

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.M.; Bogaty, J.M.; Moretti, A.; Sacks, R.A.; Sesol, N.Q.; Wright, A.J.

    1981-01-01

    The primary goal of the experimental program in heavy-ion fusion (HIF) at Argonne National Laboratory (ANL) during the next few years is to demonstrate many of the requirements of a RF linac driver for inertial-fusion power plants. So far, most of the construction effort has been applied to the front end. The ANL program has developed a high-intensity xenon source, a 1.5-MV preaccelerator, and the initial cavities of the low-beta linac. The design, initial tests, and status of the low-beta linac are described.

  8. Status of the Argonne heavy ion fusion low-beta linac

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.M.; Bogaty, J.M.; Moretti, A.; Sacks, R.A.; Sesol, N.Q.; Wright, A.J.

    1981-06-01

    The primary goal of the experimental program in heavy ion fusion (HIF) at Argonne National Laboratory (ANL) during the next few years is to demonstrate many of the requirements of a RF linac driver for inertial fusion power plants. So far, most of the construction effort has been applied to the front end. The ANL program has developed a high intensity xenon source, a 1.5 MV preaccelerator, and the initial cavities of the low-beta linac. The design, initial tests and status of the low-beta linac are described. 8 refs.

  9. Ionomer-like structures and {pi}-cation interactions in Argonne Premium coals

    Energy Technology Data Exchange (ETDEWEB)

    Opaprakasit, P.; Scaroni, A.W.; Painter, P.C. [Pennsylvania State University, University Park, PA (United States). Energy Institute

    2002-06-01

    The increase in the amount of pyridine-soluble material obtained from Argonne Premium coals after acid treatment is examined. The amount of pyridine-soluble material in most of the coals increases significantly with acid treatment. In low and to some extent medium rank coals this is largely a result of the presence of ionic clusters formed by carboxylate groups. In higher rank coals we are proposing that {pi}-cation interactions play a major role. These ion/coal interactions are of sufficient strength to act as 'reversible' cross-links, in the same way as ionic clusters behave in ionomers. 26 refs., 14 figs., 3 tabs.

  10. Research in mathematics and computer science at Argonne, September 1989--February 1991

    Energy Technology Data Exchange (ETDEWEB)

    Pieper, G.W.

    1991-03-01

    This report reviews the research activities in the Mathematics and Computer Science Division at Argonne National Laboratory for the period September 1989 through February 1991. The body of the report gives a brief look at the MCS staff and the research facilities and then discusses the diverse research projects carried out in the division. Projects funded by non-DOE sources are also discussed, and new technology transfer activities are described. Further information on staff, visitors, workshops, and seminars is found in the appendixes.

  11. Survey of biomedical and environental data bases, models, and integrated computer systems at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Murarka, I.P.; Bodeau, D.J.; Scott, J.M.; Huebner, R.H.

    1978-08-01

    This document contains an inventory (index) of information resources pertaining to biomedical and environmental projects at Argonne National Laboratory--the information resources include a data base, model, or integrated computer system. Entries are categorized as models, numeric data bases, bibliographic data bases, or integrated hardware/software systems. Descriptions of the Information Coordination Focal Point (ICFP) program, the system for compiling this inventory, and the plans for continuing and expanding it are given, and suggestions for utilizing the services of the ICFP are outlined.

  12. The big and little of fifty years of Moessbauer spectroscopy at Argonne.

    Energy Technology Data Exchange (ETDEWEB)

    Westfall, C.

    2005-09-20

    Using radioactive materials obtained by chance, a turntable employing gears from Heidelberg's mechanical toy shops, and other minimal equipment available in post World War II Germany, in 1959 Rudolf Moessbauer confirmed his suspicion that his graduate research had yielded ground-breaking results. He published his conclusion: an atomic nucleus in a crystal undergoes negligible recoil when it emits a low energy gamma ray and provides the entire energy to the gamma ray. In the beginning Moessbauer's news might have been dismissed. As Argonne nuclear physicist Gilbert Perlow noted: ''Everybody knew that nuclei were supposed to recoil when emitting gamma rays--people made those measurements every day''. If any such effect existed, why had no one noticed it before? The notion that some nuclei would not recoil was ''completely crazy'', in the words of the eminent University of Illinois condensed matter physicist Frederich Seitz. Intrigued, however, nuclear physicists as well as condensed matter (or solid state) physicists in various locations--but particularly at the Atomic Energy Research Establishment at Harwell in Britain and at Argonne and Los Alamos in the U.S.--found themselves pondering the Moessbauer spectra with its nuclear and solid state properties starting in late 1959. After an exciting year during which Moessbauer's ideas were confirmed and extended, the physics community concluded that Moessbauer was right. Moessbauer won the Nobel Prize for his work in 1961. In the 1960s and 1970s Argonne physicists produced an increasingly clear picture of the properties of matter using the spectroscopy ushered in by Moessbauer. The scale of this traditional Moessbauer spectroscopy, which required a radioactive source and other simple equipment, began quite modestly by Argonne standards. For example Argonne hosted traditional Moessbauer spectroscopy research using mostly existing equipment in the early days and

  13. The SOPHY Framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, Martin Fejrskov; Bendtsen, Jan Dimon;

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  14. A new framework for quantifying uncertainties in modelling studies for future climates - how more certain are CMIP5 precipitation and temperature simulations compared to CMIP3?

    Science.gov (United States)

    Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.

    2014-12-01

    We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections

  15. Past and Future Work on Radiobiology Mega-Studies: A Case Study At Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Haley, Benjamin; Wang, Qiong; Wanzer, Beau; Vogt, Stefan; Finney, Lydia; Yang, Ping Liu; Paunesku, Tatjana; Woloschak, Gayle

    2011-09-06

    Between 1952 and 1992, more than 200 large radiobiology studies were conducted in research institutes throughout Europe, North America, and Japan to determine the effects of external irradiation and internal emitters on the lifespan and tissue toxicity development in animals. At Argonne National Laboratory, 22 external beam studies were conducted on nearly 700 beagle dogs and 50,000 mice between 1969 and 1992. These studies helped to characterize the effects of neutron and gamma irradiation on lifespan, tumorigenesis, and mutagenesis across a range of doses and dosing patterns. The records and tissues collected at Argonne during that time period have been carefully preserved and redisseminated. Using these archived data, ongoing statistical work has been done and continues to characterize quality of radiation, dose, dose rate, tissue, and gender-specific differences in the radiation responses of exposed animals. The ongoing application of newly-developed molecular biology techniques to the archived tissues has revealed gene-specific mutation rates following exposure to ionizing irradiation. The original and ongoing work with this tissue archive is presented here as a case study of a more general trend in the radiobiology megastudies. These experiments helped form the modern understanding of radiation responses in animals and continue to inform development of new radiation models. Recent archival efforts have facilitated open access to the data and materials produced by these studies, and so a unique opportunity exists to expand this continued research.

  16. An in-house alternative to traditional SDI services at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Noel, R.E.; Dominiak, R.R.

    1997-02-20

    Selective Dissemination of Information (SDIs) are based on automated, well-defined programs that regularly produce precise, relevant bibliographic information. Librarians have typically turned to information vendors such as Dialog or STN international to design and implement these searches for their users in business, academia, and the science community. Because Argonne National Laboratory (ANL) purchases the Institute for Scientific Information (ISI) Current Contents tapes (all subject areas excluding Humanities). ANL scientists enjoy the benefit of in-house developments with BASISplus software programming and no longer need to turn to outside companies for reliable SDI service. The database and its customized services are known as ACCESS (Argonne Current Contents Electronic Search Service). Through collaboration with librarians on Boolean logic and selection of terms, users can now design their own personal profiles to comb the new data, thereby avoiding service fees from outside providers. Based on the feedback from scientists, it seems that this new service can help transform the ANL distributed libraries into more efficient central functioning entities that better serve the users. One goal is to eliminate the routing of paper copies of many new journal issues to different library locations for users to browse; instead users may be expected to rely more on electronic dissemination of both table of contents and customized SDIs for new scientific and technical information.

  17. The big and little of fifty years of Moessbauer spectroscopy at Argonne.

    Energy Technology Data Exchange (ETDEWEB)

    Westfall, C.

    2005-09-20

    Using radioactive materials obtained by chance, a turntable employing gears from Heidelberg's mechanical toy shops, and other minimal equipment available in post World War II Germany, in 1959 Rudolf Moessbauer confirmed his suspicion that his graduate research had yielded ground-breaking results. He published his conclusion: an atomic nucleus in a crystal undergoes negligible recoil when it emits a low energy gamma ray and provides the entire energy to the gamma ray. In the beginning Moessbauer's news might have been dismissed. As Argonne nuclear physicist Gilbert Perlow noted: ''Everybody knew that nuclei were supposed to recoil when emitting gamma rays--people made those measurements every day''. If any such effect existed, why had no one noticed it before? The notion that some nuclei would not recoil was ''completely crazy'', in the words of the eminent University of Illinois condensed matter physicist Frederich Seitz. Intrigued, however, nuclear physicists as well as condensed matter (or solid state) physicists in various locations--but particularly at the Atomic Energy Research Establishment at Harwell in Britain and at Argonne and Los Alamos in the U.S.--found themselves pondering the Moessbauer spectra with its nuclear and solid state properties starting in late 1959. After an exciting year during which Moessbauer's ideas were confirmed and extended, the physics community concluded that Moessbauer was right. Moessbauer won the Nobel Prize for his work in 1961. In the 1960s and 1970s Argonne physicists produced an increasingly clear picture of the properties of matter using the spectroscopy ushered in by Moessbauer. The scale of this traditional Moessbauer spectroscopy, which required a radioactive source and other simple equipment, began quite modestly by Argonne standards. For example Argonne hosted traditional Moessbauer spectroscopy research using mostly existing equipment in the early days and

  18. Special Report on "Allegations of Conflict of Interest Regarding Licensing of PROTECT by Argonne National Laboratory"

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-08-01

    In February 2009, the Office of Inspector General received a letter from Congressman Mark Steven Kirk of Illinois, which included constituent allegations that an exclusive technology licensing agreement by Argonne National Laboratory was tainted by inadequate competition, conflicts of interest, and other improprieties. The technology in question was for the Program for Response Options and Technology Enhancements for Chemical/Biological Terrorism, commonly referred to as PROTECT. Because of the importance of the Department of Energy's technology transfer program, especially as implementation of the American Recovery and Reinvestment Act matures, we reviewed selected aspects of the licensing process for PROTECT to determine whether the allegations had merit. In summary, under the facts developed during our review, it was understandable that interested parties concluded that there was a conflict of interest in this matter and that Argonne may have provided the successful licensee with an unfair advantage. In part, this was consistent with aspects of the complaint from Congressman Kirk's constituent.

  19. The SOPHY framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, M. F.; Bendtsen, Jan Dimon

    2005-01-01

    , hybrid simulator is implemented to demonstrate the virtues of Sophy. The simulator is set up using subsystem models described in human readable XML combined with a composition structure allowing virtual interconnection of subsystems in a simulation scenario. The performance of the simulator has shown......The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  20. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    Science.gov (United States)

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  1. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    DEFF Research Database (Denmark)

    Misfeldt, Morten

    2015-01-01

    In this paper I describe s how students use a project management simulation game based on an attack‑defense mechanism where two teams of players compete by challenging each other⠒s projects. The project management simulation game is intended to be playe d by pre‑service construction workers and e...

  2. Gas Warfare in World War I. The Use of Gas in the Meuse-Argonne Campaign, September-November 1918

    Science.gov (United States)

    1958-12-01

    ERIBULLES-our-MEUSE-CUNEL, the Vth Corps the heights in BOIS de GESNES , BOIS do MONCY and the METIT BOIS, and the lt Corps the FORE? D’ ARGONNE to include...use in Le Petit Bois, Bois do Gesnes , Bois do Moncy, and the Argonne that night and the next day. The next evening the Aire Gpg retorted that none of...October, the left and center corps made slight gains, reaching Apremont, Exermont, and Gesnes , but the right corps, "hampered by the German flanking

  3. The Development of Dynamic Human Reliability Analysis Simulations for Inclusion in Risk Informed Safety Margin Characterization Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring; Curtis L. Smith; Rachel B. Shirley

    2015-07-01

    The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describes current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.

  4. FACET: an object-oriented software framework for modeling complex social behavior patterns

    Energy Technology Data Exchange (ETDEWEB)

    Dolph, J. E.; Christiansen, J. H.; Sydelko, P. J.

    2000-06-30

    The Framework for Addressing Cooperative Extended Transactions (FACET) is a flexible, object-oriented architecture for implementing models of dynamic behavior of multiple individuals, or agents, in a simulation. These agents can be human (individuals or organizations) or animal and may exhibit any type of organized social behavior that can be logically articulated. FACET was developed by Argonne National Laboratory's (ANL) Decision and Information Sciences Division (DIS) out of the need to integrate societal processes into natural system simulations. The FACET architecture includes generic software components that provide the agents with various mechanisms for interaction, such as step sequencing and logic, resource management, conflict resolution, and preemptive event handling. FACET components provide a rich environment within which patterns of behavior can be captured in a highly expressive manner. Interactions among agents in FACET are represented by Course of Action (COA) object-based models. Each COA contains a directed graph of individual actions, which represents any known pattern of social behavior. The agents' behavior in a FACET COA, in turn, influences the natural landscape objects in a simulation (i.e., vegetation, soil, and habitat) by updating their states. The modular design of the FACET architecture provides the flexibility to create multiple and varied simulation scenarios by changing social behavior patterns, without disrupting the natural process models. This paper describes the FACET architecture and presents several examples of FACET models that have been developed to assess the effects of anthropogenic influences on the dynamics of the natural environment.

  5. Exploiting on-node heterogeneity for in-situ analytics of climate simulations via a functional partitioning framework

    Science.gov (United States)

    Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan

    2016-04-01

    Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP

  6. Hydrogeological framework, numerical simulation of groundwater flow, and effects of projected water use and drought for the Beaver-North Canadian River alluvial aquifer, northwestern Oklahoma

    Science.gov (United States)

    Ryter, Derek W.; Correll, Jessica S.

    2016-01-14

    This report describes a study of the hydrology, hydrogeological framework, numerical groundwater-flow models, and results of simulations of the effects of water use and drought for the Beaver-North Canadian River alluvial aquifer, northwestern Oklahoma. The purpose of the study was to provide analyses, including estimating equal-proportionate-share (EPS) groundwater-pumping rates and the effects of projected water use and droughts, pertinent to water management of the Beaver-North Canadian River alluvial aquifer for the Oklahoma Water Resources Board.

  7. The PyZgoubi framework and the simulation of dynamic aperture in fixed-field alternating-gradient accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Tygier, S., E-mail: sam.tygier@hep.manchester.ac.uk [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Appleby, R.B., E-mail: robert.appleby@manchester.ac.uk [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Garland, J.M. [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Hock, K. [University of Liverpool (United Kingdom); Owen, H. [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Kelliher, D.J.; Sheehy, S.L. [STFC Rutherford Appleton Laboratory (United Kingdom)

    2015-03-01

    We present PyZgoubi, a framework that has been developed based on the tracking engine Zgoubi to model, optimise and visualise the dynamics in particle accelerators, especially fixed-field alternating-gradient (FFAG) accelerators. We show that PyZgoubi abstracts Zgoubi by wrapping it in an easy-to-use Python framework in order to allow simple construction, parameterisation, visualisation and optimisation of FFAG accelerator lattices. Its object oriented design gives it the flexibility and extensibility required for current novel FFAG design. We apply PyZgoubi to two example FFAGs; this includes determining the dynamic aperture of the PAMELA medical FFAG in the presence of magnet misalignments, and illustrating how PyZgoubi may be used to optimise FFAGs. We also discuss a robust definition of dynamic aperture in an FFAG and show its implementation in PyZgoubi.

  8. Development of an Integrated Modeling Framework for Simulations of Coastal Processes in Deltaic Environments Using High-Performance Computing

    Science.gov (United States)

    2008-01-01

    Swash and Surf Zones, PI: Q. Jim Chen Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of...potential vorticity in the surf and swash zones, as well as the momentum exchange between the two dynamical regions. APPROACH The research project...divers for inshore countermine warfare. The modeling framework integrated with the CFD Toolkits developed at LSU will allow us to couple the hydrodynamic

  9. A locally p-adaptive approach for Large Eddy Simulation of compressible flows in a DG framework

    CERN Document Server

    Tugnoli, Matteo; Bonaventura, Luca; Restelli, Marco

    2016-01-01

    We investigate the possibility of reducing the computational burden of LES models by employing local polynomial degree adaptivity in the framework of a high order DG method. A novel degree adaptation technique especially featured to be effective for LES applications is proposed and its effectiveness is compared to that of other criteria already employed in the literature. The resulting locally adaptive approach allows to achieve significant reductions in computational cost of representative LES computations.

  10. Simulator of P-Systems with String Replication Developed in Framework of P-Lingua 2.1

    Directory of Open Access Journals (Sweden)

    Veaceslav Macari

    2010-11-01

    Full Text Available In this paper we present beta version of simulator for P-systems with string replication rules. This simulator is developed according to P-Lingua ideology and principles of the P-Lingua 2.1 development environment. Format for presentation of rules with replications in P-Lingua language is proposed. The already known solutions by means of P-systems with string replication for two problems are used to demonstrate the work with the simulator: the SAT problem and inflections generation problem.

  11. Physics Division Argonne National Laboratory description of the programs and facilities.

    Energy Technology Data Exchange (ETDEWEB)

    Thayer, K.J. [ed.

    1999-05-24

    The ANL Physics Division traces its roots to nuclear physics research at the University of Chicago around the time of the second world war. Following the move from the University of Chicago out to the present Argonne site and the formation of Argonne National Laboratory: the Physics Division has had a tradition of research into fundamental aspects of nuclear and atomic physics. Initially, the emphasis was on areas such as neutron physics, mass spectrometry, and theoretical studies of the nuclear shell model. Maria Goeppert Maier was an employee in the Physics Division during the time she did her Nobel-Prize-winning work on the nuclear shell model. These interests diversified and at the present time the research addresses a wide range of current problems in nuclear and atomic physics. The major emphasis of the current experimental nuclear physics research is in heavy-ion physics, centered around the ATLAS facility (Argonne Tandem-Linac Accelerator System) with its new injector providing intense, energetic ion beams over the fill mass range up to uranium. ATLAS is a designated National User Facility and is based on superconducting radio-frequency technology developed in the Physics Division. A small program continues in accelerator development. In addition, the Division has a strong program in medium-energy nuclear physics carried out at a variety of major national and international facilities. The nuclear theory research in the Division spans a wide range of interests including nuclear dynamics with subnucleonic degrees of freedom, dynamics of many-nucleon systems, nuclear structure, and heavy-ion interactions. This research makes contact with experimental research programs in intermediate-energy and heavy-ion physics, both within the Division and on the national and international scale. The Physics Division traditionally has strong connections with the nation's universities. We have many visiting faculty members and we encourage students to participate in our

  12. Petri nets in Snoopy: a unifying framework for the graphical display, computational modelling, and simulation of bacterial regulatory networks.

    Science.gov (United States)

    Marwan, Wolfgang; Rohr, Christian; Heiner, Monika

    2012-01-01

    Using the example of phosphate regulation in enteric bacteria, we demonstrate the particular suitability of stochastic Petri nets to model biochemical phenomena and their simulative exploration by various features of the software tool Snoopy.

  13. An overset curvilinear/immersed boundary framework for high resolution simulations of wind and hydrokinetic turbine flows

    Science.gov (United States)

    Borazjani, Iman; Behara, Suresh; Natarajan, Ganesh; Sotiropoulos, Fotis

    2009-11-01

    We generalize the curvilinear/immersed boundary method to incorporate overset grids to enable the simulation of more complicated geometries and increase grid resolution locally near complex immersed boundary. The new method has been applied to carry out high resolution simulations of wind and hydrokinetic turbine rotors. An interior fine mesh contains the rotor blades and is embedded within a coarser background mesh. The rotor blades can be treated either as immersed boundaries or using curvilinear, boundary-conforming overset grids. The numerical methodology has been generalized to include both inertial and non-inertial frame formulations. The method is validated by applying it to simulate the flow for the NREL wind turbine rotor for various turbine operating points. Inviscid, unsteady RANS and LES simulations are carried out and compared with experimental data. Preliminary results will also be presented for the hydrokinetic turbine rotor installed at the Roosevelt Island Tidal Energy project in New York City.

  14. The RD53 Collaboration's SystemVerilog-UVM Simulation Framework and its General Applicability to Design of Advanced Pixel Readout Chips

    CERN Document Server

    Marconi, S; Placidi, Pisana; Christiansen, Jorgen; Hemperek, Tomasz

    2014-01-01

    The foreseen Phase 2 pixel upgrades at the LHC have very challenging requirements for the design of hybrid pixel readout chips. A versatile pixel simulation platform is as an essential development tool for the design, verification and optimization of both the system architecture and the pixel chip building blocks (Intellectual Properties, IPs). This work is focused on the implemented simulation and verification environment named VEPIX53, built using the SystemVerilog language and the Universal Verification Methodology (UVM) class library in the framework of the RD53 Collaboration. The environment supports pixel chips at different levels of description: its reusable components feature the generation of different classes of parameterized input hits to the pixel matrix, monitoring of pixel chip inputs and outputs, conformity checks between predicted and actual outputs and collection of statistics on system performance. The environment has been tested performing a study of shared architectures of the trigger late...

  15. SmartCell, a framework to simulate cellular processes that combines stochastic approximation with diffusion and localisation: analysis of simple networks.

    Science.gov (United States)

    Ander, M; Beltrao, P; Di Ventura, B; Ferkinghoff-Borg, J; Foglierini, M; Kaplan, A; Lemerle, C; Tomás-Oliveira, I; Serrano, L

    2004-06-01

    SmartCell has been developed to be a general framework for modelling and simulation of diffusion-reaction networks in a whole-cell context. It supports localisation and diffusion by using a mesoscopic stochastic reaction model. The SmartCell package can handle any cell geometry, considers different cell compartments, allows localisation of species, supports DNA transcription and translation, membrane diffusion and multistep reactions, as well as cell growth. Moreover, different temporal and spatial constraints can be applied to the model. A GUI interface that facilitates model making is also available. In this work we discuss limitations and advantages arising from the approach used in SmartCell and determine the impact of localisation on the behaviour of simple well-defined networks, previously analysed with differential equations. Our results show that this factor might play an important role in the response of networks and cannot be neglected in cell simulations.

  16. Investigating H 2 Sorption in a Fluorinated Metal–Organic Framework with Small Pores Through Molecular Simulation and Inelastic Neutron Scattering

    KAUST Repository

    Forrest, Katherine A.

    2015-07-07

    © 2015 American Chemical Society. Simulations of H2 sorption were performed in a metal-organic framework (MOF) consisting of Zn2+ ions coordinated to 1,2,4-triazole and tetrafluoroterephthalate ligands (denoted [Zn(trz)(tftph)] in this work). The simulated H2 sorption isotherms reported in this work are consistent with the experimental data for the state points considered. The experimental H2 isosteric heat of adsorption (Qst) values for this MOF are approximately 8.0 kJ mol-1 for the considered loading range, which is in the proximity of those determined from simulation. The experimental inelastic neutron scattering (INS) spectra for H2 in [Zn(trz)(tftph)] reveal at least two peaks that occur at low energies, which corresponds to high barriers to rotation for the respective sites. The most favorable sorption site in the MOF was identified from the simulations as sorption in the vicinity of a metal-coordinated H2O molecule, an exposed fluorine atom, and a carboxylate oxygen atom in a confined region in the framework. Secondary sorption was observed between the fluorine atoms of adjacent tetrafluoroterephthalate ligands. The H2 molecule at the primary sorption site in [Zn(trz)(tftph)] exhibits a rotational barrier that exceeds that for most neutral MOFs with open-metal sites according to an empirical phenomenological model, and this was further validated by calculating the rotational potential energy surface for H2 at this site. (Figure Presented).

  17. Overview of basic and applied research on battery systems at Argonne

    Energy Technology Data Exchange (ETDEWEB)

    Nevitt, M. V.

    1979-01-01

    The need for a basic understanding of the ion transport and related effects that are observed under the unique physical and electrochemical conditions occurring in high-temperature, high-performance batteries is pointed out. Such effects include those that are typical of transport in bulk materials such as liquid and solid electrolytes and the less well understood effects observed in migration in and across the interfacial zones existing around electrodes. The basic and applied studies at Argonne National Laboratory, centered in part around the development of a Li(alloy)/iron sulfide battery system for energy storage, are briefly described as an example of the way that such an understanding is being sought by coordinated interdisciplinary research. 3 figures.

  18. Ground State Correlations Using exp(S) Method for the Argonne-v18 Potential.

    Science.gov (United States)

    Heisenberg, Jochen; Mihaila, Bogdan

    1997-04-01

    We use the Argonne-v18 potential together with the phenomenological three-nucleon interaction to do the calculation of the mean-field single particle wave functions and the correlation operator S for ^16O. Our correlation operator includes the contributions from up to 4p4h terms. From the three-nucleon interaction we include only those terms that can be written as a density dependent two-body term. We present a breakdown of the contributions to the binding from the two- and the three-body interactions. The one- and the two-body densities for ^16O are presented. Effects of the center-of-mass correction on the charge density and form factor are also discussed.

  19. Two-Nucleon Scattering without partial waves using a momentum space Argonne V18 interaction

    CERN Document Server

    Veerasamy, S; Polyzou, W N

    2012-01-01

    We test the operator form of the Fourier transform of the Argonne V18 potential by computing selected scattering observables and all Wolfenstein parameters for a variety of energies. These are compared to the GW-DAC database and to partial wave calculations. We represent the interaction and transition operators as expansions in a spin-momentum basis. In this representation the Lippmann-Schwinger equation becomes a six channel integral equation in two variables. Our calculations use different numbers of spin-momentum basis elements to represent the on- and off-shell transition operators. This is because different numbers of independent spin-momentum basis elements are required to expand the on- and off-shell transition operators. The choice of on and off-shell spin-momentum basis elements is made so the coefficients of the on-shell spin-momentum basis vectors are simply related to the corresponding off-shell coefficients.

  20. The Anapole Moment of the Deuteron with the Argonne v18 Nucleon-Nucleon Interaction Model

    CERN Document Server

    Hyun, C H; Hyun, Chang Ho; Desplanques, Bertrand

    2003-01-01

    We calculate the deuteron anapole moment with the wave functions obtained from the Argonne $v18$ nucleon-nucleon interaction model. The anapole moment operators are considered at the leading order. To minimize the uncertainty due to a lack of current conservation, we calculate the matrix element of the anapole moment from the original definition. In virtue of accurate wave functions, we can obtain a more precise value of the deuteron anapole moment which contains less uncertainty than the former works. We obtain a result reduced by more than 25% in the magnitude of the deuteron anapole moment. The reduction of individual nuclear contributions is much more important however, varying from a factor 2 for the spin part to a factor 4 for the convection and associated two-body currents.

  1. National coal utilization assessment: modeling long-term coal production with the Argonne coal market model

    Energy Technology Data Exchange (ETDEWEB)

    Dux, C.D.; Kroh, G.C.; VanKuiken, J.C.

    1977-08-01

    The Argonne Coal Market Model was developed as part of the National Coal Utilization Assessment, a comprehensive study of coal-related environmental, health, and safety impacts. The model was used to generate long-term coal market scenarios that became the basis for comparing the impacts of coal-development options. The model has a relatively high degree of regional detail concerning both supply and demand. Coal demands are forecast by a combination of trend and econometric analysis and then input exogenously into the model. Coal supply in each region is characterized by a linearly increasing function relating increments of new mine capacity to the marginal cost of extraction. Rail-transportation costs are econometrically estimated for each supply-demand link. A quadratic programming algorithm is used to calculate flow patterns that minimize consumer costs for the system.

  2. Decontamination and dismantlement of the JANUS Reactor at Argonne National Laboratory-East. Project final report

    Energy Technology Data Exchange (ETDEWEB)

    Fellhauer, C.R.; Clark, F.R. [Argonne National Lab., IL (United States). Technology Development Div.; Garlock, G.A. [MOTA Corp., Cayce, SC (United States)

    1997-10-01

    The decontamination and dismantlement of the JANUS Reactor at Argonne National Laboratory-East (ANL-E) was completed in October 1997. Descriptions and evaluations of the activities performed and analyses of the results obtained during the JANUS D and D Project are provided in this Final Report. The following information is included: objective of the JANUS D and D Project; history of the JANUS Reactor facility; description of the ANL-E site and the JANUS Reactor facility; overview of the D and D activities performed; description of the project planning and engineering; description of the D and D operations; summary of the final status of the JANUS Reactor facility based upon the final survey results; description of the health and safety aspects of the project, including personnel exposure and OSHA reporting; summary of the waste minimization techniques utilized and total waste generated by the project; and summary of the final cost and schedule for the JANUS D and D Project.

  3. An analytical drilling force model and GPU-accelerated haptics-based simulation framework of the pilot drilling procedure for micro-implants surgery training.

    Science.gov (United States)

    Zheng, Fei; Lu, Wen Feng; Wong, Yoke San; Foong, Kelvin Weng Chiong

    2012-12-01

    The placement of micro-implants is a common but relatively new surgical procedure in clinical dentistry. This paper presents a haptics-based simulation framework for the pilot drilling of micro-implants surgery to train orthodontists to successfully perform this essential procedure by tactile sensation, without damaging tooth roots. A voxel-based approach was employed to model the inhomogeneous oral tissues. A preprocessing pipeline was designed to reduce imaging noise, smooth segmentation results and construct an anatomically correct oral model from patient-specific data. In order to provide a physically based haptic feedback, an analytical drilling force model based on metal cutting principles was developed and adapted for the voxel-based approach. To improve the real-time response, the parallel computing power of Graphics Processing Units is exploited through extra efforts for data structure design, algorithms parallelization, and graphic memory utilization. A prototype system has been developed based on the proposed framework. Preliminary results show that, by using this framework, proper drilling force can be rendered at different tissue layers with reduced cycle time, while the visual display has also been enhanced.

  4. Proc. of the sixteenth symposium on energy engineering sciences, May 13-15, 1998, Argonne, IL.

    Energy Technology Data Exchange (ETDEWEB)

    None

    1998-05-13

    This Proceedings Volume includes the technical papers that were presented during the Sixteenth Symposium on Energy Engineering Sciences on May 13--15, 1998, at Argonne National Laboratory, Argonne, Illinois. The Symposium was structured into eight technical sessions, which included 30 individual presentations followed by discussion and interaction with the audience. A list of participants is appended to this volume. The DOE Office of Basic Energy Sciences (BES), of which Engineering Research is a component program, is responsible for the long-term, mission-oriented research in the Department. The Office has prime responsibility for establishing the basic scientific foundation upon which the Nation's future energy options will be identified, developed, and built. BES is committed to the generation of new knowledge necessary to solve present and future problems regarding energy exploration, production, conversion, and utilization, while maintaining respect for the environment. Consistent with the DOE/BES mission, the Engineering Research Program is charged with the identification, initiation, and management of fundamental research on broad, generic topics addressing energy-related engineering problems. Its stated goals are to improve and extend the body of knowledge underlying current engineering practice so as to create new options for enhancing energy savings and production, prolonging the useful life of energy-related structures and equipment, and developing advanced manufacturing technologies and materials processing. The program emphasis is on reducing costs through improved industrial production and performance and expanding the nation's store of fundamental knowledge for solving anticipated and unforeseen engineering problems in energy technologies. To achieve these goals, the Engineering Research Program supports approximately 130 research projects covering a broad spectrum of topics that cut across traditional engineering disciplines. The program

  5. Changes in the Vegetation Cover in a Constructed Wetland at Argonne National Laboratory, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, C.L.; LaGory, K.

    2004-01-01

    Wetlands are valuable resources that are disappearing at an alarming rate. Land development has resulted in the destruction of wetlands for approximately 200 years. To combat this destruction, the federal government passed legislation that requires no net loss of wetlands. The United States Army Corps of Engineers (USACE) is responsible for regulating wetland disturbances. In 1991, the USACE determined that the construction of the Advanced Photon Source at Argonne National Laboratory would damage three wetlands that had a total area of one acre. Argonne was required to create a wetland of equal acreage to replace the damaged wetlands. For the first five years after this wetland was created (1992-1996), the frequency of plant species, relative cover, and water depth was closely monitored. The wetland was not monitored again until 2002. In 2003, the vegetation cover data were again collected with a similar methodology to previous years. The plant species were sampled using quadrats at randomly selected locations along transects throughout the wetland. The fifty sampling locations were monitored once in June and percent cover of each of the plant species was determined for each plot. Furthermore, the extent of standing water in the wetland was measured. In 2003, 21 species of plants were found and identified. Eleven species dominated the wetland, among which were reed canary grass (Phalaris arundinacea), crown vetch (Coronilla varia), and Canada thistle (Cirsium arvense). These species are all non-native, invasive species. In the previous year, 30 species were found in the same wetland. The common species varied from the 2002 study but still had these non-native species in common. Reed canary grass and Canada thistle both increased by more than 100% from 2002. Unfortunately, the non-native species may be contributing to the loss of biodiversity in the wetland. In the future, control measures should be taken to ensure the establishment of more desired native species.

  6. raaSAFT: A framework enabling coarse-grained molecular dynamics simulations based on the SAFT- γ Mie force field

    Science.gov (United States)

    Ervik, Åsmund; Serratos, Guadalupe Jiménez; Müller, Erich A.

    2017-03-01

    We describe here raaSAFT, a Python code that enables the setup and running of coarse-grained molecular dynamics simulations in a systematic and efficient manner. The code is built on top of the popular HOOMD-blue code, and as such harnesses the computational power of GPUs. The methodology makes use of the SAFT- γ Mie force field, so the resulting coarse grained pair potentials are both closely linked to and consistent with the macroscopic thermodynamic properties of the simulated fluid. In raaSAFT both homonuclear and heteronuclear models are implemented for a wide range of compounds spanning from linear alkanes, to more complicated fluids such as water and alcohols, all the way up to nonionic surfactants and models of asphaltenes and resins. Adding new compounds as well as new features is made straightforward by the modularity of the code. To demonstrate the ease-of-use of raaSAFT, we give a detailed walkthrough of how to simulate liquid-liquid equilibrium of a hydrocarbon with water. We describe in detail how both homonuclear and heteronuclear compounds are implemented. To demonstrate the performance and versatility of raaSAFT, we simulate a large polymer-solvent mixture with 300 polystyrene molecules dissolved in 42 700 molecules of heptane, reproducing the experimentally observed temperature-dependent solubility of polystyrene. For this case we obtain a speedup of more than three orders of magnitude as compared to atomistically-detailed simulations.

  7. Argonne National Laboratory study of the transfer of federal computational technology to manufacturing industry in the State of Michigan

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, C.J.

    1991-11-01

    This report describes a pilot study to develop, initiate the implementation, and document a process to identify computational technology capabilities resident within Argonne National Laboratory to small and medium-sized businesses in the State of Michigan. It is a derivative of a program entitled ``Technology Applications Development Process for the State of Michigan`` undertaken by the Industrial Technology Institute and MERRA under funding from the National Institute of Standards and Technology. The overall objective of the latter program is to develop procedures which can facilitate the discovery and commercialization of new technologies for the benefit of small and medium-size manufacturing firms. Federal laboratories such as Argonne, along with universities, have been identified by the Industrial Technology Institute as key sources of technology which can be profitably commercialized by the target firms. The scope of this study limited the investigation of technology areas for technology transfer to that of computational science and engineering featuring high performance computing. This area was chosen as the broad technological capability within Argonne to investigate for technology transfer to Michigan firms for several reasons. First, and most importantly, as a multidisciplinary laboratory, Argonne has the full range of scientific and engineering skills needed to utilize leading-edge computing capabilities in many areas of manufacturing.

  8. Argonne National Laboratory study of the transfer of federal computational technology to manufacturing industry in the State of Michigan

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, C.J.

    1991-11-01

    This report describes a pilot study to develop, initiate the implementation, and document a process to identify computational technology capabilities resident within Argonne National Laboratory to small and medium-sized businesses in the State of Michigan. It is a derivative of a program entitled Technology Applications Development Process for the State of Michigan'' undertaken by the Industrial Technology Institute and MERRA under funding from the National Institute of Standards and Technology. The overall objective of the latter program is to develop procedures which can facilitate the discovery and commercialization of new technologies for the benefit of small and medium-size manufacturing firms. Federal laboratories such as Argonne, along with universities, have been identified by the Industrial Technology Institute as key sources of technology which can be profitably commercialized by the target firms. The scope of this study limited the investigation of technology areas for technology transfer to that of computational science and engineering featuring high performance computing. This area was chosen as the broad technological capability within Argonne to investigate for technology transfer to Michigan firms for several reasons. First, and most importantly, as a multidisciplinary laboratory, Argonne has the full range of scientific and engineering skills needed to utilize leading-edge computing capabilities in many areas of manufacturing.

  9. Practical superconductor development for electrical applications - Argonne National Laboratory quarterly report for the period ending September 30, 2002.

    Energy Technology Data Exchange (ETDEWEB)

    Dorris, S. E.

    2002-12-02

    This is a multiyear experimental research program that focuses on improving relevant material properties of high-T{sub c} superconductors (HTSs) and developing fabrication methods that can be transferred to industry for production of commercial conductors. The development of teaming relationships through agreements with industrial partners is a key element of the Argonne (ANL) program.

  10. Practical superconductor development for electrical power applications - Argonne National Laboratory - quarterly report for the period ending June 30, 2001.

    Energy Technology Data Exchange (ETDEWEB)

    Dorris, S. E.

    2001-08-21

    This is a multiyear experimental research program focused on improving relevant material properties of high-T{sub c} superconductors (HTSs) and on development of fabrication methods that can be transferred to industry for production of commercial conductors. The development of teaming relationships through agreements with industrial partners is a key element of the Argonne (ANL) program.

  11. A Robust Metal-Organic Framework with An Octatopic Ligand for Gas Adsorption and Separation: A Combined Characterization by Experiments and Molecular Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Zhuang, Wenjuan; Yuan, Daqiang; Liu, Dahuan; Zhong, Chongli; Li, Jian-Rong; Zhou, Hong-Cai

    2012-01-10

    A newly designed octatopic carboxylate ligand, tetrakis[(3,5-dicarboxyphenyl)oxamethyl]methane (TDM8–) has been used to connect a dicopper paddlewheel building unit affording a metal–organic framework (MOF), Cu₄(H₂O)₄(TDM)·xS (PCN-26·xS, S represents noncoordinated solvent molecules, PCN = porous coordination network) with novel structure, high gas uptake, and interesting gas adsorption selectivity. PCN-26 contains two different types of cages, octahedral and cuboctahedral, to form a polyhedron-stacked three-dimensional framework with open channels in three orthogonal directions. Gas adsorption studies of N₂, Ar, and H₂ on an activated PCN-26 at 77 K, 1 bar, reveals a Langmuir surface area of 2545 m²/g, a Brunauer–Emmett–Teller (BET) surface area of 1854 m²/g, a total pore volume of 0.84 cm³/g, and a H₂ uptake capacity of 2.57 wt %. Additionally, PCN-26 exhibits a CO₂/N₂ selectivity of 49:1 and CO₂/CH₄ selectivity of 8.4:1 at 273 K. To investigate properties of gas adsorption and the adsorption sites for CO₂ in activated PCN-26, theoretical simulations of the adsorption isotherms of CO₂, CH₄, and N₂ at different temperatures were carried out. Experimental results corroborate very well with those of molecular simulations.

  12. pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2014-01-01

    This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions.

  13. Exploring the impacts of physics and resolution on aqua-planet simulations from a nonhydrostatic global variable-resolution modeling framework: IMPACTS OF PHYSICS AND RESOLUTION

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Chun [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland Washington USA; Leung, L. Ruby [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland Washington USA; Park, Sang-Hun [National Center for Atmospheric Research, Boulder Colorado USA; Hagos, Samson [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland Washington USA; Lu, Jian [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland Washington USA; Sakaguchi, Koichi [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland Washington USA; Yoon, Jinho [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland Washington USA; School of Earth Sciences and Environmental Engineering, Gwanju Institute of Science and Technology, Gwangju South Korea; Harrop, Bryce E. [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland Washington USA; Skamarock, William [National Center for Atmospheric Research, Boulder Colorado USA; Duda, Michael G. [National Center for Atmospheric Research, Boulder Colorado USA

    2016-11-04

    Advances in computing resources are gradually moving regional and global numerical forecasting simulations towards sub-10 km resolution, but global high resolution climate simulations remain a challenge. The non-hydrostatic Model for Prediction Across Scales (MPAS) provides a global framework to achieve very high resolution using regional mesh refinement. Previous studies using the hydrostatic version of MPAS (H-MPAS) with the physics parameterizations of Community Atmosphere Model version 4 (CAM4) found notable resolution dependent behaviors. This study revisits the resolution sensitivity using the non-hydrostatic version of MPAS (NH-MPAS) with both CAM4 and CAM5 physics. A series of aqua-planet simulations at global quasi-uniform resolutions ranging from 240 km to 30 km and global variable resolution simulations with a regional mesh refinement of 30 km resolution over the tropics are analyzed, with a primary focus on the distinct characteristics of NH-MPAS in simulating precipitation, clouds, and large-scale circulation features compared to H-MPAS-CAM4. The resolution sensitivity of total precipitation and column integrated moisture in NH-MPAS is smaller than that in H-MPAS-CAM4. This contributes importantly to the reduced resolution sensitivity of large-scale circulation features such as the inter-tropical convergence zone and Hadley circulation in NH-MPAS compared to H-MPAS. In addition, NH-MPAS shows almost no resolution sensitivity in the simulated westerly jet, in contrast to the obvious poleward shift in H-MPAS with increasing resolution, which is partly explained by differences in the hyperdiffusion coefficients used in the two models that influence wave activity. With the reduced resolution sensitivity, simulations in the refined region of the NH-MPAS global variable resolution configuration exhibit zonally symmetric features that are more comparable to the quasi-uniform high-resolution simulations than those from H-MPAS that displays zonal asymmetry in

  14. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    DEFF Research Database (Denmark)

    Comminal, Raphaël; Spangenberg, Jon; Hattel, Jesper Henri

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our...... calculations predict a hydrodynamic instability in the 4:1 contraction geometry at a Weissenberg number of order 4. This new result is in qualitative agreement with the prediction of a non-linear subcritical elastic instability in Poiseuille flow. Our viscoelastic flow solver is coupled with a volume...

  15. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    DEFF Research Database (Denmark)

    Comminal, Raphaël; Spangenberg, Jon; Hattel, Jesper Henri

    2015-01-01

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our...... calculations predict a hydrodynamic instability in the 4:1 contraction geometry at a Weissenberg number of order 4. This new result is in qualitative agreement with the prediction of a non-linear subcritical elastic instability in Poiseuille flow. Our viscoelastic flow solver is coupled with a volume...

  16. Intermittent communications modeling and simulation for autonomous unmanned maritime vehicles using an integrated APM and FSMC framework

    Science.gov (United States)

    Coker, Ayodeji; Straatemeier, Logan; Rogers, Ted; Valdez, Pierre; Griendling, Kelly; Cooksey, Daniel

    2014-06-01

    In this work a framework is presented for addressing the issue of intermittent communications faced by autonomous unmanned maritime vehicles operating at sea. In particular, this work considers the subject of predictive atmospheric signal transmission over multi-path fading channels in maritime environments. A Finite State Markov Channel is used to represent a Nakagami-m modeled physical fading radio channel. The range of the received signal-to-noise ratio is partitioned into a finite number of intervals which represent application-specific communications states. The Advanced Propagation Model (APM), developed at the Space and Naval Warfare Systems Center San Diego, provides a characterization of the transmission channel in terms of evaporation duct induced signal propagation loss. APM uses a hybrid ray-optic and parabolic equations model which allows for the computation of electromagnetic (EM) wave propagation over various sea and/or terrain paths. These models which have been integrated in the proposed framework provide a strategic and mission planning aid for the operation of maritime unmanned vehicles at sea.

  17. Variable Density Flow Modeling for Simulation Framework for Regional Geologic CO{sub 2} Storage Along Arches Province of Midwestern United States

    Energy Technology Data Exchange (ETDEWEB)

    Joel Sminchak

    2011-09-30

    The Arches Province in the Midwestern U.S. has been identified as a major area for carbon dioxide (CO{sub 2}) storage applications because of the intersection of Mt. Simon sandstone reservoir thickness and permeability. To better understand large-scale CO{sub 2} storage infrastructure requirements in the Arches Province, variable density scoping level modeling was completed. Three main tasks were completed for the variable density modeling: Single-phase, variable density groundwater flow modeling; Scoping level multi-phase simulations; and Preliminary basin-scale multi-phase simulations. The variable density modeling task was successful in evaluating appropriate input data for the Arches Province numerical simulations. Data from the geocellular model developed earlier in the project were translated into preliminary numerical models. These models were calibrated to observed conditions in the Mt. Simon, suggesting a suitable geologic depiction of the system. The initial models were used to assess boundary conditions, calibrate to reservoir conditions, examine grid dimensions, evaluate upscaling items, and develop regional storage field scenarios. The task also provided practical information on items related to CO{sub 2} storage applications in the Arches Province such as pressure buildup estimates, well spacing limitations, and injection field arrangements. The Arches Simulation project is a three-year effort and part of the United States Department of Energy (U.S. DOE)/National Energy Technology Laboratory (NETL) program on innovative and advanced technologies and protocols for monitoring/verification/accounting (MVA), simulation, and risk assessment of CO{sub 2} sequestration in geologic formations. The overall objective of the project is to develop a simulation framework for regional geologic CO{sub 2} storage infrastructure along the Arches Province of the Midwestern U.S.

  18. A Research Framework for the Distribution Fast Simulation and Modeling%配电系统快速仿真与建模的研究框架

    Institute of Scientific and Technical Information of China (English)

    余贻鑫; 马世乾; 徐臣

    2014-01-01

    The technical requirements, basic functions, major design concepts, local automatic functions targeting self-healing, simulation and modeling tools, and high level requirement frameworks of distribution fast simulation and modeling (DFSM) were briefly introduced in this paper. Among those the following aspects about DFSM were emphasized, the pivotal role in distributed intelligent architecture of smart grid, the necessity of faster-than-real-time simulation, the fast simulation and modeling tools supported by a variety of generic simulation modules, and the design criteria for continuous test and update of the topology models of distribution grids.%对配电系统快速仿真与建模(distribution fast simulation and model,DFSM)的技术需求、基本功能、主要设计构想、以自愈为目标的当地自动化功能、仿真与建模工具集,以及 DFSM 的高层次需求框架等做了简要的介绍。其中,强调了DFSM:在智能配电网的分布式智能体系结构中的作用,采用超实时仿真的预测分析的必要,依托于多种通用仿真模块的快速仿真工具集,以及对网络模型不断地进行检查和更新的设计准则等特点。

  19. Helios: a Multi-Purpose LIDAR Simulation Framework for Research, Planning and Training of Laser Scanning Operations with Airborne, Ground-Based Mobile and Stationary Platforms

    Science.gov (United States)

    Bechtold, S.; Höfle, B.

    2016-06-01

    In many technical domains of modern society, there is a growing demand for fast, precise and automatic acquisition of digital 3D models of a wide variety of physical objects and environments. Laser scanning is a popular and widely used technology to cover this demand, but it is also expensive and complex to use to its full potential. However, there might exist scenarios where the operation of a real laser scanner could be replaced by a computer simulation, in order to save time and costs. This includes scenarios like teaching and training of laser scanning, development of new scanner hardware and scanning methods, or generation of artificial scan data sets to support the development of point cloud processing and analysis algorithms. To test the feasibility of this idea, we have developed a highly flexible laser scanning simulation framework named Heidelberg LiDAR Operations Simulator (HELIOS). HELIOS is implemented as a Java library and split up into a core component and multiple extension modules. Extensible Markup Language (XML) is used to define scanner, platform and scene models and to configure the behaviour of modules. Modules were developed and implemented for (1) loading of simulation assets and configuration (i.e. 3D scene models, scanner definitions, survey descriptions etc.), (2) playback of XML survey descriptions, (3) TLS survey planning (i.e. automatic computation of recommended scanning positions) and (4) interactive real-time 3D visualization of simulated surveys. As a proof of concept, we show the results of two experiments: First, a survey planning test in a scene that was specifically created to evaluate the quality of the survey planning algorithm. Second, a simulated TLS scan of a crop field in a precision farming scenario. The results show that HELIOS fulfills its design goals.

  20. Simulating Star Clusters with the AMUSE Software Framework: I. Dependence of Cluster Lifetimes on Model Assumptions and Cluster Dissolution Modes

    CERN Document Server

    Whitehead, Alfred J; Vesperini, Enrico; Zwart, Simon Portegies

    2013-01-01

    We perform a series of simulations of evolving star clusters using AMUSE (the Astrophysical Multipurpose Software Environment), a new community-based multi-physics simulation package, and compare our results to existing work. These simulations model a star cluster beginning with a King model distribution and a selection of power-law initial mass functions, and contain a tidal cut-off. They are evolved using collisional stellar dynamics and include mass loss due to stellar evolution. After determining that the differences between AMUSE results and prior publications are understood, we explored the variation in cluster lifetimes due to the random realization noise introduced by transforming a King model to specific initial conditions. This random realization noise can affect the lifetime of a simulated star cluster by up to 30%. Two modes of star cluster dissolution were identified: a mass evolution curve that contains a run-away cluster dissolution with a sudden loss of mass, and a dissolution mode that does n...

  1. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    Science.gov (United States)

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment.

  2. A novel framework for fluid/structure interaction in rapid subject specific simulations of blood flow in coronary artery bifurcations

    Directory of Open Access Journals (Sweden)

    Blagojević Milan

    2014-01-01

    Full Text Available Background/Aim. Practical difficulties, particularly long model development time, have limited the types and applicability of computational fluid dynamics simulations in numerical modeling of blood flow in serial manner. In these simulations, the most revealing flow parameters are the endothelial shear stress distribution and oscillatory shear index. The aim of this study was analyze their role in the diagnosis of the occurrence and prognosis of plaque development in coronary artery bifurcations. Methods. We developed a novel modeling technique for rapid cardiovascular hemodynamic simulations taking into account interactions between fluid domain (blood and solid domain (artery wall. Two numerical models that represent the observed subdomains of an arbitrary patient-specific coronary artery bifurcation were created using multi-slice computed tomography (MSCT coronagraphy and ultrasound measurements of blood velocity. Coronary flow using an in-house finite element solver PAK-FS was solved. Results. Overall behavior of coronary artery bifurcation during one cardiac cycle is described by: velocity, pressure, endothelial shear stress, oscillatory shear index, stress in arterial wall and nodal displacements. The places where (a endothelial shear stress is less than 1.5, and (b oscillatory shear index is very small (close or equal to 0 are prone to plaque genesis. Conclusion. Finite element simulation of fluid-structure interaction was used to investigate patient-specific flow dynamics and wall mechanics at coronary artery bifurcations. Simulation model revealed that lateral walls of the main branch and lateral walls distal to the carina are exposed to low endothelial shear stress which is a predilection site for development of atherosclerosis. This conclusion is confirmed by the low values of oscillatory shear index in those places.

  3. A feasibility study on the use of the MOOSE computational framework to simulate three-dimensional deformation of CANDU reactor fuel elements

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, Kyle A., E-mail: Kyle.Gamble@inl.gov [Royal Military College of Canada, Chemistry and Chemical Engineering, 13 General Crerar Crescent, Kingston, Ontario, Canada K7K 7B4 (Canada); Williams, Anthony F., E-mail: Tony.Williams@cnl.ca [Canadian Nuclear Laboratories, Fuel and Fuel Channel Safety, 1 Plant Road, Chalk River, Ontario, Canada K0J 1J0 (Canada); Chan, Paul K., E-mail: Paul.Chan@rmc.ca [Royal Military College of Canada, Chemistry and Chemical Engineering, 13 General Crerar Crescent, Kingston, Ontario, Canada K7K 7B4 (Canada); Wowk, Diane, E-mail: Diane.Wowk@rmc.ca [Royal Military College of Canada, Mechanical and Aerospace Engineering, 13 General Crerar Crescent, Kingston, Ontario, Canada K7K 7B4 (Canada)

    2015-11-15

    Highlights: • This is the first demonstration of using the MOOSE framework for modeling CANDU fuel. • Glued and frictionless contact algorithms behave as expected for 2D and 3D cases. • MOOSE accepts and correctly interprets functions of arbitrary form. • 3D deformation calculations accurately compare against analytical solutions. • MOOSE is a viable simulation tool for modeling accident reactor conditions. - Abstract: Horizontally oriented fuel bundles, such as those in CANada Deuterium Uranium (CANDU) reactors present unique modeling challenges. After long irradiation times or during severe transients the fuel elements can laterally deform out of plane due to processes known as bow and sag. Bowing is a thermally driven process that causes the fuel elements to laterally deform when a temperature gradient develops across the diameter of the element. Sagging is a coupled mechanical and thermal process caused by deformation of the fuel pin due to creep mechanisms of the sheathing after long irradiation times and or high temperatures. These out-of-plane deformations can lead to reduced coolant flow and a reduction in coolability of the fuel bundle. In extreme cases element-to-element or element-to-pressure tube contact could occur leading to reduced coolant flow in the subchannels or pressure tube rupture leading to a loss of coolant accident. This paper evaluates the capability of the Multiphysics Object-Oriented Simulation Environment (MOOSE) framework developed at the Idaho National Laboratory to model these deformation mechanisms. The material model capabilities of MOOSE and its ability to simulate contact are also investigated.

  4. Simulations

    CERN Document Server

    Ngada, N M

    2015-01-01

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  5. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    Science.gov (United States)

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  6. Development of a Coupled Framework for Simulating Interactive Effects of Frozen Soil Hydrological Dynamics in Permafrost Regions

    Science.gov (United States)

    2013-11-01

    Army Engineer Research and Development Center 3909 Halls Ferry Road Vicksburg, MS 39180-6199 Sergei Marchenko and Anna Liljedahl Geophysical ...The model is the result of coupling the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model with the Geophysical Institute Permafrost...simulates two dimensional groundwater flow and one-dimensional vadose zone flow. These two models were combined by incorporating the GIPL model into the

  7. Electron-cloud simulation studies for the CERN-PS in the framework of the LHC Injectors Upgrade project

    CERN Document Server

    Rioja Fuentelsaz, Sergio

    The present study aims to provide a consistent picture of the electron cloud effect in the CERN Proton Synchrotron (PS) and to investigate possible future limitations due to the requirements foreseen by the LHC Injectors Upgrade (LIU) project. It consists of a complete simulation survey of the electron cloud build-up in the different beam pipe sections of the ring depending on several controllable beam parameters and vacuum chamber surface properties, covering present and future operation parameters. As the combined function magnets of the accelerator constitute almost the $80\\%$ in length of the ring, the implementation of a new feature for the simulation of any external magnetic field on the PyECLOUD code, made it possible to perform this study. All the results of the simulations are given as a function of the vacuum chamber surface properties in order to deduce them, both locally and globally, when compared with experimental data. In a first step, we characterize locally the maximum possible number of ...

  8. Using E-Z Reader to simulate eye movements in nonreading tasks: a unified framework for understanding the eye-mind link.

    Science.gov (United States)

    Reichle, Erik D; Pollatsek, Alexander; Rayner, Keith

    2012-01-01

    Nonreading tasks that share some (but not all) of the task demands of reading have often been used to make inferences about how cognition influences when the eyes move during reading. In this article, we use variants of the E-Z Reader model of eye-movement control in reading to simulate eye-movement behavior in several of these tasks, including z-string reading, target-word search, and visual search of Landolt Cs arranged in both linear and circular arrays. These simulations demonstrate that a single computational framework is sufficient to simulate eye movements in both reading and nonreading tasks but also suggest that there are task-specific differences in both saccadic targeting (i.e., decisions about where to move the eyes) and the coupling between saccadic programming and the movement of attention (i.e., decisions about when to move the eyes). These findings suggest that some aspects of the eye-mind link are flexible and can be configured in a manner that supports efficient task performance.

  9. Quantifying Thermal Disorder in Metal–Organic Frameworks: Lattice Dynamics and Molecular Dynamics Simulations of Hybrid Formate Perovskites

    Science.gov (United States)

    2016-01-01

    Hybrid organic–inorganic materials are mechanically soft, leading to large thermoelastic effects which can affect properties such as electronic structure and ferroelectric ordering. Here we use a combination of ab initio lattice dynamics and molecular dynamics to study the finite temperature behavior of the hydrazinium and guanidinium formate perovskites, [NH2NH3][Zn(CHO2)3] and [C(NH2)3][Zn(CHO2)3]. Thermal displacement parameters and ellipsoids computed from the phonons and from molecular dynamics trajectories are found to be in good agreement. The hydrazinium compound is ferroelectric at low temperatures, with a calculated spontaneous polarization of 2.6 μC cm–2, but the thermal movement of the cation leads to variations in the instantaneous polarization and eventually breakdown of the ferroelectric order. Contrary to this the guanidinium cation is found to be stationary at all temperatures; however, the movement of the cage atoms leads to variations in the electronic structure and a renormalization in the bandgap from 6.29 eV at 0 K to an average of 5.96 eV at 300 K. We conclude that accounting for temperature is necessary for quantitative modeling of the physical properties of metal–organic frameworks. PMID:28298951

  10. Highly porous ionic rht metal-organic framework for H2 and CO2 storage and separation: A molecular simulation study

    KAUST Repository

    Babarao, Ravichandar

    2010-07-06

    The storage and separation of H2 and CO2 are investigated in a highly porous ionic rht metal-organic framework (rht-MOF) using molecular simulation. The rht-MOF possesses a cationic framework and charge-balancing extraframework NO3 - ions. Three types of unique open cages exist in the framework: rhombicuboctahedral, tetrahedral, and cuboctahedral cages. The NO3 - ions exhibit small mobility and are located at the windows connecting the tetrahedral and cuboctahedral cages. At low pressures, H2 adsorption occurs near the NO 3 - ions that act as preferential sites. With increasing pressure, H2 molecules occupy the tetrahedral and cuboctahedral cages and the intersection regions. The predicted isotherm of H2 at 77 K agrees well with the experimental data. The H2 capacity is estimated to be 2.4 wt % at 1 bar and 6.2 wt % at 50 bar, among the highest in reported MOFs. In a four-component mixture (15:75:5:5 CO2/H 2/CO/CH4) representing a typical effluent gas of H 2 production, the selectivity of CO2/H2 in rht-MOF decreases slightly with increasing pressure, then increases because of cooperative interactions, and finally decreases as a consequence of entropy effect. By comparing three ionic MOFs (rht-MOF, soc-MOF, and rho-ZMOF), we find that the selectivity increases with increasing charge density or decreasing free volume. In the presence of a trace amount of H2O, the interactions between CO2 and NO3 - ions are significantly shielded by H2O; consequently, the selectivity of CO 2/H2 decreases substantially. © 2010 American Chemical Society.

  11. I. Dissociation free energies of drug-receptor systems via non-equilibrium alchemical simulations: a theoretical framework.

    Science.gov (United States)

    Procacci, Piero

    2016-06-01

    In this contribution I critically revise the alchemical reversible approach in the context of the statistical mechanics theory of non-covalent bonding in drug-receptor systems. I show that most of the pitfalls and entanglements for the binding free energy evaluation in computer simulations are rooted in the equilibrium assumption that is implicit in the reversible method. These critical issues can be resolved by using a non-equilibrium variant of the alchemical method in molecular dynamics simulations, relying on the production of many independent trajectories with a continuous dynamical evolution of an externally driven alchemical coordinate, completing the decoupling of the ligand in a matter of a few tens of picoseconds rather than nanoseconds. The absolute binding free energy can be recovered from the annihilation work distributions by applying an unbiased unidirectional free energy estimate, on the assumption that any observed work distribution is given by a mixture of normal distributions, whose components are identical in either direction of the non-equilibrium process, with weights regulated by the Crooks theorem. I finally show that the inherent reliability and accuracy of the unidirectional estimate of the decoupling free energies, based on the production of a few hundreds of non-equilibrium independent sub-nanosecond unrestrained alchemical annihilation processes, is a direct consequence of the funnel-like shape of the free energy surface in molecular recognition. An application of the technique to a real drug-receptor system is presented in the companion paper.

  12. Injury Profile SIMulator, a qualitative aggregative modelling framework to predict crop injury profile as a function of cropping practices, and the abiotic and biotic environment. I. Conceptual bases.

    Science.gov (United States)

    Aubertot, Jean-Noël; Robin, Marie-Hélène

    2013-01-01

    The limitation of damage caused by pests (plant pathogens, weeds, and animal pests) in any agricultural crop requires integrated management strategies. Although significant efforts have been made to i) develop, and to a lesser extent ii) combine genetic, biological, cultural, physical and chemical control methods in Integrated Pest Management (IPM) strategies (vertical integration), there is a need for tools to help manage Injury Profiles (horizontal integration). Farmers design cropping systems according to their goals, knowledge, cognition and perception of socio-economic and technological drivers as well as their physical, biological, and chemical environment. In return, a given cropping system, in a given production situation will exhibit a unique injury profile, defined as a dynamic vector of the main injuries affecting the crop. This simple description of agroecosystems has been used to develop IPSIM (Injury Profile SIMulator), a modelling framework to predict injury profiles as a function of cropping practices, abiotic and biotic environment. Due to the tremendous complexity of agroecosystems, a simple holistic aggregative approach was chosen instead of attempting to couple detailed models. This paper describes the conceptual bases of IPSIM, an aggregative hierarchical framework and a method to help specify IPSIM for a given crop. A companion paper presents a proof of concept of the proposed approach for a single disease of a major crop (eyespot on wheat). In the future, IPSIM could be used as a tool to help design ex-ante IPM strategies at the field scale if coupled with a damage sub-model, and a multicriteria sub-model that assesses the social, environmental, and economic performances of simulated agroecosystems. In addition, IPSIM could also be used to help make diagnoses on commercial fields. It is important to point out that the presented concepts are not crop- or pest-specific and that IPSIM can be used on any crop.

  13. Injury Profile SIMulator, a qualitative aggregative modelling framework to predict crop injury profile as a function of cropping practices, and the abiotic and biotic environment. I. Conceptual bases.

    Directory of Open Access Journals (Sweden)

    Jean-Noël Aubertot

    Full Text Available The limitation of damage caused by pests (plant pathogens, weeds, and animal pests in any agricultural crop requires integrated management strategies. Although significant efforts have been made to i develop, and to a lesser extent ii combine genetic, biological, cultural, physical and chemical control methods in Integrated Pest Management (IPM strategies (vertical integration, there is a need for tools to help manage Injury Profiles (horizontal integration. Farmers design cropping systems according to their goals, knowledge, cognition and perception of socio-economic and technological drivers as well as their physical, biological, and chemical environment. In return, a given cropping system, in a given production situation will exhibit a unique injury profile, defined as a dynamic vector of the main injuries affecting the crop. This simple description of agroecosystems has been used to develop IPSIM (Injury Profile SIMulator, a modelling framework to predict injury profiles as a function of cropping practices, abiotic and biotic environment. Due to the tremendous complexity of agroecosystems, a simple holistic aggregative approach was chosen instead of attempting to couple detailed models. This paper describes the conceptual bases of IPSIM, an aggregative hierarchical framework and a method to help specify IPSIM for a given crop. A companion paper presents a proof of concept of the proposed approach for a single disease of a major crop (eyespot on wheat. In the future, IPSIM could be used as a tool to help design ex-ante IPM strategies at the field scale if coupled with a damage sub-model, and a multicriteria sub-model that assesses the social, environmental, and economic performances of simulated agroecosystems. In addition, IPSIM could also be used to help make diagnoses on commercial fields. It is important to point out that the presented concepts are not crop- or pest-specific and that IPSIM can be used on any crop.

  14. Injury Profile SIMulator, a Qualitative Aggregative Modelling Framework to Predict Crop Injury Profile as a Function of Cropping Practices, and the Abiotic and Biotic Environment. I. Conceptual Bases

    Science.gov (United States)

    Aubertot, Jean-Noël; Robin, Marie-Hélène

    2013-01-01

    The limitation of damage caused by pests (plant pathogens, weeds, and animal pests) in any agricultural crop requires integrated management strategies. Although significant efforts have been made to i) develop, and to a lesser extent ii) combine genetic, biological, cultural, physical and chemical control methods in Integrated Pest Management (IPM) strategies (vertical integration), there is a need for tools to help manage Injury Profiles (horizontal integration). Farmers design cropping systems according to their goals, knowledge, cognition and perception of socio-economic and technological drivers as well as their physical, biological, and chemical environment. In return, a given cropping system, in a given production situation will exhibit a unique injury profile, defined as a dynamic vector of the main injuries affecting the crop. This simple description of agroecosystems has been used to develop IPSIM (Injury Profile SIMulator), a modelling framework to predict injury profiles as a function of cropping practices, abiotic and biotic environment. Due to the tremendous complexity of agroecosystems, a simple holistic aggregative approach was chosen instead of attempting to couple detailed models. This paper describes the conceptual bases of IPSIM, an aggregative hierarchical framework and a method to help specify IPSIM for a given crop. A companion paper presents a proof of concept of the proposed approach for a single disease of a major crop (eyespot on wheat). In the future, IPSIM could be used as a tool to help design ex-ante IPM strategies at the field scale if coupled with a damage sub-model, and a multicriteria sub-model that assesses the social, environmental, and economic performances of simulated agroecosystems. In addition, IPSIM could also be used to help make diagnoses on commercial fields. It is important to point out that the presented concepts are not crop- or pest-specific and that IPSIM can be used on any crop. PMID:24019908

  15. A general-purpose framework to simulate musculoskeletal system of human body: using a motion tracking approach.

    Science.gov (United States)

    Ehsani, Hossein; Rostami, Mostafa; Gudarzi, Mohammad

    2016-02-01

    Computation of muscle force patterns that produce specified movements of muscle-actuated dynamic models is an important and challenging problem. This problem is an undetermined one, and then a proper optimization is required to calculate muscle forces. The purpose of this paper is to develop a general model for calculating all muscle activation and force patterns in an arbitrary human body movement. For this aim, the equations of a multibody system forward dynamics, which is considered for skeletal system of the human body model, is derived using Lagrange-Euler formulation. Next, muscle contraction dynamics is added to this model and forward dynamics of an arbitrary musculoskeletal system is obtained. For optimization purpose, the obtained model is used in computed muscle control algorithm, and a closed-loop system for tracking desired motions is derived. Finally, a popular sport exercise, biceps curl, is simulated by using this algorithm and the validity of the obtained results is evaluated via EMG signals.

  16. Investigation of the vertical instability at the Argonne Intense Pulsed Neutron Source

    Science.gov (United States)

    Wang, Shaoheng; Dooling, J. C.; Harkay, K. C.; Kustom, R. L.; McMichael, G. E.

    2009-10-01

    The rapid cycling synchrotron of the intense pulsed neutron source at Argonne National Laboratory normally operates at an average beam current of 14 to 15μA, accelerating protons from 50 to 450 MeV 30 times per second. The beam current is limited by a single-bunch vertical instability that occurs in the later part of the 14 ms acceleration cycle. By analyzing turn-by-turn beam position monitor data, two cases of vertical beam centroid oscillations were discovered. The oscillations start from the tail of the bunch, build up, and develop toward the head of the bunch. The development stops near the bunch center and oscillations remain localized in the tail for a relatively long time (2-4 ms, 1-2×104 turns). This vertical instability is identified as the cause of the beam loss. We compared this instability with a head-tail instability that was purposely induced by switching off sextupole magnets. It appears that the observed vertical instability is different from the classical head-tail instability.

  17. Argonne National Laboratory-East site environmental report for calendar year 1994

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Kolzow, R.G.

    1995-05-01

    This report discusses the results of the environmental protection program at Argonne National Laboratory-East (ANL) for 1994. To evaluate the effects of ANL operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the ANL site were analyzed and compared to applicable guidelines and standards. A variety of radionuclides was measured in air, surface water, groundwater, soil, grass, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and ANL effluent water were analyzed. External penetrating radiation doses were measured and the potential for radiation exposure to off-site population groups was estimated. The results of the surveillance program are interpreted in terms of the origin of the radioactive and chemical substances (natural, fallout, ANL, and other) and are compared with applicable environmental quality standards. A US Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection (ICRP) recommendations and the CAP-88 version of the EPA-AIRDOSE/RADRISK COMPUTER CODE, is used in this report. The status of ANL environmental protection activities with respect to the various laws and regulations which govern waste handling and disposal is discussed. This report also discusses progress being made on environmental corrective actions and restoration projects.

  18. Argonne National Laboratory--East site environmental report for calendar year 1990

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Moos, L.P.

    1991-07-01

    This report discusses the results of the environmental protection program at Argonne National Laboratory-East (ANL) for 1990. To evaluate the effects of ANL operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the ANL site were analyzed and compared to applicable guidelines and standards. A variety of radionuclides was measured in air, surface water, groundwater, soil, grass, bottom sediment, and milk samples. In addition, chemical constituents in surface water, groundwater, and ANL effluent water were analyzed. External penetrating radiation doses were measured and the potential for radiation exposure to off-site population groups was estimated. The results of the surveillance program are interpreted in terms of the origin of the radioactive and chemical substances (natural, fallout, ANL, and other) and are compared with applicable environmental quality standards. A US Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection (ICRP) recommendations, is used in this report. The status of ANL environmental protection activities with respect to the various laws and regulations which govern waste handling and disposal is discussed. This report also discusses progress being made on environmental corrective actions and restoration projects from past activities.

  19. Argonne National Laboratory-East site environmental report for calendar year 1996

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Kolzow, R.G.

    1997-09-01

    This report discusses the results of the environmental protection program at Argonne National Laboratory-East (ANL-E) for 1996. To evaluate the effects of ANL-E operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the ANL-E site were analyzed and compared to applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, soil, grass, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and ANL-E effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. The results of the surveillance program are interpreted in terms of the origin of the radioactive and chemical substances (natural, fallout, ANL-E, and other) and are compared with applicable environmental quality standards. A US Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the CAP-88 version of the EPA-AIRDOSE/RADRISK computer code, is used in this report. The status of ANL-E environmental protection activities with respect to the various laws and regulations that govern waste handling and disposal is discussed. This report also discusses progress being made on environmental corrective actions and restoration projects.

  20. Argonne National Laboratory-East site environmental report for calendar year 1993

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Kolzow, R.G. [Argonne National Lab., IL (United States). Environment and Waste Management Program

    1994-05-01

    This report discusses the results of the environmental protection program at Argonne National Laboratory-East (ANL) for 1993. To evaluate the effects of ANL operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the ANL site were analyzed and compared to applicable guidelines and standards. A variety of radionuclides was measured in air, surface water, groundwater, soil, grass, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and ANL effluent water were analyzed. External penetrating radiation doses were measured and the potential for radiation exposure to off-site population groups was estimated. The results of the surveillance program are interpreted in terms of the origin of the radioactive and chemical substances (natural, fallout, ANL, and other) and are compared with applicable environmental quality standards. A US Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection (ICRP) recommendations and the CAP-88 version of the EPA-AIRDOSE/RADRISK computer code, is used in this report. The status of ANL environmental protection activities with respect to the various laws and regulations which govern waste handling and disposal is discussed. This report also discusses progress being made on environmental corrective actions and restoration projects from past activities.

  1. Argonne National Laboratory-East site environmental report for calendar year 1998.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Kolzow, R.G.

    1999-08-26

    This report discusses the results of the environmental protection program at Argonne National Laboratory-East (ANL-E) for 1998. To evaluate the effects of ANL-E operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the ANL-E site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and ANL-E effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, ANL-E, and other) and are compared with applicable environmental quality standards. A US Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the US Environmental Protection Agency's CAP-88 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report. The status of ANL-E environmental protection activities with respect to the various laws and regulations that govern waste handling and disposal is discussed, along with the progress of environmental corrective actions and restoration projects.

  2. Vitrification as a low-level radioactive mixed waste treatment technology at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Mazer, J.J.; No, Hyo J.

    1995-08-01

    Argonne National Laboratory-East (ANL-E) is developing plans to use vitrification to treat low-level radioactive mixed wastes (LLMW) generated onsite. The ultimate objective of this project is to install a full-scale vitrification system at ANL-E capable of processing the annual generation and historic stockpiles of selected LLMW streams. This project is currently in the process of identifying a range of processible glass compositions that can be produced from actual mixed wastes and additives, such as boric acid or borax. During the formulation of these glasses, there has been an emphasis on maximizing the waste content in the glass (70 to 90 wt %), reducing the overall final waste volume, and producing a stabilized low-level radioactive waste glass. Crucible glass studies with actual mixed waste streams have produced alkali borosilicate glasses that pass the Toxic Characteristic Leaching Procedure (TCLP) test. These same glass compositions, spiked with toxic metals well above the expected levels in actual wastes, also pass the TCLP test. These results provide compelling evidence that the vitrification system and the glass waste form will be robust enough to accommodate expected variations in the LLMW streams from ANL-E. Approximately 40 crucible melts will be studied to establish a compositional envelope for vitrifying ANL-E mixed wastes. Also being determined is the identity of volatilized metals or off-gases that will be generated.

  3. Structural elucidation of Argonne premium coals: Molecular weights, heteroatom distributions and linkages between clusters

    Energy Technology Data Exchange (ETDEWEB)

    Winans, R.E.,; Kim, Y.; Hunt, J.E.; McBeth, R.L.

    1995-12-31

    The objective of this study is to create a statistically accurate picture of important structural features for a group of coals representing a broad rank range. Mass spectrometric techniques are used to study coals, coal extracts and chemically modified coals and extracts. Laser desorption mass spectrometry is used to determine molecular weight distributions. Desorption chemical ionization high resolution mass spectrometry provides detailed molecular information on compound classes of molecules is obtained using tandem mass spectrometry. These results are correlated with other direct studies on these samples such as solid NMR, XPS and X-ray absorption spectroscopy. From the complex sets of data, several general trends are emerging especially for heteroatom containing species. From a statistical point of view, heteroatoms must play important roles in the reactivity of all coals. Direct characterization of sulfur containing species in the Argonne coals has been reported from XANES analysis. Indirect methods used include: TG-FTIR and HRMS which rely on thermal desorption and pyrolysis to vaporize the samples. Both XANES and XPS data on nitrogen has been reported, but at this time, the XPS information is probably more reliable. Results from HRMS are discussed in this paper. Most other information on nitrogen is limited to analysis of liquefaction products. However, nitrogen can be important in influencing characteristics of coal liquids and as a source of NO{sub x}`s in coal combustion.

  4. Inspection and monitoring plan, contaminated groundwater seeps 317/319/ENE Area, Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-10-11

    During the course of completing the Resource Conservation and Recovery Act (RCRA) Facility Investigation (RFI) in the 317/319/East-Northeast (ENE) Area of Argonne National Laboratory-East (ANL-E), groundwater was discovered moving to the surface through a series of groundwater seeps. The seeps are located in a ravine approximately 600 ft south of the ANL-E fence line in Waterfall Glen Forest Preserve. Samples of the seep water were collected and analyzed for selected parameters. Two of the five seeps sampled were found to contain detectable levels of organic contaminants. Three chemical species were identified: chloroform (14--25 {micro}g/L), carbon tetrachloride (56--340 {micro}g/L), and tetrachloroethylene (3--6 {micro}g/L). The other seeps did not contain detectable levels of volatile organics. The nature of the contaminants in the seeps will also be monitored on a regular basis. Samples of surface water flowing through the bottom of the ravine and groundwater emanating from the seeps will be collected and analyzed for chemical and radioactive constituents. The results of the routine sampling will be compared with the concentrations used in the risk assessment. If the concentrations exceed those used in the risk assessment, the risk calculations will be revised by using the higher numbers. This revised analysis will determine if additional actions are warranted.

  5. The beam bunching and transport system of the Argonne positive ion injector

    Energy Technology Data Exchange (ETDEWEB)

    Den Hartog, P.K.; Bogaty, J.M.; Bollinger, L.M.; Clifft, B.E.; Pardo, R.C.; Shepard, K.W.

    1989-01-01

    A new positive ion injector (PII) is currently under construction at Argonne that will replace the existing 9-MV tandem electrostatic accelerator as an injector into ATLAS. It consists of an electron-cyclotron resonance-ion source on a 350-kV platform injecting into a superconducting linac optimized for very slow (..beta.. less than or equal to .007 c) ions. This combination can potentially produce even higher quality heavy-ion beams than are currently available from the tandem since the emittance growth within the linac is largely determined by the quality of the bunching and beam transport. The system we have implemented uses a two-stage bunching system, composed of a 4-harmonic gridded buncher located on the ECR high-voltage platform and a room temperature spiral-loaded buncher of novel design. A sinusoidal beam chopper is used for removal of tails. The beam transport is designed to provide mass resolution of M/..delta..M > 250 and a doubly-isochronous beamline is used to minimize time spread due to path length differences. 4 refs., 2 figs.

  6. Management of wildlife causing damage at Argonne National Laboratory-East, DuPage County, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    The DOE, after an independent review, has adopted an Environmental Assessment (EA) prepared by the US Department of Agriculture (USDA) which evaluates use of an Integrated Wildlife Damage Management approach at Argonne National Laboratory-East (ANL-E) in DuPage County, Illinois (April 1995). In 1994, the USDA issued a programmatic Environmental Impact Statement (EIS) that covers nationwide animal damage control activities. The EA for Management of Wildlife Causing Damage at ANL-E tiers off this programmatic EIS. The USDA wrote the EA as a result of DOE`s request to USDA to prepare and implement a comprehensive Wildlife Management Damage Plan; the USDA has authority for animal damage control under the Animal Damage Control Act of 1931, as amended, and the Rural Development, Agriculture and Related Agencies Appropriations Act of 1988. DOE has determined, based on the analysis in the EA, that the proposed action does not constitute a major Federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act of 1969 (NEPA). Therefore, the preparation of an EIS is not required. This report contains the Environmental Assessment, as well as the Finding of No Significant Impact (FONSI).

  7. Software package as an information center product. [Activities of Argonne Code Center

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M. K.

    1977-01-01

    The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables. (RWR)

  8. Design, calibration, and operation of 220Rn stack effluent monitoring systems at Argonne National Laboratory.

    Science.gov (United States)

    Munyon, W J; Kretz, N D; Marchetti, F P

    1994-09-01

    A group of stack effluent monitoring systems have been developed to monitor discharges of 220Rn from a hot cell facility at Argonne National Laboratory. The stack monitors use flow-through scintillation cells and are completely microprocessor-based systems. A method for calibrating the stack monitors in the laboratory and in the field is described. A nominal calibration factor for the stack monitoring systems in use is 15.0 cts min-1 per kBq m-3 (0.56 cts min-1 per pCi L-1) +/- 26% at the 95% confidence level. The plate-out fraction of decay products in the stack monitor scintillation cells, without any pre-filtering, was found to be nominally 25% under normal operating conditions. When the sample was pre-filtered upstream of the scintillation cell, the observed cell plate-out fraction ranged from 16-22%, depending on the specific sampling conditions. The instantaneous 220Rn stack concentration can be underestimated or overestimated when the steady state condition established between 220Rn and its decay products in the scintillation cell is disrupted by sudden changes in the monitored 220Rn concentration. For long-term measurements, however, the time-averaged response of the monitor represents the steady state condition and leads to a reasonable estimate of the average 220Rn concentration during the monitoring period.

  9. R D activities at Argonne National Laboratory for the application of base seismic isolation in nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Seidensticker, R.W.

    1991-01-01

    Argonne National Laboratory (ANL) has been deeply involved in the development of seismic isolation for use in nuclear facilities for the past decade. Initial focus of these efforts has been on the use of seismic isolation for advanced liquid metal reactors (LMR). Subsequent efforts in seismic isolation at ANL included a lead role in an accelerated development program for possible use of seismic isolation for the DOE's New Production reactors (NPR). Under funding provided by the National Science Foundation (NSF) Argonne is currently working with Shimizu in a joint United States-Japanese program on response of seismically-isolated buildings to actual earthquakes. The results of recent work in the seismic isolation program elements are described in this paper. The current Status of these programs is presented along with an assessment of work still needed to bring the benefits of this emerging technology to full potential in nuclear reactors and other nuclear facilities. 38 refs., 3 figs.

  10. Argonne National Laboratory, High Energy Physics Division: Semiannual report of research activities, July 1, 1986-December 31, 1986

    Energy Technology Data Exchange (ETDEWEB)

    1987-01-01

    This paper discusses the research activity of the High Energy Physics Division at the Argonne National Laboratory for the period, July 1986-December 1986. Some of the topics included in this report are: high resolution spectrometers, computational physics, spin physics, string theories, lattice gauge theory, proton decay, symmetry breaking, heavy flavor production, massive lepton pair production, collider physics, field theories, proton sources, and facility development. (LSP)

  11. Ultra long-term simulation by the integrated model. 1. Framework and energy system module; Togo model ni yoru tanchoki simulation. 1. Flame work to energy system module

    Energy Technology Data Exchange (ETDEWEB)

    Kurosawa, A.; Yagita, H.; Yanagisawa, Y. [Research Inst. of Innovative Technology for the Earth, Kyoto (Japan)

    1997-01-30

    This paper introduces the study on the ultra long-term energy model `GRAPE` with considering global environment and the results of trial calculation. The GRAPE model is to consist of modules of energy system, climate change, change of land use, food demand/supply, macro economy, and environmental impact. This is a model that divides the world into ten regions, gives 1990 as a base year, and enables the ultra long-term simulation. In this time, emission of carbon is calculated as a trial. In the case of constrained quantity of carbon emission, energy supply in the latter half of 21st century is to compose photovoltaic energy, methanol from coal gasification, and biomass energy. In addition, the shear of nuclear energy is to remarkably increase. For the constitution of power generation, IGCC power generation with carbon recovery, wind power generation, photovoltaic power generation, and nuclear power generation are to extend their shears. In the case of constrained concentration of carbon emission, structural change of power generation option is to be delayed compared with the case of constrained quantity of carbon emission. 6 refs., 4 figs.

  12. L-py: an L-system simulation framework for modeling plant architecture development based on a dynamic language.

    Science.gov (United States)

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  13. Investigation of interphase effects in silica-polystyrene nanocomposites based on a hybrid molecular-dynamics-finite-element simulation framework

    Science.gov (United States)

    Pfaller, Sebastian; Possart, Gunnar; Steinmann, Paul; Rahimi, Mohammad; Müller-Plathe, Florian; Böhm, Michael C.

    2016-05-01

    A recently developed hybrid method is employed to study the mechanical behavior of silica-polystyrene nanocomposites (NCs) under uniaxial elongation. The hybrid method couples a particle domain to a continuum domain. The region of physical interest, i.e., the interphase around a nanoparticle (NP), is treated at molecular resolution, while the surrounding elastic continuum is handled with a finite-element approach. In the present paper we analyze the polymer behavior in the neighborhood of one or two nanoparticle(s) at molecular resolution. The coarse-grained hybrid method allows us to simulate a large polymer matrix region surrounding the nanoparticles. We consider NCs with dilute concentration of NPs embedded in an atactic polystyrene matrix formed by 300 chains with 200 monomer beads. The overall orientation of polymer segments relative to the deformation direction is determined in the neighborhood of the nanoparticle to investigate the polymer response to this perturbation. Calculations of strainlike quantities give insight into the deformation behavior of a system with two NPs and show that the applied strain and the nanoparticle distance have significant influence on the deformation behavior. Finally, we investigate to what extent a continuum-based description may account for the specific effects occurring in the interphase between the polymer matrix and the NPs.

  14. L-Py: an L-System simulation framework for modeling plant development based on a dynamic language

    Directory of Open Access Journals (Sweden)

    Frederic eBoudon

    2012-05-01

    Full Text Available The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e. languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: i by keeping a simple syntax while allowing for high-level programming constructs, ii by making code execution easy and avoiding compilation overhead iii allowing a high level of model reusability and the building of complex modular models iv and by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  15. Hydrometeorological multi-model ensemble simulations of the 4 November 2011 flash flood event in Genoa, Italy, in the framework of the DRIHM project

    Directory of Open Access Journals (Sweden)

    A. Hally

    2015-03-01

    Full Text Available The e-Science environment developed in the framework of the EU-funded DRIHM project was used to demonstrate its ability to provide relevant, meaningful hydrometeorological forecasts. This was illustrated for the tragic case of 4 November 2011, when Genoa, Italy, was flooded as the result of heavy, convective precipitation that inundated the Bisagno catchment. The Meteorological Model Bridge (MMB, an innovative software component developed within the DRIHM project for the interoperability of meteorological and hydrological models, is a key component of the DRIHM e-Science environment. The MMB allowed three different rainfall-discharge models (DRiFt, RIBS and HBV to be driven by four mesoscale limited-area atmospheric models (WRF-NMM, WRF-ARW, Meso-NH and AROME and a downscaling algorithm (RainFARM in a seamless fashion. In addition to this multi-model configuration, some of the models were run in probabilistic mode, thus giving a comprehensive account of modelling errors and a very large amount of likely hydrometeorological scenarios (> 1500. The multi-model approach proved to be necessary because, whilst various aspects of the event were successfully simulated by different models, none of the models reproduced all of these aspects correctly. It was shown that the resulting set of simulations helped identify key atmospheric processes responsible for the large rainfall accumulations over the Bisagno basin. The DRIHM e-Science environment facilitated an evaluation of the sensitivity to atmospheric and hydrological modelling errors. This showed that both had a significant impact on predicted discharges, the former being larger than the latter. Finally, the usefulness of the set of hydrometeorological simulations was assessed from a flash flood early-warning perspective.

  16. The SPRINTARS version 3.80/4D-Var data assimilation system: development and inversion experiments based on the observing system simulation experiment framework

    Directory of Open Access Journals (Sweden)

    K. Yumimoto

    2013-06-01

    Full Text Available We present an aerosol data assimilation system based on a global aerosol climate model (SPRINTARS and a four-dimensional variational data assimilation method (4D-Var. Its main purposes are to optimize emission estimates, improve composites, and obtain the best estimate of the radiative effects of aerosols in conjunction with observations. To reduce the huge computational cost caused by the iterative integrations in the models, we developed an off-line model and a corresponding adjoint model, which are driven by pre-calculated meteorological, land, and soil data. The off-line and adjoint model shortened the computational time of the inner loop by more than 30%. By comparing the results with a 1yr simulation from the original on-line model, the consistency of the off-line model was verified, with correlation coefficient R^2 > 0.97 and absolute value of normalized mean bias NMB The feasibility and capability of the developed system for aerosol inverse modelling was demonstrated in several inversion experiments based on the observing system simulation experiment framework. In the experiments, we generated the simulated observation data sets of fine- and coarse-mode AOTs from sun-synchronous polar orbits to investigate the impact of the observational frequency (number of satellites and coverage (land and ocean. Observations over land have a notably positive impact on the performance of inverse modelling comparing with observations over ocean, implying that reliable observational information over land is important for inverse modelling of land-born aerosols. The experimental results also indicate that aerosol type classification is crucial to inverse modelling over regions where various aerosol species co-exist (e.g. industrialized regions and areas downwind of them.

  17. Hydrometeorological multi-model ensemble simulations of the 4 November 2011 flash-flood event in Genoa, Italy, in the framework of the DRIHM project

    Directory of Open Access Journals (Sweden)

    A. Hally

    2014-11-01

    Full Text Available The e-Science environment developed in the framework of the EU-funded DRIHM project was used to demonstrate its capability to provide relevant, meaningful hydrometeorological forecasts. This was illustrated for the tragic case of 4 November 2011, when Genoa, Italy, was flooded as the result of heavy, convective precipitation that inundated the Bisagno catchment. The Meteorological Model Bridge (MMB, an innovative software component developped within the DRIHM project for the interoperability of meteorological and hydrological models, is a key component of the DRIHM e-Science environment. The MMB allowed three different rainfall-discharge models (DRiFt, RIBS, and HBV to be driven by four mesoscale limited-area atmospheric models (WRF-NMM, WRF-ARW, Meso-NH, and AROME and a downscaling algorithm (RainFARM in a seamless fashion. In addition to this multi-model configuration, some of the models were run in probabilistic mode, thus allowing a comprehensive account of modelling errors and a very large amount of likely hydrometeorological scenarios (>1500. The multi-model approach proved to be necessary because, whilst various aspects of the event were successfully simulated by different models, none of the models reproduced all of these aspects correctly. It was shown that the resulting set of simulations helped identify key atmospheric processes responsible for the large rainfall accumulations over the Bisagno basin. The DRIHM e-Science environment facilitated an evaluation of the sensitivity to atmospheric and hydrological modelling errors. This showed that both had a significant impact on predicted discharges, the former being larger than the latter. Finally, the usefulness of the set of hydrometeorological simulations was assessed from a flash-flood early-warning perspective.

  18. The development, design, testing, refinement, simulation and application of an evaluation framework for communities of practice and social-professional networks

    Directory of Open Access Journals (Sweden)

    Ball Dianne

    2009-09-01

    Full Text Available Abstract Background Communities of practice and social-professional networks are generally considered to enhance workplace experience and enable organizational success. However, despite the remarkable growth in interest in the role of collaborating structures in a range of industries, there is a paucity of empirical research to support this view. Nor is there a convincing model for their systematic evaluation, despite the significant potential benefits in answering the core question: how well do groups of professionals work together and how could they be organised to work together more effectively? This research project will produce a rigorous evaluation methodology and deliver supporting tools for the benefit of researchers, policymakers, practitioners and consumers within the health system and other sectors. Given the prevalence and importance of communities of practice and social networks, and the extent of investments in them, this project represents a scientific innovation of national and international significance. Methods and design Working in four conceptual phases the project will employ a combination of qualitative and quantitative methods to develop, design, field-test, refine and finalise an evaluation framework. Once available the framework will be used to evaluate simulated, and then later existing, health care communities of practice and social-professional networks to assess their effectiveness in achieving desired outcomes. Peak stakeholder groups have agreed to involve a wide range of members and participant organisations, and will facilitate access to various policy, managerial and clinical networks. Discussion Given its scope and size, the project represents a valuable opportunity to achieve breakthroughs at two levels; firstly, by introducing novel and innovative aims and methods into the social research process and, secondly, through the resulting evaluation framework and tools. We anticipate valuable outcomes in the

  19. Using the DeNitrification-DeComposition Framework to Simulate Global Soil Nitrous Oxide Emissions in the Community Land Model with Coupled Carbon and Nitrogen

    Science.gov (United States)

    Seok, B.; Saikawa, E.

    2015-12-01

    Soils are among the largest emission sources of nitrous oxide (N2O), which is a strong greenhouse gas and is the leading stratospheric ozone depleting substance. Thus, there is a rising concern for mitigating N2O emissions from soils. And yet, our understanding of the global magnitude and the impacts of soil N2O emissions on the climate and the stratospheric ozone layer is still limited, and our ability to mitigate N2O emissions thus remains a challenge. One approach to assess the global magnitude and impacts of N2O emissions is to use global biogeochemical models. Currently, most of these models use a simple or a conceptual framework to simulate soil N2O emissions. However, if we are to reduce the uncertainty in determining the N2O budget, a better representation of the soil N2O emissions process is essential. In our attempts to fulfill this objective, we have further improved the parameterization of soil N2O emissions in the Community Land Model with coupled Carbon and Nitrogen (CLM-CN) by implementing the DeNitrification-DeComposition (DNDC) model and validated our model results to existing measurements. We saw a general improvement in simulated N2O emissions with the updated parameterization and further improvements for specific sites when the model was nudged with measured soil temperature and moisture data of the respective site. We present the latest updates and changes made to CLM-CN with one-way coupled DNDC model (CLMCN-N2O) and compare the results between model versions and against other global biogeochemical models.

  20. Combustion and leaching behavior of elements in the argonne premium coal samples

    Science.gov (United States)

    Finkelman, R.B.; Palmer, C.A.; Krasnow, M.R.; Aruscavage, P. J.; Sellers, G.A.; Dulong, F.T.

    1990-01-01

    Eight Argonne Premium Coal samples and two other coal samples were used to observe the effects of combustion and leaching on 30 elements. The results were used to infer the modes of occurrence of these elements. Instrumental neutron activation analysis indicates that the effects of combustion and leaching on many elements varied markedly among the samples. As much as 90% of the selenium and bromine is volatilized from the bituminous coal samples, but substantially less is volatilized from the low-rank coals. We interpret the combustion and leaching behavior of these elements to indicate that they are associated with the organic fraction. Sodium, although nonvolatile, is ion-exchangeable in most samples, particularly in the low-rank coal samples where it is likely to be associated with the organic constituents. Potassium is primarily in an ion-exchangeable form in the Wypdak coal but is in HF-soluble phases (probably silicates) in most other samples. Cesium is in an unidentified HNO3-soluble phase in most samples. Virtually all the strontium and barium in the low-rank coal samples is removed by NH4OAc followed by HCl, indicating that these elements probably occur in both organic and inorganic phases. Most tungsten and tantalum are in insoluble phases, perhaps as oxides or in organic association. Hafnium is generally insoluble, but as much as 65% is HF soluble, perhaps due to the presence of very fine grained or metamict zircon. We interpret the leaching behavior of uranium to indicate its occurrence in chelates and its association with silicates and with zircon. Most of the rare-earth elements (REE) and thorium appear to be associated with phosphates. Differences in textural relationships may account for some of the differences in leaching behavior of the REE among samples. Zinc occurs predominantly in sphalerite. Either the remaining elements occur in several different modes of occurrence (scandium, iron), or the leaching data are equivocal (arsenic, antimony

  1. 基于数据耕耘的探索性仿真实验框架研究%Research on the Framework of Exploratory Simulation Experiment Based on Data Farming

    Institute of Scientific and Technical Information of China (English)

    李斌; 李春洪; 刘苏洋

    2011-01-01

    Based on data farming, the paper designs the framework of exploratory simulation experiment, and discusses all parts in the framework mainly. The framework of exploratory simulation experiment, which is based on data farming, includes Ping-Pang wargaming preparation experiment, single scenario building loop, simulation scenario space execution loop and simulation results analysis loop. Combining human being's experiences and intelligence with computer simulation method, the advices needed for military decision are formed and interested war rules are found gradually in the process of circulations. Many qualitative and quantitative analysis methods are combined under the framework of exploratory simulation experiment to carry out the experiment aimming at the object of experiment.%基于数据耕耘思想,设计了探索性仿真实验框架,并对框架中每个部分进行了简要探讨.基于数据耕耘的探索性仿真实验框架由乒乓式对抗推演预实验、单个仿真想定生成环、仿真想定空间运行环和仿真结果分析环四部分组成,通过将人的经验、智慧与计算机仿真手段相结合,在多次循环的过程中逐渐形成所需的军事决策建议或寻找感兴趣的战争规律.通过构建探索性仿真实验框架,能够将各种定量、定性分析方法整合起来,围绕实验目标实施探索性仿真实验.

  2. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J;

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  3. NNWSI [Nevada Nuclear Waste Storage Investigations] waste form testing at Argonne National Laboratory; Semiannual report, January--June 1988

    Energy Technology Data Exchange (ETDEWEB)

    Bates, J.K.; Gerding, T.J.; Ebert, W.L.; Mazer, J.J.; Biwer, B.M. [Argonne National Lab., IL (USA)

    1990-04-01

    The Chemical Technology Division of Argonne National Laboratory is performing experiments in support of the waste package development of the Yucca Mountain Project (formerly the Nevada Nuclear Waste Storage Investigations Project). Experiments in progress include (1) the development and performance of a durability test in unsaturated conditions, (2) studies of waste form behavior in an irradiated atmosphere, (3) studies of behavior in water vapor, and (4) studies of naturally occurring glasses to be used as analogues for waste glass behavior. This report documents progress made during the period of January--June 1988. 21 refs., 37 figs., 12 tabs.

  4. An evaluation of alternative reactor vessel cutting technologies for the experimental boiling water reactor at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Boing, L.E.; Henley, D.R. (Argonne National Lab., IL (USA)); Manion, W.J.; Gordon, J.W. (Nuclear Energy Services, Inc., Danbury, CT (USA))

    1989-12-01

    Metal cutting techniques that can be used to segment the reactor pressure vessel of the Experimental Boiling Water Reactor (EBWR) at Argonne National Laboratory (ANL) have been evaluated by Nuclear Energy Services. Twelve cutting technologies are described in terms of their ability to perform the required task, their performance characteristics, environmental and radiological impacts, and cost and schedule considerations. Specific recommendations regarding which technology should ultimately be used by ANL are included. The selection of a cutting method was the responsibility of the decommissioning staff at ANL, who included a relative weighting of the parameters described in this document in their evaluation process. 73 refs., 26 figs., 69 tabs.

  5. Simulation of the Mechanism of Gas Sorption in a Metal–Organic Framework with Open Metal Sites: Molecular Hydrogen in PCN-61

    KAUST Repository

    Forrest, Katherine A.

    2012-07-26

    Grand canonical Monte Carlo (GCMC) simulations were performed to investigate hydrogen sorption in an rht-type metal-organic framework (MOF), PCN-61. The MOF was shown to have a large hydrogen uptake, and this was studied using three different hydrogen potentials, effective for bulk hydrogen, but of varying sophistication: a model that includes only repulsion/dispersion parameters, one augmented with charge-quadrupole interactions, and one supplemented with many-body polarization interactions. Calculated hydrogen uptake isotherms and isosteric heats of adsorption, Q st, were in quantitative agreement with experiment only for the model with explicit polarization. This success in reproducing empirical measurements suggests that modeling MOFs that have open metal sites is feasible, though it is often not considered to be well described via a classical potential function; here it is shown that such systems may be accurately described by explicitly including polarization effects in an otherwise traditional empirical potential. Decomposition of energy terms for the models revealed deviations between the electrostatic and polarizable results that are unexpected due to just the augmentation of the potential surface by the addition of induction. Charge-quadrupole and induction energetics were shown to have a synergistic interaction, with inclusion of the latter resulting in a significant increase in the former. Induction interactions strongly influence the structure of the sorbed hydrogen compared to the models lacking polarizability; sorbed hydrogen is a dipolar dense fluid in the MOF. This study demonstrates that many-body polarization makes a critical contribution to gas sorption structure and must be accounted for in modeling MOFs with polar interaction sites. © 2012 American Chemical Society.

  6. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  7. Enhanced Methane Adsorption in Catenated Metal-organic Frameworks: A Molecular Simulation Study%连锁结构金属-有机骨架材料强化甲烷吸附的分子模拟研究

    Institute of Scientific and Technical Information of China (English)

    薛春瑜; 周了娥; 阳庆元; 仲崇立

    2009-01-01

    A systematic molecular simulation study was performed to investigate the effect of catenation on meth-ane adsorption in metal-organic frameworks (MOFs).Four pairs of isoreticular MOFs (IRMOFs) with and without catenation were adopted and their capacities for methane adsorption were compared at room temperature. The pre-sent work showed that catenation could greatly enhance the storage capacity of methane in MOFs, due to the for-mation of additional small pores and adsorption sites formed by the catenation of frameworks. In addition, the simulation results obtained at 298 K and 3.5 MPa showed that catenated MOFs could easily meet the requirement for methane storage in porous materials.

  8. Numerical Simulation of Casting Deformation and Stress in the Ti-Alloy Parts with Framework Structure%钛合金框架铸件铸造变形和应力的数值模拟

    Institute of Scientific and Technical Information of China (English)

    崔新鹏; 张晨; 范世玺; 南海

    2015-01-01

    基于ProCAST软件对框架型钛合金铸件铸造过程中的温度场、应力场及铸造变形进行了数值模拟。在应力计算初期将石墨型壳设为RIGID模型,后期设为VACANT模型。模拟结果显示铸件上部及内部框架处温度较低,下部浇道处温度较高;铸件两侧尖部有沿-Y方向的弯曲变形;内部框架结构的转角处会出现应力集中。实际尺寸测量结果与模拟结果吻合较好,验证了模拟的准确性。%The temperature field,stress field and casting deformation of the Ti-Alloy framework castings during pouring and solidification process were simulated based on the ProCAST software.The model of the graphite mould was set as RIGID at the initial stage of the stress calculation whereas VACANT at the later stage.The simulation results reveal that the temperature of the upper and inner part of the framework castings is much lower than the one of the bottom part and the pouring gates.The bending deflection along -Y axis was observed at two tips of the framework castings.Stress concentration was observed at the corners of the internal framework structure.The actual casting dimension measurement results are well in agreement with the simulated ones,which virified the accuracy of the simulation.

  9. Statistical framework for evaluation of climate model simulations by use of climate proxy data from the last millennium – Part 3: Practical considerations, relaxed assumptions, and using tree-ring data to address the amplitude of solar forcing

    Directory of Open Access Journals (Sweden)

    A. Moberg

    2014-06-01

    Full Text Available Practical issues arise when applying a statistical framework for unbiased ranking of alternative forced climate model simulations by comparison with climate observations from instrumental and proxy data (Part 1 in this series. Given a set of model and observational data, several decisions need to be made; e.g. concerning the region that each proxy series represents, the weighting of different regions, and the time resolution to use in the analysis. Objective selection criteria cannot be made here, but we argue to study how sensitive the results are to the choices made. The framework is improved by the relaxation of two assumptions; to allow autocorrelation in the statistical model for simulated climate variability, and to enable direct comparison of alternative simulations to test if any of them fit the observations significantly better. The extended framework is applied to a set of simulations driven with forcings for the pre-industrial period 1000–1849 CE and fifteen tree-ring based temperature proxy series. Simulations run with only one external forcing (land-use, volcanic, small-amplitude solar, or large-amplitude solar, do not significantly capture the variability in the tree-ring data – although the simulation with volcanic forcing does so for some experiment settings. When all forcings are combined (using either the small- or large-amplitude solar forcing including also orbital, greenhouse-gas and non-volcanic aerosol forcing, and additionally used to produce small simulation ensembles starting from slightly different initial ocean conditions, the resulting simulations are highly capable of capturing some observed variability. Nevertheless, for some choices in the experiment design, they are not significantly closer to the observations than when unforced simulations are used, due to highly variable results between regions. It is also not possible to tell whether the small-amplitude or large-amplitude solar forcing causes the multiple

  10. Digital Polygon Model Grid of the Hydrogeologic Framework of Bedrock Units for a Simulation of Groundwater Flow for the Lake Michigan Basin

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The hydrogeologic framework for the Lake Michigan Basin model was developed by grouping the bedrock geology of the study area into hydrogeologic units on the basis...

  11. V&V Of CFD Modeling Of The Argonne Bubble Experiment: FY15 Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoyt, Nathaniel C. [Argonne National Lab. (ANL), Argonne, IL (United States); Wardle, Kent E. [Argonne National Lab. (ANL), Argonne, IL (United States); Bailey, James L. [Argonne National Lab. (ANL), Argonne, IL (United States); Basavarajappa, Manjunath [Univ. of Utah, Salt Lake City, UT (United States)

    2015-09-30

    In support of the development of accelerator-driven production of the fission product Mo 99, computational fluid dynamics (CFD) simulations of an electron-beam irradiated, experimental-scale bubble chamber have been conducted in order to aid in interpretation of existing experimental results, provide additional insights into the physical phenomena, and develop predictive thermal hydraulic capabilities that can be applied to full-scale target solution vessels. Toward that end, a custom hybrid Eulerian-Eulerian-Lagrangian multiphase solver was developed, and simulations have been performed on high-resolution meshes. Good agreement between experiments and simulations has been achieved, especially with respect to the prediction of the maximum temperature of the uranyl sulfate solution in the experimental vessel. These positive results suggest that the simulation methodology that has been developed will prove to be suitable to assist in the development of full-scale production hardware.

  12. SIMULATION MODEL RESOURCE SEARCH FRAMEWORK BASED ON SEMANTICS DESCRIPTION OF CONCEPTUAL MODEL%基于概念模型语义描述的仿真模型资源搜索框架

    Institute of Scientific and Technical Information of China (English)

    康晓予; 邓贵仕

    2011-01-01

    重用已有模型构建新的仿真应用一直受到系统仿真领域的关注.基于模型数据库搜索、判断与应用需求相匹配的仿真模型资源是实现重用的关键问题.提出一个基于概念模型语义描述的仿真模型资源搜索框架,详细说明了该搜索框架的结构.框架建立了由实体、任务、交互等概念模型元素构成的仿真模型资源语义描述模型,采用本体语义和关键字匹配等搜索策略.模拟实验表明该框架可以很大程度上提高搜索判断的准确性.%Constructing new simulation applications based on the reuse of existing simulation models has always been paid attention in the system simulation field. It is a key issue in realising the reuse that to search, estimate and apply the simulation model resources matching the needs of application based on the database. This paper proposed a search framework for simulation model resources based on semantics description of conceptual model, and its structure is expounded as well. The framework sets up a semantics description model for simulation model resources constructed by the conceptual model elements of entities, tasks and interactions, and uses searching policies of ontology semantics and keywords matching. Simulation experiments indicate that the frame can improve the accuracy of search and estimation remarkably.

  13. Global climate change and international security. Report on a conference held at Argonne National Laboratory, May 8--10, 1991

    Energy Technology Data Exchange (ETDEWEB)

    Rice, M.

    1991-12-31

    On May 8--10, 1991, the Midwest Consortium of International Security Studies (MCISS) and Argonne National Laboratory cosponsored a conference on Global Climate Change and International Security. The aim was to bring together natural and social scientists to examine the economic, sociopolitical, and security implications of the climate changes predicted by the general circulation models developed by natural scientists. Five themes emerged from the papers and discussions: (1) general circulation models and predicted climate change; (2) the effects of climate change on agriculture, especially in the Third World; (3) economic implications of policies to reduce greenhouse gas emissions; (4) the sociopolitical consequences of climate change; and (5) the effect of climate change on global security.

  14. Radiological and Environmental Research Division annual report. Fundamental molecular physics and chemistry, June 1975--September 1976. [Summaries of research activities at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    None

    1976-01-01

    A summary of research activities in the fundamental molecular physics and chemistry section at Argonne National Laboratory from July 1975 to September 1976 is presented. Of the 40 articles and abstracts given, 24 have been presented at conferences or have been published and will be separately abstracted. Abstracts of the remaining 16 items appear in this issue of ERA. (JFP)

  15. Upgrading the Benchmark Simulation Model Framework with emerging challenges - A study of N2O emissions and the fate of pharmaceuticals in urban wastewater systems

    DEFF Research Database (Denmark)

    Snip, Laura

    the performance of a WWTP can be done with mathematical models that can be used in simulation studies. The Benchmark Simulation Model (BSM) framework was developed to compare objectively different operational/control strategies. As different operational strategies of a WWTP will most likely have an effect...... on the greenhouse gas (GHG) emissions and the removal rate of micropollutants (MPs), modelling these processes for dynamic simulations and evaluation seems to be a promising tool for optimisation of a WWTP. Therefore, in this thesis the BSM is upgraded with processes describing GHG emissions and MPs removal...... for pharmaceuticals with a more random occurrence. Different sewer conditions demonstrated effects on the occurrence of the pharmaceuticals as influent patterns at the inlet of the WWTP were smoothed or delayed. The fate in the WWTP showed that operational conditions can influence the biotransformation...

  16. Using the C4ISR Architecture Framework as a Tool to Facilitate VV&A for Simulation Systems within the Military Application Domain

    CERN Document Server

    Tolk, Andreas

    2010-01-01

    To harmonize the individual architectures of the different commands, services, and agencies dealing with the development and procurement of Command, Control, Communications, Computing, Surveillance, Reconnaissance, and Intelligence (C4ISR) systems, the C4ISR Architecture Framework was developed based on existing and matured modeling techniques and methods. Within a short period, NATO adapted this method family as the NATO Consultation, Command, and Control (C3) System Architecture Framework to harmonize the efforts of the different nations. Based on these products, for every system to be fielded to be used in the US Armed Forces, a C4I Support Plan (C4ISP) has to be developed enabling the integration of the special system into the integrated C4I Architecture. The tool set proposed by these architecture frameworks connects operational views of the military user, system views of the developers, and the technical views for standards and integration methods needed to make the network centric system of systems wor...

  17. Statistical framework for evaluation of climate model simulations by use of climate proxy data from the last millennium – Part 2: A pseudo-proxy study addressing the amplitude of solar forcing

    Directory of Open Access Journals (Sweden)

    A. Hind

    2012-08-01

    Full Text Available The statistical framework of Part 1 (Sundberg et al., 2012, for comparing ensemble simulation surface temperature output with temperature proxy and instrumental records, is implemented in a pseudo-proxy experiment. A set of previously published millennial forced simulations (Max Planck Institute – COSMOS, including both "low" and "high" solar radiative forcing histories together with other important forcings, was used to define "true" target temperatures as well as pseudo-proxy and pseudo-instrumental series. In a global land-only experiment, using annual mean temperatures at a 30-yr time resolution with realistic proxy noise levels, it was found that the low and high solar full-forcing simulations could be distinguished. In an additional experiment, where pseudo-proxies were created to reflect a current set of proxy locations and noise levels, the low and high solar forcing simulations could only be distinguished when the latter served as targets. To improve detectability of the low solar simulations, increasing the signal-to-noise ratio in local temperature proxies was more efficient than increasing the spatial coverage of the proxy network. The experiences gained here will be of guidance when these methods are applied to real proxy and instrumental data, for example when the aim is to distinguish which of the alternative solar forcing histories is most compatible with the observed/reconstructed climate.

  18. Effects of vertical girder realignment in the Argonne APS storage ring.

    Energy Technology Data Exchange (ETDEWEB)

    Lessner, E.

    1999-04-14

    The effects of vertical girder misalignments on the vertical orbit of the Advanced Photon Source (APS) storage ring are studied. Partial sector-realignment is prioritized in terms of the closed-orbit distortions due to misalignments of the corresponding girders in the sectors. A virtual girder-displacement (VGD) method is developed that allows the effects of a girder realignment to be tested prior to physically moving the girder. The method can also be used to anticipate the corrector strengths needed to restore the beam orbit after a realignment. Simulation results are compared to experimental results and found to reproduce the latter quite closely. Predicted corrector strengths are also found to be close to the actual local corrector strengths after a proof-of-principle two-sector realignment was performed.

  19. Vibratory response of a mirror support/positioning system for the Advanced Photon Source project at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Basdogan, I.; Shu, Deming; Kuzay, T.M. [Argonne National Lab., IL (United States); Royston, T.J.; Shabana, A.A. [Univ. of Illinois, Chicago, IL (United States)

    1996-08-01

    The vibratory response of a typical mirror support/positioning system used at the experimental station of the Advanced Photon Source (APS) project at Argonne National Laboratory is investigated. Positioning precision and stability are especially critical when the supported mirror directs a high-intensity beam aimed at a distant target. Stability may be compromised by low level, low frequency seismic and facility-originated vibrations traveling through the ground and/or vibrations caused by flow-structure interactions in the mirror cooling system. The example case system has five positioning degrees of freedom through the use of precision actuators and rotary and linear bearings. These linkage devices result in complex, multi-dimensional vibratory behavior that is a function of the range of positioning configurations. A rigorous multibody dynamical approach is used for the development of the system equations. Initial results of the study, including estimates of natural frequencies and mode shapes, as well as limited parametric design studies, are presented. While the results reported here are for a particular system, the developed vibratory analysis approach is applicable to the wide range of high-precision optical positioning systems encountered at the APS and at other comparable facilities.

  20. Advances in thermal hydraulic and neutronic simulation for reactor analysis and safety

    Energy Technology Data Exchange (ETDEWEB)

    Tentner, A.M.; Blomquist, R.N.; Canfield, T.R.; Ewing, T.F.; Garner, P.L.; Gelbard, E.M.; Gross, K.C.; Minkoff, M.; Valentin, R.A.

    1993-03-01

    This paper describes several large-scale computational models developed at Argonne National Laboratory for the simulation and analysis of thermal-hydraulic and neutronic events in nuclear reactors and nuclear power plants. The impact of advanced parallel computing technologies on these computational models is emphasized.