WorldWideScience

Sample records for argonne simulation framework

  1. Component-Based Framework for Subsurface Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, Bruce J.; Fang, Yilin; Hammond, Glenn E.; Gurumoorthi, Vidhya

    2007-08-01

    Simulations in the subsurface environment represent a broad range of phenomena covering an equally broad range of scales. Developing modelling capabilities that can integrate models representing different phenomena acting at different scales present formidable challenges both from the algorithmic and computer science perspective. This paper will describe the development of an integrated framework that will be used to combine different models into a single simulation. Initial work has focused on creating two frameworks, one for performing smooth particle hydrodynamics (SPH) simulations of fluid systems, the other for performing grid-based continuum simulations of reactive subsurface flow. The SPH framework is based on a parallel code developed for doing pore scale simulations, the continuum grid-based framework is based on the STOMP (Subsurface Transport Over Multiple Phases) code developed at PNNL. Future work will focus on combining the frameworks together to perform multiscale, multiphysics simulations of reactive subsurface flow.

  2. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  3. Development of the ATLAS Simulation Framework

    Institute of Scientific and Technical Information of China (English)

    A.DellAcqua; K.Amako; 等

    2001-01-01

    Object-oriented (OO) approach is the key technology to develop a software system in the LHC/ATLAS experiment.We developed a OO simulation framework based on the Geant4 general-purpose simulation toolkit.Because of complexity of simulation in ATLAS,we payed most attention to the scalability in its design.Although the first target to apply this framework is to implement the ATLAS full detector simulation program,there is no experiment-specific code in it,therefore it can be utilized for the development of any simulation package,not only for HEP experiments but also for various different research domains ,In this paper we discuss our approach of design and implementation of the framework.

  4. Monte Carlo simulation framework for TMT

    Science.gov (United States)

    Vogiatzis, Konstantinos; Angeli, George Z.

    2008-07-01

    This presentation describes a strategy for assessing the performance of the Thirty Meter Telescope (TMT). A Monte Carlo Simulation Framework has been developed to combine optical modeling with Computational Fluid Dynamics simulations (CFD), Finite Element Analysis (FEA) and controls to model the overall performance of TMT. The framework consists of a two year record of observed environmental parameters such as atmospheric seeing, site wind speed and direction, ambient temperature and local sunset and sunrise times, along with telescope azimuth and elevation with a given sampling rate. The modeled optical, dynamic and thermal seeing aberrations are available in a matrix form for distinct values within the range of influencing parameters. These parameters are either part of the framework parameter set or can be derived from them at each time-step. As time advances, the aberrations are interpolated and combined based on the current value of their parameters. Different scenarios can be generated based on operating parameters such as venting strategy, optical calibration frequency and heat source control. Performance probability distributions are obtained and provide design guidance. The sensitivity of the system to design, operating and environmental parameters can be assessed in order to maximize the % of time the system meets the performance specifications.

  5. MCdevelop - the universal framework for Stochastic Simulations

    CERN Document Server

    Slawinska, M

    2011-01-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, mer...

  6. Template-Based Geometric Simulation of Flexible Frameworks

    Directory of Open Access Journals (Sweden)

    Stephen A. Wells

    2012-03-01

    Full Text Available Specialised modelling and simulation methods implementing simplified physical models are valuable generators of insight. Template-based geometric simulation is a specialised method for modelling flexible framework structures made up of rigid units. We review the background, development and implementation of the method, and its applications to the study of framework materials such as zeolites and perovskites. The “flexibility window” property of zeolite frameworks is a particularly significant discovery made using geometric simulation. Software implementing geometric simulation of framework materials, “GASP”, is freely available to researchers.

  7. An advanced object-based software framework for complex ecosystem modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sydelko, P. J.; Dolph, J. E.; Majerus, K. A.; Taxon, T. N.

    2000-06-29

    Military land managers and decision makers face an ever increasing challenge to balance maximum flexibility for the mission with a diverse set of multiple land use, social, political, and economic goals. In addition, these goals encompass environmental requirements for maintaining ecosystem health and sustainability over the long term. Spatiotemporal modeling and simulation in support of adaptive ecosystem management can be best accomplished through a dynamic, integrated, and flexible approach that incorporates scientific and technological components into a comprehensive ecosystem modeling framework. The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) integrates ecological models and decision support techniques through a geographic information system (GIS)-based backbone. Recently, an object-oriented (OO) architectural framework was developed for IDLAMS (OO-IDLAMS). This OO-IDLAMS Prototype was built upon and leverages from the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS is an object-based architectural framework that affords a more integrated, dynamic, and flexible approach to comprehensive ecosystem modeling than was possible with the GIS-based integration approach of the original IDLAMS. The flexibility, dynamics, and interoperability demonstrated through this case study of an object-oriented approach have the potential to provide key technology solutions for many of the military's multiple-use goals and needs for integrated natural resource planning and ecosystem management.

  8. FACET: A simulation software framework for modeling complex societal processes and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  9. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  10. A Multiscale/Multifidelity CFD Framework for Robust Simulations

    Science.gov (United States)

    Lee, Seungjoon; Kevrekidis, Yannis; Karniadakis, George

    2015-11-01

    We develop a general CFD framework based on multifidelity simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy simulated fields. We combine approximation theory and domain decomposition together with machine learning techniques, e.g. co-Kriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation with different patches of the domain simulated by finite differences at fine resolution or very low resolution but also with Monte Carlo, hence fusing multifidelity and heterogeneous models to obtain the final answer. Second, we simulate the flow in a driven cavity by fusing finite difference solutions with solutions obtained by dissipative particle dynamics - a coarse-grained molecular dynamics method. In addition to its robustness and resilience, the new framework generalizes previous multiscale approaches (e.g. continuum-atomistic) in a unified parallel computational framework.

  11. The Astrophysics Simulation Collaboratory portal: A framework foreffective distributed research

    Energy Technology Data Exchange (ETDEWEB)

    Bondarescu, Ruxandra; Allen, Gabrielle; Daues, Gregory; Kelly,Ian; Russell, Michael; Seidel, Edward; Shalf, John; Tobias, Malcolm

    2003-03-03

    We describe the motivation, architecture, and implementation of the Astrophysics Simulation Collaboratory (ASC) portal. The ASC project provides a web-based problem solving framework for the astrophysics community that harnesses the capabilities of emerging computational grids.

  12. A Simulation and Modeling Framework for Space Situational Awareness

    International Nuclear Information System (INIS)

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated

  13. OpenMM: A Hardware Independent Framework for Molecular Simulations

    OpenAIRE

    Eastman, Peter; Pande, Vijay S.

    2010-01-01

    The wide diversity of computer architectures today requires a new approach to software development. OpenMM is a framework for molecular mechanics simulations, allowing a single program to run efficiently on a variety of hardware platforms.

  14. A Simulation and Modeling Framework for Space Situational Awareness

    Energy Technology Data Exchange (ETDEWEB)

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  15. Service-Oriented Simulation Framework: An Overview and Unifying Methodology

    CERN Document Server

    Wang, Wenguang; Zhu, Yifan; Li, Qun; 10.1177/0037549710391838

    2010-01-01

    The prevailing net-centric environment demands and enables modeling and simulation to combine efforts from numerous disciplines. Software techniques and methodology, in particular service-oriented architecture, provide such an opportunity. Service-oriented simulation has been an emerging paradigm following on from object- and process-oriented methods. However, the ad-hoc frameworks proposed so far generally focus on specific domains or systems and each has its pros and cons. They are capable of addressing different issues within service-oriented simulation from different viewpoints. It is increasingly important to describe and evaluate the progress of numerous frameworks. In this paper, we propose a novel three-dimensional reference model for a service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks. In particular, the model inspects the crossover between the disciplines of modelin...

  16. GEMFsim: A Stochastic Simulator for the Generalized Epidemic Modeling Framework

    CERN Document Server

    Sahneh, Faryad Darabi; Shakeri, Heman; Fan, Futing; Scoglio, Caterina

    2016-01-01

    The recently proposed generalized epidemic modeling framework (GEMF) \\cite{sahneh2013generalized} lays the groundwork for systematically constructing a broad spectrum of stochastic spreading processes over complex networks. This article builds an algorithm for exact, continuous-time numerical simulation of GEMF-based processes. Moreover the implementation of this algorithm, GEMFsim, is available in popular scientific programming platforms such as MATLAB, R, Python, and C; GEMFsim facilitates simulating stochastic spreading models that fit in GEMF framework. Using these simulations one can examine the accuracy of mean-field-type approximations that are commonly used for analytical study of spreading processes on complex networks.

  17. Software Framework for Advanced Power Plant Simulations

    Energy Technology Data Exchange (ETDEWEB)

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  18. A simulation framework for the CMS Track Trigger electronics

    International Nuclear Information System (INIS)

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system

  19. A generic testing framework for agent-based simulation models

    OpenAIRE

    Gürcan, Önder; Dikenelli, Oguz; Bernon, Carole

    2013-01-01

    International audience Agent-based modelling and simulation (ABMS) had an increasing attention during the last decade. However, the weak validation and verification of agent-based simulation models makes ABMS hard to trust. There is no comprehensive tool set for verification and validation of agent-based simulation models, which demonstrates that inaccuracies exist and/or reveals the existing errors in the model. Moreover, on the practical side, many ABMS frameworks are in use. In this sen...

  20. Monte Carlo simulation techniques : The development of a general framework

    OpenAIRE

    Nilsson, Emma

    2009-01-01

    Algorithmica Research AB develops software application for the financial markets. One of their products is Quantlab that is a tool for quantitative analyses. An effective method to value several financial instruments is Monte Carlo simulation. Since it is a common method Algorithmica is interesting in investigating if it is possible to create a Monte Carlo framework. A requirement from Algorithmica is that the framework is general and this is the main problem to solve. It is difficult to gene...

  1. A Simulation Framework for Virtual Prototyping of Robotic Exoskeletons.

    Science.gov (United States)

    Agarwal, Priyanshu; Neptune, Richard R; Deshpande, Ashish D

    2016-06-01

    A number of robotic exoskeletons are being developed to provide rehabilitation interventions for those with movement disabilities. We present a systematic framework that allows for virtual prototyping (i.e., design, control, and experimentation (i.e. design, control, and experimentation) of robotic exoskeletons. The framework merges computational musculoskeletal analyses with simulation-based design techniques which allows for exoskeleton design and control algorithm optimization. We introduce biomechanical, morphological, and controller measures to optimize the exoskeleton performance. A major advantage of the framework is that it provides a platform for carrying out hypothesis-driven virtual experiments to quantify device performance and rehabilitation progress. To illustrate the efficacy of the framework, we present a case study wherein the design and analysis of an index finger exoskeleton is carried out using the proposed framework. PMID:27018453

  2. Power Aware Simulation Framework for Wireless Sensor Networks and Nodes

    Directory of Open Access Journals (Sweden)

    Daniel Weber

    2008-07-01

    Full Text Available The constrained resources of sensor nodes limit analytical techniques and cost-time factors limit test beds to study wireless sensor networks (WSNs. Consequently, simulation becomes an essential tool to evaluate such systems.We present the power aware wireless sensors (PAWiS simulation framework that supports design and simulation of wireless sensor networks and nodes. The framework emphasizes power consumption capturing and hence the identification of inefficiencies in various hardware and software modules of the systems. These modules include all layers of the communication system, the targeted class of application itself, the power supply and energy management, the central processing unit (CPU, and the sensor-actuator interface. The modular design makes it possible to simulate heterogeneous systems. PAWiS is an OMNeT++ based discrete event simulator written in C++. It captures the node internals (modules as well as the node surroundings (network, environment and provides specific features critical to WSNs like capturing power consumption at various levels of granularity, support for mobility, and environmental dynamics as well as the simulation of timing effects. A module library with standardized interfaces and a power analysis tool have been developed to support the design and analysis of simulation models. The performance of the PAWiS simulator is comparable with other simulation environments.

  3. Particle Tracking and Simulation on the .NET Framework

    International Nuclear Information System (INIS)

    Particle tracking and simulation studies are becoming increasingly complex. In addition to the use of more sophisticated graphics, interactive scripting is becoming popular. Compatibility with different control systems requires network and database capabilities. It is not a trivial task to fulfill all the various requirements without sacrificing runtime performance. We evaluated the effectiveness of the .NET framework by converting a C++ simulation code to C. The portability to other platforms is mentioned in terms of Mono

  4. Fundamental concepts in the Cyclus nuclear fuel cycle simulation framework

    OpenAIRE

    Huff, Kathryn D.; Gidden, Matthew J.; Carlsen, Robert W.; Flanagan, Robert R.; McGarry, Meghan B.; Opotowsky, Arrielle C.; Schneider, Erich A.; Scopatz, Anthony M.; Wilson, Paul P. H.

    2015-01-01

    As nuclear power expands, technical, economic, political, and environmental analyses of nuclear fuel cycles by simulators increase in importance. To date, however, current tools are often fleet-based rather than discrete and restrictively licensed rather than open source. Each of these choices presents a challenge to modeling fidelity, generality, efficiency, robustness, and scientific transparency. The Cyclus nuclear fuel cycle simulator framework and its modeling ecosystem incorporate moder...

  5. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    Science.gov (United States)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  6. A generic digitization framework for the CDF simulation

    International Nuclear Information System (INIS)

    Digitization from GEANT tracking requires a predictable sequence of steps to produce raw simulated detector readout information. The authors have developed a software framework that simplifies the development and integration of digitizers by separating the coordination activities (sequencing and dispatching) from the actual digitization process. This separation allows the developers of digitizers to concentrate on digitization. The framework provides the sequencing infrastructure and a digitizer model, which means that all digitizers are required to follow the same sequencing rules and provide an interface that fits the model

  7. LCIO - A persistency framework for linear collider simulation studies

    International Nuclear Information System (INIS)

    Almost all groups involved in linear collider detector studies have their own simulation software framework. Using a common persistency scheme would allow to easily share results and compare reconstruction algorithms. We present such a persistency framework, called LCIO (Linear Collider I/O). The framework has to fulfill the requirements of the different groups today and be flexible enough to be adapted to future needs. To that end we define an ''abstract object persistency layer'' that will be used by the applications. A first implementation, based on a sequential file format (SIO) is completely separated from the interface, thus allowing support to additional formats if necessary. The interface is defined with the AID (Abstract Interface Definition) tool from freehep.org that allows creation of Java and C++ code synchronously. In order to make use of legacy software a Fortran interface is also provided. We present the design and implementation of LCIO

  8. linear accelerator simulation framework with placet and guinea-pig

    CERN Document Server

    Snuverink, Jochem; CERN. Geneva. ATS Department

    2016-01-01

    Many good tracking tools are available for simulations for linear accelerators. However, several simple tasks need to be performed repeatedly, like lattice definitions, beam setup, output storage, etc. In addition, complex simulations can become unmanageable quite easily. A high level layer would therefore be beneficial. We propose LinSim, a linear accelerator framework with the codes PLACET and GUINEA-PIG. It provides a documented well-debugged high level layer of functionality. Users only need to provide the input settings and essential code and / or use some of the many implemented imperfections and algorithms. It can be especially useful for first-time users. Currently the following accelerators are implemented: ATF2, ILC, CLIC and FACET. This note is the comprehensive manual, discusses the framework design and shows its strength in some condensed examples.

  9. Sorting, Searching, and Simulation in the MapReduce Framework

    DEFF Research Database (Denmark)

    Goodrich, Michael T.; Sitchinava, Nodari; Zhang, Qin

    2011-01-01

    usefulness of our approach by designing and analyzing efficient MapReduce algorithms for fundamental sorting, searching, and simulation problems. This study is motivated by a goal of ultimately putting the MapReduce framework on an equal theoretical footing with the well-known PRAM and BSP parallel...... models, which would benefit both the theory and practice of MapReduce algorithms. We describe efficient MapReduce algorithms for sorting, multi-searching, and simulations of parallel algorithms specified in the BSP and CRCW PRAM models. We also provide some applications of these results to problems in...... parallel computational geometry for the MapReduce framework, which result in efficient MapReduce algorithms for sorting, 2- and 3-dimensional convex hulls, and fixed-dimensional linear programming. For the case when mappers and reducers have a memory/message-I/O size of M = (N), for a small constant > 0...

  10. Symphony: A Framework for Accurate and Holistic WSN Simulation

    Directory of Open Access Journals (Sweden)

    Laurynas Riliskis

    2015-02-01

    Full Text Available Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles.

  11. Symphony: a framework for accurate and holistic WSN simulation.

    Science.gov (United States)

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  12. A framework for the calibration of social simulation models

    CERN Document Server

    Ciampaglia, Giovanni Luca

    2013-01-01

    Simulation with agent-based models is increasingly used in the study of complex socio-technical systems and in social simulation in general. This paradigm offers a number of attractive features, namely the possibility of modeling emergent phenomena within large populations. As a consequence, often the quantity in need of calibration may be a distribution over the population whose relation with the parameters of the model is analytically intractable. Nevertheless, we can simulate. In this paper we present a simulation-based framework for the calibration of agent-based models with distributional output based on indirect inference. We illustrate our method step by step on a model of norm emergence in an online community of peer production, using data from three large Wikipedia communities. Model fit and diagnostics are discussed.

  13. Sorting, Searching, and Simulation in the MapReduce Framework

    DEFF Research Database (Denmark)

    Goodrich, Michael T.; Sitchinava, Nodar; Zhang, Qin

    2011-01-01

    We study the MapReduce framework from an algorithmic standpoint, providing a generalization of the previous algorithmic models for MapReduce. We present optimal solutions for the fundamental problems of all-prefix-sums, sorting and multi-searching. Additionally, we design optimal simulations of the...... the well-established PRAM and BSP models in MapReduce, immediately resulting in optimal solutions to the problems of computing fixed-dimensional linear programming and 2-D and 3-D convex hulls....

  14. A framework to simulate VANET scenarios with SUMO

    OpenAIRE

    KAISSER,F; GRANSART,C; Kassab, M.; Berbineau, M.

    2011-01-01

    Vehicular Ad hoc Networks (VANET) are a special kind of Mobile Ad-Hoc Networks (MANET) adapted to the communications between vehicles. Several specific protocols to VANETs have been developed to improve performances and satisfy vehicular application needs. To evaluate a protocol for VANET, some realistic mobility models are needed. Unfortunately, such models are not provided by OPNET Modeler. In this work, we propose a framework that enhances OPNET simulation scenario using realistic vehicula...

  15. Hierarchical Visual Analysis and Steering Framework for Astrophysical Simulations

    Institute of Scientific and Technical Information of China (English)

    肖健; 张加万; 原野; 周鑫; 纪丽; 孙济洲

    2015-01-01

    A framework for accelerating modern long-running astrophysical simulations is presented, which is based on a hierarchical architecture where computational steering in the high-resolution run is performed under the guide of knowledge obtained in the gradually refined ensemble analyses. Several visualization schemes for facilitating ensem-ble management, error analysis, parameter grouping and tuning are also integrated owing to the pluggable modular design. The proposed approach is prototyped based on the Flash code, and it can be extended by introducing user-defined visualization for specific requirements. Two real-world simulations, i.e., stellar wind and supernova remnant, are carried out to verify the proposed approach.

  16. Framework Application for Core Edge Transport Simulation (FACETS)

    Energy Technology Data Exchange (ETDEWEB)

    Krasheninnikov, Sergei; Pigarov, Alexander

    2011-10-15

    The FACETS (Framework Application for Core-Edge Transport Simulations) project of Scientific Discovery through Advanced Computing (SciDAC) Program was aimed at providing a high-fidelity whole-tokamak modeling for the U.S. magnetic fusion energy program and ITER through coupling separate components for each of the core region, edge region, and wall, with realistic plasma particles and power sources and turbulent transport simulation. The project also aimed at developing advanced numerical algorithms, efficient implicit coupling methods, and software tools utilizing the leadership class computing facilities under Advanced Scientific Computing Research (ASCR). The FACETS project was conducted by a multi-discipline, multi-institutional teams, the Lead PI was J.R. Cary (Tech-X Corp.). In the FACETS project, the Applied Plasma Theory Group at the MAE Department of UCSD developed the Wall and Plasma-Surface Interaction (WALLPSI) module, performed its validation against experimental data, and integrated it into the developed framework. WALLPSI is a one-dimensional, coarse grained, reaction/advection/diffusion code applied to each material boundary cell in the common modeling domain for a tokamak. It incorporates an advanced model for plasma particle transport and retention in the solid matter of plasma facing components, simulation of plasma heat power load handling, calculation of erosion/deposition, and simulation of synergistic effects in strong plasma-wall coupling.

  17. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  18. A Driver Behavior Learning Framework for Enhancing Traffic Simulation

    Directory of Open Access Journals (Sweden)

    Ramona Maria Paven

    2014-06-01

    Full Text Available Traffic simulation provides an essential support for developing intelligent transportation systems. It allows affordable validation of such systems using a large variety of scenarios that involves massive data input. However, realistic traffic models are hard to be implemented especially for microscopic traffic simulation. One of the hardest problems in this context is to model the behavior of drivers, due the complexity of human nature. The work presented in this paper proposes a framework for learning driver behavior based on a Hidden Markov Model technique. Moreover, we propose also a practical method to inject this behavior in a traffic model used by the SUMO traffic simulator. To demonstrate the effectiveness of this method we present a case study involving real traffic collected from Timisoara city area.

  19. The PandaRoot framework for simulation, reconstruction and analysis

    International Nuclear Information System (INIS)

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  20. Sorting, Searching, and Simulation in the MapReduce Framework

    CERN Document Server

    Goodrich, Michael T; Zhang, Qin

    2011-01-01

    In this paper, we study the MapReduce framework from an algorithmic standpoint and demonstrate the usefulness of our approach by designing and analyzing efficient MapReduce algorithms for fundamental sorting, searching, and simulation problems. This study is motivated by a goal of ultimately putting the MapReduce framework on an equal theoretical footing with the well-known PRAM and BSP parallel models, which would benefit both the theory and practice of MapReduce algorithms. We describe efficient MapReduce algorithms for sorting, multi-searching, and simulations of parallel algorithms specified in the BSP and CRCW PRAM models. We also provide some applications of these results to problems in parallel computational geometry for the MapReduce framework, which result in efficient MapReduce algorithms for sorting, 2- and 3-dimensional convex hulls, and fixed-dimensional linear programming. For the case when mappers and reducers have a memory/message-I/O size of $M=\\Theta(N^\\epsilon)$, for a small constant $\\epsi...

  1. The Framework for Approximate Queries on Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Abdulla, G; Baldwin, C; Critchlow, T; Kamimura, R; Lee, B; Musick, R; Snapp, R; Tang, N

    2001-09-27

    AQSim is a system intended to enable scientists to query and analyze a large volume of scientific simulation data. The system uses the state of the art in approximate query processing techniques to build a novel framework for progressive data analysis. These techniques are used to define a multi-resolution index, where each node contains multiple models of the data. The benefits of these models are two-fold: (1) they are compact representations, reconstructing only the information relevant to the analysis, and (2) the variety of models capture different aspects of the data which may be of interest to the user but are not readily apparent in their raw form. To be able to deal with the data interactively, AQSim allows the scientist to make an informed tradeoff between query response accuracy and time. In this paper, we present the framework of AQSim with a focus on its architectural design. We also show the results from an initial proof-of-concept prototype developed at LLNL. The presented framework is generic enough to handle more than just simulation data.

  2. Hierarchical petascale simulation framework for stress corrosion cracking

    International Nuclear Information System (INIS)

    The goal of this SciDAC project is to develop a scalable parallel and distributed computational framework consisting of methods, algorithms, and integrated software tools for: 1) multi Tera-to-Petascale simulations with quantum-level accuracy; 2) multimillion-to-multibillion-to-trillion atom molecular dynamics (MD) simulations based on density functional theory (DFT) and temperature dependent model generalized pseudopotential theory; 3) quasicontinuum (QC) method embedded with classical atomistic and quantum simulations based on DFT; and 4) accelerated molecular dynamics (AMD) coupled with hierarchical atomistic/QC simulations to reach macroscopic length and time scales relevant to SCC. Scalability is being achieved beyond 105 processors through linear-scaling algorithms and performance-optimization techniques. We are employing automated model transitioning to embed higher fidelity simulations concurrently inside coarser simulations only when and where they are required. Each scale and model has well-defined error bounds with controlled error propagation across the scales and models to estimate uncertainty in predictions

  3. A framework of modeling detector systems for computed tomography simulations

    International Nuclear Information System (INIS)

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model

  4. Architecture of collaborating frameworks: simulation, visualisation, user interface and analysis

    International Nuclear Information System (INIS)

    In modern high energy and astrophysics experiments the variety of user requirements and the complexity of the problem domain often involve the collaboration of several software frameworks, and different components are responsible for providing the functionalities related to each domain. For instance, a common use case consists in studying the physics effects and the detector performance, resulting from primary events, in a given detector configuration, to evaluate the physics reach of the experiment or optimise the detector design. Such a study typically involves various components: Simulation, Visualisation, Analysis and (interactive) User Interface. The authors focus on the design aspects of the collaboration of these frameworks and on the technologies that help to simplify the complex process of software design

  5. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Samantha S [ORNL; Elwasif, Wael R [ORNL; Bernholdt, David E [ORNL

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels of parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.

  6. Coupled multi-physics simulation frameworks for reactor simulation: A bottom-up approach

    International Nuclear Information System (INIS)

    A 'bottom-up' approach to multi-physics frameworks is described, where first common interfaces to simulation data are developed, then existing physics modules are adapted to communicate through those interfaces. Physics modules read and write data through those common interfaces, which also provide access to common simulation services like parallel IO, mesh partitioning, etc.. Multi-physics codes are assembled as a combination of physics modules, services, interface implementations, and driver code which coordinates calling these various pieces. Examples of various physics modules and services connected to this framework are given. (author)

  7. Recent advances on simulation and theory of hydrogen storage in metal–organic frameworks and covalent organic frameworks

    OpenAIRE

    Han, Sang Soo; Mendoza-Cortés, José L.; Goddard, William A.

    2009-01-01

    This critical review covers the application of computer simulations, including quantum calculations (ab initio and DFT), grand canonical Monte-Carlo simulations, and molecular dynamics simulations, to the burgeoning area of the hydrogen storage by metal–organic frameworks and covalent-organic frameworks. This review begins with an overview of the theoretical methods obtained from previous studies. Then strategies for the improvement of hydrogen storage in the porous materials are discussed in...

  8. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  9. A Virtual Engineering Framework for Simulating Advanced Power System

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Stanislav Borodai

    2008-06-18

    In this report is described the work effort performed to provide NETL with VE-Suite based Virtual Engineering software and enhanced equipment models to support NETL's Advanced Process Engineering Co-simulation (APECS) framework for advanced power generation systems. Enhancements to the software framework facilitated an important link between APECS and the virtual engineering capabilities provided by VE-Suite (e.g., equipment and process visualization, information assimilation). Model enhancements focused on improving predictions for the performance of entrained flow coal gasifiers and important auxiliary equipment (e.g., Air Separation Units) used in coal gasification systems. In addition, a Reduced Order Model generation tool and software to provide a coupling between APECS/AspenPlus and the GE GateCycle simulation system were developed. CAPE-Open model interfaces were employed where needed. The improved simulation capability is demonstrated on selected test problems. As part of the project an Advisory Panel was formed to provide guidance on the issues on which to focus the work effort. The Advisory Panel included experts from industry and academics in gasification, CO2 capture issues, process simulation and representatives from technology developers and the electric utility industry. To optimize the benefit to NETL, REI coordinated its efforts with NETL and NETL funded projects at Iowa State University, Carnegie Mellon University and ANSYS/Fluent, Inc. The improved simulation capabilities incorporated into APECS will enable researchers and engineers to better understand the interactions of different equipment components, identify weaknesses and processes needing improvement and thereby allow more efficient, less expensive plants to be developed and brought on-line faster and in a more cost-effective manner. These enhancements to APECS represent an important step toward having a fully integrated environment for performing plant simulation and engineering

  10. A Simulation Framework for Optimal Energy Storage Sizing

    Directory of Open Access Journals (Sweden)

    Carlos Suazo-Martínez

    2014-05-01

    Full Text Available Despite the increasing interest in Energy Storage Systems (ESS, quantification of their technical and economical benefits remains a challenge. To assess the use of ESS, a simulation approach for ESS optimal sizing is presented. The algorithm is based on an adapted Unit Commitment, including ESS operational constraints, and the use of high performance computing (HPC. Multiple short-term simulations are carried out within a multiple year horizon. Evaluation is performed for Chile's Northern Interconnected Power System (SING. The authors show that a single year evaluation could lead to sub-optimal results when evaluating optimal ESS size. Hence, it is advisable to perform long-term evaluations of ESS. Additionally, the importance of detailed simulation for adequate assessment of ESS contributions and to fully capture storage value is also discussed. Furthermore, the robustness of the optimal sizing approach is evaluated by means of a sensitivity analyses. The results suggest that regulatory frameworks should recognize multiple value streams from storage in order to encourage greater ESS integration.

  11. Framework Application for Core Edge Transport Simulation (FACETS)

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D; Shende, Sameer S; Huck, Kevin A; Mr. Alan Morris, and Mr. Wyatt Spear

    2012-03-14

    The goal of the FACETS project (Framework Application for Core-Edge Transport Simulations) was to provide a multiphysics, parallel framework application (FACETS) that will enable whole-device modeling for the U.S. fusion program, to provide the modeling infrastructure needed for ITER, the next step fusion confinement device. Through use of modern computational methods, including component technology and object oriented design, FACETS is able to switch from one model to another for a given aspect of the physics in a flexible manner. This enables use of simplified models for rapid turnaround or high-fidelity models that can take advantage of the largest supercomputer hardware. FACETS does so in a heterogeneous parallel context, where different parts of the application execute in parallel by utilizing task farming, domain decomposition, and/or pipelining as needed and applicable. ParaTools, Inc. was tasked with supporting the performance analysis and tuning of the FACETS components and framework in order to achieve the parallel scaling goals of the project. The TAU Performance System® was used for instrumentation, measurement, archiving, and profile / tracing analysis. ParaTools, Inc. also assisted in FACETS performance engineering efforts. Through the use of the TAU Performance System, ParaTools provided instrumentation, measurement, analysis and archival support for the FACETS project. Performance optimization of key components has yielded significant performance speedups. TAU was integrated into the FACETS build for both the full coupled application and the UEDGE component. The performance database provided archival storage of the performance regression testing data generated by the project, and helped to track improvements in the software development.

  12. A new framework for magnetohydrodynamic simulations with anisotropic pressure

    CERN Document Server

    Hirabayashi, Kota; Amano, Takanobu

    2016-01-01

    We describe a new theoretical and numerical framework of the magnetohydrodynamic simulation incorporated with an anisotropic pressure tensor, which can play an important role in a collisionless plasma. A classical approach to handle the anisotropy is based on the double adiabatic approximation assuming that a pressure tensor is well described only by the components parallel and perpendicular to the local magnetic field. This gyrotropic assumption, however, fails around a magnetically neutral region, where the cyclotron period may get comparable to or even longer than a dynamical time in a system, and causes a singularity in the mathematical expression. In this paper, we demonstrate that this singularity can be completely removed away by the combination of direct use of the 2nd-moment of the Vlasov equation and an ingenious gyrotropization model. Numerical tests also verify that the present model properly reduces to the standard MHD or the double adiabatic formulation in an asymptotic manner under an appropria...

  13. Coupled multi-physics simulation frameworks for reactor simulation: A bottom-up approach

    International Nuclear Information System (INIS)

    In this paper, we present a 'bottom-up' approach to multi-physics frameworks, where we first develop common interfaces to simulation data, then adapt existing physics modules to communicate through those interfaces. Interfaces are provided for geometry, mesh, and field data, and are independent of one another; a fourth interface is available for relating data between these interfaces. Physics modules read and write data through those common interfaces, which also provide access to common simulation services like parallel IO, mesh partitioning, etc. Multi-physics codes are assembled as a combination of physics modules, services, interface implementations, and driver code which coordinates calling these various pieces. The framework being constructed as part of this effort, referred to as SHARP, is shown

  14. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Thomas [California Inst. of Technology (CalTech), Pasadena, CA (United States); Efendiev, Yalchin [Stanford Univ., CA (United States); Tchelepi, Hamdi [Texas A & M Univ., College Station, TX (United States); Durlofsky, Louis [Stanford Univ., CA (United States)

    2016-05-24

    Our work in this project is aimed at making fundamental advances in multiscale methods for flow and transport in highly heterogeneous porous media. The main thrust of this research is to develop a systematic multiscale analysis and efficient coarse-scale models that can capture global effects and extend existing multiscale approaches to problems with additional physics and uncertainties. A key emphasis is on problems without an apparent scale separation. Multiscale solution methods are currently under active investigation for the simulation of subsurface flow in heterogeneous formations. These procedures capture the effects of fine-scale permeability variations through the calculation of specialized coarse-scale basis functions. Most of the multiscale techniques presented to date employ localization approximations in the calculation of these basis functions. For some highly correlated (e.g., channelized) formations, however, global effects are important and these may need to be incorporated into the multiscale basis functions. Other challenging issues facing multiscale simulations are the extension of existing multiscale techniques to problems with additional physics, such as compressibility, capillary effects, etc. In our project, we explore the improvement of multiscale methods through the incorporation of additional (single-phase flow) information and the development of a general multiscale framework for flows in the presence of uncertainties, compressible flow and heterogeneous transport, and geomechanics. We have considered (1) adaptive local-global multiscale methods, (2) multiscale methods for the transport equation, (3) operator-based multiscale methods and solvers, (4) multiscale methods in the presence of uncertainties and applications, (5) multiscale finite element methods for high contrast porous media and their generalizations, and (6) multiscale methods for geomechanics.

  15. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    Energy Technology Data Exchange (ETDEWEB)

    Tchelepi, Hamdi

    2014-11-14

    A multiscale linear-solver framework for the pressure equation associated with flow in highly heterogeneous porous formations was developed. The multiscale based approach is cast in a general algebraic form, which facilitates integration of the new scalable linear solver in existing flow simulators. The Algebraic Multiscale Solver (AMS) is employed as a preconditioner within a multi-stage strategy. The formulations investigated include the standard MultiScale Finite-Element (MSFE) andMultiScale Finite-Volume (MSFV) methods. The local-stage solvers include incomplete factorization and the so-called Correction Functions (CF) associated with the MSFV approach. Extensive testing of AMS, as an iterative linear solver, indicate excellent convergence rates and computational scalability. AMS compares favorably with advanced Algebraic MultiGrid (AMG) solvers for highly detailed three-dimensional heterogeneous models. Moreover, AMS is expected to be especially beneficial in solving time-dependent problems of coupled multiphase flow and transport in large-scale subsurface formations.

  16. Mathematical framework for simulations of quantum fields in complex interferometers using the two-photon formalism

    OpenAIRE

    Corbitt, T.; Chen, Y.; Mavalvala, N

    2005-01-01

    We present a mathematical framework for simulation of optical fields in complex gravitational-wave interferometers. The simulation framework uses the two-photon formalism for optical fields and includes radiation pressure effects, an important addition required for simulating signal and noise fields in next-generation interferometers with high circulating power. We present a comparison of results from the simulation with analytical calculation and show that accurate agreement is achieved.

  17. A framework for using simulation methodology in ergonomics interventions in design projects

    DEFF Research Database (Denmark)

    Broberg, Ole; Duarte, Francisco; Andersen, Simone Nyholm;

    2014-01-01

    The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective......The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective...

  18. A framework for using simulation methodology in ergonomics interventions in design projects

    DEFF Research Database (Denmark)

    Broberg, Ole; Duarte, Francisco; Andersen, Simone Nyholm; Bittencourt, Joao; Conceicao, Carolina; Edwards, Kasper; Garotti, Luciano; Lima, Francisco

    The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective......The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective...

  19. artG4: A Generic Framework for Geant4 Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Arvanitis, Tasha [Harvey Mudd Coll.; Lyon, Adam [Fermilab

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  20. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    OpenAIRE

    Data Iranata

    2010-01-01

    A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models u...

  1. A forward-muscular inverse-skeletal dynamics framework for human musculoskeletal simulations.

    Science.gov (United States)

    S Shourijeh, Mohammad; Smale, Kenneth B; Potvin, Brigitte M; Benoit, Daniel L

    2016-06-14

    This study provides a forward-muscular inverse-skeletal dynamics framework for musculoskeletal simulations. The simulation framework works based on solving the muscle redundancy problem forward in time parallel to a torque tracking between the musculotendon net torques and joint moments from inverse dynamics. The proposed framework can be used by any musculoskeletal modeling software package; however, just to exemplify, here in this study it is wrapped around OpenSim and the optimization is done in MATLAB. The novel simulation framework was highly robust for repeated runs and produced relatively high correlations between predicted muscle excitations and experimental EMGs for level gait trials. This simulation framework represents an efficient and robust approach to predict muscle excitation, musculotendon unit force, and to estimate net joint torque. PMID:27106173

  2. Parallel simulation of wormhole propagation with the Darcy-Brinkman-Forchheimer framework

    KAUST Repository

    Wu, Yuanqing

    2015-07-09

    The acid treatment of carbonate reservoirs is a widely practiced oil and gas well stimulation technique. The injected acid dissolves the material near the wellbore and creates flow channels that establish a good connectivity between the reservoir and the well. Such flow channels are called wormholes. Different from the traditional simulation technology relying on Darcy framework, the new Darcy-Brinkman-Forchheimer (DBF) framework is introduced to simulate the wormhole forming procedure. The DBF framework considers both large and small porosity conditions and should output better simulation results than the Darcy framework. To process the huge quantity of cells in the simulation grid and shorten the long simulation time of the traditional serial code, a parallel code with FORTRAN 90 and MPI was developed. The experimenting field approach to set coefficients in the model equations was also introduced. Moreover, a procedure to fill in the coefficient matrix in the linear system in the solver was described. After this, 2D dissolution experiments were carried out. In the experiments, different configurations of wormholes and a series of properties simulated by both frameworks were compared. We conclude that the numerical results of the DBF framework are more like wormholes and more stable than the Darcy framework, which is a demonstration of the advantages of the DBF framework. Finally, the scalability of the parallel code was evaluated, and we conclude that superlinear scalability can be achieved. © 2015 Elsevier Ltd.

  3. Extending the FairRoot framework to allow for simulation and reconstruction of free streaming data

    International Nuclear Information System (INIS)

    The FairRoot framework is the standard framework for simulation, reconstruction and data analysis for the FAIR experiments. The framework is designed to optimise the accessibility for beginners and developers, to be flexible and to cope with future developments. FairRoot enhances the synergy between the different physics experiments. As a first step toward simulation of free streaming data, the time based simulation was introduced to the framework. The next step is the event source simulation. This is achieved via a client server system. After digitization the so called 'samplers' can be started, where sampler can read the data of the corresponding detector from the simulation files and make it available for the reconstruction clients. The system makes it possible to develop and validate the online reconstruction algorithms. In this work, the design and implementation of the new architecture and the communication layer will be described.

  4. NEVESIM: Event-Driven Neural Simulation Framework with a Python Interface

    Directory of Open Access Journals (Sweden)

    Dejan ePecevski

    2014-08-01

    Full Text Available NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  5. NEVESIM: event-driven neural simulation framework with a Python interface.

    Science.gov (United States)

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  6. Towards a Mobile-based ADAS Simulation Framework

    OpenAIRE

    Gonçalves, João S. V.; Jacob, João; Rossetti, Rosaldo J. F.; Coelho, António; Rodrigues, Rui

    2014-01-01

    In this paper we propose a multi-agent-based driving simulator which integrates a test-bed that allows ADAS developers to compress testing time and carry out tests in a controlled environment while using a low-cost setup. We use the SUMO microscopic simulator and a serious-game-based driving simulator which has geodata provided from standard open sources. This simulator connects to an Android device and sends data such as the current GPS coordinates and transportation network data. One import...

  7. Software for the international linear collider: simulation and reconstruction frameworks

    International Nuclear Information System (INIS)

    Software plays an increasingly important role already in the early stages of a large project like the ILC. In international collaboration a data format for the ILC detector and physics studies has been developed. Building upon this software frameworks are made available which ease the event reconstruction and analysis. (author)

  8. Software for the international linear collider: Simulation and reconstruction frameworks

    Indian Academy of Sciences (India)

    Ties Behnke; Frank Gaede; DESY Hamburg

    2007-12-01

    Software plays an increasingly important role already in the early stages of a large project like the ILC. In international collaboration a data format for the ILC detector and physics studies has been developed. Building upon this software frameworks are made available which ease the event reconstruction and analysis.

  9. The Astrophysics Simulation Collaboratory portal: A framework for effective distributed research

    OpenAIRE

    Bondarescu, Ruxandra; Allen, Gabrielle; Daues, Gregory; Kelly, Ian; Russell, Michael; Seidel, Edward; Shalf, John; Tobias, Malcolm

    2003-01-01

    We describe the motivation, architecture, and implementation of the Astrophysics Simulation Collaboratory (ASC) portal. The ASC project provides a web-based problem solving framework for the astrophysics community that harnesses the capabilities of emerging computational grids.

  10. Framework for bringing realistic virtual natural environments to distributed simulations

    Science.gov (United States)

    Whitney, David A.; Reynolds, Robert A.; Olson, Stephen H.; Sherer, Dana Z.; Driscoll, Mavis L.; Watman, K. L.

    1997-06-01

    One of the major new technical challenges for distributed simulations is the distribution and presentation and distribution of the natural atmosphere-ocean-space environment. The natural terrain environment has been a part of such simulations for a while, but the integration of atmosphere and ocean data and effects is quite new. The DARPA synthetic environments (SE) program has been developing and demonstrating advanced technologies for providing tactically significant atmosphere-ocean data and effects for a range of simulations. A general-purpose data collection, assimilation, management, and distribution system is being developed by the TAOS (Total Atmosphere-Ocean System) Project. This system is designed to support the new high level architecture (HLA)/run- time infrastructure (RTI) being developed by the Defense Modeling and Simulation Office (DMSO), as well as existing distributed interactive simulation (DIS) network protocols. This paper describes how synthetic natural environments are being integrated by TAOS to provide an increasingly rich dynamic synthetic natural environment. Architectural designs and implementations to accommodate a range of simulation applications are discussed. A number of enabling technologies are employed, such as the development of standards for gridded data distribution, and the inclusion of derived products and local environmental features within 4-dimensional data grids. The application of TAOS for training, analysis, and engineering simulations for sensor analysis is discussed.

  11. Environmental impact evaluation using an agent based simulation framework

    OpenAIRE

    Schroijen, M.J.T.; Van Tooren, M.J.L.

    2010-01-01

    Environmental issues play an increasingly important role in aviation, affecting the desirability of novel technologies directly. The desirability of a certain technology with respect to environmental issues is determined by the system of systems level impact instead of the often used system level impact. Changing this perspective introduces additional complexities in how the system level evaluation should be related to the desired system of systems (SoS) level evaluation. A framework is propo...

  12. Towards a framework for games and simulations in STEM subject assessments

    OpenAIRE

    Wills, Gary; Gilbert, Lester; Recio, Alejandra

    2012-01-01

    Currently, providing games and simulations to address specific educational objectives in STEM subjects is a craft activity, requiring custom-built applications and hence making difficult-to-share and difficult to-reuse solutions. To address this we propose a framework for the creation of Pedagogically Effective Games & Simulations (PEGS). The framework supports the construction and machine-processable expression of an educational intention which can be turned into a computer deliverable se...

  13. A framework for simulating ultrasound imaging based on first order nonlinear pressure–velocity relations

    DEFF Research Database (Denmark)

    Du, Yigang; Fan, Rui; Li, Yong;

    2016-01-01

    An ultrasound imaging framework modeled with the first order nonlinear pressure–velocity relations (NPVR) based simulation and implemented by a half-time staggered solution and pseudospectral method is presented in this paper. The framework is capable of simulating linear and nonlinear ultrasound...... ultrasound image can be obtained by beamforming the simulated channel data. Various results simulated by different algorithms are illustrated for comparisons. The root mean square (RMS) errors for each compared pulses are calculated. The linear propagation is validated by an angular spectrum approach (ASA...

  14. Research on the simulation framework in Building Information Modeling

    OpenAIRE

    Liang, Nan; Xu, Hongqing; Yu, Qiong

    2012-01-01

    In recent ten years, Building Information Modeling (BIM) has been proposed and applied in the industry of architecture. For the high efficiency and visualization, BIM and correlative technologies are welcomed by architects, engineers, builders and owners, thus the technologies on modeling for design has been widely researched. However, little attention is given to simulation while simulation is an important part of design for building, maybe because it is seen as somewhat less related to the ...

  15. Performance Improvements for the ATLAS Detector Simulation Framework

    OpenAIRE

    Almalioglu Yasin; Salzburger Andreas; Ritsch Elmar

    2013-01-01

    Many physics and performance studies carried out with the ATLAS detector at the Long Hadron Collider (LHC) require very large event samples. A detailed simulation for the detector, however, requires a great amount of CPU resources. In addition to detailed simulation, fast techniques and new setups are developed and extensively used to supply large event samples. In addition to the development of new techniques and setups, it is still possible to find some performance improvements in the exist...

  16. Abdominal surgery process modeling framework for simulation using spreadsheets.

    Science.gov (United States)

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. PMID:26004999

  17. Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking

    Energy Technology Data Exchange (ETDEWEB)

    Vashishta, Priya

    2014-12-01

    Reaction Dynamics in Energetic Materials: Detonation is a prototype of mechanochemistry, in which mechanically and thermally induced chemical reactions far from equilibrium exhibit vastly different behaviors. It is also one of the hardest multiscale physics problems, in which diverse length and time scales play important roles. The CACS group has performed multimillion-atom reactive MD simulations to reveal a novel two-stage reaction mechanism during the detonation of cyclotrimethylenetrinitramine (RDX) crystal. Rapid production of N2 and H2O within ~10 ps is followed by delayed production of CO molecules within ~ 1 ns. They found that further decomposition towards the final products is inhibited by the formation of large metastable C- and O-rich clusters with fractal geometry. The CACS group has also simulated the oxidation dynamics of close-packed aggregates of aluminum nanoparticles passivated by oxide shells. Their simulation results suggest an unexpectedly active role of the oxide shell as a nanoreactor.

  18. NPTool: a simulation and analysis framework for low-energy nuclear physics experiments

    Science.gov (United States)

    Matta, A.; Morfouace, P.; de Séréville, N.; Flavigny, F.; Labiche, M.; Shearman, R.

    2016-08-01

    The Nuclear Physics Tool (NPTool) is an open source data analysis and Monte Carlo simulation framework that has been developed for low-energy nuclear physics experiments with an emphasis on radioactive beam experiments. The NPTool offers a unified framework for designing, preparing and analyzing complex experiments employing multiple detectors, each of which may comprise some hundreds of channels. The framework has been successfully used for the analysis and simulation of experiments at facilities including GANIL, RIKEN, ALTO and TRIUMF, using both stable and radioactive beams. This paper details the NPTool philosophy together with an overview of the workflow. The framework has been benchmarked through the comparison of simulated and experimental data for a variety of detectors used in charged particle and gamma-ray spectroscopy.

  19. BOUT++: a framework for parallel plasma fluid simulations

    CERN Document Server

    Dudson, B D; Xu, X Q; Snyder, P B; Wilson, H R

    2008-01-01

    A new modular code called BOUT++ is presented, which simulates 3D fluid equations in curvilinear coordinates. Although aimed at simulating Edge Localised Modes (ELMs) in tokamak X-point geometry, the code is able to simulate a wide range of fluid models (magnetised and unmagnetised) involving an arbitrary number of scalar and vector fields, in a wide range of geometries. Time evolution is fully implicit, and 3rd-order WENO schemes are implemented. Benchmarks are presented for linear and non-linear problems (the Orszag-Tang vortex) showing good agreement. Performance of the code is tested by scaling with problem size and processor number, showing efficient scaling to thousands of processors. Linear initial-value simulations of ELMs using reduced ideal MHD are presented, and the results compared to the ELITE linear MHD eigenvalue code. The resulting mode-structures and growth-rate are found to be in good agreement (BOUT++ = 0.245, ELITE = 0.239). To our knowledge, this is the first time dissipationless, initial...

  20. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    Directory of Open Access Journals (Sweden)

    Rescigno R.

    2014-03-01

    Full Text Available Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  1. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    OpenAIRE

    Rescigno R.; Finck Ch.; Juliani D.; Baudot J.; Dauvergne D.; Dedes G.; Krimmer J.; Ray C.; Reithinger V.; Rousseau M.; Testa E; Winter M.

    2014-01-01

    Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  2. Maestro: an orchestration framework for large-scale WSN simulations.

    Science.gov (United States)

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  3. Maestro: An Orchestration Framework for Large-Scale WSN Simulations

    Directory of Open Access Journals (Sweden)

    Laurynas Riliskis

    2014-03-01

    Full Text Available Contemporary wireless sensor networks (WSNs have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VMinstances that provide an optimal balance of performance and cost for a given simulation.

  4. Chemical research at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Argonne National Laboratory is a research and development laboratory located 25 miles southwest of Chicago, Illinois. It has more than 200 programs in basic and applied sciences and an Industrial Technology Development Center to help move its technologies to the industrial sector. At Argonne, basic energy research is supported by applied research in diverse areas such as biology and biomedicine, energy conservation, fossil and nuclear fuels, environmental science, and parallel computer architectures. These capabilities translate into technological expertise in energy production and use, advanced materials and manufacturing processes, and waste minimization and environmental remediation, which can be shared with the industrial sector. The Laboratory`s technologies can be applied to help companies design products, substitute materials, devise innovative industrial processes, develop advanced quality control systems and instrumentation, and address environmental concerns. The latest techniques and facilities, including those involving modeling, simulation, and high-performance computing, are available to industry and academia. At Argonne, there are opportunities for industry to carry out cooperative research, license inventions, exchange technical personnel, use unique research facilities, and attend conferences and workshops. Technology transfer is one of the Laboratory`s major missions. High priority is given to strengthening U.S. technological competitiveness through research and development partnerships with industry that capitalize on Argonne`s expertise and facilities. The Laboratory is one of three DOE superconductivity technology centers, focusing on manufacturing technology for high-temperature superconducting wires, motors, bearings, and connecting leads. Argonne National Laboratory is operated by the University of Chicago for the U.S. Department of Energy.

  5. A general simulation model developing process based on five-object framework

    Institute of Scientific and Technical Information of China (English)

    胡安斌; 伞冶; 陈建明; 陈永强

    2003-01-01

    Different paradigms that relate verification and validation to the simulation model have different development process. A simulation model developing process based on Five-Object Framework (FOF) is discussed in this paper. An example is given to demonstrate the applications of the proposed method.

  6. Power Grid Simulation Applications Developed Using the GridPACKTM High Performance Computing Framework

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chen, Yousu; Diao, Ruisheng; Huang, Zhenyu; Perkins, William A.; Palmer, Bruce J.

    2016-12-01

    This paper describes the GridPACK™ software framework for developing power grid simulations that can run on high performance computing platforms, with several example applications (dynamic simulation, static contingency analysis, and dynamic contingency analysis) that have been developed using GridPACK.

  7. Designing a Virtual Olympic Games Framework by Using Simulation in Web 2.0 Technologies

    Science.gov (United States)

    Stoilescu, Dorian

    2013-01-01

    Instructional simulation had major difficulties in the past for offering limited possibilities in practice and learning. This article proposes a link between instructional simulation and Web 2.0 technologies. More exactly, I present the design of the Virtual Olympic Games Framework (VOGF), as a significant demonstration of how interactivity in…

  8. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  9. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    OpenAIRE

    Guanyi Sun; Shengnan Xu; Xu Wang; Dawei Wang; Eugene Tang; Yangdong Deng; Sun Chan

    2011-01-01

    Today's System-on-Chips (SoCs) design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simu...

  10. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  11. A Java based framework for simulating peer-to-peer overlay networks

    OpenAIRE

    Hasselrot, Daniel

    2005-01-01

    In the last few years many new structured overlay network protocols for peer-to-peer systems have appeared. Following that, the need to test and develop the protocols in a controlled environment arose and many different simulators were written, often only supporting a single protocol and designed with a specific simulation task in mind. We introduce a general component based framework in Java for writing peer-to-peer simulators, complete with methods for exporting and vis...

  12. An artificial intelligence framework for feedback and assessment mechanisms in educational Simulations and Serious Games

    OpenAIRE

    Stallwood, James

    2015-01-01

    Simulations and Serious Games are powerful e-learning tools that can be designed to provide learning opportunities that stimulate their participants. To achieve this goal, the design of Simulations and Serious Games will often include some balance of three factors: motivation, engagement, and flow. Whilst many frameworks and approaches for Simulation and Serious Game design do provide the means for addressing a combination of these factors to some degree, few address how those factors might b...

  13. Digital system verification a combined formal methods and simulation framework

    CERN Document Server

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  14. Modeling and Simulation Framework for Flow-Based Microfluidic Biochips

    DEFF Research Database (Denmark)

    Schmidt, Morten Foged; Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2013-01-01

    several microvalves, more complex units, such as micropumps, switches, mixers, and multiplexers, can be built. Such biochips are becoming increasingly complex, with thousands of components, but are still designed manually using a bottom-up full-custom design approach, which is extremely labor intensive...... and error prone. In this paper, we present an Integrated Development Environment (IDE), which addresses (i) schematic capture of the biochip architecture and biochemical application, (ii) logic simulation of an application running on a biochip, and is able to integrate the high level synthesis tasks...

  15. Atomistic Simulation of Protein Encapsulation in Metal-Organic Frameworks.

    Science.gov (United States)

    Zhang, Haiyang; Lv, Yongqin; Tan, Tianwei; van der Spoel, David

    2016-01-28

    Fabrication of metal-organic frameworks (MOFs) with large apertures triggers a brand-new research area for selective encapsulation of biomolecules within MOF nanopores. The underlying inclusion mechanism is yet to be clarified however. Here we report a molecular dynamics study on the mechanism of protein encapsulation in MOFs. Evaluation for the binding of amino acid side chain analogues reveals that van der Waals interaction is the main driving force for the binding and that guest size acts as a key factor predicting protein binding with MOFs. Analysis on the conformation and thermodynamic stability of the miniprotein Trp-cage encapsulated in a series of MOFs with varying pore apertures and surface chemistries indicates that protein encapsulation can be achieved via maintaining a polar/nonpolar balance in the MOF surface through tunable modification of organic linkers and Mg-O chelating moieties. Such modifications endow MOFs with a more biocompatible confinement. This work provides guidelines for selective inclusion of biomolecules within MOFs and facilitates MOF functions as a new class of host materials and molecular chaperones. PMID:26730607

  16. Consistent and conservative framework for incompressible multiphase flow simulations

    Science.gov (United States)

    Owkes, Mark; Desjardins, Olivier

    2015-11-01

    We present a computational methodology for convection that handles discontinuities with second order accuracy and maintains conservation to machine precision. We use this method in the context of an incompressible gas-liquid flow to transport the phase interface, momentum, and scalars. Using the same methodology for all the variables ensures discretely consistent transport, which is necessary for robust and accurate simulations of turbulent atomizing flows with high-density ratios. The method achieves conservative transport by computing consistent fluxes on a refined mesh, which ensures all conserved quantities are fluxed with the same discretization. Additionally, the method seamlessly couples semi-Lagrangian fluxes used near the interface with finite difference fluxes used away from the interface. The semi-Lagrangian fluxes are three-dimensional, un-split, and conservatively handle discontinuities. Careful construction of the fluxes ensures they are divergence-free and no gaps or overlaps form between neighbors. We have tested and used the scheme for many cases and demonstrate a simulation of an atomizing liquid jet.

  17. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    Science.gov (United States)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  18. A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs

    OpenAIRE

    Carlos Asilis; Paul D. McNelis

    1992-01-01

    A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs This paper investigates the dynamic paths of inflation and real balances in a general equilibrium intertemporal optimization model, with transactions costs and currency substitution; when budget deficits are financed by money creation.The results show that inflationany path show more 'jumps" or explosions under the assumptions if lower transactions costs or-an increasing degree of currency...

  19. An implicit solution framework for reactor fuel performance simulation

    International Nuclear Information System (INIS)

    The simulation of nuclear reactor fuel performance involves complex thermomechanical processes between fuel pellets, made of fissile material, and the protective cladding that surrounds the pellets. An important design goal for a fuel is to maximize the life of the cladding thereby allowing the fuel to remain in the reactor for a longer period of time to achieve higher degrees of burnup. This presentation examines various mathematical and computational issues that impact the modeling of the thermomechanical response of reactor fuel, and are thus important to the development of INL's fuel performance analysis code, BISON. The code employs advanced methods for solving coupled partial differential equation systems that describe multidimensional fuel thermomechanics, heat generation, and transport within the fuel

  20. Creating a Software Framework for Simulating Satellite Geolocation

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL

    2011-01-01

    It is hard to imagine life these days without having some sort of electronic indication of one's current location. Whether the purpose is for business, personal, or emergency use, utilizing smart cell phones, in-vehicle navigation systems, or location beacons, dependence on the Global Positioning System (GPS) is pervasive. Yet the availability of the GPS should not be taken for granted. Both environmental (e.g., terrain, weather) and intentional interference (i.e., jamming) can reduce or deny satellite access. In order to investigate these and other issues, as well as to explore possible alternative satellite constellations, an application called the Satellite Simulation Toolkit (SatSim) was created. This paper presents a high-level overview of SatSim and an example of how it may be used to study geolocation.

  1. A Mobility and Traffic Generation Framework for Modeling and Simulating Ad Hoc Communication Networks

    OpenAIRE

    Chris Barrett; Martin Drozda; Marathe, Madhav V; Ravi, S. S.; Smith, James P.

    2004-01-01

    We present a generic mobility and traffic generation framework that can be incorporated into a tool for modeling and simulating large scale ad~hoc networks. Three components of this framework, namely a mobility data generator (MDG), a graph structure generator (GSG) and an occlusion modification tool (OMT) allow a variety of mobility models to be incorporated into the tool. The MDG module generates positions of transceivers at specified time instants. The GSG module constructs the graph corre...

  2. CoRoBa, a Multi Mobile Robot Control and Simulation Framework

    Directory of Open Access Journals (Sweden)

    Eric Colon

    2008-11-01

    Full Text Available This paper describes on-going development of a multi robot control framework named CoRoBa. CoRoBa is theoretically founded by reifying Real Time Design Patterns. It uses CORBA as its communication Middleware and consequently benefits from the interoperability of this standard. A multi-robot 3D simulator written in Java3D integrates seamlessly with this framework. Several demonstration applications have been developed to validate the design and implementation options.

  3. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results

    Science.gov (United States)

    Murri, Daniel G.

    2011-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.

  4. SIMPEG: An open source framework for simulation and gradient based parameter estimation in geophysical applications

    Science.gov (United States)

    Cockett, Rowan; Kang, Seogi; Heagy, Lindsey J.; Pidlisecky, Adam; Oldenburg, Douglas W.

    2015-12-01

    Inverse modeling is a powerful tool for extracting information about the subsurface from geophysical data. Geophysical inverse problems are inherently multidisciplinary, requiring elements from the relevant physics, numerical simulation, and optimization, as well as knowledge of the geologic setting, and a comprehension of the interplay between all of these elements. The development and advancement of inversion methodologies can be enabled by a framework that supports experimentation, is flexible and extensible, and allows the knowledge generated to be captured and shared. The goal of this paper is to propose a framework that supports many different types of geophysical forward simulations and deterministic inverse problems. Additionally, we provide an open source implementation of this framework in Python called SIMPEG (Simulation and Parameter Estimation in Geophysics,

  5. A detailed framework to incorporate dust in hydrodynamical simulations

    CERN Document Server

    Grassi, T; Haugboelle, T; Schleicher, D R G

    2016-01-01

    Dust plays a key role in the evolution of the ISM and its correct modelling in numerical simulations is therefore fundamental. We present a new and self-consistent model that treats grain thermal coupling with the gas, radiation balance, and surface chemistry for molecular hydrogen. This method can be applied to any dust distribution with an arbitrary number of grain types without affecting the overall computational cost. In this paper we describe in detail the physics and the algorithm behind our approach, and in order to test the methodology, we present some examples of astrophysical interest, namely (i) a one-zone collapse with complete gas chemistry and thermochemical processes, (ii) a 3D model of a low-metallicity collapse of a minihalo starting from cosmological initial conditions, and (iii) a turbulent molecular cloud with H-C-O chemistry (277 reactions), together with self-consistent cooling and heating solved on the fly. Although these examples employ the publicly available code KROME, our approach c...

  6. A numerical framework for simulating fluid-structure interaction phenomena

    Directory of Open Access Journals (Sweden)

    A. De Rosis

    2014-07-01

    Full Text Available In this paper, a numerical tool able to solve fluid-structure interaction problems is proposed. The lattice Boltzmann method is used to compute fluid dynamics, while the corotational finite element formulation together with the Time Discontinuous Galerkin method are adopted to predict structure dynamics. The Immersed Boundary method is used to account for the presence of an immersed solid in the lattice fluid background and to handle fluid-structure interface conditions, while a Volume-of-Fluid-based method is adopted to take trace of the evolution of the free surface. These ingredients are combined through a partitioned staggered explicit strategy, according to an efficient and accurate algorithm recently developed by the authors. The effectiveness of the proposed methodology is tested against two different cases. The former investigates the dam break phenomenon, involving the modeling of the free surface. The latter involves the vibration regime experienced by two highly deformable flapping flags obstructing a flow. A wide numerical campaign is carried out by computing the error in terms of interface energy artificially introduced at the fluid-solid interface. Moreover, the structure behavior is dissected by simulating scenarios characterized by different values of the Reynolds number. Present findings are compared to literature results, showing a very close agreement.

  7. Synergia an advanced object-oriented framework for beam dynamics simulation

    CERN Document Server

    Dechow, Douglas R; Spentzouris, Panagiotis; Stoltz, Peter

    2005-01-01

    Synergia is a 3-D, parallel, particle-in-cell beam dynamics simulation toolkit. At heart of the software development effort is the integration of two extant object-oriented accelerator modeling frameworks–Impact written in Fortran 90 and mxyptlk written in C++–so that they may be steered by a third, a more flexible human interface framework, written in Python. Recent efforts are focused on the refactoring of the Impact-Fortran 90 codes in order to expose more loosely-coupled interfaces to the Python interface framework.

  8. A framework of knowledge creation processes in participatory simulation of hospital work systems

    DEFF Research Database (Denmark)

    Andersen, Simone Nyholm; Broberg, Ole

    2016-01-01

    suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The...... analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the...

  9. A Unified Simulation Framework for Megathrust Rupture Dynamics and Tsunamis

    Science.gov (United States)

    Dunham, E. M.; Lotto, G. C.; Kozdon, J. E.

    2014-12-01

    Many earthquakes, including megathrust events in subduction zones, occur offshore. In addition to seismic waves, such earthquakes also generate tsunamis. We present a methodology for simultaneously investigating earthquake rupture dynamics and tsunamigenesis, based on solution of the elastic and acoustic wave equations, in the solid and fluid portions of the domain, respectively. Surface gravity waves or tsunamis emerge naturally in such a description when gravitational restoring forces are properly taken into account. In our approach, we adopt an Eulerian description of the ocean and within it solve for particle velocities and the perturbation in pressure, Δp, about an initial hydrostatic state. The key step is enforcing the traction-free boundary condition on the moving ocean surface. We linearize this boundary condition, in order to apply it on the initial surface, and express it as Δp-ρgη=0, where -ρg is the initial hydrostatic gradient in pressure and η is the sea surface uplift (obtained, to first order, by integrating vertical particle velocity on the initial ocean surface). We show that this is the only place one needs to account for gravity. Additional terms in the momentum balance and linearized equation of state describing advection of pressure and density gradients can be included to study internal gravity waves within the ocean, but these can be safely neglected for problems of interest to us. We present a range of simulations employing this new methodology. These include test problems used to verify the accuracy of the method for modeling seismic, ocean acoustic, and tsunami waves, as well as more detailed models of megathrust ruptures. Our present work is focused on tsunami generation in models with variable bathymetry, where previous studies have raised questions regarding how horizontal displacement of a sloping seafloor excites tsunamis. Our approach rigorously accounts for time-dependent seafloor motion, horizontal momentum transfer, and

  10. FERN – a Java framework for stochastic simulation and evaluation of reaction networks

    Science.gov (United States)

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-01-01

    Background Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. Results In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. Conclusion FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand

  11. FERN – a Java framework for stochastic simulation and evaluation of reaction networks

    Directory of Open Access Journals (Sweden)

    Zimmer Ralf

    2008-08-01

    Full Text Available Abstract Background Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a do not provide the most efficient simulation algorithms and are difficult to extend, b cannot be easily integrated into other applications or c do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. Results In this article, we present FERN (Framework for Evaluation of Reaction Networks, a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. Conclusion FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward

  12. A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition

    Science.gov (United States)

    Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.

    2012-01-01

    A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.

  13. An Object-Oriented Framework for Versatile Finite Element Based Simulations of Neurostimulation

    OpenAIRE

    Edward T. Dougherty; Turner, James C.

    2016-01-01

    Computational simulations of transcranial electrical stimulation (TES) are commonly utilized by the neurostimulation community, and while vastly different TES application areas can be investigated, the mathematical equations and physiological characteristics that govern this research are identical. The goal of this work was to develop a robust software framework for TES that efficiently supports the spectrum of computational simulations routinely utilized by the TES community and in addition ...

  14. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    OpenAIRE

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-01-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be i...

  15. Dynamically adaptive Lattice Boltzmann simulation of shallow water flows with the Peano framework

    KAUST Repository

    Neumann, Philipp

    2015-09-01

    © 2014 Elsevier Inc. All rights reserved. We present a dynamically adaptive Lattice Boltzmann (LB) implementation for solving the shallow water equations (SWEs). Our implementation extends an existing LB component of the Peano framework. We revise the modular design with respect to the incorporation of new simulation aspects and LB models. The basic SWE-LB implementation is validated in different breaking dam scenarios. We further provide a numerical study on stability of the MRT collision operator used in our simulations.

  16. A Framework for Teaching Programming on the Internet: A Web-Based Simulation Approach

    OpenAIRE

    Yousif A. Bastaki

    2012-01-01

    Problem statement: This research study describes the process of developing a web-based framework for simulating programming language activities on the Internet, in an interactive way, by enabling executable programs to perform automatically their function. Approach: The interaction process is played using Java applets. It emphasizes the importance of building the web-based architecture of the proposed simulation model. Results: The research concentrates on developing programming courses on th...

  17. A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, K; Seymour, R; Wang, W; Kalia, R; Nakano, A; Vashishta, P; Shimojo, F; Yang, L H

    2009-02-17

    A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based on hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops {center_dot} day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).

  18. A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations

    International Nuclear Information System (INIS)

    A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based on hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops · day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).

  19. Framework for the construction of a Monte Carlo simulated brain PET–MR image database

    International Nuclear Information System (INIS)

    Simultaneous PET–MR acquisition reduces the possibility of registration mismatch between the two modalities. This facilitates the application of techniques, either during reconstruction or post-reconstruction, that aim to improve the PET resolution by utilising structural information provided by MR. However, in order to validate such methods for brain PET–MR studies it is desirable to evaluate the performance using data where the ground truth is known. In this work, we present a framework for the production of datasets where simulations of both the PET and MR, based on real data, are generated such that reconstruction and post-reconstruction approaches can be fairly compared. -- Highlights: • A framework for simulating realistic brain PET–MR images is proposed. • The imaging data created is formed from real acquisitions. • Partial volume correction techniques can be fairly compared using this framework

  20. COSMOS: A System-Level Modelling and Simulation Framework for Coprocessor-Coupled Reconfigurable Systems

    DEFF Research Database (Denmark)

    Wu, Kehuai; Madsen, Jan

    2007-01-01

    resource management, and iii) present a SystemC based framework to model and simulate coprocessor-coupled reconfigurable systems. We illustrate how COSMOS may be used to capture the dynamic behavior of such systems and emphasize the need for capturing the system aspects of such systems in order to deal...

  1. A software framework for the portable parallelization of particle-mesh simulations

    DEFF Research Database (Denmark)

    Sbalzarini, I.F.; Walther, Jens Honore; Polasek, B.; Chatelain, P.; Bergdorf, M.; Hieber, S.E.; Kotsalis, E.M.; Koumoutsakos, P.

    2006-01-01

    Abstract: We present a software framework for the transparent and portable parallelization of simulations using particle-mesh methods. Particles are used to transport physical properties and a mesh is required in order to reinitialize the distorted particle locations, ensuring the convergence of...

  2. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  3. The Umbra Simulation and Integration Framework Applied to Emergency Response Training

    Science.gov (United States)

    Hamilton, Paul Lawrence; Britain, Robert

    2010-01-01

    The Mine Emergency Response Interactive Training Simulation (MERITS) is intended to prepare personnel to manage an emergency in an underground coal mine. The creation of an effective training environment required realistic emergent behavior in response to simulation events and trainee interventions, exploratory modification of miner behavior rules, realistic physics, and incorporation of legacy code. It also required the ability to add rich media to the simulation without conflicting with normal desktop security settings. Our Umbra Simulation and Integration Framework facilitated agent-based modeling of miners and rescuers and made it possible to work with subject matter experts to quickly adjust behavior through script editing, rather than through lengthy programming and recompilation. Integration of Umbra code with the WebKit browser engine allowed the use of JavaScript-enabled local web pages for media support. This project greatly extended the capabilities of Umbra in support of training simulations and has implications for simulations that combine human behavior, physics, and rich media.

  4. Numerical simulation of the fracture process in ceramic FPD frameworks caused by oblique loading.

    Science.gov (United States)

    Kou, Wen; Qiao, Jiyan; Chen, Li; Ding, Yansheng; Sjögren, Göran

    2015-10-01

    Using a newly developed three-dimensional (3D) numerical modeling code, an analysis was performed of the fracture behavior in a three-unit ceramic-based fixed partial denture (FPD) framework subjected to oblique loading. All the materials in the study were treated heterogeneously; Weibull׳s distribution law was applied to the description of the heterogeneity. The Mohr-Coulomb failure criterion with tensile strength cut-off was utilized in judging whether the material was in an elastic or failed state. The simulated loading area was placed either on the buccal or the lingual cusp of a premolar-shaped pontic with the loading direction at 30°, 45°, 60°, 75° or 90° angles to the occlusal surface. The stress distribution, fracture initiation and propagation in the framework during the loading and fracture process were analyzed. This numerical simulation allowed the cause of the framework fracture to be identified as tensile stress failure. The decisive fracture was initiated in the gingival embrasure of the pontic, regardless of whether the buccal or lingual cusp of the pontic was loaded. The stress distribution and fracture propagation process of the framework could be followed step by step from beginning to end. The bearing capacity and the rigidity of the framework vary with the loading position and direction. The framework loaded with 90° towards the occlusal surface has the highest bearing capacity and the greatest rigidity. The framework loaded with 30° towards the occlusal surface has the least rigidity indicating that oblique loading has a major impact on the fracture of ceramic frameworks. PMID:26143353

  5. Modelling and simulation of acrylic bone cement injection and curing within the framework of vertebroplasty

    CERN Document Server

    Landgraf, Ralf; Kolmeder, Sebastian; Lion, Alexander; Lebsack, Helena; Kober, Cornelia

    2013-01-01

    The minimal invasive procedure of vertebroplasty is a surgical technique to treat compression fractures of vertebral bodies. During the treatment liquid bone cement gets injected into the affected vertebral body and therein cures to a solid. In order to investigate the treatment and the impact of injected bone cement on the vertebra, an integrated modelling and simulation framework has been developed. The framework includes (i) the generation of computer models based on microCT images of human cancellous bone, (ii) CFD simulations of bone cement injection into the trabecular structure of a vertebral body as well as (iii) non-linear FEM simulations of the bone cement curing. Thereby, microstructural models of trabecular bone structures are employed. Furthermore, a detailed description of the material behaviour of acrylic bone cements is provided. More precisely, a non-linear fluid flow model is chosen for the representation of the bone cement behaviour during injection and a non-linear viscoelastic material mo...

  6. Prototyping a coherent framework for full, fast and parameteric detector simulation for the FCC project

    CERN Document Server

    Hrdinka, Julia; Salzburger, Andreas; Hegner, Benedikt

    2015-01-01

    The outstanding success of the physics program of the Large Hadron Collider (LHC) including the discovery of the Higgs boson shifted the focus of part of the high energy physics community onto the planning phase for future circular collider (FCC) projects. A proton-proton collider is in consideration, as well as an electron-positron ring and an electron-proton option as potential LHC successor projects. Common to all projects is the need for a coherent software framework in order to carry out simulation studies to establish the potential physics reach or to test different technol- ogy approaches. Detector simulation is a particularly necessary tool needed for design studies of different detector concepts and to allow establishing the relevant performance parameters. In ad- dition, it allows to generate data as input for the development of reconstruction algorithms needed to cope with the expected future environments. We present a coherent framework that combines full, fast and parametric detector simulation e...

  7. Delphes, a framework for fast simulation of a generic collider experiment

    CERN Document Server

    Ovyn, S; Lemaître, V

    2009-01-01

    It is always delicate to know whether theoretical predictions are visible and measurable in a high energy collider experiment due to the complexity of the related detectors, data acquisition chain and software. We introduce here a new C++ based framework, DELPHES, for fast simulation of a general-purpose experiment. The simulation includes a tracking system, embedded into a magnetic field, calorimetry and a muon system, and possible very forward detectors arranged along the beamline. The framework is interfaced to standard file formats (e.g. Les Houches Event File) and outputs observable objects for analysis, like missing transverse energy and collections of electrons or jets. The simulation of detector response takes into account the detector resolution, and usual reconstruction algorithms, such as FASTJET. A simplified preselection can also be applied on processed data for trigger emulation. Detection of very forward scattered particles relies on the transport in beamlines with the HECTOR software. Finally,...

  8. Web-Enabled Framework for Real-Time Scheduler Simulator: A Teaching Too

    Directory of Open Access Journals (Sweden)

    C. Yaashuwanth

    2010-01-01

    Full Text Available Problem statement: A Real-Time System (RTS is one which controls an environment by receiving data, processing it, and returning the results quickly enough to affect the functioning of the environment at that time. The main objective of this research was to develop an architectural model for the simulation of real time tasks to implement in distributed environment through web, and to make comparison between various scheduling algorithms. The proposed model can be used for preprogrammed scheduling policies for uniprocessor systems. This model provided user friendly Graphical User Interface (GUI. Approach: Though a lot of scheduling algorithms have been developed, just a few of them are available to be implemented in real-time applications. In order to use, test and evaluate a scheduling policy it must be integrated into an operating system, which is a complex task. Simulation is another alternative to evaluate a scheduling policy. Unfortunately, just a few real-time scheduling simulators have been developed to date and most of them require the use of a specific simulation language. Results: Task ID, deadline, priority, period, computation time and phase are the input task attributes to the scheduler simulator and chronograph imitating the real-time execution of the input task set and computational statistics of the schedule are the output. Conclusion: The Web-enabled framework proposed in this study gave the developer to evaluate the schedulability of the real time application. Numerous benefits were quoted in support of the Web-based deployment. The proposed framework can be used as an invaluable teaching tool. Further, the GUI of the framework will allow for easy comparison of the framework of existing scheduling policies and also simulate the behavior and verify the suitability of custom defined schedulers for real-time applications.

  9. A model calibration framework for simultaneous multi-level building energy simulation

    International Nuclear Information System (INIS)

    Highlights: • Introduce a framework for multiple levels of building energy simulation calibration. • Improve the performance reliability of a calibrated model for different ECMs. • Achieve high simulation accuracies at building level, ECM level and zone level. • Create a classification schema to classify input parameters for calibration. • Use evidence and statistical learning to build energy model and reduce discrepancy. - Abstract: Energy simulation, the virtual representation and reproduction of energy processes for an entire building or a specific space, could assist building professionals with identifying relatively optimal energy conservation measures (ECMs). A review of current work revealed that methods for achieving simultaneous high accuracies in different levels of simulations, such as building level and zone level, have not been systematically explored, especially when there are several zones and multiple HVAC units in a building. Therefore, the objective of this paper is to introduce and validate a novel framework that can calibrate a model with high accuracies at multiple levels. In order to evaluate the performance of the calibration framework, we simulated HVAC-related energy consumption at the building level, at the ECM level and at the zone level. The simulation results were compared with the measured HVAC-related energy consumption. Our findings showed that MBE and CV (RMSE) were below 8.5% and 13.5%, respectively, for all three levels of energy simulation, demonstrating that the proposed framework could accurately simulate the building energy process at multiple levels. In addition, in order to estimate the potential energy efficiency improvements when different ECMs are implemented, the model has to be robust to the changes resulting from the building being operated under different control strategies. Mixed energy ground truths from two ECMs were used to calibrate the energy model. The results demonstrated that the model performed

  10. Implementation and performance of FDPS: A Framework Developing Parallel Particle Simulation Codes

    CERN Document Server

    Iwasawa, Masaki; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-01-01

    We have developed FDPS (Framework for Developing Particle Simulator), which enables researchers and programmers to develop high-performance parallel particle simulation codes easily. The basic idea of FDPS is to separate the program code for complex parallelization including domain decomposition, redistribution of particles, and exchange of particle information for interaction calculation between nodes, from actual interaction calculation and orbital integration. FDPS provides the former part and the users write the latter. Thus, a user can implement a high-performance fully parallelized $N$-body code only in 120 lines. In this paper, we present the structure and implementation of FDPS, and describe its performance on three sample applications: disk galaxy simulation, cosmological simulation and Giant impact simulation. All codes show very good parallel efficiency and scalability on K computer and XC30. FDPS lets the researchers concentrate on the implementation of physics and mathematical schemes, without wa...

  11. Flexible simulation framework to couple processes in complex 3D models for subsurface utilization assessment

    Science.gov (United States)

    Kempka, Thomas; Nakaten, Benjamin; De Lucia, Marco; Nakaten, Natalie; Otto, Christopher; Pohl, Maik; Tillner, Elena; Kühn, Michael

    2016-04-01

    Utilization of the geological subsurface for production and storage of hydrocarbons, chemical energy and heat as well as for waste disposal requires the quantification and mitigation of environmental impacts as well as the improvement of georesources utilization in terms of efficiency and sustainability. The development of tools for coupled process simulations is essential to tackle these challenges, since reliable assessments are only feasible by integrative numerical computations. Coupled processes at reservoir to regional scale determine the behaviour of reservoirs, faults and caprocks, generally demanding for complex 3D geological models to be considered besides available monitoring and experimenting data in coupled numerical simulations. We have been developing a flexible numerical simulation framework that provides efficient workflows for integrating the required data and software packages to carry out coupled process simulations considering, e.g., multiphase fluid flow, geomechanics, geochemistry and heat. Simulation results are stored in structured data formats to allow for an integrated 3D visualization and result interpretation as well as data archiving and its provision to collaborators. The main benefits in using the flexible simulation framework are the integration of data geological and grid data from any third party software package as well as data export to generic 3D visualization tools and archiving formats. The coupling of the required process simulators in time and space is feasible, while different spatial dimensions in the coupled simulations can be integrated, e.g., 0D batch with 3D dynamic simulations. User interaction is established via high-level programming languages, while computational efficiency is achieved by using low-level programming languages. We present three case studies on the assessment of geological subsurface utilization based on different process coupling approaches and numerical simulations.

  12. A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs

    Directory of Open Access Journals (Sweden)

    Carlos Asilis

    1992-03-01

    Full Text Available A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs This paper investigates the dynamic paths of inflation and real balances in a general equilibrium intertemporal optimization model, with transactions costs and currency substitution; when budget deficits are financed by money creation.The results show that inflationany path show more 'jumps" or explosions under the assumptions if lower transactions costs or-an increasing degree of currency substitution. Even small changes in the degrees of currency substitution with positive transactions costs sharply change the paths of intflation and real balances. Similarly, small changes in transactions costs for foreign currency, even without prior currency substirution, have marked effects on the paths of inflation and real balances.The results obtained from the simulated data are consistent with inflation processes in recent Latin American experience, where currency substitution may have taken place. Estimates of the simulated data for even a small degree of currency substitution generate generalized autoregressive conditionally heteroskedactic (GARCH estimares of the inflation process, which are consistent with estimares for Argentina, Bolivia, Mexico, and Peru. In these countries currency substitution may have gone hand-in-hand with inflationary instabiliiy through money-financed fiscal deficits.Our results suggest that fiscal deficits financed by monetary expansion should be avoided under conditions of increasing financial openness, which provide greater opportunities for financial adaptation through currency substitution or lower transactions costs on foreign currency accumulation.

  13. Microworlds, Simulators, and Simulation: Framework for a Benchmark of Human Reliability Data Sources

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Boring; Dana Kelly; Carol Smidts; Ali Mosleh; Brian Dyre

    2012-06-01

    In this paper, we propose a method to improve the data basis of human reliability analysis (HRA) by extending the data sources used to inform HRA methods. Currently, most HRA methods are based on limited empirical data, and efforts to enhance the empirical basis behind HRA methods have not yet yielded significant new data. Part of the reason behind this shortage of quality data is attributable to the data sources used. Data have been derived from unrelated industries, from infrequent risk-significant events, or from costly control room simulator studies. We propose a benchmark of four data sources: a simplified microworld simulator using unskilled student operators, a full-scope control room simulator using skilled student operators, a full-scope control room simulator using licensed commercial operators, and a human performance modeling and simulation system using virtual operators. The goal of this research is to compare findings across the data sources to determine to what extent data may be used and generalized from cost effective sources.

  14. A Semantic Web Service and Simulation Framework to Intelligent Distributed Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Son, Young Jun [University of Arizona; Kulvatunyou, Boonserm [ORNL; Cho, Hyunbo [POSTECH University, South Korea; Feng, Shaw [National Institute of Standards and Technology (NIST)

    2005-11-01

    To cope with today's fluctuating markets, a virtual enterprise (VE) concept can be employed to achieve the cooperation among independently operating enterprises. The success of VE depends on reliable interoperation among trading partners. This paper proposes a framework based on semantic web of manufacturing and simulation services to enable business and engineering collaborations between VE partners, particularly a design house and manufacturing suppliers.

  15. An Extendable Multi-Purpose Simulation and Optimization Framework for Thermal Problems in TCAD Applications

    OpenAIRE

    Holzer, S.; Wagner, M.; A Sheikholeslami; Karner, M.; Span, G.; Grasser, T.; Selberherr, S.

    2006-01-01

    We present the capabilities of our optimization framework in conjunction with typical applications for thermal problems. Our software package supports a wide rage of simulators and optimization strategies to improve electronic devices in terms of speed, reliability, efficiency, and to reduce thermal degradation due to mechanical influences. Moreover, we show several optimization examples, where we succeeded to extract electro-thermal material and process parameters. These new material paramet...

  16. Beyond illumination: An interactive simulation framework for non-visual and perceptual aspects of daylighting performance

    OpenAIRE

    Andersen, Marilyne; Guillemin, Antoine; Ámundadóttir, María Lovísa; Rockcastle, Siobhan Francois

    2013-01-01

    This paper presents a proof-of-concept for a goal-based simulation structure that could offer design support for daylighting performance aspects beyond conventional ones such as illumination, glare or solar gains. The framework uses a previously established visualization platform that simultaneously and interactively displays time-based daylighting performance alongside renderings, and relies on a goal-based approach. Two novel performance aspects are investigated in the present paper: health...

  17. SCENARIO ANALYSIS OF TECHNOLOGY PRODUCTS WITH AN AGENT-BASED SIMULATION AND DATA MINING FRAMEWORK

    OpenAIRE

    AMIT SHINDE; MOEED HAGHNEVIS; Janssen, Marco A.; GEORGE C. RUNGER; MANI JANAKIRAM

    2013-01-01

    A framework is presented to simulate and analyze the effect of multiple business scenarios on the adoption behavior of a group of technology products. Diffusion is viewed as an emergent phenomenon that results from the interaction of consumers. An agent-based model is used in which potential adopters of technology product are allowed to be influenced by their local interactions within the social network. Along with social influence, the effect of product features is important and we ascribe f...

  18. A dynamic subgrid-scale modeling framework for large eddy simulation using approximate deconvolution

    CERN Document Server

    Maulik, Romit

    2016-01-01

    We put forth a dynamic modeling framework for sub-grid parametrization of large eddy simulation of turbulent flows based upon the use of the approximate deconvolution procedure to compute the Smagorinsky constant self-adaptively from the resolved flow quantities. Our numerical assessments for solving the Burgers turbulence problem shows that the proposed approach could be used as a viable tool to address the turbulence closure problem due to its flexibility.

  19. A Hierarchical Framework for Visualising and Simulating Supply Chains in Virtual Environments

    Institute of Scientific and Technical Information of China (English)

    Hai-Yan Zhang; Zheng-Xu Zhao

    2005-01-01

    This paper presents research into applying virtual environment (VE) technology to supply chain management (SCM). Our research work has employed virtual manufacturing environments to represent supply chain nodes to simulate processes and activities in supply chain management. This will enable those who are involved in these processes and activities to gain an intuitive understanding of them, so as to design robust supply chains and make correct decisions at the right time.A framework system and its hierarchical structure for visualising and simulating supply chains in virtual environments are reported and detailed in this paper.

  20. A software framework for the portable parallelization of particle-mesh simulations

    DEFF Research Database (Denmark)

    Sbalzarini, I.F.; Walther, Jens Honore; Polasek, B.;

    2006-01-01

    wide range of applications, and it enables orders of magnitude increase in the number of computational elements employed in particle methods. We demonstrate the performance and scalability of the library on several problems, including the first-ever billion particle simulation of diffusion in real......Abstract: We present a software framework for the transparent and portable parallelization of simulations using particle-mesh methods. Particles are used to transport physical properties and a mesh is required in order to reinitialize the distorted particle locations, ensuring the convergence of...

  1. Simulation and real-time optimal scheduling: a framework for integration

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C.M.; Nevins, M.R. [Argonne National Lab., IL (United States); Williams, M.K.; Joines, J.C. [Military Traffic Management Command Transportation Engineering Agency, Newport News, VA (United States)

    1997-02-01

    Traditional scheduling and simulation models of the same system differ in several fundamental respects. These include the definition of a schedule, the existence of an objective function which orders schedules and indicates the performance of a given schedule according to specific criteria, and the level of fidelity at which the items are represented and processed through he system. This paper presents a conceptual, object-oriented, architecture for combining a traditional, high-level, scheduling system with a detailed, process- level, discrete-event simulation. A multi-echelon planning framework is established in the context of modeling end-to-end military deployments with the focus on detailed seaport operations.

  2. Simulation-Based E-Learning Framework for Entrepreneurship Education and Training

    Directory of Open Access Journals (Sweden)

    Constanţa-Nicoleta Bodea

    2015-02-01

    Full Text Available The paper proposes an e-Learning framework in entrepreneurship. The framework has three main components, for identification the business opportunities, for developing business scenarios and for risk analysis. A common database assures the components integration. The main components of this framework are already available; the main challenging for those interested in using them is to design an integrated flow of activities, adapted with their curricula and other educational settings. The originality of the approach is that the framework is domain independent and uses advanced IT technologies, such as recommendation algorithms, agent-based simulations and extended graphical support. Using this e-learning framework, the students can learn how to choose relevant characteristics/aspects for a type of business and how important is each of them according specific criteria; how to set realistic values for different characteristics/aspects of the business, how a business scenario can be changed in order to fit better to the business context and how to assess/evaluate business scenarios.

  3. An Object-Oriented Framework for Versatile Finite Element Based Simulations of Neurostimulation

    Directory of Open Access Journals (Sweden)

    Edward T. Dougherty

    2016-01-01

    Full Text Available Computational simulations of transcranial electrical stimulation (TES are commonly utilized by the neurostimulation community, and while vastly different TES application areas can be investigated, the mathematical equations and physiological characteristics that govern this research are identical. The goal of this work was to develop a robust software framework for TES that efficiently supports the spectrum of computational simulations routinely utilized by the TES community and in addition easily extends to support alternative neurostimulation research objectives. Using well-established object-oriented software engineering techniques, we have designed a software framework based upon the physical and computational aspects of TES. The framework’s versatility is demonstrated with a set of diverse neurostimulation simulations that (i reinforce the importance of using anisotropic tissue conductivities, (ii demonstrate the enhanced precision of high-definition stimulation electrodes, and (iii highlight the benefits of utilizing multigrid solution algorithms. Our approaches result in a framework that facilitates rapid prototyping of real-world, customized TES administrations and supports virtually any clinical, biomedical, or computational aspect of this treatment. Software reuse and maintainability are optimized, and in addition, the same code can be effortlessly augmented to provide support for alternative neurostimulation research endeavors.

  4. A Framework for Simulation of Aircraft Flyover Noise Through a Non-Standard Atmosphere

    Science.gov (United States)

    Arntzen, Michael; Rizzi, Stephen A.; Visser, Hendrikus G.; Simons, Dick G.

    2012-01-01

    This paper describes a new framework for the simulation of aircraft flyover noise through a non-standard atmosphere. Central to the framework is a ray-tracing algorithm which defines multiple curved propagation paths, if the atmosphere allows, between the moving source and listener. Because each path has a different emission angle, synthesis of the sound at the source must be performed independently for each path. The time delay, spreading loss and absorption (ground and atmosphere) are integrated along each path, and applied to each synthesized aircraft noise source to simulate a flyover. A final step assigns each resulting signal to its corresponding receiver angle for the simulation of a flyover in a virtual reality environment. Spectrograms of the results from a straight path and a curved path modeling assumption are shown. When the aircraft is at close range, the straight path results are valid. Differences appear especially when the source is relatively far away at shallow elevation angles. These differences, however, are not significant in common sound metrics. While the framework used in this work performs off-line processing, it is conducive to real-time implementation.

  5. A 3D MPI-Parallel GPU-accelerated framework for simulating ocean wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Raessi, Mehdi

    2015-11-01

    We present an MPI-parallel GPU-accelerated computational framework for studying the interaction between ocean waves and wave energy converters (WECs). The computational framework captures the viscous effects, nonlinear fluid-structure interaction (FSI), and breaking of waves around the structure, which cannot be captured in many potential flow solvers commonly used for WEC simulations. The full Navier-Stokes equations are solved using the two-step projection method, which is accelerated by porting the pressure Poisson equation to GPUs. The FSI is captured using the numerically stable fictitious domain method. A novel three-phase interface reconstruction algorithm is used to resolve three phases in a VOF-PLIC context. A consistent mass and momentum transport approach enables simulations at high density ratios. The accuracy of the overall framework is demonstrated via an array of test cases. Numerical simulations of the interaction between ocean waves and WECs are presented. Funding from the National Science Foundation CBET-1236462 grant is gratefully acknowledged.

  6. Ximpol: a new X-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Baldini, Luca; Muleri, Fabio; Soffitta, Paolo; Omodei, Nicola; Pesce-Rollins, Melissa; Sgro, Carmelo; Latronico, Luca; Spada, Francesca; Manfreda, Alberto; Di Lalla, Niccolo

    2016-07-01

    We present a new simulation framework, ximpol, based on the Python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. ximpol is designed to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC---which make it a useful tool not only for simulating observations of astronomical sources, but also to develop and test end-to-end analysis chains. In this contribution we shall give an overview of the basic architecture of the software. Although in principle the framework is not tied to any specific mission or instrument design we shall present a few physically interesting case studies in the context of the XIPE mission phase study.

  7. A simulation framework for modeling tumor control probability in breast conserving therapy

    International Nuclear Information System (INIS)

    Background and purpose: Microscopic disease (MSD) left after tumorectomy is a major cause of local recurrence in breast conserving therapy (BCT). However, the effect of microscopic disease and RT dose on tumor control probability (TCP) was seldom studied quantitatively. A simulation framework was therefore constructed to explore the relationship between tumor load, radiation dose and TCP. Materials and methods: First, we modeled total disease load and microscopic spread with a pathology dataset. Then we estimated the remaining disease load after tumorectomy through surgery simulation. The Webb–Nahum TCP model was extended by clonogenic cell fraction to model the risk of local recurrence. The model parameters were estimated by fitting the simulated results to the observations in two clinical trials. Results: Higher histopathology grade has a strong correlation with larger MSD cell quantity. On average 12.5% of the MSD cells remained in the patient’s breast after surgery but varied considerably among patients (0–100%); illustrating the role of radiotherapy. A small clonogenic cell fraction was optimal in our model (one in every 2.7 * 106 cells). The mean radiosensitivity was estimated at 0.067 Gy−1 with standard deviation of 0.022 Gy−1. Conclusion: A relationship between radiation dose and TCP was established in a newly designed simulation framework with detailed disease load, surgery and radiotherapy models

  8. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    CERN Document Server

    Hummel, Jacob

    2016-01-01

    We present the first public release (v0.1) of the open-source GADGET Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes GADGET and GIZMO using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics (SPH) datasets.

  9. A modular modelling framework for hypotheses testing in the simulation of urbanisation

    CERN Document Server

    Cottineau, Clementine; Chapron, Paul; Coyrehourcq, Sebastien Rey; Pumain, Denise

    2015-01-01

    In this paper, we present a modelling experiment developed to study systems of cities and processes of urbanisation in large territories over long time spans. Building on geographical theories of urban evolution, we rely on agent-based models to 1/ formalise complementary and alternative hypotheses of urbanisation and 2/ explore their ability to simulate observed patterns in a virtual laboratory. The paper is therefore divided into two sections : an overview of the mechanisms implemented to represent competing hypotheses used to simulate urban evolution; and an evaluation of the resulting model structures in their ability to simulate - efficiently and parsimoniously - a system of cities (the Former Soviet Union) over several periods of time (before and after the crash of the USSR). We do so using a modular framework of model-building and evolutionary algorithms for the calibration of several model structures. This project aims at tackling equifinality in systems dynamics by confronting different mechanisms wi...

  10. A Framework for Interactive Work Design based on Digital Work Analysis and Simulation

    CERN Document Server

    Ma, Liang; Fu, Huanzhang; Guo, Yang; Chablat, Damien; Bennis, Fouad; 10.1002/hfm.20178

    2010-01-01

    Due to the flexibility and adaptability of human, manual handling work is still very important in industry, especially for assembly and maintenance work. Well-designed work operation can improve work efficiency and quality; enhance safety, and lower cost. Most traditional methods for work system analysis need physical mock-up and are time consuming. Digital mockup (DMU) and digital human modeling (DHM) techniques have been developed to assist ergonomic design and evaluation for a specific worker population (e.g. 95 percentile); however, the operation adaptability and adjustability for a specific individual are not considered enough. In this study, a new framework based on motion tracking technique and digital human simulation technique is proposed for motion-time analysis of manual operations. A motion tracking system is used to track a worker's operation while he/she is conducting a manual handling work. The motion data is transferred to a simulation computer for real time digital human simulation. The data ...

  11. Lattice Boltzmann Simulations in the Slip and Transition Flow Regime with the Peano Framework

    KAUST Repository

    Neumann, Philipp

    2012-01-01

    We present simulation results of flows in the finite Knudsen range, which is in the slip and transition flow regime. Our implementations are based on the Lattice Boltzmann method and are accomplished within the Peano framework. We validate our code by solving two- and three-dimensional channel flow problems and compare our results with respective experiments from other research groups. We further apply our Lattice Boltzmann solver to the geometrical setup of a microreactor consisting of differently sized channels and a reactor chamber. Here, we apply static adaptive grids to fur-ther reduce computational costs. We further investigate the influence of using a simple BGK collision kernel in coarse grid regions which are further away from the slip boundaries. Our results are in good agreement with theory and non-adaptive simulations, demonstrating the validity and the capabilities of our adaptive simulation software for flow problems at finite Knudsen numbers.

  12. A Framework for Teaching Programming on the Internet: A Web-Based Simulation Approach

    Directory of Open Access Journals (Sweden)

    Yousif A. Bastaki

    2012-01-01

    Full Text Available Problem statement: This research study describes the process of developing a web-based framework for simulating programming language activities on the Internet, in an interactive way, by enabling executable programs to perform automatically their function. Approach: The interaction process is played using Java applets. It emphasizes the importance of building the web-based architecture of the proposed simulation model. Results: The research concentrates on developing programming courses on the Internet to contribute to the distribution of education for the benefit of learners. We emphasize on introducing interactivity between the user and the programming environment. Conclusion: The project is at its first phase and is still under development but we hope that the design of the course and the interactivity that the Java applets provides by simulating the run of an executable C++ code will appeal to our users.

  13. Automated Object-Oriented Simulation Framework for Modelling of Superconducting Magnets at CERN

    CERN Document Server

    Maciejewski, Michał; Bartoszewicz, Andrzej

    The thesis aims at designing a flexible, extensible, user-friendly interface to model electro thermal transients occurring in superconducting magnets. Simulations are a fundamental tool for assessing the performance of a magnet and its protection system against the effects of a quench. The application is created using scalable and modular architecture based on object-oriented programming paradigm which opens an easy way for future extensions. What is more, each model composed of thousands of blocks is automatically created in MATLAB/Simulink. Additionally, the user is able to automatically run sets of simulations with varying parameters. Due to its scalability and modularity the framework can be easily used to simulate wide range of materials and magnet configurations.

  14. A higher-order numerical framework for stochastic simulation of chemical reaction systems.

    KAUST Repository

    Székely, Tamás

    2012-07-15

    BACKGROUND: In this paper, we present a framework for improving the accuracy of fixed-step methods for Monte Carlo simulation of discrete stochastic chemical kinetics. Stochasticity is ubiquitous in many areas of cell biology, for example in gene regulation, biochemical cascades and cell-cell interaction. However most discrete stochastic simulation techniques are slow. We apply Richardson extrapolation to the moments of three fixed-step methods, the Euler, midpoint and θ-trapezoidal τ-leap methods, to demonstrate the power of stochastic extrapolation. The extrapolation framework can increase the order of convergence of any fixed-step discrete stochastic solver and is very easy to implement; the only condition for its use is knowledge of the appropriate terms of the global error expansion of the solver in terms of its stepsize. In practical terms, a higher-order method with a larger stepsize can achieve the same level of accuracy as a lower-order method with a smaller one, potentially reducing the computational time of the system. RESULTS: By obtaining a global error expansion for a general weak first-order method, we prove that extrapolation can increase the weak order of convergence for the moments of the Euler and the midpoint τ-leap methods, from one to two. This is supported by numerical simulations of several chemical systems of biological importance using the Euler, midpoint and θ-trapezoidal τ-leap methods. In almost all cases, extrapolation results in an improvement of accuracy. As in the case of ordinary and stochastic differential equations, extrapolation can be repeated to obtain even higher-order approximations. CONCLUSIONS: Extrapolation is a general framework for increasing the order of accuracy of any fixed-step stochastic solver. This enables the simulation of complicated systems in less time, allowing for more realistic biochemical problems to be solved.

  15. A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging

    International Nuclear Information System (INIS)

    Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R2) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R2 of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems. (paper)

  16. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    International Nuclear Information System (INIS)

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce

  17. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Jillian, E-mail: jillian.becker@health.qld.gov.au [Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia); Bridge, Pete [School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland (Australia); Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet [Radiation Oncology, Princess Alexandra Hospital, Brisbane, Queensland (Australia); Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia)

    2015-06-15

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.

  18. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Jan eHahne

    2015-09-01

    Full Text Available Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy...

  19. An agent-based framework for fuel cycle simulation with recycling

    International Nuclear Information System (INIS)

    Simulation of the nuclear fuel cycle is an established field with multiple players. Prior development work has utilized techniques such as system dynamics to provide a solution structure for the matching of supply and demand in these simulations. In general, however, simulation infrastructure development has occurred in relatively closed circles, each effort having unique considerations as to the cases which are desired to be modeled. Accordingly, individual simulators tend to have their design decisions driven by specific use cases. Presented in this work is a proposed supply and demand matching algorithm that leverages the techniques of the well-studied field of mathematical programming. A generic approach is achieved by treating facilities as individual entities and actors in the supply-demand market which denote preferences amongst commodities. Using such a framework allows for varying levels of interaction fidelity, ranging from low-fidelity, quick solutions to high-fidelity solutions that model individual transactions (e.g. at the fuel-assembly level). The power of the technique is that it allows such flexibility while still treating the problem in a generic manner, encapsulating simulation engine design decisions in such a way that future simulation requirements can be relatively easily added when needed. (authors)

  20. LUsim: A Framework for Simulation-Based Performance Modelingand Prediction of Parallel Sparse LU Factorization

    Energy Technology Data Exchange (ETDEWEB)

    Univ. of California, San Diego; Li, Xiaoye Sherry; Cicotti, Pietro; Li, Xiaoye Sherry; Baden, Scott B.

    2008-04-15

    Sparse parallel factorization is among the most complicated and irregular algorithms to analyze and optimize. Performance depends both on system characteristics such as the floating point rate, the memory hierarchy, and the interconnect performance, as well as input matrix characteristics such as such as the number and location of nonzeros. We present LUsim, a simulation framework for modeling the performance of sparse LU factorization. Our framework uses micro-benchmarks to calibrate the parameters of machine characteristics and additional tools to facilitate real-time performance modeling. We are using LUsim to analyze an existing parallel sparse LU factorization code, and to explore a latency tolerant variant. We developed and validated a model of the factorization in SuperLU_DIST, then we modeled and implemented a new variant of slud, replacing a blocking collective communication phase with a non-blocking asynchronous point-to-point one. Our strategy realized a mean improvement of 11percent over a suite of test matrices.

  1. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Science.gov (United States)

    Hartwig, Zachary S.

    2016-04-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms.

  2. GNU polyxmass: a software framework for mass spectrometric simulations of linear (bio-polymeric analytes

    Directory of Open Access Journals (Sweden)

    Rusconi Filippo

    2006-04-01

    Full Text Available Abstract Background Nowadays, a variety of (bio-polymers can be analyzed by mass spectrometry. The detailed interpretation of the spectra requires a huge number of "hypothesis cycles", comprising the following three actions 1 put forth a structural hypothesis, 2 test it, 3 (invalidate it. This time-consuming and painstaking data scrutiny is alleviated by using specialized software tools. However, all the software tools available to date are polymer chemistry-specific. This imposes a heavy overhead to researchers who do mass spectrometry on a variety of (bio-polymers, as each polymer type will require a different software tool to perform data simulations and analyses. We developed a software to address the lack of an integrated software framework able to deal with different polymer chemistries. Results The GNU polyxmass software framework performs common (bio-chemical simulations–along with simultaneous mass spectrometric calculations–for any kind of linear bio-polymeric analyte (DNA, RNA, saccharides or proteins. The framework is organized into three modules, all accessible from one single binary program. The modules let the user to 1 define brand new polymer chemistries, 2 perform quick mass calculations using a desktop calculator paradigm, 3 graphically edit polymer sequences and perform (bio-chemical/mass spectrometric simulations. Any aspect of the mass calculations, polymer chemistry reactions or graphical polymer sequence editing is configurable. Conclusion The scientist who uses mass spectrometry to characterize (bio-polymeric analytes of different chemistries is provided with a single software framework for his data prediction/analysis needs, whatever the polymer chemistry being involved.

  3. An integrated simulation framework for the performance assessment of radioactive waste repositories

    International Nuclear Information System (INIS)

    Highlights: ► Integrated framework for a performance assessment of radioactive waste repositories. ► Use of Monte Carlo simulation for radionuclide migration at the repository scale. ► Numerical deterministic code for radionuclide migration at the geosphere scale. ► Application to a realistic case study. ► Advantage: modularity, i.e., interchangeability and interconnection. - Abstract: We present an integrated framework for a process-driven performance assessment of radioactive waste repositories. Key features of the proposed modeling strategy include: (1) the use of Monte Carlo-based simulation to model radionuclides migration at the repository scale, which allows simple management of realistic scenarios and (2) the adoption of a numerical code to provide realistic descriptions of the dynamics of radionuclide transport in natural groundwater bodies at the geosphere scale, from the release location to possible human intake occurrence. While repository-scale simulations are performed by the in-house code MASCOT, the subsequent groundwater flow and transport fields are depicted by means of the widely known and extensively used numerical codes MODFLOW and MT3DMS. An application to a realistic case study is presented to show the feasibility of the approach.

  4. Delphes, a framework for fast simulation of a general purpose LHC detector

    International Nuclear Information System (INIS)

    Knowing whether theoretical predictions are visible and measurable in a High Energy experiment is always delicate, due to the complexity of the related detectors, DAQ chain and software. We introduce here a new framework, Delphes, for fast simulation of a general purpose experiment. The simulation includes a tracking system, embedded into a magnetic field, calorimetry and a muon system, and possible very forward detectors arranged along the beamline. The framework is interfaced to standard file format from event generators (e.g. Les Houches Event File) and outputs observable analysis data objects, like missing transverse energy and collections of electrons or jets. The simulation of the detector response takes into account the detector resolution, and usual reconstruction algorithms for complex objects, like FastJet. A simplified preselection can also be applied on processed data for trigger emulation. Detection of very forward scattered particles relies on the transport in beamlines with the Hector software. Finally, the FROG 2D/3D event display is used for visualisation of the collision final states. An overview of Delphes is given as well as a few use-cases for illustration

  5. DDG4 A Simulation Framework based on the DD4hep Detector Description Toolkit

    Science.gov (United States)

    Frank, M.; Gaede, F.; Nikiforou, N.; Petric, M.; Sailer, A.

    2015-12-01

    The detector description is an essential component that has to be used to analyse and simulate data resulting from particle collisions in high energy physics experiments. Based on the DD4hep detector description toolkit a flexible and data driven simulation framework was designed using the Geant4 tool-kit. We present this framework and describe the guiding requirements and the architectural design, which was strongly driven by ease of use. The goal was, given an existing detector description, to simulate the detector response to particle collisions in high energy physics experiments with minimal effort, but not impose restrictions to support enhanced or improved behaviour. Starting from the ROOT based geometry implementation used by DD4hep an automatic conversion mechanism to Geant4 was developed. The physics response and the mechanism to input particle data from generators was highly formalized and can be instantiated on demand using known factory patterns. A palette of components to model the detector response is provided by default, but improved or more sophisticated components may easily be added using the factory pattern. Only the final configuration of the instantiated components has to be provided by end-users using either C++ or python scripting or an XML based description.

  6. The Application of Modeling and Simulation in Capacity Management within the ITIL Framework

    Science.gov (United States)

    Rahmani, Sonya; vonderHoff, Otto

    2010-01-01

    Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.

  7. Towards a unified framework for coarse-graining particle-based simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Junghans, Christoph [Los Alamos National Laboratory

    2012-06-28

    Different coarse-graining techniques for soft matter systems have been developed in recent years, however it is often very demanding to find the method most suitable for the problem studied. For this reason we began to develop the VOTCA toolkit to allow for easy comparison of different methods. We have incorporated 6 different techniques into the package and implemented a powerful and parallel analysis framework plus multiple simulation back-ends. We will discuss the specifics of the package by means of various studies, which have been performed with the toolkit and highlight problems we encountered along the way.

  8. Modeling framework and associated simulation tools for the prediction of damage tolerance of CMC

    Directory of Open Access Journals (Sweden)

    Baranger Emmanuel

    2015-01-01

    Full Text Available A modeling framework and the associated simulation tools are presented for the prediction of the mechanical behaviour and lifetime of CMC with self-healing matrix. A macroscopic anisotropic damage model enriched with micro information has been developed, identified and validated using experimental information at different levels including reaction kinetics, fiber failure probability or macroscopic mechanical behavior. The computational cost associated to the proposed model being quite expansive, a method has been setup to construct, automatically, reduced numerical constitutive laws. An illustration is given as well as the associated error estimation.

  9. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    Science.gov (United States)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  10. A proposed simulation optimization model framework for emergency department problems in public hospital

    Science.gov (United States)

    Ibrahim, Ireen Munira; Liong, Choong-Yeun; Bakar, Sakhinah Abu; Ahmad, Norazura; Najmuddin, Ahmad Farid

    2015-12-01

    The Emergency Department (ED) is a very complex system with limited resources to support increase in demand. ED services are considered as good quality if they can meet the patient's expectation. Long waiting times and length of stay is always the main problem faced by the management. The management of ED should give greater emphasis on their capacity of resources in order to increase the quality of services, which conforms to patient satisfaction. This paper is a review of work in progress of a study being conducted in a government hospital in Selangor, Malaysia. This paper proposed a simulation optimization model framework which is used to study ED operations and problems as well as to find an optimal solution to the problems. The integration of simulation and optimization is hoped can assist management in decision making process regarding their resource capacity planning in order to improve current and future ED operations.

  11. GNSSim: An Open Source GNSS/GPS Framework for Unmanned Aerial Vehicular Network Simulation

    Directory of Open Access Journals (Sweden)

    Farha Jahan

    2015-08-01

    Full Text Available Unmanned systems are of great importance in accomplishing tasks where human lives are at risk. These systems are being deployed in tasks that are time-consuming, expensive or inconclusive if accomplished by human intervention. Design, development and testing of such vehicles using actual hardware could be quite costly and dangerous. Another issue is the limited outdoor usage permitted by Federal Aviation Administration regulations, which makes outdoor testing difficult. An optimal solution to this problem is to have a simulation environment where different operational scenarios, newly developed models, etc., can be studied. In this paper, we propose GNSSim, a Global Navigation Satellite System (GNSS simulation framework. We demonstrate its effectiveness by integrating it with UAVSim. This allows users to experiment easily by adjusting different satellite as well as UAV parameters. Related tests and evidence of the correctness of the implementation are presented.

  12. A multi-scale friction model framework for full scale sheet forming simulations

    Science.gov (United States)

    Hol, J.; Meinders, T.; Huétink, J.

    2011-05-01

    In this paper a numerical framework is proposed which accounts for the most important friction mechanisms. Static flattening and flattening due to bulk strain are accounted for by theoretical models on a microscale. Based on statistical parameters a fast and efficient translation from micro- to macro modeling is included. A general overview of the friction model is presented and the translation from micro to macro modeling is outlined. The development of real area of contact is described by the flattening models and the effect of ploughing and adhesion on the coefficient of friction is described by a micro-scale friction model. A brief theoretical background of these models is given. The flattening models are validated by means of FE simulations on microscale and the feasibility of the advanced macroscopic friction model is proven by a full scale sheet metal forming simulation.

  13. Simulating collisions of thick nuclei in the color glass condensate framework

    Science.gov (United States)

    Gelfand, Daniil; Ipp, Andreas; Müller, David

    2016-07-01

    We present our work on the simulation of the early stages of heavy-ion collisions with finite longitudinal thickness in the laboratory frame in 3 +1 dimensions. In particular we study the effects of nuclear thickness on the production of a glasma state in the McLerran-Venugopalan model within the color glass condensate framework. A finite thickness enables us to describe nuclei at lower energies, but forces us to abandon boost invariance. As a consequence, random classical color sources within the nuclei have to be included in the simulation, which is achieved by using the colored particle-in-cell method. We show that the description in the laboratory frame agrees with boost-invariant approaches as a limiting case. Furthermore we investigate collisions beyond boost invariance, in particular the pressure anisotropy in the glasma.

  14. Simulating collisions of thick nuclei in the color glass condensate framework

    CERN Document Server

    Gelfand, Daniil; Müller, David

    2016-01-01

    We present our work on the simulation of the early stages of heavy-ion collisions with finite longitudinal thickness in the laboratory frame in 3+1 dimensions. In particular we study the effects of nuclear thickness on the production of a glasma state in the McLerran-Venugopalan model within the color glass condensate framework. A finite thickness enables us to describe nuclei at lower energies, but forces us to abandon boost-invariance. As a consequence, random classical color sources within the nuclei have to be included in the simulation, which is achieved by using the colored particle-in-cell (CPIC) method. We show that the description in the laboratory frame agrees with boost-invariant approaches as a limiting case. Furthermore we investigate collisions beyond boost-invariance, in particular the pressure anisotropy in the glasma.

  15. DELPHES 3, A modular framework for fast simulation of a generic collider experiment

    CERN Document Server

    de Favereau, J; Demin, P; Giammanco, A; Lemaître, V; Mertens, A; Selvaggi, M

    2013-01-01

    The version 3.0 of the DELPHES fast-simulation framework is presented. The tool is written in C++ and is interfaced with the most common Monte-Carlo file formats. Its goal is the simulation of a multipurpose detector that includes a track propagation system embedded in a magnetic field, electromagnetic and hadronic calorimeters, and a muon identification system. The new modular design allows to easily produce the collections that are needed for later analysis, from low level objects such as tracks and calorimeter deposits up to high level collections such as isolated electrons, jets, taus, and missing energy. New features such as pile-up and improved algorithms like the particle-flow reconstruction approach have also been implemented.

  16. Using a New Event-Based Simulation Framework for Investigating Resource Provisioning in Clouds

    Directory of Open Access Journals (Sweden)

    Simon Ostermann

    2011-01-01

    Full Text Available Today, Cloud computing proposes an attractive alternative to building large-scale distributed computing environments by which resources are no longer hosted by the scientists' computational facilities, but leased from specialised data centres only when and for how long they are needed. This new class of Cloud resources raises new interesting research questions in the fields of resource management, scheduling, fault tolerance, or quality of service, requiring hundreds to thousands of experiments for finding valid solutions. To enable such research, a scalable simulation framework is typically required for early prototyping, extensive testing and validation of results before the real deployment is performed. The scope of this paper is twofold. In the first part we present GroudSim, a Grid and Cloud simulation toolkit for scientific computing based on a scalable simulation-independent discrete-event engine. GroudSim provides a comprehensive set of features for complex simulation scenarios from simple job executions on leased computing resources to file transfers, calculation of costs and background load on resources. Simulations can be parameterised and are easily extendable by probability distribution packages for failures which normally occur in complex distributed environments. Experimental results demonstrate the improved scalability of GroudSim compared to a related process-based simulation approach. In the second part, we show the use of the GroudSim simulator to analyse the problem of dynamic provisioning of Cloud resources to scientific workflows that do not benefit from sufficient Grid resources as required by their computational demands. We propose and study four strategies for provisioning and releasing Cloud resources that take into account the general leasing model encountered in today's commercial Cloud environments based on resource bulks, fuzzy descriptions and hourly payment intervals. We study the impact of our techniques to the

  17. A multi-fidelity framework for physics based rotor blade simulation and optimization

    Science.gov (United States)

    Collins, Kyle Brian

    with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of

  18. Neutronics Code Development at Argonne National Laboratory

    International Nuclear Information System (INIS)

    As part of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of U.S. DOE, a suite of modern fast reactor simulation tools is being developed at Argonne National Laboratory. The general goal is to reduce the uncertainties and biases in various areas of reactor design activities by providing enhanced prediction capabilities. Under this fast reactor simulation program, a high-fidelity deterministic neutron transport code named UNIC is being developed. The end goal of this development is to produce an integrated neutronics code that enables the high fidelity description of a nuclear reactor and simplifies the multi-step design process by direct and accurate coupling with thermal-hydraulics and structural mechanics calculations. (author)

  19. Self-consistently simulation of RF sheath boundary condition in BOUT + + framework

    Science.gov (United States)

    Gui, Bin; Xu, Xueqiao; Xia, Tianyang

    2015-11-01

    The effect of the RF sheath boundary condition on the edge-localized modes and the turbulent transport is simulated in this work. The work includes two parts. The first part is to calculate the equilibrium radial electric field with RF sheath boundary condition. It is known the thermal sheath or the rectified RF sheath will modify the potential in the SOL region. The modified potential induces addition shear flow in SOL. In this part, the equilibrium radial electric field across the separatrix is calculated by solving the 2D current continuity equation with sheath boundary condition, drifts and viscosity. The second part is applying the sheath boundary condition on the perturbed variables of the six-field two fluid model in BOUT + + framework. The six-field two-fluid model simulates the ELMs and turbulent transport. The sheath boundary condition is applied in this model and it aims to simulate effect of sheath boundary condition on the turbulent transport. It is found the sheath boundary plays as a sink in the plasma and suppresses the local perturbation. Based on this two work, the effect of RF sheath boundary condition on the ELMs and turbulent transport could be self-consistently simulated. Prepared by LLNL under Contract DE-AC52-07NA27344.

  20. Global Simulation of Bioenergy Crop Productivity: Analytical framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Jeff A. {Cyber Sciences} [ORNL; Post, Wilfred M [ORNL; Wang, Dali [ORNL; Wullschleger, Stan D [ORNL; Kline, Keith L [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL; Kang, Shujiang [ORNL

    2014-01-01

    Contemporary global assessments of the deployment potential and sustainability aspects of biofuel crops lack quantitative details. This paper describes an analytical framework capable of meeting the challenges associated with global scale agro-ecosystem modeling. We designed a modeling platform for bioenergy crops, consisting of five major components: (i) standardized global natural resources and management data sets, (ii) global simulation unit and management scenarios, (iii) model calibration and validation, (iv) high-performance computing (HPC) modeling, and (v) simulation output processing and analysis. A case study with the HPC- Environmental Policy Integrated Climate model (HPC-EPIC) to simulate a perennial bioenergy crop, switchgrass (Panicum virgatum L.) and global biomass feedstock analysis on grassland demonstrates the application of this platform. The results illustrate biomass feedstock variability of switchgrass and provide insights on how the modeling platform can be expanded to better assess sustainable production criteria and other biomass crops. Feedstock potentials on global grasslands and within different countries are also shown. Future efforts involve developing databases of productivity, implementing global simulations for other bioenergy crops (e.g. miscanthus, energycane and agave), and assessing environmental impacts under various management regimes. We anticipated this platform will provide an exemplary tool and assessment data for international communities to conduct global analysis of biofuel biomass feedstocks and sustainability.

  1. Global Simulation of Bioenergy Crop Productivity: Analytical Framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Shujiang [ORNL; Kline, Keith L [ORNL; Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Dr Jeff A [ORNL; Post, Wilfred M [ORNL; Brandt, Craig C [ORNL; Wullschleger, Stan D [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL

    2013-01-01

    A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulated a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.

  2. A GIS/Simulation Framework for Assessing Change in Water Yield over Large Spatial Scales

    Energy Technology Data Exchange (ETDEWEB)

    Graham, R.; Hargrove, W.W.; Huff, D.D.; Nikolov, N.; Tharp, M.L.

    1999-11-13

    Recent legislation to,initiate vegetation management in the Central Sierra hydrologic region of California includes a focus on corresponding changes in water yield. This served as the impetus for developing a combined geographic information system (GIS) and simulation assessment framework. Using the existing vegetation density condition, together with proposed rules for thinning to reduce fire risk, a set of simulation model inputs were generated for examining the impact of the thinning scenario on water yield. The approach allows results to be expressed as the mean and standard deviation of change in water yield for each 1 km2 map cell that is treated. Values for groups of cells are aggregated for typical watershed units using area-weighted averaging. Wet, dry and average precipitation years were simulated over a large region. Where snow plays an important role in hydrologic processes, the simulated change in water yield was less than 0.5% of expected annual runoff for a typical water shed. Such small changes would be undetectable in the field using conventional stream flow analysis. These results suggest that use of water yield increases to help justify forest-thinning activities or offset their cost will be difficult.

  3. A simulation model of MAPS for the FairRoot framework

    Energy Technology Data Exchange (ETDEWEB)

    Amar-Youcef, Samir; Linnik, Benjamin; Sitzmann, Philipp [Goethe-Universitaet Frankfurt (Germany); Collaboration: CBM-MVD-Collaboration

    2014-07-01

    CMOS MAPS are the sensors of choice for the MVD of the CBM experiment at the FAIR facility. They offer a unique combination of features required for the CBM detector like low material budget, spatial resolution, radiation tolerance and yet sufficient read-out speed. The physics performance of various designs of the MVD integrated to the CBM detector system is evaluated in the CBM-/FairRoot simulation framework. In this context, algorithm are developed to simulate the realistic detector response and to optimize feature extraction from the sensor information. The objective of the sensor response model is to provide fast and realistic pixel response for a given track energy loss and position. In addition, we discuss aspects of simulating event pile-up and dataflow in the context of the CBM FLES event extraction and selection concept. This is of particular importance for the MVD since the sensors feature a comparably long integration time and a frame-wise read-out. All other detector systems operate with un-triggered front-end electronics and are freely streaming time-stamped data to the FLES. Because of the large data rates, event extraction is performed via distributed networking on a large HPC compute farm. We present an overview and status of the MVD software developments focusing on the integration of the system in a free-flowing read-out system and on the concurrent application for simulated and real data.

  4. A simulation model of MAPS for the FairRoot framework

    International Nuclear Information System (INIS)

    CMOS MAPS are the sensors of choice for the MVD of the CBM experiment at the FAIR facility. They offer a unique combination of features required for the CBM detector like low material budget, spatial resolution, radiation tolerance and yet sufficient read-out speed. The physics performance of various designs of the MVD integrated to the CBM detector system is evaluated in the CBM-/FairRoot simulation framework. In this context, algorithm are developed to simulate the realistic detector response and to optimize feature extraction from the sensor information. The objective of the sensor response model is to provide fast and realistic pixel response for a given track energy loss and position. In addition, we discuss aspects of simulating event pile-up and dataflow in the context of the CBM FLES event extraction and selection concept. This is of particular importance for the MVD since the sensors feature a comparably long integration time and a frame-wise read-out. All other detector systems operate with un-triggered front-end electronics and are freely streaming time-stamped data to the FLES. Because of the large data rates, event extraction is performed via distributed networking on a large HPC compute farm. We present an overview and status of the MVD software developments focusing on the integration of the system in a free-flowing read-out system and on the concurrent application for simulated and real data.

  5. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries

    Directory of Open Access Journals (Sweden)

    Drawert Brian

    2012-06-01

    Full Text Available Abstract Background Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. Results We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods

  6. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State University

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  7. Developing a Conceptual Framework for Simulation Analysis in a Supply Chain Based on Common Platform (SCBCP

    Directory of Open Access Journals (Sweden)

    M. Fathollah

    2009-08-01

    Full Text Available As a competitive advantage in modern organizations, product diversification may cause complexities in today’s extended supplychains. However, the Common Platform (CP Strategy, as a concept of gaining maximum variety by minimum productionelements, is believed to be one of the answers to eliminate or decrease these complexities. The main purpose of this paper is toprovide a simulation framework for modeling the supply network of a case study in automotive industry in order to study theimpacts of part commonality through the chain. The electrical wiring harness is selected as the main part to be studiedaccording to essentiality and challenges of its procurement for the production of cars (as occurred in this case and many otherstudies. The paper does not provide the simulation results but it rather builds up the required foundation and gathers therelevant content to develop a realistic simulation model by closely studying the impacts of part multiplicity on differentfunctional areas of the selected supply network and extracting the critical success factors of applying part commonality.

  8. GeNN: a code generation framework for accelerated brain simulations.

    Science.gov (United States)

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369

  9. Understanding virulence mechanisms in M. tuberculosis infection via a circuit-based simulation framework.

    Energy Technology Data Exchange (ETDEWEB)

    May, Elebeoba Eni; Oprea, Tudor I.; Joo, Jaewook; Misra, Milind; Leitao, Andrei; Faulon, Jean-Loup Michel

    2008-08-01

    Tuberculosis (TB), caused by the bacterium Mycobacterium tuberculosis (Mtb), is a growing international health crisis. Mtb is able to persist in host tissues in a non-replicating persistent (NRP) or latent state. This presents a challenge in the treatment of TB. Latent TB can re-activate in 10% of individuals with normal immune systems, higher for those with compromised immune systems. A quantitative understanding of latency-associated virulence mechanisms may help researchers develop more effective methods to battle the spread and reduce TB associated fatalities. Leveraging BioXyce's ability to simulate whole-cell and multi-cellular systems we are developing a circuit-based framework to investigate the impact of pathogenicity-associated pathways on the latency/reactivation phase of tuberculosis infection. We discuss efforts to simulate metabolic pathways that potentially impact the ability of Mtb to persist within host immune cells. We demonstrate how simulation studies can provide insight regarding the efficacy of potential anti-TB agents on biological networks critical to Mtb pathogenicity using a systems chemical biology approach

  10. A framework for stochastic simulations and visualization of biological electron-transfer dynamics

    Science.gov (United States)

    Nakano, C. Masato; Byun, Hye Suk; Ma, Heng; Wei, Tao; El-Naggar, Mohamed Y.

    2015-08-01

    Electron transfer (ET) dictates a wide variety of energy-conversion processes in biological systems. Visualizing ET dynamics could provide key insight into understanding and possibly controlling these processes. We present a computational framework named VizBET to visualize biological ET dynamics, using an outer-membrane Mtr-Omc cytochrome complex in Shewanella oneidensis MR-1 as an example. Starting from X-ray crystal structures of the constituent cytochromes, molecular dynamics simulations are combined with homology modeling, protein docking, and binding free energy computations to sample the configuration of the complex as well as the change of the free energy associated with ET. This information, along with quantum-mechanical calculations of the electronic coupling, provides inputs to kinetic Monte Carlo (KMC) simulations of ET dynamics in a network of heme groups within the complex. Visualization of the KMC simulation results has been implemented as a plugin to the Visual Molecular Dynamics (VMD) software. VizBET has been used to reveal the nature of ET dynamics associated with novel nonequilibrium phase transitions in a candidate configuration of the Mtr-Omc complex due to electron-electron interactions.

  11. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  12. A finite element framework for high performance computer simulation of blood flow in the left ventricle of the human heart

    OpenAIRE

    Spühler, Jeannette Hiromi; Jansson, Johan; Jansson, Niclas; HOFFMAN, Johan

    2015-01-01

    Progress in medical imaging, computational fluid dynamics and high performance computing enables computer simulations to evolve as a significant tool to enhance our understanding of the relationship between cardiovascular diseases and hemodynamics. The field of cardiac flow simulations is highly interdisciplinary and challenging. Therefore, the aim of our research is to build a simple and reliable framework for modeling and simulation of the blood flow in the heart that is both easy to modify...

  13. A simulation-based framework for a mapping tool that assesses the energy performance of green roofs

    OpenAIRE

    Kokogiannakis, Georgios; Darkwa, Jo

    2012-01-01

    This paper presents a framework for the development of a GIS open source mapping tool that aims to disseminate a database with results of detailed simulations in order to assess in a quick and easy way the energy performance of green roof designs across a range of Chinese climates. Detailed simulation results for heating and cooling loads are obtained from the EnergyPlus simulation tool. The study covers 12264 configurations by varying model parameters such as climate, glazing type, roof insu...

  14. Theoretical Framework and Simulation Results for Implementing Weighted Multiple Sampling in Scientific CCDs

    CERN Document Server

    Alessandri, Cristobal; Abusleme, Angel; Avila, Diego; Alvarez, Enrique; Campillo, Hernan; Gallyas, Alexandra; Oberli, Christian; Guarini, Marcelo

    2015-01-01

    The Digital Correlated Double Sampling (DCDS) is a technique based on multiple analog-to-digital conversions of every pixel when reading a CCD out. This technique allows to remove analog integrators, simplifying the readout electronics circuitry. In this work, a theoretical framework that computes the optimal weighted coefficients of the pixels samples, which minimize the readout noise measured at the CCD output is presented. By using a noise model for the CCD output amplifier where white and flicker noise are treated separately, the mathematical tool presented allows for the computation of the optimal samples coefficients in a deterministic fashion. By modifying the noise profile, our simulation results get in agreement and thus explain results that were in mutual disagreement up until now.

  15. A multiscale framework for the simulation of the anisotropic mechanical behavior of shale

    CERN Document Server

    Li, Weixin; Jin, Congrui; Zhou, Xinwei; Cusatis, Gianluca

    2016-01-01

    Shale, like many other sedimentary rocks, is typically heterogeneous, anisotropic, and is characterized by partial alignment of anisotropic clay minerals and naturally formed bedding planes. In this study, a micromechanical framework based on the Lattice Discrete Particle Model (LDPM) is formulated to capture these features. Material anisotropy is introduced through an approximated geometric description of shale internal structure, which includes representation of material property variation with orientation and explicit modeling of parallel lamination. The model is calibrated by carrying out numerical simulations to match various experimental data, including the ones relevant to elastic properties, Brazilian tensile strength, and unconfined compressive strength. Furthermore, parametric study is performed to investigate the relationship between the mesoscale parameters and the macroscopic properties. It is shown that the dependence of the elastic stiffness, strength, and failure mode on loading orientation ca...

  16. CRPropa 3 - a Public Astrophysical Simulation Framework for Propagating Extraterrestrial Ultra-High Energy Particles

    CERN Document Server

    Batista, Rafael Alves; Erdmann, Martin; Kampert, Karl-Heinz; Kuempel, Daniel; Müller, Gero; Sigl, Guenter; van Vliet, Arjen; Walz, David; Winchen, Tobias

    2016-01-01

    We present the simulation framework CRPropa version 3 designed for efficient development of astrophysical predictions for ultra-high energy particles. Users can assemble modules of the most relevant propagation effects in galactic and extragalactic space, include their own physics modules with new features, and receive on output primary and secondary cosmic messengers including nuclei, neutrinos and photons. In extension to the propagation physics contained in a previous CRPropa version, the new version facilitates high-performance computing and comprises new physical features such as an interface for galactic propagation using lensing techniques, an improved photonuclear interaction calculation, and propagation in time dependent environments to take into account cosmic evolution effects in anisotropy studies and variable sources. First applications using highlighted features are presented as well.

  17. The structure of disaster resilience: a framework for simulations and policy recommendations

    Science.gov (United States)

    Edwards, J. H. Y.

    2015-04-01

    In this era of rapid climate change there is an urgent need for interdisciplinary collaboration and understanding in the study of what determines resistance to disasters and recovery speed. This paper is an economist's contribution to that effort. It traces the entrance of the word "resilience" from ecology into the social science literature on disasters, provides a formal economic definition of resilience that can be used in mathematical modeling, incorporates this definition into a multilevel model that suggests appropriate policy roles and targets at each level, and draws on the recent empirical literature on the economics of disaster, searching for policy handles that can stimulate higher resilience. On the whole it provides a framework for simulations and for formulating disaster resilience policies.

  18. A novel finite element framework for numerical simulation of fluidization processes and multiphase granular flow

    Science.gov (United States)

    Percival, James; Xie, Zhihua; Pavlidis, Dimitrios; Gomes, Jefferson; Pain, Christopher; Matar, Omar

    2013-11-01

    We present results from a new formulation of a numerical model for direct simulation of bed fluidization and multiphase granular flow. The model is based on a consistent application of continuous-discontinuous mixed control volume finite element methods applied to fully unstructured meshes. The unstructured mesh framework allows for both a mesh adaptive capability, modifying the computational geometry in order to bound the error in the numerical solution while maximizing computational efficiency, and a simple scripting interface embedded in the model which allows fast prototyping of correlation models and parameterizations in intercomparison experiments. The model is applied to standard test problems for fluidized beds. EPSRC Programme Grant EP/K003976/1.

  19. A framework to quantify uncertainty in simulations of oil transport in the ocean

    Science.gov (United States)

    Gonçalves, Rafael C.; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Chassignet, Eric; Knio, Omar M.

    2016-04-01

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model's output to be presented in a probabilistic framework so that the model's predictions reflect the uncertainty in the model's input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model's uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable.

  20. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    Science.gov (United States)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication

  1. Progress report for FACETS (Framework Application for Core-Edge Transport Simulations): C.S. SAP

    International Nuclear Information System (INIS)

    The mission of the Computer Science Scientific Application Partnership (C.S. SAP) at LLNL is to develop and apply leading-edge scientific component technology to FACETS software. Contributions from LLNL's fusion energy program staff towards the underlying physics modules are described in a separate report. FACETS uses component technology to combine selectively multiple physics and solver software modules written in different languages by different institutions together in an tightly-integrated, parallel computing framework for Tokamak reactor modeling. In the past fiscal year, the C.S. SAP has focused on two primary tasks: applying Babel to connect UEDGE into the FACETS framework through UEDGE's existing Python interface and developing a next generation componentization strategy for UEDGE which avoids the use of Python. The FACETS project uses Babel to solve its language interoperability challenges. Specific accomplishments for the year include: (1) Refined SIDL interfaces for UEDGE to meet satisfy the standard interfaces required by FACETS for all physics modules. This required consensus building between framework and UEDGE developers. (2) Wrote prototype C++ driver for UEDGE to demonstrate how UEDGE can be called from C++ using Babel. (3) Supported the FACETS project by adding new features to Babel such as release number tagging, porting to new machines, and adding new configuration options. Babel modifications were delivered to FACETS by testing and publishing development snapshots in the projects software repository. (4) Assisted Tech-X Corporation in testing and debugging of a high level build system for the complete FACETS tool chain--the complete list of third-party software libraries that FACETS depends on directly or indirectly (e.g., MPI, HDF5, PACT, etc.). (5) Designed and implemented a new approach to wrapping UEDGE as a FACETS component without requiring Python. To get simulation results as soon as possible, our initial connection from the FACETS

  2. Modification of the Argonne tandem

    International Nuclear Information System (INIS)

    For nuclear structure experiments with heavy ions it is necessary to have ion energies in excess of 5 MeV per nucleon. At the Argonne tandem FN accelerator this was accomplished by the addition of a superconducting linac. Modifications of the FN tandem to improve the performance of the pair is described

  3. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    CERN Document Server

    Leetmaa, Mikael

    2014-01-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel pe...

  4. A framework for stochastic simulation of distribution practices for hotel reservations

    Energy Technology Data Exchange (ETDEWEB)

    Halkos, George E.; Tsilika, Kyriaki D. [Laboratory of Operations Research, Department of Economics, University of Thessaly, Korai 43, 38 333, Volos (Greece)

    2015-03-10

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.

  5. A framework for stochastic simulation of distribution practices for hotel reservations

    International Nuclear Information System (INIS)

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system

  6. A discrete element based simulation framework to investigate particulate spray deposition processes

    International Nuclear Information System (INIS)

    This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid–particle interactions, particle–surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid–particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions

  7. A discrete element based simulation framework to investigate particulate spray deposition processes

    KAUST Repository

    Mukherjee, Debanjan

    2015-06-01

    © 2015 Elsevier Inc. This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid-particle interactions, particle-surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid-particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions.

  8. A discrete element based simulation framework to investigate particulate spray deposition processes

    Energy Technology Data Exchange (ETDEWEB)

    Mukherjee, Debanjan, E-mail: debanjan@berkeley.edu; Zohdi, Tarek I., E-mail: zohdi@me.berkeley.edu

    2015-06-01

    This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid–particle interactions, particle–surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid–particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions.

  9. A heterogeneous and parallel computing framework for high-resolution hydrodynamic simulations

    Science.gov (United States)

    Smith, Luke; Liang, Qiuhua

    2015-04-01

    Shock-capturing hydrodynamic models are now widely applied in the context of flood risk assessment and forecasting, accurately capturing the behaviour of surface water over ground and within rivers. Such models are generally explicit in their numerical basis, and can be computationally expensive; this has prohibited full use of high-resolution topographic data for complex urban environments, now easily obtainable through airborne altimetric surveys (LiDAR). As processor clock speed advances have stagnated in recent years, further computational performance gains are largely dependent on the use of parallel processing. Heterogeneous computing architectures (e.g. graphics processing units or compute accelerator cards) provide a cost-effective means of achieving high throughput in cases where the same calculation is performed with a large input dataset. In recent years this technique has been applied successfully for flood risk mapping, such as within the national surface water flood risk assessment for the United Kingdom. We present a flexible software framework for hydrodynamic simulations across multiple processors of different architectures, within multiple computer systems, enabled using OpenCL and Message Passing Interface (MPI) libraries. A finite-volume Godunov-type scheme is implemented using the HLLC approach to solving the Riemann problem, with optional extension to second-order accuracy in space and time using the MUSCL-Hancock approach. The framework is successfully applied on personal computers and a small cluster to provide considerable improvements in performance. The most significant performance gains were achieved across two servers, each containing four NVIDIA GPUs, with a mix of K20, M2075 and C2050 devices. Advantages are found with respect to decreased parametric sensitivity, and thus in reducing uncertainty, for a major fluvial flood within a large catchment during 2005 in Carlisle, England. Simulations for the three-day event could be performed

  10. Statistical framework to simulate daily rainfall series conditional on upper-air predictor variables

    Science.gov (United States)

    Langousis, Andreas; Kaleris, Vassilios

    2014-05-01

    We propose a statistical framework to generate synthetic rainfall time series at daily resolution, conditional on predictor variables indicative of the atmospheric circulation at the mesoscale. We do so by first introducing a dimensionless measure to assess the relative influence of upper-air variables at different pressure levels on ground-level rainfall statistics, and then simulating rainfall occurrence and amount by proper conditioning on the selected atmospheric predictors. The proposed scheme for conditional rainfall simulation operates at a daily time step (avoiding discrete approaches for identification of weather states), can incorporate any possible number and combination of predictor variables, while it is capable of reproducing rainfall seasonality directly from the variation of upper-air variables, without any type of seasonal analysis or modeling. The suggested downscaling approach is tested using atmospheric data from the ERA-Interim archive and daily rainfall measurements from western Greece. The model is found to accurately reproduce several statistics of actual rainfall time series, at both annual and seasonal levels, including wet day fractions, the alternation of wet and dry intervals, the distributions of dry and wet spell lengths, the distribution of rainfall intensities in wet days, short-range dependencies present in historical rainfall records, the distribution of yearly rainfall maxima, dependencies of rainfall statistics on the observation scale, and long-term climatic features present in historical rainfall records. The suggested approach is expected to serve as a useful tool for stochastic rainfall simulation conditional on climate model outputs at a regional level, where climate change impacts and risks are assessed.

  11. A framework for predicting the non-visual effects of daylight – Part II: The simulation model

    OpenAIRE

    Mardaljevic, John; Andersen, Marilyne; Roy, Nicolas; Christoffersen, Jens

    2014-01-01

    This paper describes a climate-based simulation framework devised to investigate the potential for the non-visual effects of daylight in buildings. It is part 2 of a study where the first paper focused on the formulation of the photobiological underpinnings of a threshold-based model configured for lighting simulation from the perspective of the human nonvisual system (e.g. circadian response). This threshold-based model employs a static dose-response curve and instantaneous exposure of dayli...

  12. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    Science.gov (United States)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  13. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF)?a model-based software framework that shall enable seamless continuity of mission design and...

  14. The effect of casting and masticatory simulation on strain and misfit of implant-supported metal frameworks.

    Science.gov (United States)

    Bhering, Cláudia Lopes Brilhante; Marques, Isabella da Silva Vieira; Takahashi, Jessica Mie Ferreira Koyama; Barão, Valentim Adelino Ricardo; Consani, Rafael Leonardo Xediek; Mesquita, Marcelo Ferraz

    2016-05-01

    The influence of casting and masticatory simulation on marginal misfit and strain in multiple implant-supported prostheses was evaluated. Three-unit screw retained fixed dental prosthesis (FDP) and screw retained full-arch fixed dental prosthesis (FAFDP) frameworks were made using calcinable or overcasted cylinders on conical dental implant abutment. Four groups were obtained according to the cylinder and prosthesis type (n=10). Frameworks were casted in CoCr alloy and subjected to strain gauge analyses and marginal misfit measurements before and after 10(6) mechanical cycles (2 Hz/280 N). Results were submitted to ANOVA, Tukey's HSD and Pearson correlation test (α=0.05). No difference was found on misfit among all groups and times (p>0.05). Overcasted frameworks showed higher strain than the calcinable ones (FDP - Initial p=0.0047; Final p=0.0004; FAFDP - Initial p=0.0476; Final p=0.0115). The masticatory simulation did not influence strain (p>0.05). No correlation was observed between strain and misfit (r=0.24; p>0.05). In conclusion, the marginal misfit value in the overcasted full-arch frameworks was higher than clinical acceptable data. It proved that overcasted method is not an ideal method for full-arch prosthesis. Overcasted frameworks generate higher strain upon the system. The masticatory simulation had no influence on misfit and strain of multiple prostheses. PMID:26952480

  15. A model framework to represent plant-physiology and rhizosphere processes in soil profile simulation models

    Science.gov (United States)

    Vanderborght, J.; Javaux, M.; Couvreur, V.; Schröder, N.; Huber, K.; Abesha, B.; Schnepf, A.; Vereecken, H.

    2013-12-01

    of plant transpiration by root-zone produced plant hormones, and (iv) the impact of salt accumulation at the soil-root interface on root water uptake. We further propose a framework how this process knowledge could be implemented in root zone simulation models that do not resolve small scale processes.

  16. Numerical simulation of the Moon's rotation in a rigorous relativistic framework

    Science.gov (United States)

    Wang, Zai; Han, Wen-Biao; Tang, Kai; Tao, Jin-He

    2016-06-01

    This paper describes a numerical simulation of the rigid rotation of the Moon in a relativistic framework. Following a resolution passed by the International Astronomical Union (IAU) in 2000, we construct a kinematically non-rotating reference system named the Selenocentric Celestial Reference System (SCRS) and give the time transformation between the Selenocentric Coordinate Time (TCS) and Barycentric Coordinate Time (TCB). The post-Newtonian equations of the Moon's rotation are written in the SCRS, and they are integrated numerically. We calculate the correction to the rotation of the Moon due to total relativistic torque which includes post-Newtonian and gravitomagnetic torques as well as geodetic precession. We find two dominant periods associated with this correction: 18.6yr and 80.1 yr. In addition, the precession of the rotating axes caused by fourth-degree and fifth-degree harmonics of the Moon is also analyzed, and we have found that the main periods of this precession are 27.3d, 2.9 yr, 18.6 yr and 80.1 yr.

  17. A systematic intercomparison of regional flood frequency analysis models in a simulation framework

    Science.gov (United States)

    Ganora, Daniele; Laio, Francesco; Claps, Pierluigi

    2015-04-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve (or other discharge-related variables), based on the fundamental concept of substituting temporal information at a site (no data or short time series) by exploiting observations at other sites (spatial information). Different RFA paradigms exist, depending on the way the information is transferred to the site of interest. Despite the wide use of such methodology, a systematic comparison between these paradigms has not been performed. The aim of this study is to provide a framework wherein carrying out the intercomparison: we thus synthetically generate data through Monte Carlo simulations for a number of (virtual) stations, following a GEV parent distribution; different scenarios can be created to represent different spatial heterogeneity patterns by manipulating the parameters of the parent distribution at each station (e.g. with a linear variation in space of the shape parameter of the GEV). A special case is the homogeneous scenario where each station record is sampled from the same parent distribution. For each scenario and each simulation, different regional models are applied to evaluate the 200-year growth factor at each station. Results are than compared to the exact growth factor of each station, which is known in our virtual world. Considered regional approaches include: (i) a single growth curve for the whole region; (ii) a multiple-region model based on cluster analysis which search for an adequate number of homogeneous subregions; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially-smooth estimation procedure based on linear regressions.. A further benchmark model is the at-site estimate based on the analysis of the local record. A comprehensive analysis of the results of the simulations shows that, if the scenario is homogeneous (no spatial variability), all the regional approaches

  18. A modular framework for matter flux simulation at the catchment scale

    Science.gov (United States)

    Kraft, P.; Breuer, L.; Vaché, K. B.; Frede, H.-G.

    2009-04-01

    Modeling nutrient fluxes in a catchment is a complex and interdisciplinary task. Building and improving simulation tools for such complex systems is often constraint by the expertise of the engaged scientists: Since different fields of science are involved like vadose zone and ground water hydrology, plant growth, atmospheric exchange, soil chemistry, soil microbiology, stream physics and stream chemistry, a single work group cannot excel in all parts. As a result, either parts of the system, where no scientist involved is an expert, include rough simplifications, or a "complete" group is too big for maintaining the system over a longer period. However, many approaches exist to create complex models that integrate processes for all sub domains. But a tight integration bears the problem of freezing a specific state of science in the complex system. A model infrastructure, which takes the complex feedback loops across domain boundaries (e.g. soil moisture and plant growth) into consideration and is still flexible enough for adoption to new findings in any of the scientific fields is therefore needed. This type of infrastructure can be obtained by a set of independent, but connectible models. The new Catchment Model Framework (cmf), a module for subsurface water and solute transport, is an example of an independent yet open and easily extendible framework for the simulation of water and solute transport processes. Openness is gained by implementing the model as an extension to the Python programming language. Coupling of cmf with models also providing an interface to the Python language dealing with other system compartments, as plant growth, biogeochemical or atmospheric dispersion models etc. can easily be done. The models used in the coupling process can either be spatial explicit models, plot scale models with one instance per mesh node of the landscape model or pure reaction functions using the integration methods of cmf. The concept of extending an existing and

  19. Elements of naturality in dynamical simulation frameworks for Hamiltonian, thermostatic, and Lindbladian flows on classical and quantum state-spaces

    CERN Document Server

    Sidles, John A; Jacky, Jonathan P; Picone, Rico A R; Harsila, Scott A

    2010-01-01

    The practical focus of this work is the dynamical simulation of polarization transport processes in quantum spin microscopy and spectroscopy. The simulation framework is built-up progressively, beginning with state-spaces (configuration manifolds) that are geometrically natural, introducing coordinates that are algebraically natural; and finally specifying dynamical potentials that are physically natural; in each respect explicit criteria are given for "naturality." The resulting framework encompasses Hamiltonian flow (both classical and quantum), quantum Lindbladian processes, and classical thermostatic processes. Constructive validation and verification criteria are given for metric and symplectic flows on classical, quantum, and hybrid state-spaces, with particular emphasis to tensor network state-spaces. Both classical and quantum examples are presented, including dynamic nuclear polarization (DNP). A broad span of applications and challenges is discussed, ranging from the design and simulation of quantum...

  20. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  1. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  2. A Simulation Framework for Exploring Socioecological Dynamics and Sustainability of Settlement Systems Under Stress in Ancient Mesopotamia and Beyond

    Science.gov (United States)

    Christiansen, J. H.; Altaweel, M. R.

    2007-12-01

    The presentation will describe an object-oriented, agent-based simulation framework being used to help answer longstanding questions regarding the development trajectories and sustainability of ancient Mesopotamian settlement systems. This multidisciplinary, multi-model framework supports explicit, fine-scale representations of the dynamics of key natural processes such as crop growth, hydrology, and weather, operating concurrently with social processes such as kinship-driven behaviors, farming and herding practices, social stratification, and economic and political activities carried out by social agents that represent individual persons, households, and larger-scale organizations. The framework has allowed us to explore the inherently coupled dynamics of modeled settlements and landscapes that are undergoing diverse social and environmental stresses, both acute and chronic, across multi-generational time spans. The simulation framework was originally used to address single-settlement scenarios, but has recently been extended to begin to address settlement system sustainability issues at sub-regional to regional scale, by introducing a number of new dynamic mechanisms, such as the activities of nomadic communities, that manifest themselves at these larger spatial scales. The framework is flexible and scalable and has broad applicability. It has, for example, recently been adapted to address agroeconomic sustainability of settlement systems in modern rural Thailand, testing the resilience and vulnerability of settled landscapes in the face of such perturbations as large-scale political interventions, global economic shifts, and climate change.

  3. Infectio: a Generic Framework for Computational Simulation of Virus Transmission between Cells

    Science.gov (United States)

    Yakimovich, Artur; Yakimovich, Yauhen; Schmid, Michael; Mercer, Jason; Sbalzarini, Ivo F.

    2016-01-01

    ABSTRACT Viruses spread between cells, tissues, and organisms by cell-free and cell-cell mechanisms, depending on the cell type, the nature of the virus, or the phase of the infection cycle. The mode of viral transmission has a large impact on disease development, the outcome of antiviral therapies or the efficacy of gene therapy protocols. The transmission mode of viruses can be addressed in tissue culture systems using live-cell imaging. Yet even in relatively simple cell cultures, the mechanisms of viral transmission are difficult to distinguish. Here we present a cross-platform software framework called “Infectio,” which is capable of simulating transmission phenotypes in tissue culture of virtually any virus. Infectio can estimate interdependent biological parameters, for example for vaccinia virus infection, and differentiate between cell-cell and cell-free virus spreading. Infectio assists in elucidating virus transmission mechanisms, a feature useful for designing strategies of perturbing or enhancing viral transmission. The complexity of the Infectio software is low compared to that of other software commonly used to quantitate features of cell biological images, which yields stable and relatively error-free output from Infectio. The software is open source (GPLv3 license), and operates on the major platforms (Windows, Mac, and Linux). The complete source code can be downloaded from http://infectio.github.io/index.html. IMPORTANCE Infectio presents a generalized platform to analyze virus infection spread between cells. It allows the simulation of plaque phenotypes from image-based assays. Viral plaques are the result of virus spreading from primary infected cells to neighboring cells. This is a complex process and involves neighborhood effects at cell-cell contact sites or fluid dynamics in the extracellular medium. Infectio differentiates between two major modes of virus transmission between cells, allowing in silico testing of hypotheses about

  4. Infectio: a Generic Framework for Computational Simulation of Virus Transmission between Cells.

    Science.gov (United States)

    Yakimovich, Artur; Yakimovich, Yauhen; Schmid, Michael; Mercer, Jason; Sbalzarini, Ivo F; Greber, Urs F

    2016-01-01

    Viruses spread between cells, tissues, and organisms by cell-free and cell-cell mechanisms, depending on the cell type, the nature of the virus, or the phase of the infection cycle. The mode of viral transmission has a large impact on disease development, the outcome of antiviral therapies or the efficacy of gene therapy protocols. The transmission mode of viruses can be addressed in tissue culture systems using live-cell imaging. Yet even in relatively simple cell cultures, the mechanisms of viral transmission are difficult to distinguish. Here we present a cross-platform software framework called "Infectio," which is capable of simulating transmission phenotypes in tissue culture of virtually any virus. Infectio can estimate interdependent biological parameters, for example for vaccinia virus infection, and differentiate between cell-cell and cell-free virus spreading. Infectio assists in elucidating virus transmission mechanisms, a feature useful for designing strategies of perturbing or enhancing viral transmission. The complexity of the Infectio software is low compared to that of other software commonly used to quantitate features of cell biological images, which yields stable and relatively error-free output from Infectio. The software is open source (GPLv3 license), and operates on the major platforms (Windows, Mac, and Linux). The complete source code can be downloaded from http://infectio.github.io/index.html. IMPORTANCE Infectio presents a generalized platform to analyze virus infection spread between cells. It allows the simulation of plaque phenotypes from image-based assays. Viral plaques are the result of virus spreading from primary infected cells to neighboring cells. This is a complex process and involves neighborhood effects at cell-cell contact sites or fluid dynamics in the extracellular medium. Infectio differentiates between two major modes of virus transmission between cells, allowing in silico testing of hypotheses about spreading

  5. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    Science.gov (United States)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-09-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel performance with a simple one-dimensional A-B-C lattice KMC model and a more complex three-dimensional lattice KMC model of oxygen-vacancy diffusion in a fluorite structured metal oxide. KMCLib can keep track of individual particle movements and includes tools for mean square displacement analysis, and is therefore particularly well suited for studying diffusion processes at surfaces and in solids. Catalogue identifier: AESZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AESZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 49 064 No. of bytes in distributed program, including test data, etc.: 1 575 172 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer that can run a C++ compiler and a Python interpreter. Operating system: Tested on Ubuntu 12

  6. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser.

    Science.gov (United States)

    Yoon, Chun Hong; Yurkov, Mikhail V; Schneidmiller, Evgeny A; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N Duane; Tschentscher, Thomas; Mancuso, Adrian P

    2016-01-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design. PMID:27109208

  7. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    Science.gov (United States)

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; Mancuso, Adrian P.

    2016-01-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design. PMID:27109208

  8. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    Science.gov (United States)

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; Mancuso, Adrian P.

    2016-04-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design.

  9. Demonstrating the Practical Advantages of the Scalable and Interoperable Astronomical Framework FASE: Applications to EUCLID Simulations and LUCIFER Data Reduction

    Science.gov (United States)

    Paioro, L.; Garilli, B.; Franzetti, P.; Fumana, M.; Scodeggio, M.; Grosbøl, P.; Tody, D.; Surace, C.

    2012-09-01

    The European OPTICON Networks 3.6 and 9.2 in collaboration with the Virtual Observatory, during the last years have produced a detailed document, designing the requirements and the architecture of a future scalable and interoperable desktop framework for the astronomical software (FASE). A first reference implementation of the FASE framework has been developed at INAF-IASF Milano and applied to different projects we are involved in: a) the simulation software developed to study the performance of the EUCLID NISP instrument; b) the LBT LUCIFER instrument reduction pipeline used by the Italian community. An application involving graphical capabilities is also being developed exploiting FASE facilities. We show how the main architectural concepts of the FASE framework have been successfully applied to the software mentioned above, providing easy to use and install interoperable software, equipped with distributed and scalable capabilities. See also Grosbøl et al. (2012).

  10. Towards a framework for teaching about information technology risk in health care: Simulating threats to health data and patient safety

    Directory of Open Access Journals (Sweden)

    Elizabeth M. Borycki

    2015-09-01

    Full Text Available In this paper the author describes work towards developing an integrative framework for educating health information technology professionals about technology risk. The framework considers multiple sources of risk to health data quality and integrity that can result from the use of health information technology (HIT and can be used to teach health professional students about these risks when using health technologies. This framework encompasses issues and problems that may arise from varied sources, including intentional alterations (e.g. resulting from hacking and security breaches as well as unintentional breaches and corruption of data (e.g. resulting from technical problems, or from technology-induced errors. The framework that is described has several levels: the level of human factors and usability of HIT, the level of monitoring of security and accuracy, the HIT architectural level, the level of operational and physical checks, the level of healthcare quality assurance policies and the data risk management strategies level. Approaches to monitoring and simulation of risk are also discussed, including a discussion of an innovative approach to monitoring potential quality issues. This is followed by a discussion of the application (using computer simulations to educate both students and health information technology professionals about the impact and spread of technology-induced and related types of data errors involving HIT.

  11. A practical proposal for the use of origin destination matrices in the analysis, modeling and simulation framework for traffic management

    OpenAIRE

    Barceló Bugeda, Jaime; Montero Mercadé, Lídia; Bullejos, Manuel; Linares Herreros, Mª Paz

    2013-01-01

    Dealing with traffic demand trip matrices to feed models to support decisions in traffic management is still a problem that deserves research efforts, aimed to find practical solutions applicable in the short term horizon, before future developments become available. This paper analyzes the role of OD matrices in the framework proposed for Analysis, Modeling and Simulation (AMS), namely when moving from static OD matrices to the time-sliced OD required by (AMS) applications. The paper reviews...

  12. Environmental Survey preliminary report, Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    1988-11-01

    This report presents the preliminary findings of the first phase of the Environmental Survey of the United States Department of Energy's (DOE) Argonne National Laboratory (ANL), conducted June 15 through 26, 1987. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. The team includes outside experts supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with ANL. The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. The on-site phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at ANL, and interviews with site personnel. The Survey team developed a Sampling and Analysis (S A) Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The S A Plan will be executed by the Oak Ridge National Laboratory (ORNL). When completed, the S A results will be incorporated into the Argonne National Laboratory Environmental Survey findings for inclusion in the Environmental Survey Summary Report. 75 refs., 24 figs., 60 tabs.

  13. Proposed environmental remediation at Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The Department of Energy (DOE) has prepared an Environmental Assessment evaluating proposed environmental remediation activity at Argonne National Laboratory-East (ANL-E), Argonne, Illinois. The environmental remediation work would (1) reduce, eliminate, or prevent the release of contaminants from a number of Resource Conservation and Recovery Act (RCRA) Solid Waste Management Units (SWMUs) and two radiologically contaminated sites located in areas contiguous with SWMUs, and (2) decrease the potential for exposure of the public, ANL-E employees, and wildlife to such contaminants. The actions proposed for SWMUs are required to comply with the RCRA corrective action process and corrective action requirements of the Illinois Environmental Protection Agency; the actions proposed are also required to reduce the potential for continued contaminant release. Based on the analysis in the EA, the DOE has determined that the proposed action does not constitute a major federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act of 1969 (NEPA). Therefore, the preparation of an Environmental Impact Statement is not required.

  14. Proposed environmental remediation at Argonne National Laboratory, Argonne, Illinois

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) has prepared an Environmental Assessment evaluating proposed environmental remediation activity at Argonne National Laboratory-East (ANL-E), Argonne, Illinois. The environmental remediation work would (1) reduce, eliminate, or prevent the release of contaminants from a number of Resource Conservation and Recovery Act (RCRA) Solid Waste Management Units (SWMUs) and two radiologically contaminated sites located in areas contiguous with SWMUs, and (2) decrease the potential for exposure of the public, ANL-E employees, and wildlife to such contaminants. The actions proposed for SWMUs are required to comply with the RCRA corrective action process and corrective action requirements of the Illinois Environmental Protection Agency; the actions proposed are also required to reduce the potential for continued contaminant release. Based on the analysis in the EA, the DOE has determined that the proposed action does not constitute a major federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act of 1969 (NEPA). Therefore, the preparation of an Environmental Impact Statement is not required

  15. Environmental Survey preliminary report, Argonne National Laboratory, Argonne, Illinois

    International Nuclear Information System (INIS)

    This report presents the preliminary findings of the first phase of the Environmental Survey of the United States Department of Energy's (DOE) Argonne National Laboratory (ANL), conducted June 15 through 26, 1987. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. The team includes outside experts supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with ANL. The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. The on-site phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at ANL, and interviews with site personnel. The Survey team developed a Sampling and Analysis (S ampersand A) Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The S ampersand A Plan will be executed by the Oak Ridge National Laboratory (ORNL). When completed, the S ampersand A results will be incorporated into the Argonne National Laboratory Environmental Survey findings for inclusion in the Environmental Survey Summary Report. 75 refs., 24 figs., 60 tabs

  16. An update on Argonne's AWA

    International Nuclear Information System (INIS)

    The Argonne Wakefield Accelerator (AWA) is a new research facility which will possess unprecedented research capabilities for the study of wakefields and related areas requiring short, intense electron bunches. The AWA is designed to produce 100 nC, 14 ps (full width) electron bunches at rep rates up to 30 Hz. Phase-1 of the AWA, now under construction, will provide these pulses at 20 MeV for various experiments. Current designs, related research and development, and construction status are presented in this general overview and project update. 6 refs., 4 figs

  17. Argonne National Laboratory 1985 publications

    International Nuclear Information System (INIS)

    This report is a bibliography of scientific and technical 1985 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1985. This compilation, prepared by the Technical Information Services Technical Publications Section (TPB), lists all nonrestricted 1985 publications submitted to TPS by Laboratory's Divisions. The report is divided into seven parts: Journal Articles - Listed by first author, ANL Reports - Listed by report number, ANL and non-ANL Unnumbered Reports - Listed by report number, Non-ANL Numbered Reports - Listed by report number, Books and Book Chapters - Listed by first author, Conference Papers - Listed by first author, Complete Author Index

  18. Argonne National Laboratory 1985 publications

    Energy Technology Data Exchange (ETDEWEB)

    Kopta, J.A. (ED.); Hale, M.R. (comp.)

    1987-08-01

    This report is a bibliography of scientific and technical 1985 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1985. This compilation, prepared by the Technical Information Services Technical Publications Section (TPB), lists all nonrestricted 1985 publications submitted to TPS by Laboratory's Divisions. The report is divided into seven parts: Journal Articles - Listed by first author, ANL Reports - Listed by report number, ANL and non-ANL Unnumbered Reports - Listed by report number, Non-ANL Numbered Reports - Listed by report number, Books and Book Chapters - Listed by first author, Conference Papers - Listed by first author, Complete Author Index.

  19. High Performance Hybrid RANS-LES Simulation Framework for Turbulent Combusting Flows Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a computational framework for high performance, high fidelity computational fluid dynamics (CFD) to enable accurate, fast and robust...

  20. Advanced Simulation Framework for Design and Analysis of Space Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a high-performance, high-fidelity framework in the computational fluid dynamics (CFD) code called Loci-STREAM to enable accurate,...

  1. Advanced Simulation Framework for Design and Analysis of Space Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a computational framework for high performance, high fidelity computational fluid dynamics (CFD) to enable accurate, fast and robust...

  2. Framework for real-time forest fire animation: Simulating fire spread using the GPU

    OpenAIRE

    Kjærnet, Øystein

    2010-01-01

    In 2009 Odd Erik Gundersen and Jo Skjermo described a conceptual framework for animating physically based forest fires. This project expands on their ideas with a focus on how modern graphics hardware can be utilized to achieve real-time performance. A prototype demonstrating some of the concepts suggested for the framework have been implemented and tested, successfully achieving real-time frame rates on a simple animation of a burning tree.

  3. A General Simulation Framework for Supply Chain Modeling: State of the Art and Case Study

    CERN Document Server

    Cimino, Antonio; Mirabelli, Giovanni

    2010-01-01

    Nowadays there is a large availability of discrete event simulation software that can be easily used in different domains: from industry to supply chain, from healthcare to business management, from training to complex systems design. Simulation engines of commercial discrete event simulation software use specific rules and logics for simulation time and events management. Difficulties and limitations come up when commercial discrete event simulation software are used for modeling complex real world-systems (i.e. supply chains, industrial plants). The objective of this paper is twofold: first a state of the art on commercial discrete event simulation software and an overview on discrete event simulation models development by using general purpose programming languages are presented; then a Supply Chain Order Performance Simulator (SCOPS, developed in C++) for investigating the inventory management problem along the supply chain under different supply chain scenarios is proposed to readers.

  4. A General Simulation Framework for Supply Chain Modeling: State of the Art and Case Study

    Directory of Open Access Journals (Sweden)

    Antonio Cimino

    2010-03-01

    Full Text Available Nowadays there is a large availability of discrete event simulation software that can be easily used in different domains: from industry to supply chain, from healthcare to business management, from training to complex systems design. Simulation engines of commercial discrete event simulation software use specific rules and logics for simulation time and events management. Difficulties and limitations come up when commercial discrete event simulation software are used for modeling complex real world-systems (i.e. supply chains, industrial plants. The objective of this paper is twofold: first a state of the art on commercial discrete event simulation software and an overview on discrete event simulation models development by using general purpose programming languages are presented; then a Supply Chain Order Performance Simulator (SCOPS, developed in C++ for investigating the inventory management problem along the supply chain under different supply chain scenarios is proposed to readers.

  5. Run-Time Interoperability Between Neuronal Network Simulators Based on the MUSIC Framework

    OpenAIRE

    Djurfeldt, Mikael; Hjorth, Johannes,; Eppler, Jochen M; Dudani, Niraj; Helias, Moritz; Potjans, Tobias C.; Bhalla, Upinder S; Diesmann, Markus; Hellgren Kotaleski, Jeanette; Ekeberg, Örjan

    2010-01-01

    MUSIC is a standard API allowing large scale neuron simulators to exchange data within a parallel computer during runtime. A pilot implementation of this API has been released as open source. We provide experiences from the implementation of MUSIC interfaces for two neuronal network simulators of different kinds, NEST and MOOSE. A multi-simulation of a cortico-striatal network model involving both simulators is performed, demonstrating how MUSIC can promote inter-operability between models wr...

  6. NEVESIM: Event-Driven Neural Simulation Framework with a Python Interface

    OpenAIRE

    Dejan ePecevski; David eKappel; Zeno eJonke

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes...

  7. NEVESIM: event-driven neural simulation framework with a Python interface

    OpenAIRE

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes...

  8. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  9. Monte Carlo simulation of the scanner rSPECT using GAMOS: a Geant4 based-framework

    International Nuclear Information System (INIS)

    The molecular imaging of cellular processes in vivo using preclinical animal studies and SPECT technique is one of the main reasons for the design of new devices with high spatial resolution. As an auxiliary tool, Monte Carlo simulation has allowed the characterization and optimization of those medical imaging systems. GAMOS (Geant4-based Architecture for Medicine-Oriented Simulations) has been proved as a powerful and effective toolkit to reproduce experimental data obtained with PET (Positron Emission Tomography) systems. This work aims to demonstrate the potential of this new simulation framework to generate reliable simulated data using SPECT (Single Photon Emission Tomography) applications package. For this purpose, simulation of a novel installation, dedicated to preclinical studies with rodents 'sPECT' has been done. The study comprises collimation, detection geometries, spatial distribution and activity of the source in correspondence with experimental measurements. Studies have been done using 99mTc, 20% energy window and two collimators: 1. hexagonal parallel holes and 2. pinhole. Performance evaluation of the facility was focused to calculate spatial resolution and sensitivity as function of source-collimator distance. Simulated values had been compared with experimental ones. A micro-Derenzo phantom was recreated in order to carry out tomographic reconstruction using Single Slice ReBinning (SSRB) algorithm. It was concluded that simulation shows good agreement with experimental data, which proves GAMOS feasibility in reproducing SPECT data. (Author)

  10. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  11. Water adsorption and proton conduction in metal-organic frameworks: Insights from molecular simulations

    Science.gov (United States)

    Paesani, Francesco

    2014-03-01

    Metal-organic frameworks (MOFs) are a relatively new class of porous materials that hold great potential for a wide range of applications in chemistry, materials science, and nanoengineering. Compared to other porous materials such as zeolites, MOF properties are highly tunable. In particular, it has been shown that both size and shape of the MOF pores can be rationally designed for specific applications. For example, the ability to modify the framework properties with respect to hydrophilicity/hydrophobicity and acidity/basicity can enable the direct control of proton conduction through carrier molecules adsorbed inside the pores. Here, I report on our current efforts aimed at providing a molecular-level characterization of water-mediated proton conduction through the MOF pores. Particular emphasis will be put on correlation between proton conduction and both structural and chemical properties of the frameworks as well as on the dynamical behavior of water confined in the MOF pores. NSF award number: DMR-130510

  12. Study of the response and photon-counting resolution of silicon photomultipliers using a generic simulation framework

    CERN Document Server

    Eckert, Patrick; Schultz-Coulon, Hans-Christian

    2012-01-01

    We present a generic framework for the simulation of Silicon Photomultipliers (SiPMs) which enables detailed modelling of the SiPM response using basic SiPM parameters and geometry as an input. Depending on the specified SiPM properties which can be determined from basic characterisation measurements, the simulation generates the signal charge and pulse shape for arbitrary incident light pulse distributions. The simulation has been validated in the whole dynamic range for a Hamamatsu S10362-11-100C MPPC and was used to study the effect of different noise sources like optical cross-talk and after-pulsing on the response curve and the photon-counting resolution.

  13. Hydro-without-hydro framework for simulations of black hole-neutron star binaries

    International Nuclear Information System (INIS)

    We introduce a computational framework which avoids solving explicitly hydrodynamic equations and is suitable for studying the pre-merger evolution of black hole-neutron star binary systems. The essence of the method consists of constructing a neutron star model with a black hole companion and freezing the internal degrees of freedom of the neutron star during the course of the evolution of the spacetime geometry. We present the main ingredients of the framework, from the formulation of the problem to the appropriate computational techniques to study these binary systems. In addition, we present numerical results of the construction of initial data sets and evolutions that demonstrate the feasibility of this approach

  14. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    Science.gov (United States)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  15. Sensitivity of Surface Flux Simulations to Hydrologic Parameters Based on an Uncertainty Quantification Framework Applied to the Community Land Model

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Zhangshuan; Huang, Maoyi; Leung, Lai-Yung R.; Lin, Guang; Ricciuto, Daniel M.

    2012-08-10

    Uncertainties in hydrologic parameters could have significant impacts on the simulated water and energy fluxes and land surface states, which will in turn affect atmospheric processes and the carbon cycle. Quantifying such uncertainties is an important step toward better understanding and quantification of uncertainty of integrated earth system models. In this paper, we introduce an uncertainty quantification (UQ) framework to analyze sensitivity of simulated surface fluxes to selected hydrologic parameters in the Community Land Model (CLM4) through forward modeling. Thirteen flux tower footprints spanning a wide range of climate and site conditions were selected to perform sensitivity analyses by perturbing the parameters identified. In the UQ framework, prior information about the parameters was used to quantify the input uncertainty using the Minimum-Relative-Entropy approach. The quasi-Monte Carlo approach was applied to generate samples of parameters on the basis of the prior pdfs. Simulations corresponding to sampled parameter sets were used to generate response curves and response surfaces and statistical tests were used to rank the significance of the parameters for output responses including latent (LH) and sensible heat (SH) fluxes. Overall, the CLM4 simulated LH and SH show the largest sensitivity to subsurface runoff generation parameters. However, study sites with deep root vegetation are also affected by surface runoff parameters, while sites with shallow root zones are also sensitive to the vadose zone soil water parameters. Generally, sites with finer soil texture and shallower rooting systems tend to have larger sensitivity of outputs to the parameters. Our results suggest the necessity of and possible ways for parameter inversion/calibration using available measurements of latent/sensible heat fluxes to obtain the optimal parameter set for CLM4. This study also provided guidance on reduction of parameter set dimensionality and parameter

  16. A new numerical framework for simulating the control of weather and climate on the evolution of soil-mantled hillslopes

    Science.gov (United States)

    Bovy, Benoît; Braun, Jean; Demoulin, Alain

    2016-06-01

    We present a new numerical framework for simulating short to long-term hillslope evolution. This modeling framework, to which we have given the name CLICHE (CLImate Control on Hillslope Evolution), aims to better capture the control of climate on soil dynamics. It allows the use of realistic forcing that involves, through a specific time discretization scheme, the variability of both the temperature and precipitation at time scales ranging from the daily rainfall events to the climatic oscillations of the Quaternary, also including seasonal variability. Two simple models of soil temperature and soil water balance permit the link between the climatic inputs and derived quantities that take part in the computation of the soil flux, such as the surface water discharge and the depth of the non-frozen soil layer. Using this framework together with a multi-process parameterization of soil transport, we apply an original method to calculate hillslope effective diffusivity as a function of climate. This allows us to demonstrate the ability of the model to simulate observed rates of hillslope erosion under different climates (cold and temperate) with a single set of parameter values. Numerical experiments furthermore suggest a potential high peak of sediment transport on hillslopes during the glacial-interglacial transitions of the Quaternary. We finally discuss the need to improve the parameterization of the soil production and transport processes in order to explicitly account for other key controlling factors that are also climate-sensitive, such as biological activity.

  17. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    Science.gov (United States)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  18. Constrained multi-global optimization using a penalty stretched simulated annealing framework

    OpenAIRE

    Pereira, Ana I.; Edite M.G.P. Fernandes

    2009-01-01

    This paper presents a new simulated annealing algorithm to solve constrained multi-global optimization problems. To compute all global solutions in a sequential manner, we combine the function stretching technique with the adaptive simulated annealing variant. Constraint-handling is carried out through a nondifferentiable penalty function. To benchmark our penalty stretched simulated annealing algorithm we solve a set of well-known problems. Our preliminary numerical results show that the alg...

  19. Experimental spectra analysis in THM with the help of simulation based on Geant4 framework

    CERN Document Server

    Li, Chengbo; Zhou, Shuhua; Fu, Yuanyong; Zhou, Jing; Meng, Qiuying; Jiang, Zongjun; Wang, Xiaolian

    2014-01-01

    The Coulomb barrier and electron screening cause difficulties in directly measuring nuclear reaction cross sections of charged particles in astrophysical energies. The Trojan-horse method has been introduced to solve the difficulties as a powerful indirect tool. In order to understand experimental spectra better, Geant4 is employed to simulate the method for the first time. Validity and reliability of the simulation are examined by comparing the experimental data with simulated results. The Geant4 simulation can give useful information to understand the experimental spectra better in data analysis and is beneficial to the design for future related experiments.

  20. A Modeling Framework for Supply Chain Simulation: Opportunities for Improved Decision Making

    NARCIS (Netherlands)

    Zee, van der D.J.; Vorst, van der J.G.A.J.

    2005-01-01

    Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and th

  1. A Framework for the Interactive Handling of High-Dimensional Simulation Data in Complex Geometries

    KAUST Repository

    Benzina, Amal

    2013-01-01

    Flow simulations around building infrastructure models involve large scale complex geometries, which when discretized in adequate detail entail high computational cost. Moreover, tasks such as simulation insight by steering or optimization require many such costly simulations. In this paper, we illustrate the whole pipeline of an integrated solution for interactive computational steering, developed for complex flow simulation scenarios that depend on a moderate number of both geometric and physical parameters. A mesh generator takes building information model input data and outputs a valid cartesian discretization. A sparse-grids-based surrogate model—a less costly substitute for the parameterized simulation—uses precomputed data to deliver approximated simulation results at interactive rates. Furthermore, a distributed multi-display visualization environment shows building infrastructure together with flow data. The focus is set on scalability and intuitive user interaction.

  2. A conceptual framework for using Doppler radar acquired atmospheric data for flight simulation

    Science.gov (United States)

    Campbell, W.

    1983-01-01

    A concept is presented which can permit turbulence simulation in the vicinity of microbursts. The method involves a large data base, but should be fast enough for use with flight simulators. The model permits any pilot to simulate any flight maneuver in any aircraft. The model simulates a wind field with three-component mean winds and three-component turbulent gusts, and gust variation over the body of an aircraft so that all aerodynamic loads and moments can be calculated. The time and space variation of mean winds and turbulent intensities associated with a particular atmospheric phenomenon such as a microburst is used in the model. In fact, Doppler radar data such as provided by JAWS is uniquely suited for use with the proposed model. The concept is completely general and is not restricted to microburst studies. Reentry and flight in terrestrial or planetary atmospheres could be realistically simulated if supporting data of sufficient resolution were available.

  3. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    Science.gov (United States)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  4. A Discrete Event Simulation Framework for Utility Accrual Scheduling Algorithm in Uniprocessor Environment

    Directory of Open Access Journals (Sweden)

    Idawaty Ahmad

    2011-01-01

    Full Text Available Problem statement: The heterogeneity in the choice of simulation platforms for real time scheduling stands behind the difficulty of developing a common simulation environment. A Discrete Event Simulation (DES for a real time scheduling domain encompassing event definition, time advancing mechanism and scheduler has yet to be developed. Approach: The study focused on the proposed and the development of an event based discrete event simulator for the existing General Utility Scheduling (GUS to facilitate the reuse of the algorithm under a common simulation environment. GUS is one of the existing TUF/UA scheduling algorithms that consider the Time/Utility Function (TUF of the executed tasks in its scheduling decision. The scheduling optimality criteria are based on maximizing accrued utility accumulated from execution of all tasks in the system. These criteria are named as Utility Accrual (UA. The TUF/ UA scheduling algorithms are design for adaptive real time system environment. The developed GUS simulator has derived the set of parameter, events, performance metrics and other unique TUF/UA scheduling element according to a detailed analysis of the base model. Results: The Accrued Utility Ratio (AUR is investigated and compared to the benchmark of the modeled domain. Successful deployment of the GUS simulator was proven by the generated results. Conclusion: Extensive performance analysis of GUS simulator can be deployed using the developed simulator with low computational overhead. Further enhancements were to extend the developed GUS simulator with detail performance metrics together with a fault tolerance mechanism to support a reliable real time application domain.

  5. Argonne National Laboratory 1986 publications

    International Nuclear Information System (INIS)

    This report is a bibliography of scientific and technical 1986 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1986. This compilation, prepared by the Technical Information Services Technical Publications Section (TPS), lists all nonrestricted 1986 publications submitted to TPS by the Laboratory's Divisions. Author indexes list ANL authors only. If a first author is not an ANL employee, an asterisk in the bibliographic citation indicates the first ANL author. The report is divided into seven parts: Journal Articles -- Listed by first author; ANL Reports -- Listed by report number; ANL and non-ANL Unnumbered Reports -- Listed by report number; Non-ANL Numbered Reports -- Listed by report number; Books and Book Chapters -- Listed by first author; Conference Papers -- Listed by first author; and Complete Author Index

  6. Argonne National Laboratory 1986 publications

    Energy Technology Data Exchange (ETDEWEB)

    Kopta, J.A.; Springer, C.J.

    1987-12-01

    This report is a bibliography of scientific and technical 1986 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1986. This compilation, prepared by the Technical Information Services Technical Publications Section (TPS), lists all nonrestricted 1986 publications submitted to TPS by the Laboratory's Divisions. Author indexes list ANL authors only. If a first author is not an ANL employee, an asterisk in the bibliographic citation indicates the first ANL author. The report is divided into seven parts: Journal Articles -- Listed by first author; ANL Reports -- Listed by report number; ANL and non-ANL Unnumbered Reports -- Listed by report number; Non-ANL Numbered Reports -- Listed by report number; Books and Book Chapters -- Listed by first author; Conference Papers -- Listed by first author; and Complete Author Index.

  7. Status on the Development of a Modeling and Simulation Framework for the Economic Assessment of Nuclear Hybrid Energy Systems

    International Nuclear Information System (INIS)

    An effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year 2015 (FY15). The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status on its progress. Several tasks have been accomplished. First, starting from a simulation strategy, a rigorous mathematical formulation has been achieved in which the economic optimization of a Nuclear Hybrid Energy System is presented as a constrained robust (under uncertainty) optimization problem. Some possible algorithms for the solution of the optimization problem are presented. A variation of the Simultaneous Perturbation Stochastic Approximation algorithm has been implemented in RAVEN and preliminary tests have been performed. The development of the software infrastructure to support the simulation of the whole NHES has also moved forward. The coupling between RAVEN and an implementation of the Modelica language (OpenModelica) has been implemented, migrated under several operating systems and tested using an adapted model of a desalination plant. In particular, this exercise was focused on testing the coupling of the different code systems; testing parallel, computationally expensive simulations on the INL cluster; and providing a proof of concept for the possibility of using surrogate models to represent the different NHES subsystems. Another important step was the porting of the RAVEN code under the Windows™ operating system. This accomplishment makes RAVEN compatible with the development environment that is being used for dynamic simulation of NHES components. A very simplified model of a NHES on the electric market has been built in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces even for stochastic constraints. This

  8. Status on the Development of a Modeling and Simulation Framework for the Economic Assessment of Nuclear Hybrid Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bragg-Sitton, Shannon Michelle [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert Arthur [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Jong Suk [Idaho National Lab. (INL), Idaho Falls, ID (United States); Deason, Wesley Ray [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boardman, Richard Doin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Garcia, Humberto E. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    An effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year 2015 (FY15). The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status on its progress. Several tasks have been accomplished. First, starting from a simulation strategy, a rigorous mathematical formulation has been achieved in which the economic optimization of a Nuclear Hybrid Energy System is presented as a constrained robust (under uncertainty) optimization problem. Some possible algorithms for the solution of the optimization problem are presented. A variation of the Simultaneous Perturbation Stochastic Approximation algorithm has been implemented in RAVEN and preliminary tests have been performed. The development of the software infrastructure to support the simulation of the whole NHES has also moved forward. The coupling between RAVEN and an implementation of the Modelica language (OpenModelica) has been implemented, migrated under several operating systems and tested using an adapted model of a desalination plant. In particular, this exercise was focused on testing the coupling of the different code systems; testing parallel, computationally expensive simulations on the INL cluster; and providing a proof of concept for the possibility of using surrogate models to represent the different NHES subsystems. Another important step was the porting of the RAVEN code under the Windows™ operating system. This accomplishment makes RAVEN compatible with the development environment that is being used for dynamic simulation of NHES components. A very simplified model of a NHES on the electric market has been built in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces even for stochastic constraints. This

  9. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Anna; Yin, Fang-Fang; Wu, Qiuwen, E-mail: Qiuwen.Wu@Duke.edu [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27710 and Medical Physics Graduate Program, Duke University Medical Center, Durham, North Carolina 27705 (United States); Sawkey, Daren [Varian Medical Systems, Palo Alto, California 94304 (United States)

    2015-05-15

    Purpose: To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. Methods: In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimated field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm{sup 2} were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R{sub 100}, R{sub 50}, R{sub p}, and R{sub p+} for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Results: Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R{sub 100}, R{sub 50}, and R{sub p} were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. Conclusions: We have presented a Monte Carlo simulation framework for electron beam dose calculations for

  10. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs

    International Nuclear Information System (INIS)

    Purpose: To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. Methods: In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimated field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm2 were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R100, R50, Rp, and Rp+ for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Results: Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R100, R50, and Rp were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. Conclusions: We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6

  11. Monte Carlo simulation of inverse geometry x-ray fluoroscopy using a modified MC-GPU framework

    Science.gov (United States)

    Dunkerley, David A. P.; Tomkowiak, Michael T.; Slagowski, Jordan M.; McCabe, Bradley P.; Funk, Tobias; Speidel, Michael A.

    2015-03-01

    Scanning-Beam Digital X-ray (SBDX) is a technology for low-dose fluoroscopy that employs inverse geometry x-ray beam scanning. To assist with rapid modeling of inverse geometry x-ray systems, we have developed a Monte Carlo (MC) simulation tool based on the MC-GPU framework. MC-GPU version 1.3 was modified to implement a 2D array of focal spot positions on a plane, with individually adjustable x-ray outputs, each producing a narrow x-ray beam directed toward a stationary photon-counting detector array. Geometric accuracy and blurring behavior in tomosynthesis reconstructions were evaluated from simulated images of a 3D arrangement of spheres. The artifact spread function from simulation agreed with experiment to within 1.6% (rRMSD). Detected x-ray scatter fraction was simulated for two SBDX detector geometries and compared to experiments. For the current SBDX prototype (10.6 cm wide by 5.3 cm tall detector), x-ray scatter fraction measured 2.8-6.4% (18.6-31.5 cm acrylic, 100 kV), versus 2.2-5.0% in MC simulation. Experimental trends in scatter versus detector size and phantom thickness were observed in simulation. For dose evaluation, an anthropomorphic phantom was imaged using regular and regional adaptive exposure (RAE) scanning. The reduction in kerma-area-product resulting from RAE scanning was 45% in radiochromic film measurements, versus 46% in simulation. The integral kerma calculated from TLD measurement points within the phantom was 57% lower when using RAE, versus 61% lower in simulation. This MC tool may be used to estimate tomographic blur, detected scatter, and dose distributions when developing inverse geometry x-ray systems.

  12. Push technology at Argonne National Laboratory.

    Energy Technology Data Exchange (ETDEWEB)

    Noel, R. E.; Woell, Y. N.

    1999-04-06

    Selective dissemination of information (SDI) services, also referred to as current awareness searches, are usually provided by periodically running computer programs (personal profiles) against a cumulative database or databases. This concept of pushing relevant content to users has long been integral to librarianship. Librarians traditionally turned to information companies to implement these searches for their users in business, academia, and the science community. This paper describes how a push technology was implemented on a large scale for scientists and engineers at Argonne National Laboratory, explains some of the challenges to designers/maintainers, and identifies the positive effects that SDI seems to be having on users. Argonne purchases the Institute for Scientific Information (ISI) Current Contents data (all subject areas except Humanities), and scientists no longer need to turn to outside companies for reliable SDI service. Argonne's database and its customized services are known as ACCESS (Argonne-University of Chicago Current Contents Electronic Search Service).

  13. GridPACK™ : A Framework for Developing Power Grid Simulations on High-Performance Computing Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, Bruce J.; Perkins, William A.; Chen, Yousu; Jin, Shuangshuang; Callahan, David; Glass, Kevin A.; Diao, Ruisheng; Rice, Mark J.; Elbert, Stephen T.; Vallem, Mallikarjuna R.; Huang, Zhenyu

    2016-05-01

    This paper describes the GridPACK™ framework, which is designed to help power grid engineers develop modeling software capable of running on high performance computers. The framework makes extensive use of software templates to provide high level functionality while at the same time allowing developers the freedom to express whatever models and algorithms they are using. GridPACK™ contains modules for setting up distributed power grid networks, assigning buses and branches with arbitrary behaviors to the network, creating distributed matrices and vectors and using parallel linear and non-linear solvers to solve algebraic equations. It also provides mappers to create matrices and vectors based on properties of the network and functionality to support IO and to mana

  14. RunMC - an object-oriented analysis framework for Monte Carlo simulation of high-energy particle collisions

    CERN Document Server

    Chekanov, S

    2005-01-01

    RunMC is an object-oriented framework aimed to generate and to analyse high-energy collisions of elementary particles using Monte Carlo simulations. This package, being based on C++ adopted by CERN as the main programming language for the LHC experiments, provides a common interface to different Monte Carlo models using modern physics libraries. Physics calculations (projects) can easily be loaded and saved as external modules. This simplifies the development of complicated calculations for high energy physics in large collaborations.This desktop program is open-source licensed and is available on the LINUX and Windows/Cygwin platforms.

  15. A model-based approach for bridging virtual and physical sensor nodes in a hybrid simulation framework.

    Science.gov (United States)

    Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L

    2014-01-01

    The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083

  16. A Model-Based Approach for Bridging Virtual and Physical Sensor Nodes in a Hybrid Simulation Framework

    Directory of Open Access Journals (Sweden)

    Mohammad Mozumdar

    2014-06-01

    Full Text Available The Model Based Design (MBD approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL simulation.

  17. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    Directory of Open Access Journals (Sweden)

    David P. Chassin

    2014-01-01

    Full Text Available Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  18. Autonomic, Agent-Based Simulation Management (A2SM) Framework Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large scale numerical simulations, as typified by climate models, space weather models, and the like, typically involve non-linear governing equations in...

  19. A Framework for Parallel Numerical Simulations on Multi-Scale Geometries

    KAUST Repository

    Varduhn, Vasco

    2012-06-01

    In this paper, an approach on performing numerical multi-scale simulations on fine detailed geometries is presented. In particular, the focus lies on the generation of sufficient fine mesh representations, whereas a resolution of dozens of millions of voxels is inevitable in order to sufficiently represent the geometry. Furthermore, the propagation of boundary conditions is investigated by using simulation results on the coarser simulation scale as input boundary conditions on the next finer scale. Finally, the applicability of our approach is shown on a two-phase simulation for flooding scenarios in urban structures running from a city wide scale to a fine detailed in-door scale on feature rich building geometries. © 2012 IEEE.

  20. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-06-23

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  1. A Virtual Simulation Environment for Lunar Rover: Framework and Key Technologies

    Directory of Open Access Journals (Sweden)

    Yan-chun Yang

    2008-11-01

    Full Text Available Lunar rover development involves a large amount of validation works in realistic operational conditions, including its mechanical subsystem and on-board software. Real tests require equipped rover platform and a realistic terrain. It is very time consuming and high cost. To improve the development efficiency, a rover simulation environment called RSVE that affords real time capabilities with high fidelity has been developed. It uses fractional Brown motion (fBm technique and statistical properties to generate lunar surface. Thus, various terrain models for simulation can be generated through changing several parameters. To simulate lunar rover evolving on natural and unstructured surface with high realism, the whole dynamics of the multi-body systems and complex interactions with soft ground is integrated in this environment. An example for path planning algorithm and controlling algorithm testing in this environment is tested. This simulation environment runs on PC or Silicon Graphics.

  2. A technical framework to describe occupant behavior for building energy simulations

    Energy Technology Data Exchange (ETDEWEB)

    Turner, William; Hong, Tianzhen

    2013-12-20

    Green buildings that fail to meet expected design performance criteria indicate that technology alone does not guarantee high performance. Human influences are quite often simplified and ignored in the design, construction, and operation of buildings. Energy-conscious human behavior has been demonstrated to be a significant positive factor for improving the indoor environment while reducing the energy use of buildings. In our study we developed a new technical framework to describe energy-related human behavior in buildings. The energy-related behavior includes accounting for individuals and groups of occupants and their interactions with building energy services systems, appliances and facilities. The technical framework consists of four key components: i. the drivers behind energy-related occupant behavior, which are biological, societal, environmental, physical, and economical in nature ii. the needs of the occupants are based on satisfying criteria that are either physical (e.g. thermal, visual and acoustic comfort) or non-physical (e.g. entertainment, privacy, and social reward) iii. the actions that building occupants perform when their needs are not fulfilled iv. the systems with which an occupant can interact to satisfy their needs The technical framework aims to provide a standardized description of a complete set of human energy-related behaviors in the form of an XML schema. For each type of behavior (e.g., occupants opening/closing windows, switching on/off lights etc.) we identify a set of common behaviors based on a literature review, survey data, and our own field study and analysis. Stochastic models are adopted or developed for each type of behavior to enable the evaluation of the impact of human behavior on energy use in buildings, during either the design or operation phase. We will also demonstrate the use of the technical framework in assessing the impact of occupancy behavior on energy saving technologies. The technical framework presented is

  3. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    OpenAIRE

    Chassin, David P.; Jason C. Fuller; Ned Djilali

    2014-01-01

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies and the plethora of other resources and assets that are becoming part of modern electricity production, delivery and consumption systems. As a result, the US Department of Energy's Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D(TM) that uses an agent-based approach to simula...

  4. A Framework for System-level Modeling and Simulation of Embedded Systems Architectures

    OpenAIRE

    Erbas Cagkan; Pimentel AndyD; Thompson Mark; Polstra Simon

    2007-01-01

    The high complexity of modern embedded systems impels designers of such systems to model and simulate system components and their interactions in the early design stages. It is therefore essential to develop good tools for exploring a wide range of design choices at these early stages, where the design space is very large. This paper provides an overview of our system-level modeling and simulation environment, Sesame, which aims at efficient design space exploration of embedded multimedia sy...

  5. DAMNED: A Distributed and Multithreaded Neural Event-Driven simulation framework

    OpenAIRE

    Mouraud, Anthony; Puzenat, Didier; Paugam-Moisy, Hélène

    2005-01-01

    In a Spiking Neural Networks (SNN), spike emissions are sparsely and irregularly distributed both in time and in the network architecture. Since a current feature of SNNs is a low average activity, efficient implementations of SNNs are usually based on an Event-Driven Simulation (EDS). On the other hand, simulations of large scale neural networks can take advantage of distributing the neurons on a set of processors (either workstation cluster or parallel computer). This article presents DAMNE...

  6. An optimization framework for modeling and simulation of dynamic systems based on AIS

    OpenAIRE

    Leung, CSK; Lau, HYK

    2011-01-01

    Modeling and simulation can be used in many contexts for gaining insights into the functioning, performance, and operation, of complex systems. However, this method alone often produces feasible solutions under certain operating conditions of a system in which such solutions may not be optimal. This is inevitably inadequate in circumstances where optimality is required. In this respect, an approach to effectively evaluate and optimize system performance is to couple the simulation model with ...

  7. Development of a modelling and simulation method comparison and selection: Framework for health services management

    OpenAIRE

    Naseer Aisha; Harper Paul; Eldabi Tillal; Morris Zoe; Jun Gyuchan T; Patel Brijesh; Clarkson John P

    2011-01-01

    Abstract Background There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. Aim The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can b...

  8. The IDES framework: A case study in development of a parallel discrete-event simulation system

    Energy Technology Data Exchange (ETDEWEB)

    Nicol, D.M. [Dartmouth Coll., Hanover, NH (United States). Dept. of Computer Science; Johnson, M.M.; Yoshimura, A.S. [Sandia National Labs., Livermore, CA (United States)

    1997-12-31

    This tutorial describes considerations in the design and development of the IDES parallel simulation system. IDES is a Java-based parallel/distributed simulation system designed to support the study of complex large-scale enterprise systems. Using the IDES system as an example, the authors discuss how anticipated model and system constraints molded the design decisions with respect to modeling, synchronization, and communication strategies.

  9. A framework for embedding molecular-level information in continuum-scale simulations of interfacial flows

    Science.gov (United States)

    Smith, Edward; Theodorakis, Panagiotis; Muller, Erich; Craster, Richard; Matar, Omar

    2015-11-01

    Molecular dynamics provides a means of resolving the contact-line paradox. The price to pay for this insight is computational, with droplet simulations limited to the nanoscale. In order to model problems of engineering interest, the molecular contact line must be abstracted and included as part of a continuum scale simulation. Coupling, using dynamic molecular simulation in place of empirical or approximate closure relations, provides a means of doing just this. Molecular simulation of two phase Couette flow can reproduce the key features of the moving contact line. This sheared liquid bridge has the advantage that a steady state can be obtained, providing an unlimited source of data for statistical analysis. In this talk, we will present highlights from molecular dynamics simulation of the moving contact line. Using interface tracking, the dynamics of the contact line are examined, with results compared to published experimental studies. Good agreement is observed despite the difference in scale between the molecular model and experiments. Potential applications of this method are discussed, including coupled simulation which incorporates the molecular detail for surfactant-driven spreading. EPSRC Platform Grant (MACIPh) EP/L020564/1.

  10. A Conceptual framework of Strategy, Structure and Innovative Behaviour for the Development of a Dynamic Simulation Model

    Science.gov (United States)

    Konstantopoulos, Nikolaos; Trivellas, Panagiotis; Reklitis, Panagiotis

    2007-12-01

    According to many researchers of organizational theory, a great number of problems encountered by the manufacturing firms are due to their failure to foster innovative behaviour by aligning business strategy and structure. From this point of view, the fit between strategy and structure is essential in order to facilitate firms' innovative behaviour. In the present paper, we adopt Porter's typology to operationalise business strategy (cost leadership, innovative and marketing differentiation, and focus). Organizational structure is built on four dimensions (centralization, formalization, complexity and employees' initiatives to implement new ideas). Innovativeness is measured as product innovation, process and technological innovation. This study provides the necessary theoretical framework for the development of a dynamic simulation method, although the simulation of social events is a quite difficult task, considering that there are so many alternatives (not all well understood).

  11. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework

    Science.gov (United States)

    Cañadas, M.; Arce, P.; Rato Mendes, P.

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was

  12. Thermal large Eddy simulations and experiments in the framework of non-isothermal blowing

    International Nuclear Information System (INIS)

    The aim of this work is to study thermal large-eddy simulations and to determine the nonisothermal blowing impact on a turbulent boundary layer. An experimental study is also carried out in order to complete and validate simulation results. In a first time, we developed a turbulent inlet condition for the velocity and the temperature, which is necessary for the blowing simulations.We studied the asymptotic behavior of the velocity, the temperature and the thermal turbulent fluxes in a large-eddy simulation point of view. We then considered dynamics models for the eddy-diffusivity and we simulated a turbulent channel flow with imposed temperature, imposed flux and adiabatic walls. The numerical and experimental study of blowing permitted to obtain to the modifications of a thermal turbulent boundary layer with the blowing rate. We observed the consequences of the blowing on mean and rms profiles of velocity and temperature but also on velocity-velocity and velocity-temperature correlations. Moreover, we noticed an increase of the turbulent structures in the boundary layer with blowing. (author)

  13. Simulation-based Modeling Frameworks for Networked Multi-processor System-on-Chip

    DEFF Research Database (Denmark)

    Mahadevan, Shankar

    2006-01-01

    This thesis deals with modeling aspects of multi-processor system-on-chip (MpSoC) design affected by the on-chip interconnect, also called the Network-on-Chip (NoC), at various levels of abstraction. To begin with, we undertook a comprehensive survey of research and design practices of networked Mp......-based frameworks: namely ARTS and RIPE, that allows to model hardware (computation time, power consumption, network latency, caching effect, etc.) and software (application partition and mapping, operating system scheduling, interrupt handling, etc.) aspects from system-level to cycle-true abstraction. Thereby, we...

  14. A hybrid local/non-local framework for the simulation of damage and fracture

    KAUST Repository

    Azdoud, Yan

    2014-01-01

    Recent advances in non-local continuum models, notably peridynamics, have spurred a paradigm shift in solid mechanics simulation by allowing accurate mathematical representation of singularities and discontinuities. This doctoral work attempts to extend the use of this theory to a community more familiar with local continuum models. In this communication, a coupling strategy - the morphing method -, which bridges local and non-local models, is presented. This thesis employs the morphing method to ease use of the non-local model to represent problems with failure-induced discontinuities. First, we give a quick review of strategies for the simulation of discrete degradation, and suggest a hybrid local/non-local alternative. Second, we present the technical concepts involved in the morphing method and evaluate the quality of the coupling. Third, we develop a numerical tool for the simulation of the hybrid model for fracture and damage and demonstrate its capabilities on numerical model examples

  15. CO 2 adsorption in mono-, di- and trivalent cation-exchanged metal-organic frameworks: A molecular simulation study

    KAUST Repository

    Chen, Yifei

    2012-02-28

    A molecular simulation study is reported for CO 2 adsorption in rho zeolite-like metal-organic framework (rho-ZMOF) exchanged with a series of cations (Na +, K +, Rb +, Cs +, Mg 2+, Ca 2+, and Al 3+). The isosteric heat and Henry\\'s constant at infinite dilution increase monotonically with increasing charge-to-diameter ratio of cation (Cs + < Rb + < K + < Na + < Ca 2+ < Mg 2+ < Al 3+). At low pressures, cations act as preferential adsorption sites for CO 2 and the capacity follows the charge-to-diameter ratio. However, the free volume of framework becomes predominant with increasing pressure and Mg-rho-ZMOF appears to possess the highest saturation capacity. The equilibrium locations of cations are observed to shift slightly upon CO 2 adsorption. Furthermore, the adsorption selectivity of CO 2/H 2 mixture increases as Cs + < Rb + < K + < Na + < Ca 2+ < Mg 2+ ≈ Al 3+. At ambient conditions, the selectivity is in the range of 800-3000 and significantly higher than in other nanoporous materials. In the presence of 0.1% H 2O, the selectivity decreases drastically because of the competitive adsorption between H 2O and CO 2, and shows a similar value in all of the cation-exchanged rho-ZMOFs. This simulation study provides microscopic insight into the important role of cations in governing gas adsorption and separation, and suggests that the performance of ionic rho-ZMOF can be tailored by cations. © 2012 American Chemical Society.

  16. Novel Simulation Framework of Three-Dimensional Skull Bio-Metric Measurement

    Directory of Open Access Journals (Sweden)

    Shihab A. Hameed

    2009-11-01

    Full Text Available Previously, most of the researcher was suffering from simulate any three dimension applications for biometrics application, likewise, various applications of forensics and cosmetology has not been easy to be simulated. Three dimensional figures have approved the fact that, it has been more reliable than two dimensional figures in most of the applications used to be implemented for the purposes above. The reason behind this reliability was the features that extract from the three dimensional applications more close to the reality. The goal of this paper is to study and evaluate how far three-dimensional skull biometric is reliable in term of the accurate measurements, capability and applicability. As it mentions above, it was hard to evaluate or simulate an application use three-dimensional skull in biometric, however, Canfield Imaging Systems provide a new suitable environment to simulate a new three-dimensional skull biometric. The second goal of this paper is to assess how good the new threedimensional image system is. This paper will also go through the recognition and verification based on a different biometric application. Subsequently this paper will study the reliability and dependability of using skull biometric. The simulation based on the three-dimensional Skull recognition using threedimensional matching technique. The feature of the simulate system shows the capability of using three-dimensional matching system as an efficient way to identify the person through his or her skull by match it with database, this technique grantee fast processing with optimizing the false positive and negative as well .

  17. Through the lens of instructional design: appraisal of the Jeffries/National League for Nursing Simulation Framework for use in acute care.

    Science.gov (United States)

    Wilson, Rebecca D; Hagler, Debra

    2012-09-01

    As human patient simulation becomes more prevalent in acute care settings, clinical experts are often asked to assist in developing scenarios. Although the Jeffries/National League for Nursing Simulation Framework has been used in academic settings to guide the instructional design of clinical simulations, its use in acute care settings is less known. This framework incorporates a consideration of contextual elements, design characteristics, and outcomes. An external validation study applying the framework within the context of acute care showed its overall strength as well as elements that were problematic. The implications derived from the study of the design characteristics in a hospital setting can be used by nurses who are considering either adopting or adapting this framework for their own practice. PMID:22715871

  18. Numerical simulation of a full scale fire test on a loaded steel framework

    OpenAIRE

    Franssen, Jean-Marc; Cooke, C. M. E.; Latham, D. J.

    1995-01-01

    A single bay single storey steel portal frame has been tested under fire conditions. It is here simulated using hte non linear computer code CEFICOSS. The elements have composite steel-concrete sections for the thermal analysis, but only the steel part of the sections is load bearing.

  19. Metal organic frameworks (MOFs) for degrdation of nerve agent simulant parathion

    Science.gov (United States)

    Parathion, a simulant of nerve agent VX, has been studied for degradation on Fe3+, Fe2+ and zerovalent iron supported on chitosan. Chitosan, a naturally occurring biopolymer derivative of chitin, is a very good adsorbent for many chemicals including metals. Chitosan is used as supporting biopolymer ...

  20. Variable-resolution frameworks for the simulation of tropical cyclones in global atmospheric general circulation models

    Science.gov (United States)

    Zarzycki, Colin

    The ability of atmospheric General Circulation Models (GCMs) to resolve tropical cyclones in the climate system has traditionally been difficult. The challenges include adequately capturing storms which are small in size relative to model grids and the fact that key thermodynamic processes require a significant level of parameterization. At traditional GCM grid spacings of 50-300 km tropical cyclones are severely under-resolved, if not completely unresolved. This thesis explores a variable-resolution global model approach that allows for high spatial resolutions in areas of interest, such as low-latitude ocean basins where tropical cyclogenesis occurs. Such GCM designs with multi-resolution meshes serve to bridge the gap between globally-uniform grids and limited area models and have the potential to become a future tool for regional climate assessments. A statically-nested, variable-resolution option has recently been introduced into the Department of Energy/National Center for Atmospheric Research (DoE/NCAR) Community Atmosphere Model's (CAM) Spectral Element (SE) dynamical core. Using an idealized tropical cyclone test, variable-resolution meshes are shown to significantly lessen computational requirements in regional GCM studies. Furthermore, the tropical cyclone simulations are free of spurious numerical errors at the resolution interfaces. Utilizing aquaplanet simulations as an intermediate test between idealized simulations and fully-coupled climate model runs, climate statistics within refined patches are shown to be well-matched to globally-uniform simulations of the same grid spacing. Facets of the CAM version 4 (CAM4) subgrid physical parameterizations are likely too scale sensitive for variable-resolution applications, but the newer CAM5 package is vastly improved in performance at multiple grid spacings. Multi-decadal simulations following 'Atmospheric Model Intercomparison Project' protocols have been conducted with variable-resolution grids. Climate

  1. An Integrated GIS, optimization and simulation framework for optimal PV size and location in campus area environments

    International Nuclear Information System (INIS)

    Highlights: • The optimal size and locations for PV units for campus environments are achieved. • The GIS module finds the suitable rooftops and their panel capacity. • The optimization module maximizes the long-term profit of PV installations. • The simulation module evaluates the voltage profile of the distribution network. • The proposed work has been successfully demonstrated for a real university campus. - Abstract: Finding the optimal size and locations for Photovoltaic (PV) units has been a major challenge for distribution system planners and researchers. In this study, a framework is proposed to integrate Geographical Information Systems (GIS), mathematical optimization, and simulation modules to obtain the annual optimal placement and size of PV units for the next two decades in a campus area environment. First, a GIS module is developed to find the suitable rooftops and their panel capacity considering the amount of solar radiation, slope, elevation, and aspect. The optimization module is then used to maximize the long-term net profit of PV installations considering various costs of investment, inverter replacement, operation, and maintenance as well as savings from consuming less conventional energy. A voltage profile of the electricity distribution network is then investigated in the simulation module. In the case of voltage limit violation by intermittent PV generations or load fluctuations, two mitigation strategies, reallocation of the PV units or installation of a local storage unit, are suggested. The proposed framework has been implemented in a real campus area, and the results show that it can effectively be used for long-term installation planning of PV panels considering both the cost and power quality

  2. I. Dissociation free energies in drug-receptor systems via non equilibrium alchemical simulations: theoretical framework

    CERN Document Server

    Procacci, Piero

    2016-01-01

    In this contribution I critically revise the alchemical reversible approach in the context of the statistical mechanics theory of non covalent bonding in drug receptor systems. I show that most of the pitfalls and entanglements for the binding free energies evaluation in computer simulations are rooted in the equilibrium assumption that is implicit in the reversible method. These critical issues can be resolved by using a non-equilibrium variant of the alchemical method in molecular dynamics simulations, relying on the production of many independent trajectories with a continuous dynamical evolution of an externally driven alchemical coordinate, completing the decoupling of the ligand in a matter of few tens of picoseconds rather than nanoseconds. The absolute binding free energy can be recovered from the annihilation work distributions by applying an unbiased unidirectional free energy estimate, on the assumption that any observed work distribution is given by a mixture of normal distributions, whose compone...

  3. An Artificial Intelligence framework for experimental design and analysis in discrete event simulation

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, R.P.

    1988-01-01

    Simulation studies cycle through the phases of formulation, programming, verification and validation, experimental design and analysis, and implementation. The work presented has been concerned with developing methods to enhance the practice and support for the experimental design and analysis phase of a study. The investigation focussed on the introduction of Artificial Intelligence (AI) techniques to this phase, where previously there existed little support. The reason for this approach was the realization that the experimentation process in a simulation study can be broken down into a reasoning component and a control of execution component. In most studies, a user would perform both of these. The involvement of a reasoning process attracted the notion of artificial intelligence or at least the prospective use of its techniques. After a study into the current state of the art, work began by considering the development of a support system for experimental design and analysis that had human intelligence and machine control of execution. This provided a semi-structured decision-making environment in the form of a controller that requested human input. The controller was made intelligent when it was linked to a non-procedural (PROLOG) program that provided remote intelligent input from either the user or default heuristics. The intelligent controller was found to enhance simulation experimentation because it ensures that all the steps in the experimental design and analysis phase take place and receive appropriate input. The next stage was to adopt the view that simulation experimental design and analysis may be enhanced through a system that had machine intelligence but expected human control of execution.

  4. Spatial and Temporal Simulation of Human Evolution. Methods, Frameworks and Applications

    OpenAIRE

    Benguigui, Macarena; Arenas, Miguel

    2014-01-01

    Analyses of human evolution are fundamental to understand the current gradients of human diversity. In this concern, genetic samples collected from current populations together with archaeological data are the most important resources to study human evolution. However, they are often insufficient to properly evaluate a variety of evolutionary scenarios, leading to continuous debates and discussions. A commonly applied strategy consists of the use of computer simulations based on, as realistic...

  5. Federated Simulation and Gaming Framework for a Decentralized Space-Based Resource Economy

    OpenAIRE

    Grogan, Paul Thomas; de Weck, Olivier L.

    2012-01-01

    Future human space exploration will require large amounts of resources for shielding and building materials, propellants, and consumables. A space-based resource economy could produce, transport, and store resource at distributed locations such as the lunar surface, stable orbits, or Lagrange points to avoid Earth's deep gravity well. Design challenges include decentralized operation and management and socio-technical complexities not commonly addressed by modeling and simulation methods. Thi...

  6. Agent based simulation framework for quantitative and qualitative social research: Statistics and natural language generation

    OpenAIRE

    Arroyo Menéndez, Millán; Hassan Collado, Samer; León, Carlos; Pavón Mestres, Juan Luis

    2007-01-01

    Even though Agent Based Social Simulation is beginning to be spread out as a powerful quantitative method for sociologists, it is still far from attracting qualitative ones. We propose to broaden ABSS horizons with a system that returns outputs useful for both paradigms. The case used as example is the study of the evolution of religiosity in the Spanish post-modern society. From a “macro” perspective, it analyses social trends, using quantitative data from the European Values Survey and givi...

  7. OpenSim: a musculoskeletal modeling and simulation framework for in silico investigations and exchange

    OpenAIRE

    Seth, Ajay; Sherman, Michael; Reinbolt, Jeffrey A.; Delp, Scott L.

    2011-01-01

    Movement science is driven by observation, but observation alone cannot elucidate principles of human and animal movement. Biomechanical modeling and computer simulation complement observations and inform experimental design. Biological models are complex and specialized software is required for building, validating, and studying them. Furthermore, common access is needed so that investigators can contribute models to a broader community and leverage past work. We are developing OpenSim, a fr...

  8. Evaluating Standard and Custom Applications in IPv6 Within a Simulation Framework

    OpenAIRE

    Clore, Brittany Michelle

    2012-01-01

    Internet Protocol version 6 (IPv6) is being adopted in networks around the world as the Internet Protocol version 4 (IPv4) addressing space reaches its maximum capacity. Although there are IPv6 applications being developed, there are not many production IPv6 networks in place in which these applications can be deployed. Simulation presents a cost effective alternative to setting up a live test bed of devices to validate specific IPv6 environments before actual physical deployment. OPNET Mode...

  9. A parallel overset-curvilinear-immersed boundary framework for simulating complex 3D incompressible flows

    OpenAIRE

    Borazjani, Iman; Ge, Liang; Le, Trung; Sotiropoulos, Fotis

    2013-01-01

    We develop an overset-curvilinear immersed boundary (overset-CURVIB) method in a general non-inertial frame of reference to simulate a wide range of challenging biological flow problems. The method incorporates overset-curvilinear grids to efficiently handle multi-connected geometries and increase the resolution locally near immersed boundaries. Complex bodies undergoing arbitrarily large deformations may be embedded within the overset-curvilinear background grid and treated as sharp interfac...

  10. Argonne heavy ion fusion program

    International Nuclear Information System (INIS)

    The experimental part of Argonne's heavy ion fusion program is directed toward demonstrating the first, and in many ways most difficult, section of a viable accelerator facility for heavy ion fusion. this includes a high current, high brightness, singly charged xenon source, a dc preaccelerator at the highest practical voltage, and a low beta linac of special design. The latter would demonstrate rf capture with its attendant inefficiencies and accelerate ions to a velocity acceptable to more conventional rf linac structures such as the π-3π Wideroe. The initial goals of this program are for a source current of 100 mA of Xe+1, a preaccelerator voltage of 1.5 MV, and less than 50% loss in rf capture into the low beta linac. A linear accelerator is proposed with a voltage gain up to 200 MV as a minimum which would form the initial stage of an operational heavy ion fusion facility irrespective of what type of acceleration to high energies were employed beyond this point

  11. Diffusion dynamics and concentration of toxic materials from quantum dots-based nanotechnologies: an agent-based modeling simulation framework

    Energy Technology Data Exchange (ETDEWEB)

    Agusdinata, Datu Buyung, E-mail: bagusdinata@niu.edu; Amouie, Mahbod [Northern Illinois University, Department of Industrial & Systems Engineering and Environment, Sustainability, & Energy Institute (United States); Xu, Tao [Northern Illinois University, Department of Chemistry and Biochemistry (United States)

    2015-01-15

    Due to their favorable electrical and optical properties, quantum dots (QDs) nanostructures have found numerous applications including nanomedicine and photovoltaic cells. However, increased future production, use, and disposal of engineered QD products also raise concerns about their potential environmental impacts. The objective of this work is to establish a modeling framework for predicting the diffusion dynamics and concentration of toxic materials released from Trioctylphosphine oxide-capped CdSe. To this end, an agent-based model simulation with reaction kinetics and Brownian motion dynamics was developed. Reaction kinetics is used to model the stability of surface capping agent particularly due to oxidation process. The diffusion of toxic Cd{sup 2+} ions in aquatic environment was simulated using an adapted Brownian motion algorithm. A calibrated parameter to reflect sensitivity to reaction rate is proposed. The model output demonstrates the stochastic spatial distribution of toxic Cd{sup 2+} ions under different values of proxy environmental factor parameters. With the only chemistry considered was oxidation, the simulation was able to replicate Cd{sup 2+} ion release from Thiol-capped QDs in aerated water. The agent-based method is the first to be developed in the QDs application domain. It adds both simplicity of the solubility and rate of release of Cd{sup 2+} ions and complexity of tracking of individual atoms of Cd at the same time.

  12. Diffusion dynamics and concentration of toxic materials from quantum dots-based nanotechnologies: an agent-based modeling simulation framework

    International Nuclear Information System (INIS)

    Due to their favorable electrical and optical properties, quantum dots (QDs) nanostructures have found numerous applications including nanomedicine and photovoltaic cells. However, increased future production, use, and disposal of engineered QD products also raise concerns about their potential environmental impacts. The objective of this work is to establish a modeling framework for predicting the diffusion dynamics and concentration of toxic materials released from Trioctylphosphine oxide-capped CdSe. To this end, an agent-based model simulation with reaction kinetics and Brownian motion dynamics was developed. Reaction kinetics is used to model the stability of surface capping agent particularly due to oxidation process. The diffusion of toxic Cd2+ ions in aquatic environment was simulated using an adapted Brownian motion algorithm. A calibrated parameter to reflect sensitivity to reaction rate is proposed. The model output demonstrates the stochastic spatial distribution of toxic Cd2+ ions under different values of proxy environmental factor parameters. With the only chemistry considered was oxidation, the simulation was able to replicate Cd2+ ion release from Thiol-capped QDs in aerated water. The agent-based method is the first to be developed in the QDs application domain. It adds both simplicity of the solubility and rate of release of Cd2+ ions and complexity of tracking of individual atoms of Cd at the same time

  13. Towards multi-phase flow simulations in the PDE framework Peano

    KAUST Repository

    Bungartz, Hans-Joachim

    2011-07-27

    In this work, we present recent enhancements and new functionalities of our flow solver in the partial differential equation framework Peano. We start with an introduction including an overview of the Peano development and a short description of the basic concepts of Peano and the flow solver in Peano concerning the underlying structured but adaptive Cartesian grids, the data structure and data access optimisation, and spatial and time discretisation of the flow solver. The new features cover geometry interfaces and additional application functionalities. The two geometry interfaces, a triangulation-based description supported by the tool preCICE and a built-in geometry using geometry primitives such as cubes, spheres, or tetrahedra allow for the efficient treatment of complex and changing geometries, an essential ingredient for most application scenarios. The new application functionality concerns a coupled heat-flow problem and two-phase flows. We present numerical examples, performance and validation results for these new functionalities. © 2011 Springer-Verlag.

  14. Casting Simulation Within the Framework of ICME: Coupling of Solidification, Heat Treatment, and Structural Analysis

    Science.gov (United States)

    Guo, Jianzheng; Scott, Sam; Cao, Weisheng; Köser, Ole

    2016-05-01

    Integrated computational materials engineering (ICME) is becoming a compulsory practice for developing advanced materials, re-thinking manufacturing processing, and engineering components to meet challenging design goals quickly and cost-effectively. As a key component of the ICME approach, a numerical approach is being developed for the prediction of casting microstructure, defects formation and mechanical properties from solidification to heat treatment. Because of the processing conditions and complexity of geometry, material properties of a cast part are not normally homogeneous. This variation and the potential weakening inherent in manufacturing are currently accommodated by incorporating large safety factors that counter design goals. The simulation of the different manufacturing process stages is integrated such that the resultant microstructure of the previous event is used as the initial condition of the following event, ensuring the tracking of the component history while maintaining a high level of accuracy across these manufacturing stages. This paper explains the significance of integrated analytical prediction to obtain more precise simulation results and sets out how available techniques may be applied accordingly.

  15. On a framework for generating PoD curves assisted by numerical simulations

    Science.gov (United States)

    Subair, S. Mohamed; Agrawal, Shweta; Balasubramaniam, Krishnan; Rajagopal, Prabhu; Kumar, Anish; Rao, Purnachandra B.; Tamanna, Jayakumar

    2015-03-01

    The Probability of Detection (PoD) curve method has emerged as an important tool for the assessment of the performance of NDE techniques, a topic of particular interest to the nuclear industry where inspection qualification is very important. The conventional experimental means of generating PoD curves though, can be expensive, requiring large data sets (covering defects and test conditions), and equipment and operator time. Several methods of achieving faster estimates for PoD curves using physics-based modelling have been developed to address this problem. Numerical modelling techniques are also attractive, especially given the ever-increasing computational power available to scientists today. Here we develop procedures for obtaining PoD curves, assisted by numerical simulation and based on Bayesian statistics. Numerical simulations are performed using Finite Element analysis for factors that are assumed to be independent, random and normally distributed. PoD curves so generated are compared with experiments on austenitic stainless steel (SS) plates with artificially created notches. We examine issues affecting the PoD curve generation process including codes, standards, distribution of defect parameters and the choice of the noise threshold. We also study the assumption of normal distribution for signal response parameters and consider strategies for dealing with data that may be more complex or sparse to justify this. These topics are addressed and illustrated through the example case of generation of PoD curves for pulse-echo ultrasonic inspection of vertical surface-breaking cracks in SS plates.

  16. A framework for hierarchical, object-oriented simulation modeling of a steel manufacturing enterprise

    Energy Technology Data Exchange (ETDEWEB)

    Henriksen, A.D.; Joyce, E.L.; Lally, B.R. [and others

    1997-10-01

    This is the final report of a two-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project is to combine detailed physical models of industrial processes with unit operations and business-level models. This would allow global and individual process control schemes to be implemented that would facilitate improved overall system performance. Intelligent decision support that employs expert system concepts (knowledge base and rules) could then also be incorporated. This project is innovative because it attempts to incorporate all levels of production-related activities from atoms to enterprise, and to integrate those activities into one comprehensive decision support tool. This project is an interdisciplinary effort requiring enterprise modeling and simulation model integration; process modeling and control; process control and optimization; chemical process modeling; and detailed molecular-level models. It represents the state of the art in enterprise modeling and simulation and incorporates cutting edge process modeling, process control, and system optimization techniques.

  17. A parallel overset-curvilinear-immersed boundary framework for simulating complex 3D incompressible flows.

    Science.gov (United States)

    Borazjani, Iman; Ge, Liang; Le, Trung; Sotiropoulos, Fotis

    2013-04-01

    We develop an overset-curvilinear immersed boundary (overset-CURVIB) method in a general non-inertial frame of reference to simulate a wide range of challenging biological flow problems. The method incorporates overset-curvilinear grids to efficiently handle multi-connected geometries and increase the resolution locally near immersed boundaries. Complex bodies undergoing arbitrarily large deformations may be embedded within the overset-curvilinear background grid and treated as sharp interfaces using the curvilinear immersed boundary (CURVIB) method (Ge and Sotiropoulos, Journal of Computational Physics, 2007). The incompressible flow equations are formulated in a general non-inertial frame of reference to enhance the overall versatility and efficiency of the numerical approach. Efficient search algorithms to identify areas requiring blanking, donor cells, and interpolation coefficients for constructing the boundary conditions at grid interfaces of the overset grid are developed and implemented using efficient parallel computing communication strategies to transfer information among sub-domains. The governing equations are discretized using a second-order accurate finite-volume approach and integrated in time via an efficient fractional-step method. Various strategies for ensuring globally conservative interpolation at grid interfaces suitable for incompressible flow fractional step methods are implemented and evaluated. The method is verified and validated against experimental data, and its capabilities are demonstrated by simulating the flow past multiple aquatic swimmers and the systolic flow in an anatomic left ventricle with a mechanical heart valve implanted in the aortic position. PMID:23833331

  18. Chemical analysis of Argonne premium coal samples. Bulletin

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, C.A.

    1997-11-01

    Contents: The Chemical Analysis of Argonne Premium Coal Samples: An Introduction; Rehydration of Desiccated Argonne Premium Coal Samples; Determination of 62 Elements in 8 Argonne Premium Coal Ash Samples by Automated Semiquantitative Direct-Current Arc Atomic Emission Spectrography; Determination of 18 Elements in 5 Whole Argonne Premium Coal Samples by Quantitative Direct-Current Arc Atomic Emission Spectrography; Determination of Major and Trace Elements in Eight Argonne Premium Coal Samples (Ash and Whole Coal) by X-Ray Fluorescence Spectrometry; Determination of 29 Elements in 8 Argonne Premium Coal Samples by Instrumental Neutron Activation Analysis; Determination of Selected Elements in Coal Ash from Eight Argonne Premium Coal Samples by Atomic Absorption Spectrometry and Atomic Emission Spectrometry; Determination of 25 Elements in Coal Ash from 8 Argonne Premium Coal Samples by Inductively Coupled Argon Plasma-Atomic Emission Spectrometry; Determination of 33 Elements in Coal Ash from 8 Argonne Premium Coal Samples by Inductively Coupled Argon Plasma-Mass Spectrometry; Determination of Mercury and Selenium in Eight Argonne Premium Coal Samples by Cold-Vapor and Hydride-Generation Atomic Absorption Spectrometry; Determinaton of Carbon, Hydrogen, and Nitrogen in Eight Argonne Premium Coal Samples by Using a Gas Chromatographic Analyzer with a Thermal Conductivity Detector; and Compilation of Multitechnique Determinations of 51 Elements in 8 Argonne Premium Coal Samples.

  19. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  20. SIMULATION FRAMEWORK FOR REGIONAL GEOLOGIC CO{sub 2} STORAGE ALONG ARCHES PROVINCE OF MIDWESTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Sminchak, Joel

    2012-09-30

    This report presents final technical results for the project Simulation Framework for Regional Geologic CO{sub 2} Storage Infrastructure along Arches Province of the Midwest United States. The Arches Simulation project was a three year effort designed to develop a simulation framework for regional geologic carbon dioxide (CO{sub 2}) storage infrastructure along the Arches Province through development of a geologic model and advanced reservoir simulations of large-scale CO{sub 2} storage. The project included five major technical tasks: (1) compilation of geologic, hydraulic and injection data on Mount Simon, (2) development of model framework and parameters, (3) preliminary variable density flow simulations, (4) multi-phase model runs of regional storage scenarios, and (5) implications for regional storage feasibility. The Arches Province is an informal region in northeastern Indiana, northern Kentucky, western Ohio, and southern Michigan where sedimentary rock formations form broad arch and platform structures. In the province, the Mount Simon sandstone is an appealing deep saline formation for CO{sub 2} storage because of the intersection of reservoir thickness and permeability. Many CO{sub 2} sources are located in proximity to the Arches Province, and the area is adjacent to coal fired power plants along the Ohio River Valley corridor. Geophysical well logs, rock samples, drilling logs, and geotechnical tests were evaluated for a 500,000 km{sup 2} study area centered on the Arches Province. Hydraulic parameters and historical operational information was also compiled from Mount Simon wastewater injection wells in the region. This information was integrated into a geocellular model that depicts the parameters and conditions in a numerical array. The geologic and hydraulic data were integrated into a three-dimensional grid of porosity and permeability, which are key parameters regarding fluid flow and pressure buildup due to CO{sub 2} injection. Permeability data

  1. SIMULATION FRAMEWORK FOR REGIONAL GEOLOGIC CO{sub 2} STORAGE ALONG ARCHES PROVINCE OF MIDWESTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Sminchak, Joel

    2012-09-30

    This report presents final technical results for the project Simulation Framework for Regional Geologic CO{sub 2} Storage Infrastructure along Arches Province of the Midwest United States. The Arches Simulation project was a three year effort designed to develop a simulation framework for regional geologic carbon dioxide (CO{sub 2}) storage infrastructure along the Arches Province through development of a geologic model and advanced reservoir simulations of large-scale CO{sub 2} storage. The project included five major technical tasks: (1) compilation of geologic, hydraulic and injection data on Mount Simon, (2) development of model framework and parameters, (3) preliminary variable density flow simulations, (4) multi-phase model runs of regional storage scenarios, and (5) implications for regional storage feasibility. The Arches Province is an informal region in northeastern Indiana, northern Kentucky, western Ohio, and southern Michigan where sedimentary rock formations form broad arch and platform structures. In the province, the Mount Simon sandstone is an appealing deep saline formation for CO{sub 2} storage because of the intersection of reservoir thickness and permeability. Many CO{sub 2} sources are located in proximity to the Arches Province, and the area is adjacent to coal fired power plants along the Ohio River Valley corridor. Geophysical well logs, rock samples, drilling logs, and geotechnical tests were evaluated for a 500,000 km{sup 2} study area centered on the Arches Province. Hydraulic parameters and historical operational information was also compiled from Mount Simon wastewater injection wells in the region. This information was integrated into a geocellular model that depicts the parameters and conditions in a numerical array. The geologic and hydraulic data were integrated into a three-dimensional grid of porosity and permeability, which are key parameters regarding fluid flow and pressure buildup due to CO{sub 2} injection. Permeability data

  2. The Design of Cognitive Social Simulation Framework using Statistical Methodology in the Domain of Academic Science

    Directory of Open Access Journals (Sweden)

    R. Sivakumar

    2013-05-01

    Full Text Available Modeling the behavior of the cognitive architecture in the context of social simulation using statistical methodologies is currently a growing research area. Normally, a cognitive architecture for an intelligent agent involves artificial computational process which exemplifies theories of cognition in computer algorithms under the consideration of state space. More specifically, for such cognitive system with large state space the problem like large tables and data sparsity are faced. Hence in this paper, we have proposed a method using a value iterative approach based on Q-learning algorithm, with function approximation technique to handle the cognitive systems with large state space. From the experimental results in the application domain of academic science it has been verified that the proposed approach has better performance compared to its existing approaches.

  3. Handling of the Generation of Primary Events in Gauss, the LHCb Simulation Framework

    CERN Multimedia

    Corti, G; Brambach, T; Brook, N H; Gauvin, N; Harrison, K; Harrison, P; He, J; Ilten, P J; Jones, C R; Lieng, M H; Manca, G; Miglioranzi, S; Robbe, P; Vagnoni, V; Whitehead, M; Wishahi, J

    2010-01-01

    The LHCb simulation application, Gauss, consists of two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup. For the LHCb experimental program it is particularly important to model B meson decays: the EvtGen code developed in CLEO and BaBar has been chosen and customized for non coherent B production as occuring in pp collisions at the LHC. The initial proton-proton collision is provided by a different generator engine, currently Pythia 6 for massive production of signal and generic pp collisions events. Beam gas events, background events originating from proton halo, cosmics and calibration events for different detectors can be generated in addition to pp collisions. Different generator packages are available in the physics community or specifically developed in LHCb, and are used for the different purposes. Running conditions affecting the events generated such as the size of the luminous region, the number of collisions occuring in a bunc...

  4. Optimising and extending the geometrical modeller of a physics simulation framework

    CERN Document Server

    Urban, P

    1998-01-01

    The design of highly complex particle detectors used in High Energy Physics involves both CAD systems and physics simulation packages like Geant4. Geant4 is able to exchange detector geometries with CAD systems, conforming to the Standard for the Exchange of Product Model Data (STEP); Boundary Representation (B-Rep) models are transferred. Particle tracking is performed in these models, requiring efficient and accurate intersection computations from the geometrical modeller. The results of extending and optimising the modeller of Geant4 form the contents of this thesis. Swept surfaces: surfaces of linear extrusion and surfaces of revolution have been implemented. The problem of classifying points on surfaces bounded by curves as being inside or outside has been solved. These tasks necessitated the extension and optimisation of code related to curves and lead to a re-design of this code. Emphasis was put on efficiency and on dealing with numerical errors. The results will be integrated into the upcoming beta t...

  5. A framework of motion capture system based human behaviours simulation for ergonomic analysis

    CERN Document Server

    Ma, Ruina; Bennis, Fouad; Ma, Liang

    2011-01-01

    With the increasing of computer capabilities, Computer aided ergonomics (CAE) offers new possibilities to integrate conventional ergonomic knowledge and to develop new methods into the work design process. As mentioned in [1], different approaches have been developed to enhance the efficiency of the ergonomic evaluation. Ergonomic expert systems, ergonomic oriented information systems, numerical models of human, etc. have been implemented in numerical ergonomic software. Until now, there are ergonomic software tools available, such as Jack, Ergoman, Delmia Human, 3DSSPP, and Santos, etc. [2-4]. The main functions of these tools are posture analysis and posture prediction. In the visualization part, Jack and 3DSSPP produce results to visualize virtual human tasks in 3-dimensional, but without realistic physical properties. Nowadays, with the development of computer technology, the simulation of physical world is paid more attention. Physical engines [5] are used more and more in computer game (CG) field. The a...

  6. A discrete element and ray framework for rapid simulation of acoustical dispersion of microscale particulate agglomerations

    Science.gov (United States)

    Zohdi, T. I.

    2016-03-01

    In industry, particle-laden fluids, such as particle-functionalized inks, are constructed by adding fine-scale particles to a liquid solution, in order to achieve desired overall properties in both liquid and (cured) solid states. However, oftentimes undesirable particulate agglomerations arise due to some form of mutual-attraction stemming from near-field forces, stray electrostatic charges, process ionization and mechanical adhesion. For proper operation of industrial processes involving particle-laden fluids, it is important to carefully breakup and disperse these agglomerations. One approach is to target high-frequency acoustical pressure-pulses to breakup such agglomerations. The objective of this paper is to develop a computational model and corresponding solution algorithm to enable rapid simulation of the effect of acoustical pulses on an agglomeration composed of a collection of discrete particles. Because of the complex agglomeration microstructure, containing gaps and interfaces, this type of system is extremely difficult to mesh and simulate using continuum-based methods, such as the finite difference time domain or the finite element method. Accordingly, a computationally-amenable discrete element/discrete ray model is developed which captures the primary physical events in this process, such as the reflection and absorption of acoustical energy, and the induced forces on the particulate microstructure. The approach utilizes a staggered, iterative solution scheme to calculate the power transfer from the acoustical pulse to the particles and the subsequent changes (breakup) of the pulse due to the particles. Three-dimensional examples are provided to illustrate the approach.

  7. CO adsorption over Pd nanoparticles: A general framework for IR simulations on nanoparticles

    Science.gov (United States)

    Zeinalipour-Yazdi, Constantinos D.; Willock, David J.; Thomas, Liam; Wilson, Karen; Lee, Adam F.

    2016-04-01

    CO vibrational spectra over catalytic nanoparticles under high coverages/pressures are discussed from a DFT perspective. Hybrid B3LYP and PBE DFT calculations of CO chemisorbed over Pd4 and Pd13 nanoclusters, and a 1.1 nm Pd38 nanoparticle, have been performed in order to simulate the corresponding coverage dependent infrared (IR) absorption spectra, and hence provide a quantitative foundation for the interpretation of experimental IR spectra of CO over Pd nanocatalysts. B3LYP simulated IR intensities are used to quantify site occupation numbers through comparison with experimental DRIFTS spectra, allowing an atomistic model of CO surface coverage to be created. DFT adsorption energetics for low CO coverage (θ → 0) suggest the CO binding strength follows the order hollow > bridge > linear, even for dispersion-corrected functionals for sub-nanometre Pd nanoclusters. For a Pd38 nanoparticle, hollow and bridge-bound are energetically similar (hollow ≈ bridge > atop). It is well known that this ordering has not been found at the high coverages used experimentally, wherein atop CO has a much higher population than observed over Pd(111), confirmed by our DRIFTS spectra for Pd nanoparticles supported on a KIT-6 silica, and hence site populations were calculated through a comparison of DFT and spectroscopic data. At high CO coverage (θ = 1), all three adsorbed CO species co-exist on Pd38, and their interdiffusion is thermally feasible at STP. Under such high surface coverages, DFT predicts that bridge-bound CO chains are thermodynamically stable and isoenergetic to an entirely hollow bound Pd/CO system. The Pd38 nanoparticle undergoes a linear (3.5%), isotropic expansion with increasing CO coverage, accompanied by 63 and 30 cm- 1 blue-shifts of hollow and linear bound CO respectively.

  8. Design of a digital beam attenuation system for computed tomography: Part I. System design and simulation framework

    International Nuclear Information System (INIS)

    Purpose: The purpose of this work is to introduce a new device that allows for patient-specific imaging-dose modulation in conventional and cone-beam CT. The device is called a digital beam attenuator (DBA). The DBA modulates an x-ray beam by varying the attenuation of a set of attenuating wedge filters across the fan angle. The ability to modulate the imaging dose across the fan beam represents another stride in the direction of personalized medicine. With the DBA, imaging dose can be tailored for a given patient anatomy, or even tailored to provide signal-to-noise ratio enhancement within a region of interest. This modulation enables decreases in: dose, scatter, detector dynamic range requirements, and noise nonuniformities. In addition to introducing the DBA, the simulation framework used to study the DBA under different configurations is presented. Finally, a detailed study on the choice of the material used to build the DBA is presented. Methods: To change the attenuator thickness, the authors propose to use an overlapping wedge design. In this design, for each wedge pair, one wedge is held stationary and another wedge is moved over the stationary wedge. The composite thickness of the two wedges changes as a function of the amount of overlap between the wedges. To validate the DBA concept and study design changes, a simulation environment was constructed. The environment allows for changes to system geometry, different source spectra, DBA wedge design modifications, and supports both voxelized and analytic phantom models. A study of all the elements from atomic number 1 to 92 were evaluated for use as DBA filter material. The amount of dynamic range and tube loading for each element were calculated for various DBA designs. Tube loading was calculated by comparing the attenuation of the DBA at its minimum attenuation position to a filtered non-DBA acquisition. Results: The design and parametrization of DBA implemented FFMCT has been introduced. A simulation

  9. A high performance computing framework for physics-based modeling and simulation of military ground vehicles

    Science.gov (United States)

    Negrut, Dan; Lamb, David; Gorsich, David

    2011-06-01

    This paper describes a software infrastructure made up of tools and libraries designed to assist developers in implementing computational dynamics applications running on heterogeneous and distributed computing environments. Together, these tools and libraries compose a so called Heterogeneous Computing Template (HCT). The heterogeneous and distributed computing hardware infrastructure is assumed herein to be made up of a combination of CPUs and Graphics Processing Units (GPUs). The computational dynamics applications targeted to execute on such a hardware topology include many-body dynamics, smoothed-particle hydrodynamics (SPH) fluid simulation, and fluid-solid interaction analysis. The underlying theme of the solution approach embraced by HCT is that of partitioning the domain of interest into a number of subdomains that are each managed by a separate core/accelerator (CPU/GPU) pair. Five components at the core of HCT enable the envisioned distributed computing approach to large-scale dynamical system simulation: (a) the ability to partition the problem according to the one-to-one mapping; i.e., spatial subdivision, discussed above (pre-processing); (b) a protocol for passing data between any two co-processors; (c) algorithms for element proximity computation; and (d) the ability to carry out post-processing in a distributed fashion. In this contribution the components (a) and (b) of the HCT are demonstrated via the example of the Discrete Element Method (DEM) for rigid body dynamics with friction and contact. The collision detection task required in frictional-contact dynamics (task (c) above), is shown to benefit on the GPU of a two order of magnitude gain in efficiency when compared to traditional sequential implementations. Note: Reference herein to any specific commercial products, process, or service by trade name, trademark, manufacturer, or otherwise, does not imply its endorsement, recommendation, or favoring by the United States Army. The views and

  10. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  11. A framework for the evaluation of turbulence closures used in mesoscale ocean large-eddy simulations

    CERN Document Server

    Graham, Jonathan Pietarila

    2012-01-01

    We present a methodology to determine the best turbulence closure for an eddy-permitting ocean model: measurement of the error-landscape of the closure's subgrid spectral transfers and flux. Using a high-resolution benchmark, we compare each closure's model of energy and enstrophy transfer to the actual transfer observed in the benchmark run. The error-landscape norms enable us to both make objective comparisons between the closures and to optimize each closure's free parameter for a fair comparison. We apply this method to 6 different closures for forced-dissipative simulations of the barotropic vorticity equation on a f-plane (2D Navier-Stokes equation). The hyper-viscous closure most closely reproduces the enstrophy cascade especially at larger scales due to the concentration of its dissipative effects to the very smallest scales. The viscous and Leith closures perform nearly as well especially at smaller scales where all three models were dissipative. The Smagorinsky closure dissipates enstrophy at the wr...

  12. Development of a Parallel Overset Grid Framework for Moving Body Simulations in OpenFOAM

    Directory of Open Access Journals (Sweden)

    Dominic Chandar

    2015-12-01

    Full Text Available OpenFOAM is an industry-standard Open-Source fluid dynamics code that is used to solve the Navier-Stokes equations for a variety of flow situations. It is currently being used extensively by researchers to study a plethora of physical problems ranging from fundamental fluid dynamics to complex multiphase flows. When it comes to modeling the flow surrounding moving bodies that involve large displacements such as that of ocean risers, sinking of a ship, or the free-flight of an insect, it is cumbersome to utilize a single computational grid and move the body of interest. In this work, we discuss a high-fidelity approach based on overset or overlapping grids which overcomes the necessity of using a single computational grid. The overset library is parallelized using the Message Passing Interface (MPI and Pthreads and is linked dynamically to OpenFOAM. Computational results are presented to demonstrate the potential of this method for simulating problems with large displacements.

  13. Preliminary Simulation Results of the Constituent Distribution Model Implemented into the BISON Framework for the Performance Analysis of Metallic Fuels

    International Nuclear Information System (INIS)

    This paper presents the progress of the fuel performance analysis suing BISON fuel performance tool. Within the BISON framework a new Zirconium diffusion kernel was implemented that substantially improved the reliability and stability for solving Zirconium migration problems in metallic nuclear fuels. Several benchmark problems were created, including a simple 1-D rod and several 2-D segments of a fuel rod that covered several temperature regimes. The simulations were then compared to results from the code Pedernal, which is a “high-performance solver for coupled, nonlinear heat conduction and multi-component species diffusion”. Comparison between the BISON based solver and Pedernal showed a good agreement for all benchmark cases. Very small differences appeared at longer time frames (2,000 days) for the hotter fuel simulations; however this is believed to be due, primarily, to the different time-stepping algorithms used in the two codes. Overall the domain of the benchmark covered all components of the U-Pu-Zr phase diagram (alpha, delta, beta and gamma phases) and showed a good agreement when compared to Pedernal based results. (author)

  14. C++QEDv2 Milestone 10: A C++/Python application-programming framework for simulating open quantum dynamics

    Science.gov (United States)

    Sandner, Raimar; Vukics, András

    2014-09-01

    The v2 Milestone 10 release of C++QED is primarily a feature release, which also corrects some problems of the previous release, especially as regards the build system. The adoption of C++11 features has led to many simplifications in the codebase. A full doxygen-based API manual [1] is now provided together with updated user guides. A largely automated, versatile new testsuite directed both towards computational and physics features allows for quickly spotting arising errors. The states of trajectories are now savable and recoverable with full binary precision, allowing for trajectory continuation regardless of evolution method (single/ensemble Monte Carlo wave-function or Master equation trajectory). As the main new feature, the framework now presents Python bindings to the highest-level programming interface, so that actual simulations for given composite quantum systems can now be performed from Python. Catalogue identifier: AELU_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: yes No. of lines in distributed program, including test data, etc.: 492422 No. of bytes in distributed program, including test data, etc.: 8070987 Distribution format: tar.gz Programming language: C++/Python. Computer: i386-i686, x86 64. Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2. External routines: Boost C

  15. Causal Mathematical Logic as a guiding framework for the prediction of "Intelligence Signals" in brain simulations

    Science.gov (United States)

    Lanzalaco, Felix; Pissanetzky, Sergio

    2013-12-01

    A recent theory of physical information based on the fundamental principles of causality and thermodynamics has proposed that a large number of observable life and intelligence signals can be described in terms of the Causal Mathematical Logic (CML), which is proposed to encode the natural principles of intelligence across any physical domain and substrate. We attempt to expound the current definition of CML, the "Action functional" as a theory in terms of its ability to possess a superior explanatory power for the current neuroscientific data we use to measure the mammalian brains "intelligence" processes at its most general biophysical level. Brain simulation projects define their success partly in terms of the emergence of "non-explicitly programmed" complex biophysical signals such as self-oscillation and spreading cortical waves. Here we propose to extend the causal theory to predict and guide the understanding of these more complex emergent "intelligence Signals". To achieve this we review whether causal logic is consistent with, can explain and predict the function of complete perceptual processes associated with intelligence. Primarily those are defined as the range of Event Related Potentials (ERP) which include their primary subcomponents; Event Related Desynchronization (ERD) and Event Related Synchronization (ERS). This approach is aiming for a universal and predictive logic for neurosimulation and AGi. The result of this investigation has produced a general "Information Engine" model from translation of the ERD and ERS. The CML algorithm run in terms of action cost predicts ERP signal contents and is consistent with the fundamental laws of thermodynamics. A working substrate independent natural information logic would be a major asset. An information theory consistent with fundamental physics can be an AGi. It can also operate within genetic information space and provides a roadmap to understand the live biophysical operation of the phenotype

  16. Architecture Framework for Trapped-Ion Quantum Computer based on Performance Simulation Tool

    Science.gov (United States)

    Ahsan, Muhammad

    The challenge of building scalable quantum computer lies in striking appropriate balance between designing a reliable system architecture from large number of faulty computational resources and improving the physical quality of system components. The detailed investigation of performance variation with physics of the components and the system architecture requires adequate performance simulation tool. In this thesis we demonstrate a software tool capable of (1) mapping and scheduling the quantum circuit on a realistic quantum hardware architecture with physical resource constraints, (2) evaluating the performance metrics such as the execution time and the success probability of the algorithm execution, and (3) analyzing the constituents of these metrics and visualizing resource utilization to identify system components which crucially define the overall performance. Using this versatile tool, we explore vast design space for modular quantum computer architecture based on trapped ions. We find that while success probability is uniformly determined by the fidelity of physical quantum operation, the execution time is a function of system resources invested at various layers of design hierarchy. At physical level, the number of lasers performing quantum gates, impact the latency of the fault-tolerant circuit blocks execution. When these blocks are used to construct meaningful arithmetic circuit such as quantum adders, the number of ancilla qubits for complicated non-clifford gates and entanglement resources to establish long-distance communication channels, become major performance limiting factors. Next, in order to factorize large integers, these adders are assembled into modular exponentiation circuit comprising bulk of Shor's algorithm. At this stage, the overall scaling of resource-constraint performance with the size of problem, describes the effectiveness of chosen design. By matching the resource investment with the pace of advancement in hardware technology

  17. Toward an ontology framework supporting the integration of geographic information with modeling and simulation for critical infrastructure protection

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, John J [Los Alamos National Laboratory; Bent, Russell W [Los Alamos National Laboratory; Linger, Steve P [Los Alamos National Laboratory

    2009-01-01

    Protecting the nation's infrastructure from natural disasters, inadvertent failures, or intentional attacks is a major national security concern. Gauging the fragility of infrastructure assets, and understanding how interdependencies across critical infrastructures affect their behavior, is essential to predicting and mitigating cascading failures, as well as to planning for response and recovery. Modeling and simulation (M&S) is an indispensable part of characterizing this complex system of systems and anticipating its response to disruptions. Bringing together the necessary components to perform such analyses produces a wide-ranging and coarse-grained computational workflow that must be integrated with other analysis workflow elements. There are many points in both types of work flows in which geographic information (GI) services are required. The GIS community recognizes the essential contribution of GI in this problem domain as evidenced by past OGC initiatives. Typically such initiatives focus on the broader aspects of GI analysis workflows, leaving concepts crucial to integrating simulations within analysis workflows to that community. Our experience with large-scale modeling of interdependent critical infrastructures, and our recent participation in a DRS initiative concerning interoperability for this M&S domain, has led to high-level ontological concepts that we have begun to assemble into an architecture that spans both computational and 'world' views of the problem, and further recognizes the special requirements of simulations that go beyond common workflow ontologies. In this paper we present these ideas, and offer a high-level ontological framework that includes key geospatial concepts as special cases of a broader view.

  18. Wire chamber degradation at the Argonne ZGS

    International Nuclear Information System (INIS)

    Experience with multiwire proportional chambers at high rates at the Argonne Zero Gradient Synchrotron is described. A buildup of silicon on the sense wires was observed where the beam passed through the chamber. Analysis of the chamber gas indicated that the density of silicon was probably less than 10 ppM

  19. Overview of the Argonne National Laboratory program

    International Nuclear Information System (INIS)

    The design activities and experimental R and D program of the heavy-ion fusion program at Argonne are discussed. A 1 MJ, 160 TW rf linac - accumulator system reference design is described, and a status report on the experimental program is given

  20. Computational Investigations of Potential Energy Function Development for Metal--Organic Framework Simulations, Metal Carbenes, and Chemical Warfare Agents

    Science.gov (United States)

    Cioce, Christian R.

    Metal-Organic Frameworks (MOFs) are three-dimensional porous nanomaterials with a variety of applications, including catalysis, gas storage and separation, and sustainable energy. Their potential as air filtration systems is of interest for designer carbon capture materials. The chemical constituents (i.e. organic ligands) can be functionalized to create rationally designed CO2 sequestration platforms, for example. Hardware and software alike at the bleeding edge of supercomputing are utilized for designing first principles-based molecular models for the simulation of gas sorption in these frameworks. The classical potentials developed herein are named PHAST --- Potentials with High Accuracy, Speed, and Transferability, and thus are designed via a "bottom-up" approach. Specifically, models for N2 and CH4 are constructed and presented. Extensive verification and validation leads to insights and range of applicability. Through this experience, the PHAST models are improved upon further to be more applicable in heterogeneous environments. Given this, the models are applied to reproducing high level ab initio energies for gas sorption trajectories of helium atoms in a variety of rare-gas clusters, the geometries of which being representative of sorption-like environments commonly encountered in a porous nanomaterial. This work seeks to push forward the state of classical and first principles materials modeling. Additionally, the characterization of a new type of tunable radical metal---carbene is presented. Here, a cobalt(II)---porphyrin complex, [Co(Por)], was investigated to understand its role as an effective catalyst in stereoselective cyclopropanation of a diazoacetate reagent. Density functional theory along with natural bond order analysis and charge decomposition analysis gave insight into the electronics of the catalytic intermediate. The bonding pattern unveiled a new class of radical metal---carbene complex, with a doublet cobalt into which a triplet carbene

  1. Exploring the "what if?" in geology through a RESTful open-source framework for cloud-based simulation and analysis

    Science.gov (United States)

    Klump, Jens; Robertson, Jess

    2016-04-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Our framework consist of two layers: (a) a ground-truth layer that contains geological models, which can be statistically based on historical operations data, and (b) a network of RESTful synthetic sensor microservices which can query the ground-truth for underlying properties and produce a simulated measurement to a control layer, which could be a database or LIMS, a machine learner or a companies' existing data infrastructure. Ground truth data are generated by an implicit geological model which serves as a host for nested models of geological processes as smaller scales. Our two layers are implemented using Flask and Gunicorn, which are open source Python web application framework and server, the PyData stack (numpy, scipy etc) and Rabbit MQ (an open-source queuing library). Sensor data is encoded using a JSON-LD version of the SensorML and Observations and Measurements standards. Containerisation of the synthetic sensors using Docker and CoreOS allows rapid and scalable deployment of large numbers of sensors, as well as sensor discovery to form a self-organized dynamic network of sensors. Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the

  2. Simulation Framework to Estimate the Performance of CO2 and O2 Sensing from Space and Airborne Platforms for the ASCENDS Mission Requirements Analysis

    Science.gov (United States)

    Plitau, Denis; Prasad, Narasimha S.

    2012-01-01

    The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.

  3. On-lattice agent-based simulation of populations of cells within the open-source Chaste framework

    KAUST Repository

    Figueredo, G. P.

    2013-02-21

    Over the years, agent-based models have been developed that combine cell division and reinforced random walks of cells on a regular lattice, reaction-diffusion equations for nutrients and growth factors; and ordinary differential equations for the subcellular networks regulating the cell cycle. When linked to a vascular layer, this multiple scale model framework has been applied to tumour growth and therapy. Here, we report on the creation of an agent-based multi-scale environment amalgamating the characteristics of these models within a Virtual Physiological Human (VPH) Exemplar Project. This project enables reuse, integration, expansion and sharing of the model and relevant data. The agent-based and reaction-diffusion parts of the multi-scale model have been implemented and are available for download as part of the latest public release of Chaste (Cancer, Heart and Soft Tissue Environment; http://www.cs.ox.ac.uk/chaste/), part of the VPH Toolkit (http://toolkit.vph-noe.eu/). The environment functionalities are verified against the original models, in addition to extra validation of all aspects of the code. In this work, we present the details of the implementation of the agent-based environment, including the system description, the conceptual model, the development of the simulation model and the processes of verification and validation of the simulation results. We explore the potential use of the environment by presenting exemplar applications of the \\'what if\\' scenarios that can easily be studied in the environment. These examples relate to tumour growth, cellular competition for resources and tumour responses to hypoxia (low oxygen levels). We conclude our work by summarizing the future steps for the expansion of the current system.

  4. Numerical simulation and experimental validation of biofilm in a multi-physics framework using an SPH based method

    Science.gov (United States)

    Soleimani, Meisam; Wriggers, Peter; Rath, Henryke; Stiesch, Meike

    2016-06-01

    In this paper, a 3D computational model has been developed to investigate biofilms in a multi-physics framework using smoothed particle hydrodynamics (SPH) based on a continuum approach. Biofilm formation is a complex process in the sense that several physical phenomena are coupled and consequently different time-scales are involved. On one hand, biofilm growth is driven by biological reaction and nutrient diffusion and on the other hand, it is influenced by fluid flow causing biofilm deformation and interface erosion in the context of fluid and deformable solid interaction. The geometrical and numerical complexity arising from these phenomena poses serious complications and challenges in grid-based techniques such as finite element. Here the solution is based on SPH as one of the powerful meshless methods. SPH based computational modeling is quite new in the biological community and the method is uniquely robust in capturing the interface-related processes of biofilm formation such as erosion. The obtained results show a good agreement with experimental and published data which demonstrates that the model is capable of simulating and predicting overall spatial and temporal evolution of biofilm.

  5. A framework for the design and specification of hard real-time, hardware-in-the-loop simulations of large, avionic systems

    Science.gov (United States)

    Ricks, Kenneth Gerald

    High-level design tools for the design and specification of avionic systems and real-time systems currently exist. However, real-time, hardware-in-the-loop simulations of avionic systems are based upon principles fundamentally different than those used to design avionic systems and represent a specialized case of real-time systems. As a result, the high-level software tools used to design avionic systems and real-time systems cannot be applied to the design of real-time, hardware-in-the-loop simulations of avionic systems. For this reason, such simulations of avionic systems should not be considered part of the domain containing avionic systems or general-purpose real-time systems and should be considered as an application domain unto itself for which design tools are unavailable. To fill this void, this dissertation proposes a framework for the design and specification of real-time, hardware-in-the-loop simulations of avionic systems. This framework is based upon a new specification language called the Simulation Architecture Description Language. This specification language is a graphical language with constructs and semantics defined to provide the user with the capability to completely define the simulation and its software execution characteristics at various levels of abstraction. The language includes a new method for combining precedence constraints for a single software process. These semantics provide a more accurate description of the behavior of software systems having a dynamic job structure than existing semantics. An environment that supports the execution of simulation software having the semantics defined within this language is also described. A toolset that interfaces to the language and provides additional functionality such as design analysis, schedulability analysis, and simulation file generation is also discussed. This framework provides a complete design and specification environment for real-time, hardware-in-the-loop simulations of

  6. ELIST8: simulating military deployments in Java

    International Nuclear Information System (INIS)

    Planning for the transportation of large amounts of equipment, troops, and supplies presents a complex problem. Many options, including modes of transportation, vehicles, facilities, routes, and timing, must be considered. The amount of data involved in generating and analyzing a course of action (e.g., detailed information about military units, logistical infrastructures, and vehicles) is enormous. Software tools are critical in defining and analyzing these plans. Argonne National Laboratory has developed ELIST (Enhanced Logistics Intra-theater Support Tool), a simulation-based decision support system, to assist military planners in determining the logistical feasibility of an intra-theater course of action. The current version of ELIST (v.8) contains a discrete event simulation developed using the Java programming language. Argonne selected Java because of its object-oriented framework, which has greatly facilitated entity and process development within the simulation, and because it fulfills a primary requirement for multi-platform execution. This paper describes the model, including setup and analysis, a high-level architectural design, and an evaluation of Java

  7. Environmental monitoring at Argonne National Laboratory. Annual report for 1976

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Sedlet, J.

    1977-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1976 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, Argonne effluent water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in surface and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with accepted environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  8. Environmental monitoring at Argonne National Laboratory. Annual report for 1978

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N W; Duffy, T L; Sedlet, J

    1979-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1978 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, Argonne effluent water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements wee made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  9. Environmental monitoring at Argonne National Laboratory. Annual report for 1978

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1978 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, Argonne effluent water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements wee made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated

  10. Argonne plasma wake-field acceleration experiments

    International Nuclear Information System (INIS)

    Four years after the initial proposal of the Plasma Wake-field Accelerator (PWFA), it continues to be the object of much investigation, due to the promise of the ultra-high accelerating gradients that can exist in relativistic plasma waves driven in the wake of charged particle beams. These wake-fields are of interest both in the laboratory, for acceleration and focusing of electrons and positrons in future linear colliders, and in nature as a possible cosmic ray acceleration mechanism. The purpose of the present work is to review the recent experimental advances made in PWFA research at Argonne National Laboratory. Some of the topics discussed are: the Argonne Advanced Accelerator Test Facility; linear plasma wake-field theory; measurement of linear plasma wake-fields; review of nonlinear plasma wave theory; and experimental measurement of nonlinear plasma wake-fields. 25 refs., 11 figs

  11. Towards a framework for teaching about information technology risk in health care: Simulating threats to health data and patient safety

    OpenAIRE

    Borycki, Elizabeth M

    2015-01-01

    In this paper the author describes work towards developing an integrative framework for educating health information technology professionals about technology risk. The framework considers multiple sources of risk to health data quality and integrity that can result from the use of health information technology (HIT) and can be used to teach health professional students about these risks when using health technologies. This framework encompasses issues and problems that may arise from varied ...

  12. The Relational Database Aspects of Argonne's ATLAS Control System

    OpenAIRE

    Quock, D. E. R.; Munson, F. H.; Eder, K. J.; Dean, S. L.

    2001-01-01

    The Relational Database Aspects of Argonnes ATLAS Control System Argonnes ATLAS (Argonne Tandem Linac Accelerator System) control system comprises two separate database concepts. The first is the distributed real-time database structure provided by the commercial product Vsystem [1]. The second is a more static relational database archiving system designed by ATLAS personnel using Oracle Rdb [2] and Paradox [3] software. The configuration of the ATLAS facility has presented a unique opportuni...

  13. Materials technology at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Betten, P.

    1989-01-01

    Argonne is actively involved in the research and development of new materials research and development (R D). Five new materials technologies have been identified for commercial potential and are presented in this paper as follows: (1) nanophase materials, (2) nuclear magnetic resonance (NMR) imaging of ceramics, (3) superconductivity developments and technology transfer mechanisms, and (4) COMMIX computer code modeling for metal castings, and (5) tribology using ion-assisted deposition (IAB). 4 refs., 7 figs., 1 tab.

  14. Materials technology at Argonne National Laboratory

    International Nuclear Information System (INIS)

    Argonne is actively involved in the research and development of new materials research and development (R ampersand D). Five new materials technologies have been identified for commercial potential and are presented in this paper as follows: (1) nanophase materials, (2) nuclear magnetic resonance (NMR) imaging of ceramics, (3) superconductivity developments and technology transfer mechanisms, and (4) COMMIX computer code modeling for metal castings, and (5) tribology using ion-assisted deposition (IAB). 4 refs., 7 figs., 1 tab

  15. Effect of the partial charge distributions in the ZIF-8 framework on the adsorption of CO2 as assessed by Gibbs Ensemble Monte Carlo simulations

    Science.gov (United States)

    Puphasuk, P.; Remsungnen, T.

    2016-03-01

    The effect of the partial charge distributions on the adsorption of CO2 molecules in ZIF-8 framework has been obtained using GEMC simulations. The charge models are derived from the ab initio electrostatic potential surfaces of framework fragments. The signs of the atomic charges obtained with the MK model are more consistent with respect to the basis set than those obtained from the other models. The charges on the H1 and H2 atoms are found to have a notable effect on the amount of absorbed CO2, in agreement with a previous mention that these hydrogen sites are the favored absorption sites.

  16. A flexible object-based software framework for modeling complex systems with interacting natural and societal processes.

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-15

    The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations. The DIAS infrastructure makes it feasible to build and manipulate complex simulation scenarios in which many thousands of objects can interact via dozens to hundreds of concurrent dynamic processes. The flexibility and extensibility of the DIAS software infrastructure stem mainly from (1) the abstraction of object behaviors, (2) the encapsulation and formalization of model functionality, and (3) the mutability of domain object contents. DIAS simulation objects are inherently capable of highly flexible and heterogeneous spatial realizations. Geospatial graphical representation of DIAS simulation objects is addressed via the GeoViewer, an object-based GIS toolkit application developed at ANL. DIAS simulation capabilities have been extended by inclusion of societal process models generated by the Framework for Addressing Cooperative Extended Transactions (FACET), another object-based framework developed at Argonne National Laboratory. By using FACET models to implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations, it has become possible to conveniently address a broad range of issues involving interaction and feedback among natural and societal processes. Example DIAS application areas discussed in this paper include a dynamic virtual oceanic environment, detailed simulation of clinical, physiological, and logistical aspects of health care delivery, and studies of agricultural sustainability of urban centers under environmental stress in ancient Mesopotamia.

  17. Neutronics code development at Argonne National Laboratory

    International Nuclear Information System (INIS)

    Full text: As part of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of U.S. DOE, a suite of modern fast reactor simulation tools is being developed at Argonne National Laboratory. The general goal is to reduce the uncertainties and biases in various areas of reactor design activities by providing enhanced prediction capabilities. Under this fast reactor simulation program, a high-fidelity deterministic neutron transport code named UNIC is being developed. The end goal of this development is to produce an integrated neutronics code that enables the high fidelity description of a nuclear reactor and simplifies the multi-step design process by direct and accurate coupling with thermal-hydraulics and structural mechanics calculations. Current fast reactor analysis tools such as the fuel cycle analysis packages REBUS-3 and ERANOS, and the safety analysis package SASSYS contain neutronics packages built around multi-step averaging techniques (spatial homogenization and energy collapsing). These approximations vastly reduce the total space-angle-energy degrees of freedom required for nuclear reactor analysis and provide reasonably good solutions for most fast reactor design and analysis calculations. However, they have limitations in providing reliable answers for difficult reactor physics problems (e.g., the reactivity feedback due to core radial expansion). Additionally, it is desirable to reduce the uncertainties and biases in various areas of reactor design activities with the enhanced prediction capabilities that higher fidelity solvers provide. We therefore have a long term goal of replacing the multi-step averaging approximations by progressively more accurate treatments of the entire space-angle-energy phase space with sufficiently fine-grained levels of discretization. Given that high-fidelity transport calculations are not required in all areas of reactor analysis, we also desire an analysis tool that can allow the user to start at the

  18. A generalized adjoint framework for sensitivity and global error estimation in time-dependent nuclear reactor simulations

    International Nuclear Information System (INIS)

    Highlights: ► We develop an abstract framework for computing the adjoint to the neutron/nuclide burnup equations posed as a system of differential algebraic equations. ► We validate use of the adjoint for computing both sensitivity to uncertain inputs and for estimating global time discretization error. ► Flexibility of the framework is leveraged to add heat transfer physics and compute its adjoint without a reformulation of the adjoint system. ► Such flexibility is crucial for high performance computing applications. -- Abstract: We develop a general framework for computing the adjoint variable to nuclear engineering problems governed by a set of differential–algebraic equations (DAEs). The nuclear engineering community has a rich history of developing and applying adjoints for sensitivity calculations; many such formulations, however, are specific to a certain set of equations, variables, or solution techniques. Any change or addition to the physics model would require a reformulation of the adjoint problem and substantial difficulties in its software implementation. In this work we propose an abstract framework that allows for the modification and expansion of the governing equations, leverages the existing theory of adjoint formulation for DAEs, and results in adjoint equations that can be used to efficiently compute sensitivities for parametric uncertainty quantification. Moreover, as we justify theoretically and demonstrate numerically, the same framework can be used to estimate global time discretization error. We first motivate the framework and show that the coupled Bateman and transport equations, which govern the time-dependent neutronic behavior of a nuclear reactor, may be formulated as a DAE system with a power constraint. We then use a variational approach to develop the parameter-dependent adjoint framework and apply existing theory to give formulations for sensitivity and global time discretization error estimates using the adjoint

  19. Steady-State Gyrokinetics Transport Code (SSGKT), A Scientific Application Partnership with the Framework Application for Core-Edge Transport Simulations, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Fahey, Mark R. [Oak Ridge National Laboratory; Candy, Jeff [General Atomics

    2013-11-07

    This project initiated the development of TGYRO ? a steady-state Gyrokinetic transport code (SSGKT) that integrates micro-scale GYRO turbulence simulations into a framework for practical multi-scale simulation of conventional tokamaks as well as future reactors. Using a lightweight master transport code, multiple independent (each massively parallel) gyrokinetic simulations are coordinated. The capability to evolve profiles using the TGLF model was also added to TGYRO and represents a more typical use-case for TGYRO. The goal of the project was to develop a steady-state Gyrokinetic transport code (SSGKT) that integrates micro-scale gyrokinetic turbulence simulations into a framework for practical multi-scale simulation of a burning plasma core ? the International Thermonuclear Experimental Reactor (ITER) in particular. This multi-scale simulation capability will be used to predict the performance (the fusion energy gain, Q) given the H-mode pedestal temperature and density. At present, projections of this type rely on transport models like GLF23, which are based on rather approximate fits to the results of linear and nonlinear simulations. Our goal is to make these performance projections with precise nonlinear gyrokinetic simulations. The method of approach is to use a lightweight master transport code to coordinate multiple independent (each massively parallel) gyrokinetic simulations using the GYRO code. This project targets the practical multi-scale simulation of a reactor core plasma in order to predict the core temperature and density profiles given the H-mode pedestal temperature and density. A master transport code will provide feedback to O(16) independent gyrokinetic simulations (each massively parallel). A successful feedback scheme offers a novel approach to predictive modeling of an important national and international problem. Success in this area of fusion simulations will allow US scientists to direct the research path of ITER over the next two

  20. Recent developments in the target facilities at Argonne National Laboratory

    International Nuclear Information System (INIS)

    A description is given of recent developments in the target facility at Argonne National Laboratory. Highlights include equipment upgrades which enables us to provide enhanced capabilities for support of the Argonne Heavy-Ion ATLAS Accelerator Project. Also future plans and additional equipment acquisitions will be discussed. 3 refs., 3 tabs

  1. Description of the Argonne National Laboratory target making facility

    International Nuclear Information System (INIS)

    A description is given to some recent developments in the target facility at Argonne National Laboratory. Highlights include equipment upgrades which enable us to provide enhanced capabilities for support of the Argonne Heavy-Ion ATLAS Accelerator Program. Work currently in progress is described and future prospects discussed. 8 refs

  2. Environmental monitoring at Argonne National Laboratory. Annual report for 1982

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne Ntaional Laboratory for 1982 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, ground water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and masurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated

  3. Environmental monitoring at Argonne National Laboratory. Annual report for 1983

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1983 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, ground water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The potential radiation dose to off-site population groups is also estimated. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. 19 references, 8 figures, 49 tables

  4. Environmental monitoring at Argonne National Laboratory. Annual report for 1980

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Duffy, T. L.; Sedlet, J.

    1981-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1980 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  5. Environmental monitoring at Argonne National Laboratory. Annual report for 1984

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1984 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, ground water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in surface water, ground water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made on the site, at the site boundary, and off the Argonne site for comparison purposes. The potential radiation dose to off-site population groups is also estimated. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. 20 refs., 8 figs., 46 tabs

  6. Environmental monitoring at Argonne National Laboratory. Annual report for 1983

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Sedlet, J.

    1984-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1983 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, ground water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The potential radiation dose to off-site population groups is also estimated. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. 19 references, 8 figures, 49 tables.

  7. Environmental monitoring at Argonne National Laboratory. Annual report, 1981

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1981 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated

  8. Environmental monitoring at Argonne National Laboratory. Annual report, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Sedlet, J.

    1982-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1981 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  9. Environmental monitoring at Argonne National Laboratory. Annual report for 1980

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1980 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated

  10. Environmental monitoring at Argonne National Laboratory. Annual report for 1979

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1979 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, Argonne effluent water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environemetal penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measuremenets were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated

  11. The Argonne Wakefield Accelerator: Overview and status

    International Nuclear Information System (INIS)

    The Argonne Wakefield Accelerator (AWA) is a new facility for advanced accelerator research, with a particular emphasis on studies of high gradient (∼100 MeV/m) wakefield acceleration. A novel high current short pulse L-Band photocathode and preaccelerator will provide 100 nC electron bunches at 20 MeV to be used as a drive beam, while a second high brightness gun will be used to generate a 5 MeV witness beam for wakefield measurements. We will present an overview of the various AWA systems, the status of construction, and initial commissioning results

  12. The SOPHY framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, M. F.; Bendtsen, Jan Dimon; Alminde, Lars

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  13. The SOPHY Framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, Martin Fejrskov; Bendtsen, Jan Dimon; Alminde, Lars

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  14. Argonne Bubble Experiment Thermal Model Development II

    Energy Technology Data Exchange (ETDEWEB)

    Buechler, Cynthia Eileen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report describes the continuation of the work reported in “Argonne Bubble Experiment Thermal Model Development”. The experiment was performed at Argonne National Laboratory (ANL) in 2014. A rastered 35 MeV electron beam deposited power in a solution of uranyl sulfate, generating heat and radiolytic gas bubbles. Irradiations were performed at three beam power levels, 6, 12 and 15 kW. Solution temperatures were measured by thermocouples, and gas bubble behavior was observed. This report will describe the Computational Fluid Dynamics (CFD) model that was developed to calculate the temperatures and gas volume fractions in the solution vessel during the irradiations. The previous report described an initial analysis performed on a geometry that had not been updated to reflect the as-built solution vessel. Here, the as-built geometry is used. Monte-Carlo N-Particle (MCNP) calculations were performed on the updated geometry, and these results were used to define the power deposition profile for the CFD analyses, which were performed using Fluent, Ver. 16.2. CFD analyses were performed for the 12 and 15 kW irradiations, and further improvements to the model were incorporated, including the consideration of power deposition in nearby vessel components, gas mixture composition, and bubble size distribution. The temperature results of the CFD calculations are compared to experimental measurements.

  15. Little by little does the trick design and construction of a discrete event agent-based simulation framework

    OpenAIRE

    Matsopoulos, Alexandros

    2007-01-01

    Simulation is one of the most widely used techniques in operations research. In the military context, agent-based simulations have been extensively used by defense agencies worldwide. Despite the numerous disadvantages and limitations associated with timestepping, most of the combat-oriented agent-based simulation models are time-step implementations. The Discrete Event Scheduling (DES) paradigm, on the other hand, is free of these disadvantages and limitations. The scope of this thesis...

  16. Pricing Caps in the Heath, Jarrow and Morton Framework Using Monte Carlo Simulations in a Java Applet

    OpenAIRE

    Kalavrezos, Michail

    2007-01-01

    In this paper the Heath, Jarrow and Morton (HJM) framework is applied in the programming language Java for the estimation of the future spot rate. The subcase of an exponential model for the diffusion coefficient (volatility) is used for the pricing of interest rate derivatives (caps).

  17. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    DEFF Research Database (Denmark)

    Comminal, Raphaël; Spangenberg, Jon; Hattel, Jesper Henri

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our......-of-fluid solver in order to predict free-surfaces in extrusion....

  18. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    DEFF Research Database (Denmark)

    Comminal, Raphaël; Spangenberg, Jon; Hattel, Jesper Henri

    2015-01-01

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our......-of-fluid solver in order to predict free-surfaces in extrusion....

  19. Low-coverage adsorption properties of the metal-organic framework MIL-47 studied by pulse chromatography and Monte Carlo simulations.

    Science.gov (United States)

    Finsy, Vincent; Calero, Sofia; García-Pérez, Elena; Merkling, Patrick J; Vedts, Gill; De Vos, Dirk E; Baron, Gino V; Denayer, Joeri F M

    2009-05-14

    Low-coverage adsorption properties of the metal-organic framework MIL-47 were determined by a combined experimental and simulation study. Henry constants and low coverage adsorption enthalpies of C5-C8 linear and branched alkanes, cyclohexane and benzene were measured from 120 to 240 degrees C using pulse gas chromatography. An adapted force field for linear and branched alkanes in MIL-47 was used to compute the adsorption properties of those molecules. A new set of charges was developed for simulations with benzene in MIL-47. The adsorption enthalpy of linear alkanes increases with about 7.6 kJ mol(-1) per additional -CH2- group. Henry adsorption constants of iso-alkanes are slightly lower than those of the linear chains but the MIL-47 framework is not imposing steric constraints on the branched chains. Benzene and cyclohexane are adsorbed less strongly than n-hexane as they have less hydrogen atoms. For the studied non-polar molecules, the adsorption energies are dominated by van der Waals interactions and benzene adsorption is additionally influenced by Coulombic interactions. The simulated tendencies are in good agreement with the experiments. PMID:19421556

  20. The Development of Dynamic Human Reliability Analysis Simulations for Inclusion in Risk Informed Safety Margin Characterization Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring; Curtis L. Smith; Rachel B. Shirley

    2015-07-01

    The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describes current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.

  1. Environmental assessment related to the operation of Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    1982-08-01

    In order to evaluate the environmental impacts of Argonne National Laboratory (ANL) operations, this assessment includes a descriptive section which is intended to provide sufficient detail to allow the various impacts to be viewed in proper perspective. In particular, details are provided on site characteristics, current programs, characterization of the existing site environment, and in-place environmental monitoring programs. In addition, specific facilities and operations that could conceivably impact the environment are described at length. 77 refs., 16 figs., 47 tabs.

  2. The PyZgoubi framework and the simulation of dynamic aperture in fixed-field alternating-gradient accelerators

    International Nuclear Information System (INIS)

    We present PyZgoubi, a framework that has been developed based on the tracking engine Zgoubi to model, optimise and visualise the dynamics in particle accelerators, especially fixed-field alternating-gradient (FFAG) accelerators. We show that PyZgoubi abstracts Zgoubi by wrapping it in an easy-to-use Python framework in order to allow simple construction, parameterisation, visualisation and optimisation of FFAG accelerator lattices. Its object oriented design gives it the flexibility and extensibility required for current novel FFAG design. We apply PyZgoubi to two example FFAGs; this includes determining the dynamic aperture of the PAMELA medical FFAG in the presence of magnet misalignments, and illustrating how PyZgoubi may be used to optimise FFAGs. We also discuss a robust definition of dynamic aperture in an FFAG and show its implementation in PyZgoubi

  3. The PyZgoubi framework and the simulation of dynamic aperture in fixed-field alternating-gradient accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Tygier, S., E-mail: sam.tygier@hep.manchester.ac.uk [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Appleby, R.B., E-mail: robert.appleby@manchester.ac.uk [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Garland, J.M. [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Hock, K. [University of Liverpool (United Kingdom); Owen, H. [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Kelliher, D.J.; Sheehy, S.L. [STFC Rutherford Appleton Laboratory (United Kingdom)

    2015-03-01

    We present PyZgoubi, a framework that has been developed based on the tracking engine Zgoubi to model, optimise and visualise the dynamics in particle accelerators, especially fixed-field alternating-gradient (FFAG) accelerators. We show that PyZgoubi abstracts Zgoubi by wrapping it in an easy-to-use Python framework in order to allow simple construction, parameterisation, visualisation and optimisation of FFAG accelerator lattices. Its object oriented design gives it the flexibility and extensibility required for current novel FFAG design. We apply PyZgoubi to two example FFAGs; this includes determining the dynamic aperture of the PAMELA medical FFAG in the presence of magnet misalignments, and illustrating how PyZgoubi may be used to optimise FFAGs. We also discuss a robust definition of dynamic aperture in an FFAG and show its implementation in PyZgoubi.

  4. On-lattice agent-based simulation of populations of cells within the open-source Chaste framework

    OpenAIRE

    Figueredo, G.; Joshi, T.; Osborne, J.; Byrne, H. M.; Owen, M

    2013-01-01

    Over the years, agent-based models have been developed that combine cell division and reinforced random walks of cells on a regular lattice, reaction–diffusion equations for nutrients and growth factors; and ordinary differential equations for the subcellular networks regulating the cell cycle. When linked to a vascular layer, this multiple scale model framework has been applied to tumour growth and therapy. Here, we report on the creation of an agent-based multi-scale environment amalgamatin...

  5. The Design for Tractable Analysis (DTA) Framework: A Methodology for the Analysis and Simulation of Complex Systems

    OpenAIRE

    John M. Linebarger; Mark J. De Spain; McDonald, Michael J.; Floyd W. Spencer; Robert J. Cloutier

    2009-01-01

    The Design for Tractable Analysis (DTA) framework was developed to address the analysis of complex systems and so-called “wicked problems.†DTA is distinctive because it treats analytic processes as key artifacts that can be created and improved through formal design processes. Systems (or enterprises) are analyzed as a whole, in conjunction with decomposing them into constituent elements for domain-specific analyses that are informed by the whole. After using the Systems Modeling Language...

  6. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    DEFF Research Database (Denmark)

    Misfeldt, Morten

    2015-01-01

    construction w ork. The goal of the paper is to investigate empirically how these two understandings influence game experience and learning outcome. This question is approached by qualitative post‑game interviews about the experienced fun, competition and realism. Speci fic attention is given to how the...... competitive game and as a simulation. Both of these views are meaningful and can be seen as supporting learnin g. Emphasizing the simulation aspect let us explain how students learn by being immersed into a simulated world, where the players identify with specific roles, live out specific situations, and...

  7. Nuclear Accident Dosimetry at Argonne National Laboratory

    International Nuclear Information System (INIS)

    This report summarizes current planning at Argonne National Laboratory with respect to dose determination following a criticality incident. The discussion relates chiefly to two types of commercially obtained dosimeter packages, and includes the results of independent calibrations performed at the Laboratory. The primary dosimeter system incorporates threshold detectors developed at Oak Ridge National Laboratory for neutron spectrum measurement. Fission foil decay calibration curves have been determined experimentally for scintillation counting equipment routinely used at Argonne. This equipment also has been calibrated for determination of sodium-24 activity in blood. Dosimeter units of the type designed at Savannah River Laboratory are deployed as secondary stations. Data from the neutron activation components of these units will be used to make corrections to, the neutron spectrum for intermediate as well as thermal energies. The epicadmium copper foil activation, for a given fluence of intermediate energy neutrons, has been shown relatively insensitive to neutron spectrum variations within the region, and a meaningful average of copper cross-section has been determined. Counter calibration factors determined at Argonne are presented for the copper, indium, and sulphur components. The total neutron fluence is computed using the corrected spectrum in conjunction with a capture probability function and the blood sodium result. One or more specifications of neutron dose then may be calculated by applying the spectral information to the appropriate conversion function. The gamma portion of the primary dosimeter package contains fluorescent rods and a thermoluminescent dosimeter in addition to a two-phase chemical dosimeter. The gamma dosimeter in the secondary package is a polyacrylamide solution which is degraded by exposure to gamma radiation. The absorbed dose is derived from a measured change insolution viscosity. Difficulties in evaluation, placement, and

  8. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    OpenAIRE

    Comminal, Raphaël; Spangenberg, Jon; Hattel, Jesper Henri

    2015-01-01

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our calculations predict a hydrodynamic instability in the 4:1 contraction geometry at a Weissenberg number of order 4. This new result is in qualitative agreement with the prediction of a non-linear sub...

  9. Development and validation of a modelling framework for simulating 2D-mammography and breast tomosynthesis images.

    Science.gov (United States)

    Elangovan, Premkumar; Warren, Lucy M; Mackenzie, Alistair; Rashidnasab, Alaleh; Diaz, Oliver; Dance, David R; Young, Kenneth C; Bosmans, Hilde; Strudley, Celia J; Wells, Kevin

    2014-08-01

    Planar 2D x-ray mammography is generally accepted as the preferred screening technique used for breast cancer detection. Recently, digital breast tomosynthesis (DBT) has been introduced to overcome some of the inherent limitations of conventional planar imaging, and future technological enhancements are expected to result in the introduction of further innovative modalities. However, it is crucial to understand the impact of any new imaging technology or methodology on cancer detection rates and patient recall. Any such assessment conventionally requires large scale clinical trials demanding significant investment in time and resources. The concept of virtual clinical trials and virtual performance assessment may offer a viable alternative to this approach. However, virtual approaches require a collection of specialized modelling tools which can be used to emulate the image acquisition process and simulate images of a quality indistinguishable from their real clinical counterparts. In this paper, we present two image simulation chains constructed using modelling tools that can be used for the evaluation of 2D-mammography and DBT systems. We validate both approaches by comparing simulated images with real images acquired using the system being simulated. A comparison of the contrast-to-noise ratios and image blurring for real and simulated images of test objects shows good agreement ( < 9% error). This suggests that our simulation approach is a promising alternative to conventional physical performance assessment followed by large scale clinical trials. PMID:25029333

  10. Development and validation of a modelling framework for simulating 2D-mammography and breast tomosynthesis images

    International Nuclear Information System (INIS)

    Planar 2D x-ray mammography is generally accepted as the preferred screening technique used for breast cancer detection. Recently, digital breast tomosynthesis (DBT) has been introduced to overcome some of the inherent limitations of conventional planar imaging, and future technological enhancements are expected to result in the introduction of further innovative modalities. However, it is crucial to understand the impact of any new imaging technology or methodology on cancer detection rates and patient recall. Any such assessment conventionally requires large scale clinical trials demanding significant investment in time and resources. The concept of virtual clinical trials and virtual performance assessment may offer a viable alternative to this approach. However, virtual approaches require a collection of specialized modelling tools which can be used to emulate the image acquisition process and simulate images of a quality indistinguishable from their real clinical counterparts. In this paper, we present two image simulation chains constructed using modelling tools that can be used for the evaluation of 2D-mammography and DBT systems. We validate both approaches by comparing simulated images with real images acquired using the system being simulated. A comparison of the contrast-to-noise ratios and image blurring for real and simulated images of test objects shows good agreement ( < 9% error). This suggests that our simulation approach is a promising alternative to conventional physical performance assessment followed by large scale clinical trials. (paper)

  11. Drive linac for the Argonne Wakefield Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Chojnacki, E.; Konecny, R.; Rosing, M.; Simpson, J.

    1993-08-01

    The drive linac in Phase I of the Argonne Wakefield Accelerator (AWA) will be used to accelerate short duration (10 ps), high charge (100 nC) electron bunches from 2 MV to 20 MV for use in a variety of wakefield acceleration and measurement studies. The high charge is required since this drive bunch will generate the wakefields of interest in various test sections and their amplitudes are proportional to bunch charge. The short bunch duration is required to drive high-frequency wakefields without intra-bunch cancellation effects. The drive linac design was a balance between having a small wake function to maintain a drive bunch energy spread of {le}10% and obtaining an adequate accelerating gradient of {ge}10 MV/m. This yielded a large aperture, low shunt impedance, high group velocity, L-band, standing-wave linac. Details of the design, fabrication, and testing are presented in the following.

  12. Argonne Plasma Engineering Experiment (APEX) Tokamak

    International Nuclear Information System (INIS)

    The Argonne Plasma Engineering Experiment (APEX) Tokamak was designed to provide hot plasmas for reactor-relevant experiments with rf heating (current drive) and plasma wall experiments, principally in-situ low-Z wall coating and maintenance. The device, sized to produce energetic plasmas at minimum cost, is small (R = 51 cm, r = 15 cm) but capable of high currents (100 kA) and long pulse durations (100 ms). A design using an iron central core with no return legs, pure tension tapewound toroidal field coils, digital radial position control, and UHV vacuum technology was used. Diagnostics include monochrometers, x-ray detectors, and a microwave interferometer and radiometer for density and temperature measurements. Stable 100 ms shots were produced with electron temperatures in the range 500 to 1000 eV. Initial results included studies of thermal desorption and recoating of wall materials

  13. The RD53 Collaboration's SystemVerilog-UVM Simulation Framework and its General Applicability to Design of Advanced Pixel Readout Chips

    CERN Document Server

    Marconi, S; Placidi, Pisana; Christiansen, Jorgen; Hemperek, Tomasz

    2014-01-01

    The foreseen Phase 2 pixel upgrades at the LHC have very challenging requirements for the design of hybrid pixel readout chips. A versatile pixel simulation platform is as an essential development tool for the design, verification and optimization of both the system architecture and the pixel chip building blocks (Intellectual Properties, IPs). This work is focused on the implemented simulation and verification environment named VEPIX53, built using the SystemVerilog language and the Universal Verification Methodology (UVM) class library in the framework of the RD53 Collaboration. The environment supports pixel chips at different levels of description: its reusable components feature the generation of different classes of parameterized input hits to the pixel matrix, monitoring of pixel chip inputs and outputs, conformity checks between predicted and actual outputs and collection of statistics on system performance. The environment has been tested performing a study of shared architectures of the trigger late...

  14. 着色Petri网的混杂系统仿真平台构架%Platform Framework of Complex Hybrid System Simulation Based on Colored Petri Nets

    Institute of Scientific and Technical Information of China (English)

    方哲梅; 王明哲; 杨翠蓉

    2011-01-01

    提出一种以Petri网为仿真进程控制,以着色Petri网与Matlab交互为主题的混杂仿真跨平台构架.该仿真构架通过运用和扩展着色Petri网中替代变迁的概念,结合融合库所和折叠功能,实现了混杂系统的复杂逻辑建模和连续系统内嵌.同时,着色Petri网的分析功能在一定程度上缓解了逻辑结构复杂的混杂系统检验困难的问题.最后通过一个混杂系统实例的建模与仿真分析,验证了该平台的可行性与逻辑检验的有效性,为复杂混杂系统的建模与仿真提供了一条新途径.%This paper proposes a cross-platform framework for hybrid simulation based on the interaction between colored Petri net (CPN) and Matlab, using Petri net as a tool for simulation process control. Utilizing and extending the concept of substitution transition, with the function of fusion place and folding, this framework can accomplish complex logical modeling and establishment of imbedded continuous process for hybrid systems. Besides, the analytical function of CPN reduces difficulty in logical verification for hybrid systems with complex logical behaviors. Finally, by modeling, simulation and analysis of a simple instance, feasibility of the platform and validity of the logic are shown. It provides a new method of modeling and simulation for large and complicated hybrid systems.

  15. Investigating H 2 Sorption in a Fluorinated Metal–Organic Framework with Small Pores Through Molecular Simulation and Inelastic Neutron Scattering

    KAUST Repository

    Forrest, Katherine A.

    2015-07-07

    © 2015 American Chemical Society. Simulations of H2 sorption were performed in a metal-organic framework (MOF) consisting of Zn2+ ions coordinated to 1,2,4-triazole and tetrafluoroterephthalate ligands (denoted [Zn(trz)(tftph)] in this work). The simulated H2 sorption isotherms reported in this work are consistent with the experimental data for the state points considered. The experimental H2 isosteric heat of adsorption (Qst) values for this MOF are approximately 8.0 kJ mol-1 for the considered loading range, which is in the proximity of those determined from simulation. The experimental inelastic neutron scattering (INS) spectra for H2 in [Zn(trz)(tftph)] reveal at least two peaks that occur at low energies, which corresponds to high barriers to rotation for the respective sites. The most favorable sorption site in the MOF was identified from the simulations as sorption in the vicinity of a metal-coordinated H2O molecule, an exposed fluorine atom, and a carboxylate oxygen atom in a confined region in the framework. Secondary sorption was observed between the fluorine atoms of adjacent tetrafluoroterephthalate ligands. The H2 molecule at the primary sorption site in [Zn(trz)(tftph)] exhibits a rotational barrier that exceeds that for most neutral MOFs with open-metal sites according to an empirical phenomenological model, and this was further validated by calculating the rotational potential energy surface for H2 at this site. (Figure Presented).

  16. Argonne National Laboratory research offers clues to Alzheimer's plaques

    CERN Multimedia

    2003-01-01

    Researchers from Argonne National Laboratory and the University of Chicago have developed methods to directly observe the structure and growth of microscopic filaments that form the characteristic plaques found in the brains of those with Alzheimer's Disease (1 page).

  17. Electron scattering. Lectures given at Argonne National Laboratory

    International Nuclear Information System (INIS)

    This report is an almost verbatim copy of lectures on Electron Scattering given at Argonne National Laboratory in the Fall of 1982 by John Dirk Walecka. Professor Walecka was an Argonne Fellow in the Physics Division from October 1982 to January 1983. Broad headings include general considerations, coincidence cross section (e,e'x), quantum electrodynamics and radiative corrections, unification of electroweak interactions, relativistic models of nuclear structure, electroproduction of pions and nucleon resonances, and deep inelastic (e,e')

  18. Argonne National Lab gets Linux network teraflop cluster

    CERN Multimedia

    2003-01-01

    "Linux NetworX, Salt Lake City, Utah, has delivered an Evolocity II (E2) Linux cluster to Argonne National Laboratory that is capable of performing more than one trillion calculations per second (1 teraFLOP). The cluster, named "Jazz" by Argonne, is designed to provide optimum performance for multiple disciplines such as chemistry, physics and reactor engineering and will be used by the entire scientific community at the Lab" (1 page).

  19. Argonne National Laboratory's photooxidation organic mixed-waste treatment system

    International Nuclear Information System (INIS)

    This paper describes the installation and startup testing of the Argonne National Laboratory-East (ANL-E) photo-oxidation organic mixed-waste treatment system. This system will treat organic mixed (i.e., radioactive and hazardous) waste by oxidizing the organics to carbon dioxide and inorganic salts in an aqueous media. The residue will be treated in the existing radwaste evaporators. The system is installed in the waste management facility at the ANL-E site in Argonne, Illinois

  20. pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2014-01-01

    This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions. PMID:24732497

  1. SIMPAR: a portable object-oriented simulation-science-based metamodel framework for performance modeling, prediction, and evaluation of HPC systems

    Science.gov (United States)

    Prasad, Guru; Gupta, Pankaj

    2004-08-01

    We present a novel, portable, platform-independent, object-oriented, simulation-science-based, metamodel framework (SimPar) for performance evaluation, estimation, and prediction of High-Performance Computing (HPC) systems. This UML-based, parallel meta-model enhances the Bulk Synchronous Parallel (BSP) computation model. The UML activity diagram is used to model the computation, communication, and synchronization operations of an application. We also identify the UML building blocks that characterize the message passing and shared memory parallel paradigms. This helps in modeling large and complex parallel applications. Using the collaboration diagram concept, parallel applications are mapped onto different multiprocessor architecture topologies such as hypercube, 2D mesh, ring, tree, star, etc. We present unique UML structural and behavioral extensions for modeling the inter-object interactions in BSP model. The communication semantics such as BROADCAST, GATHER, and SCATTER are incorporated in the metamodel using UML building blocks. In its present form, UML cannot satisfy all the modeling needs. In addition, none of the currently available tool sets deploy UML-based modeling. This underscores the uniqueness of parallel, cluster-based UML-enhanced framework presented here. We have validated the proposed model through benchmarks, simulation-science case studies and real-time parallel applications.

  2. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    Science.gov (United States)

    Comminal, R.; Spangenberg, J.; Hattel, J. H.

    2015-04-01

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our calculations predict a hydrodynamic instability in the 4:1 contraction geometry at a Weissenberg number of order 4. This new result is in qualitative agreement with the prediction of a non-linear subcritical elastic instability in Poiseuille flow. Our viscoelastic flow solver is coupled with a volume-of-fluid solver in order to predict free- surfaces in extrusion.

  3. Study of the response and photon-counting resolution of silicon photomultipliers using a generic simulation framework

    CERN Document Server

    Eckert, P; Schultz-Coulon, H.C

    2012-01-01

    which enables detailed modelling of the SiPM response using basic SiPM parameters and geometry as an input. Depending on the specified SiPM properties which can be determined from basic characterisation measurements, the simulation generates the signal charge and pulse shape for arbitrary incident light pulse distributions. The simulation has been validated in the whole dynamic range for a Hamamatsu S10362-11-100C MPPC and was used to study the effect of different noise sources like optical cross-talk and after-pulsing on the response curve and the photon-counting resolution.

  4. Generation of annular, high-charge electron beams at the Argonne wakefield accelerator

    Science.gov (United States)

    Wisniewski, E. E.; Li, C.; Gai, W.; Power, J.

    2013-01-01

    We present and discuss the results from the experimental generation of high-charge annular(ring-shaped)electron beams at the Argonne Wakefield Accelerator (AWA). These beams were produced by using laser masks to project annular laser profiles of various inner and outer diameters onto the photocathode of an RF gun. The ring beam is accelerated to 15 MeV, then it is imaged by means of solenoid lenses. Transverse profiles are compared for different solenoid settings. Discussion includes a comparison with Parmela simulations, some applications of high-charge ring beams,and an outline of a planned extension of this study.

  5. Intermittent communications modeling and simulation for autonomous unmanned maritime vehicles using an integrated APM and FSMC framework

    Science.gov (United States)

    Coker, Ayodeji; Straatemeier, Logan; Rogers, Ted; Valdez, Pierre; Griendling, Kelly; Cooksey, Daniel

    2014-06-01

    In this work a framework is presented for addressing the issue of intermittent communications faced by autonomous unmanned maritime vehicles operating at sea. In particular, this work considers the subject of predictive atmospheric signal transmission over multi-path fading channels in maritime environments. A Finite State Markov Channel is used to represent a Nakagami-m modeled physical fading radio channel. The range of the received signal-to-noise ratio is partitioned into a finite number of intervals which represent application-specific communications states. The Advanced Propagation Model (APM), developed at the Space and Naval Warfare Systems Center San Diego, provides a characterization of the transmission channel in terms of evaporation duct induced signal propagation loss. APM uses a hybrid ray-optic and parabolic equations model which allows for the computation of electromagnetic (EM) wave propagation over various sea and/or terrain paths. These models which have been integrated in the proposed framework provide a strategic and mission planning aid for the operation of maritime unmanned vehicles at sea.

  6. Argonne National Laboratory Site Environmental Report for Calendar Year 2013

    Energy Technology Data Exchange (ETDEWEB)

    Davis, T. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Gomez, J. L. [Argonne National Lab. (ANL), Argonne, IL (United States); Moos, L. P. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-09-02

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2013. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with environmental management, sustainability efforts, environmental corrective actions, and habitat restoration. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, Argonne, and other) and are compared with applicable standards intended to protect human health and the environment. A U.S. Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection (ICRP) recommendations and the U.S. Environmental Protection Agency’s (EPA) CAP-88 Version 3 computer code, was used in preparing this report.

  7. Argonne National Laboratory site enviromental report for calendar year 2008.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.

    2009-09-02

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2008. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  8. Argonne National Laboratory Site Environmental report for calendar year 2009.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.

    2010-08-04

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2009. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's (EPA) CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  9. Argonne National Laboratory site environmental report for calendar year 2006.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; ESH/QA Oversight

    2007-09-13

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2006. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  10. Argonne National Laboratory site environmental report for calendar year 2007.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.; ESH/QA Oversight

    2008-09-09

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2007. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  11. 1986 annual site environmental report for Argonne National Laboratory

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory (ANL) for 1986 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, ground water, soil, grass, bottom sediment, and milk; of the environmental penetrating radiation dose; and for a variety of chemical constituents in surface water, ground water, and Argonne effluent water. Sample collections and measurements were made on the site, at the site boundary, and off the Argonne site for comparison purposes. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A US Department of Energy (DOE) dose calculation methodology based on recent International Commission on Radiological Protection (ICRP) recommendations is required and used in this report. The radiation dose to off-site population groups is estimated. The average concentrations and total amounts of radioactive and chemical pollutants released by Argonne to the environment were all below appropriate standards. 21 refs., 7 figs., 52 tabs

  12. Variable Density Flow Modeling for Simulation Framework for Regional Geologic CO{sub 2} Storage Along Arches Province of Midwestern United States

    Energy Technology Data Exchange (ETDEWEB)

    Joel Sminchak

    2011-09-30

    The Arches Province in the Midwestern U.S. has been identified as a major area for carbon dioxide (CO{sub 2}) storage applications because of the intersection of Mt. Simon sandstone reservoir thickness and permeability. To better understand large-scale CO{sub 2} storage infrastructure requirements in the Arches Province, variable density scoping level modeling was completed. Three main tasks were completed for the variable density modeling: Single-phase, variable density groundwater flow modeling; Scoping level multi-phase simulations; and Preliminary basin-scale multi-phase simulations. The variable density modeling task was successful in evaluating appropriate input data for the Arches Province numerical simulations. Data from the geocellular model developed earlier in the project were translated into preliminary numerical models. These models were calibrated to observed conditions in the Mt. Simon, suggesting a suitable geologic depiction of the system. The initial models were used to assess boundary conditions, calibrate to reservoir conditions, examine grid dimensions, evaluate upscaling items, and develop regional storage field scenarios. The task also provided practical information on items related to CO{sub 2} storage applications in the Arches Province such as pressure buildup estimates, well spacing limitations, and injection field arrangements. The Arches Simulation project is a three-year effort and part of the United States Department of Energy (U.S. DOE)/National Energy Technology Laboratory (NETL) program on innovative and advanced technologies and protocols for monitoring/verification/accounting (MVA), simulation, and risk assessment of CO{sub 2} sequestration in geologic formations. The overall objective of the project is to develop a simulation framework for regional geologic CO{sub 2} storage infrastructure along the Arches Province of the Midwestern U.S.

  13. Helios: a Multi-Purpose LIDAR Simulation Framework for Research, Planning and Training of Laser Scanning Operations with Airborne, Ground-Based Mobile and Stationary Platforms

    Science.gov (United States)

    Bechtold, S.; Höfle, B.

    2016-06-01

    In many technical domains of modern society, there is a growing demand for fast, precise and automatic acquisition of digital 3D models of a wide variety of physical objects and environments. Laser scanning is a popular and widely used technology to cover this demand, but it is also expensive and complex to use to its full potential. However, there might exist scenarios where the operation of a real laser scanner could be replaced by a computer simulation, in order to save time and costs. This includes scenarios like teaching and training of laser scanning, development of new scanner hardware and scanning methods, or generation of artificial scan data sets to support the development of point cloud processing and analysis algorithms. To test the feasibility of this idea, we have developed a highly flexible laser scanning simulation framework named Heidelberg LiDAR Operations Simulator (HELIOS). HELIOS is implemented as a Java library and split up into a core component and multiple extension modules. Extensible Markup Language (XML) is used to define scanner, platform and scene models and to configure the behaviour of modules. Modules were developed and implemented for (1) loading of simulation assets and configuration (i.e. 3D scene models, scanner definitions, survey descriptions etc.), (2) playback of XML survey descriptions, (3) TLS survey planning (i.e. automatic computation of recommended scanning positions) and (4) interactive real-time 3D visualization of simulated surveys. As a proof of concept, we show the results of two experiments: First, a survey planning test in a scene that was specifically created to evaluate the quality of the survey planning algorithm. Second, a simulated TLS scan of a crop field in a precision farming scenario. The results show that HELIOS fulfills its design goals.

  14. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    Science.gov (United States)

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research. PMID:27017830

  15. Argonne National Laboratory institutional plan FY 2001--FY 2006.

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, S.D.

    2000-12-07

    This Institutional Plan describes what Argonne management regards as the optimal future development of Laboratory activities. The document outlines the development of both research programs and support operations in the context of the nation's R and D priorities, the missions of the Department of Energy (DOE) and Argonne, and expected resource constraints. The Draft Institutional Plan is the product of many discussions between DOE and Argonne program managers, and it also reflects programmatic priorities developed during Argonne's summer strategic planning process. That process serves additionally to identify new areas of strategic value to DOE and Argonne, to which Laboratory Directed Research and Development funds may be applied. The Draft Plan is provided to the Department before Argonne's On-Site Review. Issuance of the final Institutional Plan in the fall, after further comment and discussion, marks the culmination of the Laboratory's annual planning cycle. Chapter II of this Institutional Plan describes Argonne's missions and roles within the DOE laboratory system, its underlying core competencies in science and technology, and six broad planning objectives whose achievement is considered critical to the future of the Laboratory. Chapter III presents the Laboratory's ''Science and Technology Strategic Plan,'' which summarizes key features of the external environment, presents Argonne's vision, and describes how Argonne's strategic goals and objectives support DOE's four business lines. The balance of Chapter III comprises strategic plans for 23 areas of science and technology at Argonne, grouped according to the four DOE business lines. The Laboratory's 14 major initiatives, presented in Chapter IV, propose important advances in key areas of fundamental science and technology development. The ''Operations and Infrastructure Strategic Plan'' in Chapter V includes

  16. Numerical simulation of spray coalescence in an eulerian framework : direct quadrature method of moments and multi-fluid method

    OpenAIRE

    Fox, Rodney; Laurent, Frédérique; Massot, Marc

    2008-01-01

    The scope of the present study is Eulerian modeling and simulation of polydisperse liquid sprays undergoing droplet coalescence and evaporation. The fundamental mathematical description is the Williams spray equation governing the joint number density function f(v, u; x, t) of droplet volume and velocity. Eulerian multi-fluid models have already been rigorously derived from this equation in Laurent et al. (2004). The first key feature of the paper is the application of direct quadrature metho...

  17. Numerical simulation of spray coalescence in an eulerian framework : direct quadrature method of moments and multi-fluid method

    OpenAIRE

    Fox, Rodney O.; Laurent, Frédérique; Massot, Marc

    2007-01-01

    The scope of the present study is Eulerian modeling and simulation of polydisperse liquid sprays undergoing droplet coalescence and evaporation. The fundamental mathematical description is the Williams spray equation governing the joint number density function f(v, u; x, t) of droplet volume and velocity. Eulerian multi-fluid models have already been rigorously derived from this equation in Laurent et al. (2004). The first key feature of the paper is the application of direct quadrature metho...

  18. Exploring the impact of forcing error characteristics on physically based snow simulations within a global sensitivity analysis framework

    OpenAIRE

    M. S. Raleigh; Lundquist, J D; Clark, M. P.

    2015-01-01

    Physically based models provide insights into key hydrologic processes but are associated with uncertainties due to deficiencies in forcing data, model parameters, and model structure. Forcing uncertainty is enhanced in snow-affected catchments, where weather stations are scarce and prone to measurement errors, and meteorological variables exhibit high variability. Hence, there is limited understanding of how forcing error characteristics affect simulations of cold region h...

  19. L-Py: An L-System Simulation Framework for Modeling Plant Architecture Development Based on a Dynamic Language

    OpenAIRE

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic ...

  20. Electron-cloud simulation studies for the CERN-PS in the framework of the LHC Injectors Upgrade project

    CERN Document Server

    Rioja Fuentelsaz, Sergio

    The present study aims to provide a consistent picture of the electron cloud effect in the CERN Proton Synchrotron (PS) and to investigate possible future limitations due to the requirements foreseen by the LHC Injectors Upgrade (LIU) project. It consists of a complete simulation survey of the electron cloud build-up in the different beam pipe sections of the ring depending on several controllable beam parameters and vacuum chamber surface properties, covering present and future operation parameters. As the combined function magnets of the accelerator constitute almost the $80\\%$ in length of the ring, the implementation of a new feature for the simulation of any external magnetic field on the PyECLOUD code, made it possible to perform this study. All the results of the simulations are given as a function of the vacuum chamber surface properties in order to deduce them, both locally and globally, when compared with experimental data. In a first step, we characterize locally the maximum possible number of ...

  1. Simulations

    CERN Document Server

    Ngada, N M

    2015-01-01

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  2. SU-E-I-02: A Framework to Perform Batch Simulations of Computational Voxel Phantoms to Study Organ Doses in Computed Tomography Using a Commercial Monte Carlo Software Package

    International Nuclear Information System (INIS)

    Purpose: ImpactMC (CT Imaging, Erlangen, Germany) is a Monte Carlo (MC) software package that offers a GPU enabled, user definable and validated method for 3D dose distribution calculations for radiography and Computed Tomography (CT). ImpactMC, in and of itself, offers limited capabilities to perform batch simulations. The aim of this work was to develop a framework for the batch simulation of absorbed organ dose distributions from CT scans of computational voxel phantoms. Methods: The ICRP 110 adult Reference Male and Reference Female computational voxel phantoms were formatted into compatible input volumes for MC simulations. A Matlab (The MathWorks Inc., Natick, MA) script was written to loop through a user defined set of simulation parameters and 1) generate input files required for the simulation, 2) start the MC simulation, 3) segment the absorbed dose for organs in the simulated dose volume and 4) transfer the organ doses to a database. A demonstration of the framework is made where the glandular breast dose to the adult Reference Female phantom, for a typical Chest CT examination, is investigated. Results: A batch of 48 contiguous simulations was performed with variations in the total collimation and spiral pitch. The demonstration of the framework showed that the glandular dose to the right and left breast will vary depending on the start angle of rotation, total collimation and spiral pitch. Conclusion: The developed framework provides a robust and efficient approach to performing a large number of user defined MC simulations with computational voxel phantoms in CT (minimal user interaction). The resulting organ doses from each simulation can be accessed through a database which greatly increases the ease of analyzing the resulting organ doses. The framework developed in this work provides a valuable resource when investigating different dose optimization strategies in CT

  3. Using E-Z Reader to simulate eye movements in nonreading tasks: a unified framework for understanding the eye-mind link.

    Science.gov (United States)

    Reichle, Erik D; Pollatsek, Alexander; Rayner, Keith

    2012-01-01

    Nonreading tasks that share some (but not all) of the task demands of reading have often been used to make inferences about how cognition influences when the eyes move during reading. In this article, we use variants of the E-Z Reader model of eye-movement control in reading to simulate eye-movement behavior in several of these tasks, including z-string reading, target-word search, and visual search of Landolt Cs arranged in both linear and circular arrays. These simulations demonstrate that a single computational framework is sufficient to simulate eye movements in both reading and nonreading tasks but also suggest that there are task-specific differences in both saccadic targeting (i.e., decisions about where to move the eyes) and the coupling between saccadic programming and the movement of attention (i.e., decisions about when to move the eyes). These findings suggest that some aspects of the eye-mind link are flexible and can be configured in a manner that supports efficient task performance. PMID:22229492

  4. The RD53 collaboration's SystemVerilog-UVM simulation framework and its general applicability to design of advanced pixel readout chips

    International Nuclear Information System (INIS)

    The foreseen Phase 2 pixel upgrades at the LHC have very challenging requirements for the design of hybrid pixel readout chips. A versatile pixel simulation platform is as an essential development tool for the design, verification and optimization of both the system architecture and the pixel chip building blocks (Intellectual Properties, IPs). This work is focused on the implemented simulation and verification environment named VEPIX53, built using the SystemVerilog language and the Universal Verification Methodology (UVM) class library in the framework of the RD53 Collaboration. The environment supports pixel chips at different levels of description: its reusable components feature the generation of different classes of parameterized input hits to the pixel matrix, monitoring of pixel chip inputs and outputs, conformity checks between predicted and actual outputs and collection of statistics on system performance. The environment has been tested performing a study of shared architectures of the trigger latency buffering section of pixel chips. A fully shared architecture and a distributed one have been described at behavioral level and simulated; the resulting memory occupancy statistics and hit loss rates have subsequently been compared

  5. Exploring the impact of forcing error characteristics on physically based snow simulations within a global sensitivity analysis framework

    Science.gov (United States)

    Raleigh, M. S.; Lundquist, J. D.; Clark, M. P.

    2015-07-01

    Physically based models provide insights into key hydrologic processes but are associated with uncertainties due to deficiencies in forcing data, model parameters, and model structure. Forcing uncertainty is enhanced in snow-affected catchments, where weather stations are scarce and prone to measurement errors, and meteorological variables exhibit high variability. Hence, there is limited understanding of how forcing error characteristics affect simulations of cold region hydrology and which error characteristics are most important. Here we employ global sensitivity analysis to explore how (1) different error types (i.e., bias, random errors), (2) different error probability distributions, and (3) different error magnitudes influence physically based simulations of four snow variables (snow water equivalent, ablation rates, snow disappearance, and sublimation). We use the Sobol' global sensitivity analysis, which is typically used for model parameters but adapted here for testing model sensitivity to coexisting errors in all forcings. We quantify the Utah Energy Balance model's sensitivity to forcing errors with 1 840 000 Monte Carlo simulations across four sites and five different scenarios. Model outputs were (1) consistently more sensitive to forcing biases than random errors, (2) generally less sensitive to forcing error distributions, and (3) critically sensitive to different forcings depending on the relative magnitude of errors. For typical error magnitudes found in areas with drifting snow, precipitation bias was the most important factor for snow water equivalent, ablation rates, and snow disappearance timing, but other forcings had a more dominant impact when precipitation uncertainty was due solely to gauge undercatch. Additionally, the relative importance of forcing errors depended on the model output of interest. Sensitivity analysis can reveal which forcing error characteristics matter most for hydrologic modeling.

  6. I. Dissociation free energies of drug-receptor systems via non-equilibrium alchemical simulations: a theoretical framework.

    Science.gov (United States)

    Procacci, Piero

    2016-06-01

    In this contribution I critically revise the alchemical reversible approach in the context of the statistical mechanics theory of non-covalent bonding in drug-receptor systems. I show that most of the pitfalls and entanglements for the binding free energy evaluation in computer simulations are rooted in the equilibrium assumption that is implicit in the reversible method. These critical issues can be resolved by using a non-equilibrium variant of the alchemical method in molecular dynamics simulations, relying on the production of many independent trajectories with a continuous dynamical evolution of an externally driven alchemical coordinate, completing the decoupling of the ligand in a matter of a few tens of picoseconds rather than nanoseconds. The absolute binding free energy can be recovered from the annihilation work distributions by applying an unbiased unidirectional free energy estimate, on the assumption that any observed work distribution is given by a mixture of normal distributions, whose components are identical in either direction of the non-equilibrium process, with weights regulated by the Crooks theorem. I finally show that the inherent reliability and accuracy of the unidirectional estimate of the decoupling free energies, based on the production of a few hundreds of non-equilibrium independent sub-nanosecond unrestrained alchemical annihilation processes, is a direct consequence of the funnel-like shape of the free energy surface in molecular recognition. An application of the technique to a real drug-receptor system is presented in the companion paper. PMID:27193067

  7. Highly porous ionic rht metal-organic framework for H2 and CO2 storage and separation: A molecular simulation study

    KAUST Repository

    Babarao, Ravichandar

    2010-07-06

    The storage and separation of H2 and CO2 are investigated in a highly porous ionic rht metal-organic framework (rht-MOF) using molecular simulation. The rht-MOF possesses a cationic framework and charge-balancing extraframework NO3 - ions. Three types of unique open cages exist in the framework: rhombicuboctahedral, tetrahedral, and cuboctahedral cages. The NO3 - ions exhibit small mobility and are located at the windows connecting the tetrahedral and cuboctahedral cages. At low pressures, H2 adsorption occurs near the NO 3 - ions that act as preferential sites. With increasing pressure, H2 molecules occupy the tetrahedral and cuboctahedral cages and the intersection regions. The predicted isotherm of H2 at 77 K agrees well with the experimental data. The H2 capacity is estimated to be 2.4 wt % at 1 bar and 6.2 wt % at 50 bar, among the highest in reported MOFs. In a four-component mixture (15:75:5:5 CO2/H 2/CO/CH4) representing a typical effluent gas of H 2 production, the selectivity of CO2/H2 in rht-MOF decreases slightly with increasing pressure, then increases because of cooperative interactions, and finally decreases as a consequence of entropy effect. By comparing three ionic MOFs (rht-MOF, soc-MOF, and rho-ZMOF), we find that the selectivity increases with increasing charge density or decreasing free volume. In the presence of a trace amount of H2O, the interactions between CO2 and NO3 - ions are significantly shielded by H2O; consequently, the selectivity of CO 2/H2 decreases substantially. © 2010 American Chemical Society.

  8. Research on Framework of Virtual Simulation System of Nuclear Facilities Decommissioning%核设施退役虚拟仿真系统框架研究

    Institute of Scientific and Technical Information of China (English)

    刘中坤; 彭敏俊; 朱海山; 成守宇; 巩诚

    2011-01-01

    Since nuclear facilities have strong radioactivity, the process of dismantling nuclear facilities has some characteristics of high risk and complex procedure. The virtual simulation system of nuclear facilities decommissioning based on virtual reality technology can provide an aiding tool for decommissioning project. This paper analyzed the research results of native and abroad relative fields and the demand for one specific decommissioning project, then some functions of the virtual simulation system modules were proposed. Furthermore, a systematic framework structure based on software development technique was developed, and the corresponding technologies were analyzed and discussed. Combined with the latest research results of computer hardware and software technologies, the cutting, blasting, and the radioactive fluid simulation technique were discussed. The analysis results show that a comprehensive virtual simulation for decommissioning process can be achieved even on the common personal computer. The experts point out the framework has feasibility.%退役核设施具有放射性,其退役过程危险且复杂,基于虚拟现实等技术的退役虚拟仿真系统可为退役工程提供辅助工具.本文在分析国内外研究成果的基础上,结合一专项退役工程的需求提出了该虚拟仿真系统的功能模块组成.基于软件开发技术进一步给出了系统的框架结构,并对各模块实现的相应技术进行了分析和探讨.分析中结合了计算机软硬件技术的最新研究成果,对退役中的切割、爆破、流体及放射性等的仿真技术进行了探讨.分析表明,基于普通个人计算机也可实现退役过程的全面虚拟仿真.经专家论证本方案具有可行性.

  9. Coarse-grained kinetic scheme-based simulation framework for solution growth of ZnO nanowires

    Energy Technology Data Exchange (ETDEWEB)

    Alvi, Farah, E-mail: falvi@mail.usf.edu [University of South Florida, Department of Electrical Engineering (United States); Joshi, Rakesh K. [University of South Florida, Department of Mechanical Engineering (United States); Huang, Qiang [University of Southern California, Daniel J. Epstein Department of Industrial and Systems Engineering (United States); Kumar, Ashok [University of South Florida, Department of Mechanical Engineering (United States)

    2011-06-15

    Kinetic Monte Carlo (KMC)-based stochastic model is used to understand the growth of zinc oxide nanowires from aqueous solution containing chemical precursors and capping agent. Through a hydrothermal growth mechanism, the average diameter of zinc oxide wires obtained is around 300 nm, whereas the length is order of several micrometers. Our Monte Carlo algorithm is based on the continuous-time Monte Carlo algorithm of Bortz, Kalos and Lebowitz (BKL) methodology. Both reactions and diffusion mechanisms assigning stochastic probabilities have been simulated. In algorithm, the ZnO atoms were treated as individual particles which diffuse in solution substrate and interact with other type of atoms. Once attached with growing nanowires, the diffusion rate of ZnO atom is considerably reduced. Since in a KMC algorithm each atom can be represented individually therefore, internal noise is automatically incorporated.

  10. Numerical simulation of spray coalescence in an Eulerian framework: Direct quadrature method of moments and multi-fluid method

    Science.gov (United States)

    Fox, R. O.; Laurent, F.; Massot, M.

    2008-03-01

    The scope of the present study is Eulerian modeling and simulation of polydisperse liquid sprays undergoing droplet coalescence and evaporation. The fundamental mathematical description is the Williams spray equation governing the joint number density function f(v,u;x,t) of droplet volume and velocity. Eulerian multi-fluid models have already been rigorously derived from this equation in Laurent et al. [F. Laurent, M. Massot, P. Villedieu, Eulerian multi-fluid modeling for the numerical simulation of coalescence in polydisperse dense liquid sprays, J. Comput. Phys. 194 (2004) 505-543]. The first key feature of the paper is the application of direct quadrature method of moments (DQMOM) introduced by Marchisio and Fox [D.L. Marchisio, R.O. Fox, Solution of population balance equations using the direct quadrature method of moments, J. Aerosol Sci. 36 (2005) 43-73] to the Williams spray equation. Both the multi-fluid method and DQMOM yield systems of Eulerian conservation equations with complicated interaction terms representing coalescence. In order to focus on the difficulties associated with treating size-dependent coalescence and to avoid numerical uncertainty issues associated with two-way coupling, only one-way coupling between the droplets and a given gas velocity field is considered. In order to validate and compare these approaches, the chosen configuration is a self-similar 2D axisymmetrical decelerating nozzle with sprays having various size distributions, ranging from smooth ones up to Dirac delta functions. The second key feature of the paper is a thorough comparison of the two approaches for various test-cases to a reference solution obtained through a classical stochastic Lagrangian solver. Both Eulerian models prove to describe adequately spray coalescence and yield a very interesting alternative to the Lagrangian solver. The third key point of the study is a detailed description of the limitations associated with each method, thus giving criteria for

  11. Numerical simulation of spray coalescence in an Eulerian framework: Direct quadrature method of moments and multi-fluid method

    International Nuclear Information System (INIS)

    The scope of the present study is Eulerian modeling and simulation of polydisperse liquid sprays undergoing droplet coalescence and evaporation. The fundamental mathematical description is the Williams spray equation governing the joint number density function f(v,u;x,t) of droplet volume and velocity. Eulerian multi-fluid models have already been rigorously derived from this equation in Laurent et al. [F. Laurent, M. Massot, P. Villedieu, Eulerian multi-fluid modeling for the numerical simulation of coalescence in polydisperse dense liquid sprays, J. Comput. Phys. 194 (2004) 505-543]. The first key feature of the paper is the application of direct quadrature method of moments (DQMOM) introduced by Marchisio and Fox [D.L. Marchisio, R.O. Fox, Solution of population balance equations using the direct quadrature method of moments, J. Aerosol Sci. 36 (2005) 43-73] to the Williams spray equation. Both the multi-fluid method and DQMOM yield systems of Eulerian conservation equations with complicated interaction terms representing coalescence. In order to focus on the difficulties associated with treating size-dependent coalescence and to avoid numerical uncertainty issues associated with two-way coupling, only one-way coupling between the droplets and a given gas velocity field is considered. In order to validate and compare these approaches, the chosen configuration is a self-similar 2D axisymmetrical decelerating nozzle with sprays having various size distributions, ranging from smooth ones up to Dirac delta functions. The second key feature of the paper is a thorough comparison of the two approaches for various test-cases to a reference solution obtained through a classical stochastic Lagrangian solver. Both Eulerian models prove to describe adequately spray coalescence and yield a very interesting alternative to the Lagrangian solver. The third key point of the study is a detailed description of the limitations associated with each method, thus giving criteria for

  12. Estimation of floods with long return period using continuous simulation within the framework of the limits of acceptability approach

    Science.gov (United States)

    Beven, K.; Blazkova, S.

    2009-04-01

    The estimation of flood frequency by continuous simulation provides an alternative method to direct statistical estimation for catchments where there are limited historical records of flood peaks. We are presenting the extended GLUE multiple limits of acceptability calibration strategy in which models are treated as hypotheses about system response, to be rejected if the predictions fall outside of the limits of acceptability. Flood frequency predictions on the Skalka catchment in the Czech Republic (672 km2, range of altitudes from 460 to 1041 m a.s.l.), are compared against summary information of rainfall characteristics, the flow duration curve, and the frequency characteristics of flood discharges and snow water equivalent. Limits of acceptability have been defined, prior to running the Monte Carlo model realisations. Since we have identified only 39 behavioural models we have relaxed the limits of acceptability using a procedure of scoring deviations relative to the limits, to identify the minimum extension across all criteria (together 114 criteria) to obtain a sample of 4192 parameter sets that were accepted as potentially useful in prediction. Long term simulations of 10000 years for retained models were used to obtain uncertain estimates of the 1000 year peak required for the assessment of dam safety at the catchment outlet. We also demonstrate the effect of different input realisations on acceptability. Taking just one of the behavioural parameter sets and generating 10,000 input sequences of the same length as the observed flood series results in a range of critical values for acceptability across a range of evaluation criteria.

  13. A framework for incorporating DTI Atlas Builder registration into tract-based spatial statistics and a simulated comparison to standard TBSS

    Science.gov (United States)

    Leming, Matthew; Steiner, Rachel; Styner, Martin

    2016-03-01

    Tract-based spatial statistics (TBSS)6 is a software pipeline widely employed in comparative analysis of the white matter integrity from diffusion tensor imaging (DTI) datasets. In this study, we seek to evaluate the relationship between different methods of atlas registration for use with TBSS and different measurements of DTI (fractional anisotropy, FA, axial diffusivity, AD, radial diffusivity, RD, and medial diffusivity, MD). To do so, we have developed a novel tool that builds on existing diffusion atlas building software, integrating it into an adapted version of TBSS called DAB-TBSS (DTI Atlas Builder-Tract-Based Spatial Statistics) by using the advanced registration offered in DTI Atlas Builder7. To compare the effectiveness of these two versions of TBSS, we also propose a framework for simulating population differences for diffusion tensor imaging data, providing a more substantive means of empirically comparing DTI group analysis programs such as TBSS. In this study, we used 33 diffusion tensor imaging datasets and simulated group-wise changes in this data by increasing, in three different simulations, the principal eigenvalue (directly altering AD), the second and third eigenvalues (RD), and all three eigenvalues (MD) in the genu, the right uncinate fasciculus, and the left IFO. Additionally, we assessed the benefits of comparing the tensors directly using a functional analysis of diffusion tensor tract statistics (FADTTS10). Our results indicate comparable levels of FA-based detection between DAB-TBSS and TBSS, with standard TBSS registration reporting a higher rate of false positives in other measurements of DTI. Within the simulated changes investigated here, this study suggests that the use of DTI Atlas Builder's registration enhances TBSS group-based studies.

  14. Tiger team assessment of the Argonne Illinois site

    Energy Technology Data Exchange (ETDEWEB)

    1990-10-19

    This report documents the results of the Department of Energy's (DOE) Tiger Team Assessment of the Argonne Illinois Site (AIS) (including the DOE Chicago Operations Office, DOE Argonne Area Office, Argonne National Laboratory-East, and New Brunswick Laboratory) and Site A and Plot M, Argonne, Illinois, conducted from September 17 through October 19, 1990. The Tiger Team Assessment was conducted by a team comprised of professionals from DOE, contractors, consultants. The purpose of the assessment was to provide the Secretary of Energy with the status of Environment, Safety, and Health (ES H) Programs at AIS. Argonne National Laboratory-East (ANL-E) is the principal tenant at AIS. ANL-E is a multiprogram laboratory operated by the University of Chicago for DOE. The mission of ANL-E is to perform basic and applied research that supports the development of energy-related technologies. There are a significant number of ES H findings and concerns identified in the report that require prompt management attention. A significant change in culture is required before ANL-E can attain consistent and verifiable compliance with statutes, regulations and DOE Orders. ES H activities are informal, fragmented, and inconsistently implemented. Communication is seriously lacking, both vertically and horizontally. Management expectations are not known or commondated adequately, support is not consistent, and oversight is not effective.

  15. Microscopic optical potential from Argonne inter-nucleon potentials

    International Nuclear Information System (INIS)

    In the present work we describe our results concerning the calculation of equation of state of symmetric zero temperature nuclear matter and the microscopic optical potential using the soft-core Argonne inter-nucleon potentials in first order Brueckner–Hartree–Fock (BHF) theory. The nuclear matter saturates at a density 0.228 nucleon/fm3 with 17.52 MeV binding energy per nucleon for Argonne av-14 and at 0.228 nucleon/fm3 with 17.01 MeV binding energy per nucleon for Argonne av-18. As a test case we present an analysis of 65 and 200 MeV protons scattering from 208Pb. The Argonne av-14 has been used for the first time to calculate nucleon optical potential in BHF and analyze the nucleon scattering data. We also compare our reaction matrix results with those using the old hard-core Hamada–Johnston and the soft-core Urbana uv-14 and Argonne av-18 inter-nucleon potentials. Our results indicate that the microscopic potential obtained using av-14 gives marginally better agreement with the experimental data than the other three Hamiltonians used in the present work. (author)

  16. Tiger team assessment of the Argonne Illinois site

    International Nuclear Information System (INIS)

    This report documents the results of the Department of Energy's (DOE) Tiger Team Assessment of the Argonne Illinois Site (AIS) (including the DOE Chicago Operations Office, DOE Argonne Area Office, Argonne National Laboratory-East, and New Brunswick Laboratory) and Site A and Plot M, Argonne, Illinois, conducted from September 17 through October 19, 1990. The Tiger Team Assessment was conducted by a team comprised of professionals from DOE, contractors, consultants. The purpose of the assessment was to provide the Secretary of Energy with the status of Environment, Safety, and Health (ES ampersand H) Programs at AIS. Argonne National Laboratory-East (ANL-E) is the principal tenant at AIS. ANL-E is a multiprogram laboratory operated by the University of Chicago for DOE. The mission of ANL-E is to perform basic and applied research that supports the development of energy-related technologies. There are a significant number of ES ampersand H findings and concerns identified in the report that require prompt management attention. A significant change in culture is required before ANL-E can attain consistent and verifiable compliance with statutes, regulations and DOE Orders. ES ampersand H activities are informal, fragmented, and inconsistently implemented. Communication is seriously lacking, both vertically and horizontally. Management expectations are not known or commondated adequately, support is not consistent, and oversight is not effective

  17. Preparing for the SWOT mission by evaluating the simulations of river water levels within a regional-scale hydrometeorological modeling framework

    Science.gov (United States)

    Häfliger, Vincent; Martin, Eric; Boone, Aaron; Habets, Florence; David, Cédric H.; Garambois, Pierre-André; Roux, Hélène; Ricci, Sophie

    2014-05-01

    The upcoming Surface Water Ocean Topography (SWOT) mission will provide unprecedented observations of water elevation in rivers and lakes. The vertical accuracy of SWOT measurements is expected to be around 10 cm for rivers of width greater than 50-100m. Over France, new observations will be available every 5 days. Such observations will allow new opportunities for validation of hydrological models and for data assimilation within these models. The objective of the proposed work is to evaluate the quality of simulated river water levels in the Garonne River Basin (55,000 km²) located in Southwestern France. The simulations are produced using a distributed regional-scale hydrometeorological modeling framework composed of a land surface model (ISBA), a hydrogeological model (MODCOU) and a river network model (RAPID). The modeling framework had been initially calibrated over France although this study focuses on the smaller Garonne Basin and the proposed research emphasizes on modifications made to RAPID. First, the existing RAPID parameters (i.e. temporally-constant but spatially-variable Muskingum parameters) were updated in the Garonne River Basin based on estimations made using a lagged cross correlation method applied to observed hydrographs. Second, the model code was modified to allow for the use of a kinematic or a kinematic-diffusive wave equation for routing, both allowing for temporally and spatially variables wave celerities. Such modification required prescribing the values of hydraulic parameters of the river-channel. Initial results show that the variable flow velocity scheme is advantageous for discharge computations when compared to the original Muskingum method in RAPID. Additionally, water level computations led to root mean square errors of 50-60 cm in the improved Muskingum method and 40-50 cm in the kinematic-diffusive wave method. Discharge computations were also shown to be comparable to those obtained with high-resolution models solving the

  18. L-Py: an L-System simulation framework for modeling plant development based on a dynamic language

    Directory of Open Access Journals (Sweden)

    Frederic eBoudon

    2012-05-01

    Full Text Available The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e. languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: i by keeping a simple syntax while allowing for high-level programming constructs, ii by making code execution easy and avoiding compilation overhead iii allowing a high level of model reusability and the building of complex modular models iv and by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  19. L-py: an L-system simulation framework for modeling plant architecture development based on a dynamic language.

    Science.gov (United States)

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom. PMID:22670147

  20. A Fully-Integrated Framework for Terrestrial Water Cycle Simulation: Application to the San Joaquin Valley, California

    Science.gov (United States)

    Davison, Jason; Hwang, Hyoun-Tae; Sudicky, Edward; Lin, John

    2015-04-01

    Groundwater reservoirs are drastically decreasing from the increased stresses of agricultural, industrial, and residential use. Across the world, groundwater levels continue to decline due to the expansion of human activities and the decrease in groundwater recharge. Methods commonly used to project the future decline in subsurface water storage involve simulating precipitation patterns and applying them independently to hydrological models without feedback between the atmospheric and the groundwater/surface water systems. However, it is becoming increasingly evident that this traditional methodology, which ignores the critical feedbacks between groundwater, the land-surface, and the atmosphere, is inappropriate at basin or larger scales. To improve upon conventional methods, we coupled HydroGeoSphere (HGS), a fully-integrated, physically-based, 3D surface/subsurface flow, solute and energy transport model that also accounts for land surface processes, to the Weather Research and Forecasting (WRF) model. WRF is a well-known nonhydrostatic finite-difference mesoscale weather model. Our flexible coupled model, referred to as HGS-WRF, directly links the water and energy fluxes between the surface/subsurface to the atmosphere, and allows HGS to maintain a finer unstructured mesh, while WRF uses a coarser mesh over the entire domain. We applied HGS-WRF to the San Joaquin Valley in central California and expect to see an increase in skill of energy and moisture fluxes between domains. Overall, the inclusion of atmospheric feedbacks in hydrologic models will increase their predictive capabilities and help better inform water managers.

  1. Mixed traffic flow model considering illegal lane-changing behavior: Simulations in the framework of Kerner’s three-phase theory

    Science.gov (United States)

    Hu, Xiaojian; Wang, Wei; Yang, Haifei

    2012-11-01

    This paper studies the mixed motorized vehicle (m-vehicle) and non-motorized vehicle (nm-vehicle) traffic flow in the m-vehicle lane. We study the formation mechanism of the nm-vehicle illegal lane-changing behavior (NILB) by considering the overtaking motivation and the traffic safety awareness. In the framework of Kerner’s three-phase theory, we propose a model for the mixed traffic flow by introducing a new set of rules. A series of simulations are carried out in order to reveal the formation, travel process and influence of the mixed traffic flow. The simulation results show that the proposed model can be used to study not only the travel characteristic of the mixed traffic flow, but also some complex traffic problems such as traffic breakdown, moving synchronized flow pattern (MSP) and moving jam. Moreover, the results illustrate that the proposed model reflects the phenomenon of the mixed flow and the influence of the MSP caused by the NILB, which is consistent with the actual traffic system, and thus this work is helpful for the management of the mixed traffic flow.

  2. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  3. Hydrometeorological multi-model ensemble simulations of the 4 November 2011 flash flood event in Genoa, Italy, in the framework of the DRIHM project

    Directory of Open Access Journals (Sweden)

    A. Hally

    2015-03-01

    Full Text Available The e-Science environment developed in the framework of the EU-funded DRIHM project was used to demonstrate its ability to provide relevant, meaningful hydrometeorological forecasts. This was illustrated for the tragic case of 4 November 2011, when Genoa, Italy, was flooded as the result of heavy, convective precipitation that inundated the Bisagno catchment. The Meteorological Model Bridge (MMB, an innovative software component developed within the DRIHM project for the interoperability of meteorological and hydrological models, is a key component of the DRIHM e-Science environment. The MMB allowed three different rainfall-discharge models (DRiFt, RIBS and HBV to be driven by four mesoscale limited-area atmospheric models (WRF-NMM, WRF-ARW, Meso-NH and AROME and a downscaling algorithm (RainFARM in a seamless fashion. In addition to this multi-model configuration, some of the models were run in probabilistic mode, thus giving a comprehensive account of modelling errors and a very large amount of likely hydrometeorological scenarios (> 1500. The multi-model approach proved to be necessary because, whilst various aspects of the event were successfully simulated by different models, none of the models reproduced all of these aspects correctly. It was shown that the resulting set of simulations helped identify key atmospheric processes responsible for the large rainfall accumulations over the Bisagno basin. The DRIHM e-Science environment facilitated an evaluation of the sensitivity to atmospheric and hydrological modelling errors. This showed that both had a significant impact on predicted discharges, the former being larger than the latter. Finally, the usefulness of the set of hydrometeorological simulations was assessed from a flash flood early-warning perspective.

  4. Hydrometeorological multi-model ensemble simulations of the 4 November 2011 flash-flood event in Genoa, Italy, in the framework of the DRIHM project

    Directory of Open Access Journals (Sweden)

    A. Hally

    2014-11-01

    Full Text Available The e-Science environment developed in the framework of the EU-funded DRIHM project was used to demonstrate its capability to provide relevant, meaningful hydrometeorological forecasts. This was illustrated for the tragic case of 4 November 2011, when Genoa, Italy, was flooded as the result of heavy, convective precipitation that inundated the Bisagno catchment. The Meteorological Model Bridge (MMB, an innovative software component developped within the DRIHM project for the interoperability of meteorological and hydrological models, is a key component of the DRIHM e-Science environment. The MMB allowed three different rainfall-discharge models (DRiFt, RIBS, and HBV to be driven by four mesoscale limited-area atmospheric models (WRF-NMM, WRF-ARW, Meso-NH, and AROME and a downscaling algorithm (RainFARM in a seamless fashion. In addition to this multi-model configuration, some of the models were run in probabilistic mode, thus allowing a comprehensive account of modelling errors and a very large amount of likely hydrometeorological scenarios (>1500. The multi-model approach proved to be necessary because, whilst various aspects of the event were successfully simulated by different models, none of the models reproduced all of these aspects correctly. It was shown that the resulting set of simulations helped identify key atmospheric processes responsible for the large rainfall accumulations over the Bisagno basin. The DRIHM e-Science environment facilitated an evaluation of the sensitivity to atmospheric and hydrological modelling errors. This showed that both had a significant impact on predicted discharges, the former being larger than the latter. Finally, the usefulness of the set of hydrometeorological simulations was assessed from a flash-flood early-warning perspective.

  5. Remote dismantlement activities for the Argonne CP-5 Research Reactor

    International Nuclear Information System (INIS)

    The Department of Energy's (DOE's) Robotics Technology Development Program (RTDP) is participating in the dismantlement of a mothballed research reactor, Chicago Pile Number 5 (CP-5), at Argonne National Laboratory (ANL) to demonstrate technology developed by the program while assisting Argonne with their remote system needs. Equipment deployed for CP-5 activities includes the dual-arm work platform (DAWP), which will handle disassembly of reactor internals, and the RedZone Robotics-developed 'Rosie' remote work vehicle, which will perform size reduction of shield plugs, demolition of the biological shield, and waste packaging. Remote dismantlement tasks are scheduled to begin in February of 1997 and to continue through 1997 and beyond

  6. Performance model of the Argonne Voyager multimedia server

    Energy Technology Data Exchange (ETDEWEB)

    Disz, T.; Olson, R.; Stevens, R. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.

    1997-07-01

    The Argonne Voyager Multimedia Server is being developed in the Futures Lab of the Mathematics and Computer Science Division at Argonne National Laboratory. As a network-based service for recording and playing multimedia streams, it is important that the Voyager system be capable of sustaining certain minimal levels of performance in order for it to be a viable system. In this article, the authors examine the performance characteristics of the server. As they examine the architecture of the system, they try to determine where bottlenecks lie, show actual vs potential performance, and recommend areas for improvement through custom architectures and system tuning.

  7. Argonne's contribution to regional development : successful examples.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y. I.

    2000-11-14

    Argonne National Laboratory's mission is basic research and technology development to meet national goals in scientific leadership, energy technology, and environmental quality. In addition to its core missions as a national research and development center, Argonne has exerted a positive impact on its regional economic development, has carried out outstanding educational programs not only for college/graduate students but also for pre-college students and teachers, and has fostered partnerships with universities for research collaboration and with industry for shaping the new technological frontiers.

  8. Photographic as-builts for Argonne National Laboratory-West

    Energy Technology Data Exchange (ETDEWEB)

    Sherman, E.K.; Wiegand, C.V.

    1995-04-19

    Located 35 miles West of Idaho Falls, Idaho, Argonne National Laboratory-West operates a number of nuclear facilities for the Department of Energy (DOE) through the University of Chicago. Part of the present mission of Argonne National Laboratory-West includes shutdown of the EBR-II Reactor. In order to accomplish this task the Engineering-Drafting Department is exploring cost effective methods of providing as-building services. A new technology of integrating photographic images and AUTOCAD drawing files is considered one of those methods that shows promise.

  9. A FULL GPU IMPLEMENTATION FRAMEWORK OF SPH FLUID REAL-TIME SIMULATION%一个SPH流体实时模拟的全GPU实现框架

    Institute of Scientific and Technical Information of China (English)

    郭秋雷; 唐逸之; 刘诗秋; 李桂清

    2011-01-01

    How to implement timely the high realistic imitation of large-scale fluid simulation is an important element in computer graphics research. Fluid simulation consists of quite a few components including physical calculation, collision detection, surface reconstruction and rendering, so that there are a lot of works with regard to GPU acceleration aiming at the algorithms of each component of the fluid simulation. This paper proposes a set of GPU-based framework for SPH fluid simulation acceleration. On the basis of employing smoothed-particle hydrodynamics to solve Navier-Stokes equation, we speed up greatly the particle collision detection with GPU-based parallel spatial subdivision. Meanwhile, we design a fluid surface information reconstruction algorithm which uses the geometry shader, and the index-based optimisation is further carried out, which makes the reconstruction process of fluid surface avoid from traversing those areas do not contain surfaces. Experimental results show that the method in this paper can simulate timely the fluid scene with perfect reality.%怎样实时地进行高度逼真的大规模流体模拟是图形学要研究的一个重要内容.流体的模拟由物理计算、碰撞检测、表面重构和渲染几个部分组成,因此有大量工作针对流体模拟中的各个部分算法进行GPU加速.提出一整套基于GPU的SPH流体模拟加速框架.在利用平滑粒子动力学(SPH)求解Navier-Stokes方程的基础上,借助基于GPU的空间划分PSS( Parallel Spatial Subdivision)来大幅度提升粒子碰撞的检测速度.同时,设计一种基于几何着色器(Geometry Shader)的流体表面信息的重建算法,并进一步地实现基于索引的优化,使得在流体表面重建过程无须遍历不包含表面的区域.实验结果表明,该方法能实时模拟出具有较好真实感的流体场景.

  10. Development of high intensity source of thermal positrons APosS (Argonne Positron Source)

    International Nuclear Information System (INIS)

    We present an update on the positron-facility development at Argonne National Laboratory. We will discuss advantages of using low-energy electron accelerator, present our latest results on slow positron production simulations, and plans for further development of the facility. We have installed a new converter/moderator assembly that is appropriate for our electron energy that allows increasing the yield about an order of magnitude. We have simulated the relative yields of thermalized positrons as a function of incident positron energy on the moderator. We use these data to calculate positron yields that we compare with our experimental data as well as with available literature data. We will discuss the new design of the next generation positron front end utilization of reflection moderator geometry. We also will discuss planned accelerator upgrades and their impact on APosS.

  11. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  12. The development, design, testing, refinement, simulation and application of an evaluation framework for communities of practice and social-professional networks

    Directory of Open Access Journals (Sweden)

    Ball Dianne

    2009-09-01

    Full Text Available Abstract Background Communities of practice and social-professional networks are generally considered to enhance workplace experience and enable organizational success. However, despite the remarkable growth in interest in the role of collaborating structures in a range of industries, there is a paucity of empirical research to support this view. Nor is there a convincing model for their systematic evaluation, despite the significant potential benefits in answering the core question: how well do groups of professionals work together and how could they be organised to work together more effectively? This research project will produce a rigorous evaluation methodology and deliver supporting tools for the benefit of researchers, policymakers, practitioners and consumers within the health system and other sectors. Given the prevalence and importance of communities of practice and social networks, and the extent of investments in them, this project represents a scientific innovation of national and international significance. Methods and design Working in four conceptual phases the project will employ a combination of qualitative and quantitative methods to develop, design, field-test, refine and finalise an evaluation framework. Once available the framework will be used to evaluate simulated, and then later existing, health care communities of practice and social-professional networks to assess their effectiveness in achieving desired outcomes. Peak stakeholder groups have agreed to involve a wide range of members and participant organisations, and will facilitate access to various policy, managerial and clinical networks. Discussion Given its scope and size, the project represents a valuable opportunity to achieve breakthroughs at two levels; firstly, by introducing novel and innovative aims and methods into the social research process and, secondly, through the resulting evaluation framework and tools. We anticipate valuable outcomes in the

  13. Verification Survey of the Building 315 Zero Power Reactor-6 Facility, Argonne National Laboratory-East, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    W. C. Adams

    2007-05-25

    Oak Ridge Institute for Science and Education (ORISE) conducted independent verification radiological survey activities at Argonne National Laboratory’s Building 315, Zero Power Reactor-6 facility in Argonne, Illinois. Independent verification survey activities included document and data reviews, alpha plus beta and gamma surface scans, alpha and beta surface activity measurements, and instrumentation comparisons. An interim letter report and a draft report, documenting the verification survey findings, were submitted to the DOE on November 8, 2006 and February 22, 2007, respectively (ORISE 2006b and 2007).

  14. An object-oriented modeling and simulation framework for bearings-only multi-target tracking using an unattended acoustic sensor network

    Science.gov (United States)

    Aslan, Murat Šamil

    2013-10-01

    Tracking ground targets using low cost ground-based sensors is a challenging field because of the limited capabilities of such sensors. Among the several candidates, including seismic and magnetic sensors, the acoustic sensors based on microphone arrays have a potential of being useful: They can provide a direction to the sound source, they can have a relatively better range, and the sound characteristics can provide a basis for target classification. However, there are still many problems. One of them is the difficulty to resolve multiple sound sources, another is that they do not provide distance, a third is the presence of background noise from wind, sea, rain, distant air and land traffic, people, etc., and a fourth is that the same target can sound very differently depending on factors like terrain type, topography, speed, gear, distance, etc. Use of sophisticated signal processing and data fusion algorithms is the key for compensating (to an extend) the limited capabilities and mentioned problems of these sensors. It is hard, if not impossible, to evaluate the performance of such complex algorithms analytically. For an effective evaluation, before performing expensive field trials, well-designed laboratory experiments and computer simulations are necessary. Along this line, in this paper, we present an object-oriented modeling and simulation framework which can be used to generate simulated data for the data fusion algorithms for tracking multiple on-road targets in an unattended acoustic sensor network. Each sensor node in the network is a circular microphone array which produces the direction of arrival (DOA) (or bearing) measurements of the targets and sends this information to a fusion center. We present the models for road networks, targets (motion and acoustic power) and acoustic sensors in an object-oriented fashion where different and possibly time-varying sampling periods for each sensor node is possible. Moreover, the sensor's signal processing and

  15. FACET: an object-oriented software framework for modeling complex social behavior patterns

    Energy Technology Data Exchange (ETDEWEB)

    Dolph, J. E.; Christiansen, J. H.; Sydelko, P. J.

    2000-06-30

    The Framework for Addressing Cooperative Extended Transactions (FACET) is a flexible, object-oriented architecture for implementing models of dynamic behavior of multiple individuals, or agents, in a simulation. These agents can be human (individuals or organizations) or animal and may exhibit any type of organized social behavior that can be logically articulated. FACET was developed by Argonne National Laboratory's (ANL) Decision and Information Sciences Division (DIS) out of the need to integrate societal processes into natural system simulations. The FACET architecture includes generic software components that provide the agents with various mechanisms for interaction, such as step sequencing and logic, resource management, conflict resolution, and preemptive event handling. FACET components provide a rich environment within which patterns of behavior can be captured in a highly expressive manner. Interactions among agents in FACET are represented by Course of Action (COA) object-based models. Each COA contains a directed graph of individual actions, which represents any known pattern of social behavior. The agents' behavior in a FACET COA, in turn, influences the natural landscape objects in a simulation (i.e., vegetation, soil, and habitat) by updating their states. The modular design of the FACET architecture provides the flexibility to create multiple and varied simulation scenarios by changing social behavior patterns, without disrupting the natural process models. This paper describes the FACET architecture and presents several examples of FACET models that have been developed to assess the effects of anthropogenic influences on the dynamics of the natural environment.

  16. PHP frameworks

    OpenAIRE

    Srša, Aljaž

    2016-01-01

    The thesis presents one of the four most popular PHP web frameworks: Laravel, Symfony, CodeIgniter and CakePHP. These frameworks are compared with each other according to the four criteria, which can help with the selection of a framework. These criteria are size of the community, quality of official support, comprehensibility of framework’s documentation and implementation of functionalities in individual frameworks, which are automatic code generation, routing, object-relational mapping and...

  17. Nuclear fuel cycle programs of Argonne's Chemical Engineering Division

    International Nuclear Information System (INIS)

    Argonne National Laboratory's Chemical Engineering Division is actively involved in the research, development and demonstration of nuclear fuel cycle technologies for the United States Department of Energy Advanced Fuel Cycle Initiative, Generation IV, and Yucca Mountain programs. This paper summarizes current technology development initiatives within the Division that address the needs of the United States' advanced nuclear energy programs. (authors)

  18. Three Argonne technologies win R&D 100 awards

    CERN Multimedia

    2003-01-01

    "Three technologies developed or co-developed at the U.S. Department of Energy's Argonne National Laboratory have been recognized with R&D 100 Awards, which highlight some of the best products and technologies from around the world" (1 page).

  19. Brookhaven Lab and Argonne Lab scientists invent a plasma valve

    CERN Multimedia

    2003-01-01

    Scientists from Brookhaven National Laboratory and Argonne National Laboratory have received U.S. patent number 6,528,948 for a device that shuts off airflow into a vacuum about one million times faster than mechanical valves or shutters that are currently in use (1 page).

  20. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  1. Argonne to open new facility for advanced vehicle testing

    CERN Multimedia

    2002-01-01

    Argonne National Laboratory will open it's Advanced Powertrain Research Facility on Friday, Nov. 15. The facility is North America's only public testing facility for engines, fuel cells, electric drives and energy storage. State-of-the-art performance and emissions measurement equipment is available to support model development and technology validation (1 page).

  2. Employment impacts of EU biofuels policy. Combining bottom-up technology information and sectoral market simulations in an input-output framework

    International Nuclear Information System (INIS)

    This paper analyses the employment consequences of policies aimed to support biofuels in the European Union. The promotion of biofuel use has been advocated as a means to promote the sustainable use of natural resources and to reduce greenhouse gas emissions originating from transport activities on the one hand, and to reduce dependence on imported oil and thereby increase security of the European energy supply on the other hand. The employment impacts of increasing biofuels shares are calculated by taking into account a set of elements comprising the demand for capital goods required to produce biofuels, the additional demand for agricultural feedstock, higher fuel prices or reduced household budget in the case of price subsidisation, price effects ensuing from a hypothetical world oil price reduction linked to substitution in the EU market, and price impacts on agro-food commodities. The calculations refer to scenarios for the year 2020 targets as set out by the recent Renewable Energy Roadmap. Employment effects are assessed in an input-output framework taking into account bottom-up technology information to specify biofuels activities and linked to partial equilibrium models for the agricultural and energy sectors. The simulations suggest that biofuels targets on the order of 10-15% could be achieved without adverse net employment effects. (author)

  3. Hydraulic and gas transfer numerical simulations at cell and module scale of a clay host rock repository in the Forge project framework

    International Nuclear Information System (INIS)

    Document available in extended abstract form only. The multiple barrier disposal concept is the cornerstone of all proposed schemes for geological disposal of radioactive wastes. The concept is based on a series of passive complementary barriers, both engineered and natural, that act to achieve the required level of safety for radioactive waste disposed in a geological repository. Demonstrating an appropriate understanding of gas generation and migration is a key component in a safety case for a geological repository for radioactive waste. On the basis of work to date, the overall behaviour of waste-derived gas and its influences on repository system performance require improved understanding. Key issues to be further examined relating to an enhanced understanding of gas-related processes include: dilational versus visco-capillary flow mechanisms; long-term integrity of seals, in particular gas flow along contacts; role of the EDZ as a conduit for preferential flow; and laboratory to field up-scaling. Such issues are the focus of the integrated, multi-disciplinary European Commission FORGE project. The FORGE project links international radioactive waste management organisations, regulators and academia, and is specifically designed to tackle the key research issues associated with the generation and movement of repository gases associated with waste disposed in a geological repository. Of particular importance are the long-term performance of bentonite buffers, plastic clays, indurated mud-rocks and crystalline formations. This presentation will focus on the numerical simulation work done by Andra in the FORGE framework, especially at the cell and module scales. One of the main problems in dealing with different teams from different countries, who have different disposal concepts, is to find a common representation as a basis for the benchmark. This implies simplifications are necessary to real concepts, in order that they be represented at a basic level that has

  4. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  5. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J;

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  6. Selection, specification, design and use of various nuclear power plant training simulators. Report prepared within the framework of the International Working Group on Nuclear Power Plant Control and Instrumentation

    International Nuclear Information System (INIS)

    Several IAEA publications consider the role of training and particularly the role of simulator training to enhance the safety of NPP operations. Initially, the focus was on full scope simulators for the training of main control room operators. Experience shows that other types of simulator are also effective tools that allow simulator training for a broader range of target groups and training objectives. This report provides guidance to training centers and suppliers on the proper selection, specification, design and use of various forms of simulators. In addition, it provides examples of their use in several Member States. This report is the result of a series of advisory and consultants meetings held in the framework of the International Working Group on Nuclear Power Plant Control and Instrumentation (IWG-NPPCI) in 1995-1996

  7. Simulation of the Mechanism of Gas Sorption in a Metal–Organic Framework with Open Metal Sites: Molecular Hydrogen in PCN-61

    KAUST Repository

    Forrest, Katherine A.

    2012-07-26

    Grand canonical Monte Carlo (GCMC) simulations were performed to investigate hydrogen sorption in an rht-type metal-organic framework (MOF), PCN-61. The MOF was shown to have a large hydrogen uptake, and this was studied using three different hydrogen potentials, effective for bulk hydrogen, but of varying sophistication: a model that includes only repulsion/dispersion parameters, one augmented with charge-quadrupole interactions, and one supplemented with many-body polarization interactions. Calculated hydrogen uptake isotherms and isosteric heats of adsorption, Q st, were in quantitative agreement with experiment only for the model with explicit polarization. This success in reproducing empirical measurements suggests that modeling MOFs that have open metal sites is feasible, though it is often not considered to be well described via a classical potential function; here it is shown that such systems may be accurately described by explicitly including polarization effects in an otherwise traditional empirical potential. Decomposition of energy terms for the models revealed deviations between the electrostatic and polarizable results that are unexpected due to just the augmentation of the potential surface by the addition of induction. Charge-quadrupole and induction energetics were shown to have a synergistic interaction, with inclusion of the latter resulting in a significant increase in the former. Induction interactions strongly influence the structure of the sorbed hydrogen compared to the models lacking polarizability; sorbed hydrogen is a dipolar dense fluid in the MOF. This study demonstrates that many-body polarization makes a critical contribution to gas sorption structure and must be accounted for in modeling MOFs with polar interaction sites. © 2012 American Chemical Society.

  8. Argonne's Expedited Site Characterization: An integrated approach to cost- and time-effective remedial investigation

    International Nuclear Information System (INIS)

    Argonne National Laboratory has developed a methodology for remedial site investigation that has proven to be both technically superior to and more cost- and time-effective than traditional methods. This methodology is referred to as the Argonne Expedited Site Characterization (ESC). Quality is the driving force within the process. The Argonne ESC process is abbreviated only in time and cost and never in terms of quality. More usable data are produced with the Argonne ESC process than with traditional site characterization methods that are based on statistical-grid sampling and multiple monitoring wells. This paper given an overview of the Argonne ESC process and compares it with traditional methods for site characterization. Two examples of implementation of the Argonne ESC process are discussed to illustrate the effectiveness of the process in CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act) and RCRA (Resource Conservation and Recovery Act) programs

  9. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  10. The virtual reality framework for engineering objects

    OpenAIRE

    Ivankov, Petr R.; Ivankov, Nikolay P.

    2006-01-01

    A framework for virtual reality of engineering objects has been developed. This framework may simulate different equipment related to virtual reality. Framework supports 6D dynamics, ordinary differential equations, finite formulas, vector and matrix operations. The framework also supports embedding of external software.

  11. Present and future radioactive nuclear beam developments at Argonne

    Energy Technology Data Exchange (ETDEWEB)

    Decrock, P.

    1996-11-01

    A scheme for building an ISOL-based radioactive nuclear beam facility at the Argonne Physics Division, is currently evaluated. The feasibility and efficiency of the different steps in the proposed production- and acceleration cycles are being tested. At the Dynamitron Facility of the ANL Physics Division, stripping yields of Kr, Xe and Ph beams in a windowless gas cell have been measured and the study of fission of {sup 238}U induced by fast neutrons from the {sup 9}Be(dn) reaction is in progress. Different aspects of the post-acceleration procedure are currently being investigated. In parallel with this work, energetic radioactive beams such as {sup 17}F, {sup 18}F and {sup 56}Ni have recently been developed at Argonne using the present ATLAS facility.

  12. APEX user`s guide - (Argonne production, expansion, and exchange model for electrical systems), version 3.0

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Veselka, T.D.; Guziel, K.A.; Blodgett, D.W.; Hamilton, S.; Kavicky, J.A.; Koritarov, V.S.; North, M.J.; Novickas, A.A.; Paprockas, K.R. [and others

    1994-11-01

    This report describes operating procedures and background documentation for the Argonne Production, Expansion, and Exchange Model for Electrical Systems (APEX). This modeling system was developed to provide the U.S. Department of Energy, Division of Fossil Energy, Office of Coal and Electricity with in-house capabilities for addressing policy options that affect electrical utilities. To meet this objective, Argonne National Laboratory developed a menu-driven programming package that enables the user to develop and conduct simulations of production costs, system reliability, spot market network flows, and optimal system capacity expansion. The APEX system consists of three basic simulation components, supported by various databases and data management software. The components include (1) the investigation of Costs and Reliability in Utility Systems (ICARUS) model, (2) the Spot Market Network (SMN) model, and (3) the Production and Capacity Expansion (PACE) model. The ICARUS model provides generating-unit-level production-cost and reliability simulations with explicit recognition of planned and unplanned outages. The SMN model addresses optimal network flows with recognition of marginal costs, wheeling charges, and transmission constraints. The PACE model determines long-term (e.g., longer than 10 years) capacity expansion schedules on the basis of candidate expansion technologies and load growth estimates. In addition, the Automated Data Assembly Package (ADAP) and case management features simplify user-input requirements. The ADAP, ICARUS, and SMN modules are described in detail. The PACE module is expected to be addressed in a future publication.

  13. The beam optics of the Argonne Positive-Ion Injector

    International Nuclear Information System (INIS)

    The beam optics for Phase I of the Argonne Positive-Ion Injector linac system have been studied for a representative set of beams. The results of this study indicate that high charge state beams from an ECR source can be accelerated without significantly increasing the transverse or longitudinal emittance of the initial beam. It is expected that the beam quality from the PII-ATLAS system will be at least as good as presently achieved with the tandem-ATLAS system

  14. Reactor D and D at Argonne National Laboratory - lessons learned

    International Nuclear Information System (INIS)

    This paper focuses on the lessons learned during the decontamination and decommissioning (D and D) of two reactors at Argonne National Laboratory-East (ANL-E). The Experimental Boiling Water Reactor (EBWR) was a 100 MW(t), 5 MSV(e) proof-of-concept facility. The Janus Reactor was a 200 kW(t) reactor located at the Biological Irradiation Facility and was used to study the effects of neutron radiation on animals

  15. Pyrolysis of the Argonne premium coals under slow heating conditions

    Energy Technology Data Exchange (ETDEWEB)

    Serio, M.A.; Solomon, P.R.; Carangelo, R.M. (Advanced Fuel Research, Inc., 87 Church St., East Hartford, CT (US))

    1988-06-01

    The establishment of the Argonne Premium Sample Bank will allow more meaningful comparisons to be made between pyrolysis studies from different laboratories. This sample bank also provides a good suite of coals for examining rank dependent phenomena, such as the kinetics of primary gas evolution. A recent ''general'' model of coal pyrolysis proposed by our research group has as one of its assumptions that the kinetics of primary product evolution are rank-insensitive. This assumption was tested by a thorough examination of our data from experiments where only coal type was varied as well as data from similar experiments in the literature. The conclusion was that, with few exceptions, the kinetic rate constants for individual species evolved from coals pyrolyzed under the same conditions show little variation with rank. However, this conclusion remains controversial. The Argonne premium samples provide an opportunity to further test this assumption with a set of coals that was designed to cover a wide range of coal types. A slow, constant heating rate experiment was used, which is the most sensitive to rate variations. The authors' own work has indicated a role for heating rate on tar yields for bituminous coals and on tar molecular weight distributions for lignites. The authors plan to extend this work to the Argonne coals in order to better establish these trends. The current paper is concerned primarily with pyrolysis of the Argonne coals under slow heating conditions in a unique TG-FTIR instrument developed in the authors' laboratory. Results from slow heating rate pyrolysis into a FIMS apparatus are also presented. Experiments have also been done under rapid heating conditions.

  16. Multiscale framework for predicting the coupling between deformation and fluid diffusion in porous rocks

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, José E; Rudnicki, John W

    2012-12-14

    In this project, a predictive multiscale framework will be developed to simulate the strong coupling between solid deformations and fluid diffusion in porous rocks. We intend to improve macroscale modeling by incorporating fundamental physical modeling at the microscale in a computationally efficient way. This is an essential step toward further developments in multiphysics modeling, linking hydraulic, thermal, chemical, and geomechanical processes. This research will focus on areas where severe deformations are observed, such as deformation bands, where classical phenomenology breaks down. Multiscale geometric complexities and key geomechanical and hydraulic attributes of deformation bands (e.g., grain sliding and crushing, and pore collapse, causing interstitial fluid expulsion under saturated conditions), can significantly affect the constitutive response of the skeleton and the intrinsic permeability. Discrete mechanics (DEM) and the lattice Boltzmann method (LBM) will be used to probe the microstructure---under the current state---to extract the evolution of macroscopic constitutive parameters and the permeability tensor. These evolving macroscopic constitutive parameters are then directly used in continuum scale predictions using the finite element method (FEM) accounting for the coupled solid deformation and fluid diffusion. A particularly valuable aspect of this research is the thorough quantitative verification and validation program at different scales. The multiscale homogenization framework will be validated using X-ray computed tomography and 3D digital image correlation in situ at the Advanced Photon Source in Argonne National Laboratories. Also, the hierarchical computations at the specimen level will be validated using the aforementioned techniques in samples of sandstone undergoing deformation bands.

  17. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  18. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  19. Digital Polygon Model Grid of the Hydrogeologic Framework of Bedrock Units for a Simulation of Groundwater Flow for the Lake Michigan Basin

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The hydrogeologic framework for the Lake Michigan Basin model was developed by grouping the bedrock geology of the study area into hydrogeologic units on the basis...

  20. Argonne National Laboratory institutional plan FY 2002 - FY 2007

    International Nuclear Information System (INIS)

    The national laboratory system provides a unique resource for addressing the national needs inherent in the mission of the Department of Energy. Argonne, which grew out of Enrico Fermi's pioneering work on the development of nuclear power, was the first national laboratory and, in many ways, has set the standard for those that followed. As the Laboratory's new director, I am pleased to present the Argonne National Laboratory Institutional Plan for FY 2002 through FY 2007 on behalf of the extraordinary group of scientists, engineers, technicians, administrators, and others who re responsible for the Laboratory's distinguished record of achievement. Like our sister DOE laboratories, Argonne uses a multifaceted approach to advance U.S. R and D priorities. First, we assemble interdisciplinary teams of scientists and engineers to address complex problems. For example, our initiative in Functional Genomics will bring together biologists, computer scientists, environmental scientists, and staff of the Advanced Photon Source to develop complete maps of cellular function. Second, we cultivate specific core competencies in science and technology; this Institutional Plan discusses the many ways in which our core competencies support DOE's four mission areas. Third, we serve the scientific community by designing, building, and operating world-class user facilities, such as the Advanced Photon Source, the Intense Pulsed Neutron Source, and the Argonne Tandem-Linac Accelerator System. This Plan summarizes the visions, missions, and strategic plans for the Laboratory's existing major user facilities, and it explains our approach to the planned Rare Isotope Accelerator. Fourth, we help develop the next generation of scientists and engineers through educational programs, many of which involve bright young people in research. This Plan summarizes our vision, objectives, and strategies in the education area, and it gives statistics on student and faculty participation. Finally, we

  1. SIMULATION MODEL RESOURCE SEARCH FRAMEWORK BASED ON SEMANTICS DESCRIPTION OF CONCEPTUAL MODEL%基于概念模型语义描述的仿真模型资源搜索框架

    Institute of Scientific and Technical Information of China (English)

    康晓予; 邓贵仕

    2011-01-01

    重用已有模型构建新的仿真应用一直受到系统仿真领域的关注.基于模型数据库搜索、判断与应用需求相匹配的仿真模型资源是实现重用的关键问题.提出一个基于概念模型语义描述的仿真模型资源搜索框架,详细说明了该搜索框架的结构.框架建立了由实体、任务、交互等概念模型元素构成的仿真模型资源语义描述模型,采用本体语义和关键字匹配等搜索策略.模拟实验表明该框架可以很大程度上提高搜索判断的准确性.%Constructing new simulation applications based on the reuse of existing simulation models has always been paid attention in the system simulation field. It is a key issue in realising the reuse that to search, estimate and apply the simulation model resources matching the needs of application based on the database. This paper proposed a search framework for simulation model resources based on semantics description of conceptual model, and its structure is expounded as well. The framework sets up a semantics description model for simulation model resources constructed by the conceptual model elements of entities, tasks and interactions, and uses searching policies of ontology semantics and keywords matching. Simulation experiments indicate that the frame can improve the accuracy of search and estimation remarkably.

  2. Profile: Returning Argonne researcher aims to push computing speed to next level

    CERN Multimedia

    Merrion, P

    2002-01-01

    Paul Messina, one of the world's leading supercomputer scientists has returned to Argonne National Laboratory. He will split time between Argonne, where he holds the title of senior computer scientist, and Geneva, Switzerland, where he is an adviser to the director of CERN (1 page).

  3. Frontiers: Research highlights 1946-1996 [50th Anniversary Edition. Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This special edition of 'Frontiers' commemorates Argonne National Laboratory's 50th anniversary of service to science and society. America's first national laboratory, Argonne has been in the forefront of U.S. scientific and technological research from its beginning. Past accomplishments, current research, and future plans are highlighted.

  4. Argonne National Lab deploys Force10 networks' massively dense ethernet switch for supercomputing cluster

    CERN Multimedia

    2003-01-01

    "Force10 Networks, Inc. today announced that Argonne National Laboratory (Argonne, IL) has successfully deployed Force10 E-Series switch/routers to connect to the TeraGrid, the world's largest supercomputing grid, sponsored by the National Science Foundation (NSF)" (1/2 page).

  5. Using the C4ISR Architecture Framework as a Tool to Facilitate VV&A for Simulation Systems within the Military Application Domain

    CERN Document Server

    Tolk, Andreas

    2010-01-01

    To harmonize the individual architectures of the different commands, services, and agencies dealing with the development and procurement of Command, Control, Communications, Computing, Surveillance, Reconnaissance, and Intelligence (C4ISR) systems, the C4ISR Architecture Framework was developed based on existing and matured modeling techniques and methods. Within a short period, NATO adapted this method family as the NATO Consultation, Command, and Control (C3) System Architecture Framework to harmonize the efforts of the different nations. Based on these products, for every system to be fielded to be used in the US Armed Forces, a C4I Support Plan (C4ISP) has to be developed enabling the integration of the special system into the integrated C4I Architecture. The tool set proposed by these architecture frameworks connects operational views of the military user, system views of the developers, and the technical views for standards and integration methods needed to make the network centric system of systems wor...

  6. Quality Assurance Program: Argonne peer review activities for the salt host-rock portion of the Civilian Radioactive Waste Management Program

    International Nuclear Information System (INIS)

    This Quality Assurance (QA) Program sets forth the methods, controls, and procedures used to ensure that the results of Argonne National Laboratory's peer review activities are consistently of the highest quality and responsive to Salt Repository Project Office's needs and directives. Implementation of the QA procedures described herein establishes an operational framework so that task activities are traceable and the activities and decisions that influence the overall quality of the peer review process and results are fully documented. 56 refs., 5 figs., 6 tabs

  7. Radiation chemistry at the Metallurgical Laboratory, Manhattan Project, University of Chicago (1942-1947) and the Argonne National Laboratory, Argonne, IL (1947-1984)

    International Nuclear Information System (INIS)

    The events in radiation chemistry which occurred in the Manhattan Project Laboratory and Argonne National Laboratory during World War II are reviewed. Research programmes from then until the present day are presented, with emphasis on pulse radiolysis studies. (UK)

  8. Pyrolysis of the Argonne premium coals under slow heating conditions

    Energy Technology Data Exchange (ETDEWEB)

    Serio, M.A.; Solomon, P.R.; Carangelo, R.M. (Advanced Fuel Research, Inc., East Hartford, CT (USA))

    1988-01-01

    A recent general model of coal pyrolysis proposed by the authors' research group has as one of its assumptions that the kinetics of primary product evolution are rank-insensitive. This assumption was tested by a thorough examination of data from experiments where only coal type was varied as well as data from similar experiments in the literature. The conclusion was that, with few exceptions, the kinetic rate constants show little variation with rank. However, this conclusion remains controversial. The Argonne premium samples provide an opportunity to further test this assumption with a set of coals that was designed to cover a wide range of coal types. A slow, constant heating rate experiment was used, which is the most sensitive to rate variations. A second controversial area is the importance of heating rate on the volatile product yield and distribution. Evidence has been presented which suggests no intrinsic effect of heating rate on pyrolysis yields and other studies have indicated the converse to be true. However, often these studies have been done under sufficiently different experimental conditions that direct comparisons are difficult. Work has indicated a role for heating rate on tar yields for bituminous coals and on tar molecular weight distributions for lignites. The authors plan to extend this work to the Argonne coals in order to better establish these trends. The current paper is concerned primarily with pyrolysis of the Argonne coals under slow heating conditions in a unique TG-FTIR instrument developed in this laboratory. Results from slow heating rate pyrolysis into a FIMS apparatus are also presented.

  9. Argonne-West Waste Characterization Area for mixed TRU waste

    International Nuclear Information System (INIS)

    Argonne National Laboratory-West has developed characterize and repackage transuranic mixed waste, the Waste Characterization Area (WCA). This new facility is designed to current Department of Energy design criteria for radioactive waste handling facilities Characterization and repackaging of real waste within the WCA began in April 1994. Characterization operations include visual examination of solid waste and drum headspace and inner bag gas sample collection and analysis. Additions planned for the WCA in late 1994 include equipment for sludge waste core sample extraction, preparation, and analysis. This paper addresses general design strategy, facility features, and specialized equipment associated with the WCA

  10. Initial operation of the Argonne superconducting heavy-ion linac

    International Nuclear Information System (INIS)

    Initial operation and recent development of the Argonne superconducting heavy-ion linac are discussed. The linac has been developed in order to demonstrate a cost-effective means of extending the performance of electrostatic tandem accelerators. The results of beam acceleration tests which began in June 1978 are described. At present 7 of a planned array of 22 resonators are operating on-line, and the linac system provides an effective accelerating potential of 7.5 MV. Although some technical problems remain, the level of performance and reliability is sufficient that appreciable beam time is becoming available to users

  11. Argonne mechanical design proposal for the ATLAS hadron calorimeter

    International Nuclear Information System (INIS)

    The uniqueness of the Argonne design is given here: (1) by overlapping the spacer plates the compression load is carried through the module without affecting the scintillator slots; (2) flat thin straps are used in place of tie rods; (3) a supermodule is constructed of six 1 meter modules; (4) it is not necessary to drill holes through the scintillator; (5) absorber structure can be assembled independent of scintillator; (6) straps provide better load distribution across the plates; and (7) this design, as currently drawn, does not include internal sourcing, but does not preclude it being used

  12. 1985 annual site environmental report for Argonne National Laboratory

    International Nuclear Information System (INIS)

    This is one in a series of annual reports prepared to provide DOE, environmental agencies, and the public with information on the level of radioactive and chemical pollutants in the environment and on the amounts of such substances, if any, added to the environment as a result of Argonne operations. Included in this report are the results of measurements obtained in 1985 for a number of radionuclides in air, surface water, ground water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in surface and subsurface water; and for the external penetrating radiation dose

  13. 1985 annual site environmental report for Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N.W.; Duffy, T.L.; Sedlet, J.

    1986-03-01

    This is one in a series of annual reports prepared to provide DOE, environmental agencies, and the public with information on the level of radioactive and chemical pollutants in the environment and on the amounts of such substances, if any, added to the environment as a result of Argonne operations. Included in this report are the results of measurements obtained in 1985 for a number of radionuclides in air, surface water, ground water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in surface and subsurface water; and for the external penetrating radiation dose.

  14. Argonne National Laboratory monthly progress report, April 1952

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1952-04-01

    This progress report from the Argonne National Laboratory covers the work in Biological and Medical Research, Radiological Physics, and Health services for the quarterly period ending March 31, 1952. Numerous experiments were conducted in an attempt to answer some of the questions arising from exposure to ionizing radiation, especially X radiation. Some of the research involved the radiosensitivity of cells and some involved animals. The effects of radium in humans was also evaluated. Other studies were performed in biology, such as the effect of photoperiodism on plant growth and the biological of beryllium.

  15. Change in argonne national laboratory: a case study.

    Science.gov (United States)

    Mozley, A

    1971-10-01

    Despite traditional opposition to change within an institution and the known reluctance of an "old guard" to accept new managerial policies and techniques, the reactions suggested in this study go well beyond the level of a basic resistance to change. The response, indeed, drawn from a random sampling of Laboratory scientific and engineering personnel, comes close to what Philip Handler has recently described as a run on the scientific bank in a period of depression (1, p. 146). It appears that Argonne's apprehension stems less from the financial cuts that have reduced staff and diminished programs by an annual 10 percent across the last 3 fiscal years than from the administrative and conceptual changes that have stamped the institution since 1966. Administratively, the advent of the AUA has not forged a sense of collaborative effort implicit in the founding negotiations or contributed noticeably to increasing standards of excellence at Argonne. The AUA has, in fact, yet to exercise the constructive powers vested in them by the contract of reviewing and formulating long-term policy on the research and reactor side. Additionally, the University of Chicago, once the single operator, appears to have forfeited some of the trust and understanding that characterized the Laboratory's attitude to it in former years. In a period of complex and sensitive management the present directorate at Argonne is seriously dissociated from a responsible spectrum of opinion within the Laboratory. The crux of discontent among the creative scientific and engineering community appears to lie in a developed sense of being overadministered. In contrast to earlier periods, Argonne's professional staff feels a critical need for a voice in the formulation of Laboratory programs and policy. The Argonne senate could supply this mechanism. Slow to rally, their present concern springs from a firm conviction that the Laboratory is "withering on the vine." By contrast, the Laboratory director Powers

  16. Draft environmental assessment of Argonne National Laboratory, East

    International Nuclear Information System (INIS)

    This environmental assessment of the operation of the Argonne National Laboratory is related to continuation of research and development work being conducted at the Laboratory site at Argonne, Illinois. The Laboratory has been monitoring various environmental parameters both offsite and onsite since 1949. Meteorological data have been collected to support development of models for atmospheric dispersion of radioactive and other pollutants. Gaseous and liquid effluents, both radioactive and non-radioactive, have been measured by portable monitors and by continuous monitors at fixed sites. Monitoring of constituents of the terrestrial ecosystem provides a basis for identifying changes should they occur in this regime. The Laboratory has established a position of leadership in monitoring methodologies and their application. Offsite impacts of nonradiological accidents are primarily those associated with the release of chlorine and with sodium fires. Both result in releases that cause no health damage offsite. Radioactive materials released to the environment result in a cumulative dose to persons residing within 50 miles of the site of about 47 man-rem per year, compared to an annual total of about 950,000 man-rem delivered to the same population from natural background radiation. 100 refs., 17 figs., 33 tabs

  17. Draft environmental assessment of Argonne National Laboratory, East

    Energy Technology Data Exchange (ETDEWEB)

    1975-10-01

    This environmental assessment of the operation of the Argonne National Laboratory is related to continuation of research and development work being conducted at the Laboratory site at Argonne, Illinois. The Laboratory has been monitoring various environmental parameters both offsite and onsite since 1949. Meteorological data have been collected to support development of models for atmospheric dispersion of radioactive and other pollutants. Gaseous and liquid effluents, both radioactive and non-radioactive, have been measured by portable monitors and by continuous monitors at fixed sites. Monitoring of constituents of the terrestrial ecosystem provides a basis for identifying changes should they occur in this regime. The Laboratory has established a position of leadership in monitoring methodologies and their application. Offsite impacts of nonradiological accidents are primarily those associated with the release of chlorine and with sodium fires. Both result in releases that cause no health damage offsite. Radioactive materials released to the environment result in a cumulative dose to persons residing within 50 miles of the site of about 47 man-rem per year, compared to an annual total of about 950,000 man-rem delivered to the same population from natural background radiation. 100 refs., 17 figs., 33 tabs.

  18. Monte Carlo simulations versus experimental measurements in a small animal PET system. A comparison in the NEMA NU 4-2008 framework

    Science.gov (United States)

    Popota, F. D.; Aguiar, P.; España, S.; Lois, C.; Udias, J. M.; Ros, D.; Pavia, J.; Gispert, J. D.

    2015-01-01

    In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system’s sensitivity and scatter fraction for an energy window of 350-650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system’s dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems.

  19. Monte Carlo simulations versus experimental measurements in a small animal PET system. A comparison in the NEMA NU 4-2008 framework

    International Nuclear Information System (INIS)

    In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system’s sensitivity and scatter fraction for an energy window of 350–650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system’s dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems. (paper)

  20. Monte-Carlo simulation of colliding particles or coalescing droplets transported by a turbulent flow in the framework of a joint fluid–particle pdf approach

    OpenAIRE

    Fede, Pascal; Simonin, Olivier; Villedieu, Philippe

    2015-01-01

    The aim of the paper is to introduce and validate a Monte-Carlo algorithm for the prediction of an ensemble of colliding solid particles, or coalescing liquid droplets, suspended in a turbulent gas flow predicted by Reynolds Averaged Navier Stokes approach (RANS). The new algorithm is based on the direct discretization of the collision/coalescence kernel derived in the framework of a joint fluid–particle pdf approach proposed by Simonin et al. (2002). This approach allows to take into account...