WorldWideScience

Sample records for argonne simulation framework

  1. MONARC Simulation Framework

    CERN Document Server

    Dobre, Ciprian

    2011-01-01

    This paper discusses the latest generation of the MONARC (MOdels of Networked Analysis at Regional Centers) simulation framework, as a design and modelling tool for large scale distributed systems applied to HEP experiments. A process-oriented approach for discrete event simulation is well-suited for describing concurrent running programs, as well as the stochastic arrival patterns that characterize how such systems are used. The simulation engine is based on Threaded Objects (or Active Objects), which offer great flexibility in simulating the complex behavior of distributed data processing programs. The engine provides an appropriate scheduling mechanism for the Active objects with support for interrupts. This approach offers a natural way of describing complex running programs that are data dependent and which concurrently compete for shared resources as well as large numbers of concurrent data transfers on shared resources. The framework provides a complete set of basic components (processing nodes, data s...

  2. Component-Based Framework for Subsurface Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, Bruce J.; Fang, Yilin; Hammond, Glenn E.; Gurumoorthi, Vidhya

    2007-08-01

    Simulations in the subsurface environment represent a broad range of phenomena covering an equally broad range of scales. Developing modelling capabilities that can integrate models representing different phenomena acting at different scales present formidable challenges both from the algorithmic and computer science perspective. This paper will describe the development of an integrated framework that will be used to combine different models into a single simulation. Initial work has focused on creating two frameworks, one for performing smooth particle hydrodynamics (SPH) simulations of fluid systems, the other for performing grid-based continuum simulations of reactive subsurface flow. The SPH framework is based on a parallel code developed for doing pore scale simulations, the continuum grid-based framework is based on the STOMP (Subsurface Transport Over Multiple Phases) code developed at PNNL. Future work will focus on combining the frameworks together to perform multiscale, multiphysics simulations of reactive subsurface flow.

  3. Flexible Residential Smart Grid Simulation Framework

    Science.gov (United States)

    Xiang, Wang

    Different scheduling and coordination algorithms controlling household appliances' operations can potentially lead to energy consumption reduction and/or load balancing in conjunction with different electricity pricing methods used in smart grid programs. In order to easily implement different algorithms and evaluate their efficiency against other ideas, a flexible simulation framework is desirable in both research and business fields. However, such a platform is currently lacking or underdeveloped. In this thesis, we provide a simulation framework to focus on demand side residential energy consumption coordination in response to different pricing methods. This simulation framework, equipped with an appliance consumption library using realistic values, aims to closely represent the average usage of different types of appliances. The simulation results of traditional usage yield close matching values compared to surveyed real life consumption records. Several sample coordination algorithms, pricing schemes, and communication scenarios are also implemented to illustrate the use of the simulation framework.

  4. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  5. Development of the ATLAS Simulation Framework

    Institute of Scientific and Technical Information of China (English)

    A.DellAcqua; K.Amako; 等

    2001-01-01

    Object-oriented (OO) approach is the key technology to develop a software system in the LHC/ATLAS experiment.We developed a OO simulation framework based on the Geant4 general-purpose simulation toolkit.Because of complexity of simulation in ATLAS,we payed most attention to the scalability in its design.Although the first target to apply this framework is to implement the ATLAS full detector simulation program,there is no experiment-specific code in it,therefore it can be utilized for the development of any simulation package,not only for HEP experiments but also for various different research domains ,In this paper we discuss our approach of design and implementation of the framework.

  6. MCdevelop - the universal framework for Stochastic Simulations

    CERN Document Server

    Slawinska, M

    2011-01-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, mer...

  7. MCdevelop - a universal framework for Stochastic Simulations

    Science.gov (United States)

    Slawinska, M.; Jadach, S.

    2011-03-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http

  8. Template-Based Geometric Simulation of Flexible Frameworks

    Directory of Open Access Journals (Sweden)

    Stephen A. Wells

    2012-03-01

    Full Text Available Specialised modelling and simulation methods implementing simplified physical models are valuable generators of insight. Template-based geometric simulation is a specialised method for modelling flexible framework structures made up of rigid units. We review the background, development and implementation of the method, and its applications to the study of framework materials such as zeolites and perovskites. The “flexibility window” property of zeolite frameworks is a particularly significant discovery made using geometric simulation. Software implementing geometric simulation of framework materials, “GASP”, is freely available to researchers.

  9. MAIA: a framework for developing agent-based social simulations

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dignum, Virginia; Bots, Pieter; Dijkema, Gerhard

    2013-01-01

    In this paper we introduce and motivate a conceptualization framework for agent-based social simulation, MAIA: Modelling Agent systems based on Institutional Analysis. The MAIA framework is based on Ostrom's Institutional Analysis and Development framework, and provides an extensive set of modelling

  10. An advanced object-based software framework for complex ecosystem modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sydelko, P. J.; Dolph, J. E.; Majerus, K. A.; Taxon, T. N.

    2000-06-29

    Military land managers and decision makers face an ever increasing challenge to balance maximum flexibility for the mission with a diverse set of multiple land use, social, political, and economic goals. In addition, these goals encompass environmental requirements for maintaining ecosystem health and sustainability over the long term. Spatiotemporal modeling and simulation in support of adaptive ecosystem management can be best accomplished through a dynamic, integrated, and flexible approach that incorporates scientific and technological components into a comprehensive ecosystem modeling framework. The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) integrates ecological models and decision support techniques through a geographic information system (GIS)-based backbone. Recently, an object-oriented (OO) architectural framework was developed for IDLAMS (OO-IDLAMS). This OO-IDLAMS Prototype was built upon and leverages from the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS is an object-based architectural framework that affords a more integrated, dynamic, and flexible approach to comprehensive ecosystem modeling than was possible with the GIS-based integration approach of the original IDLAMS. The flexibility, dynamics, and interoperability demonstrated through this case study of an object-oriented approach have the potential to provide key technology solutions for many of the military's multiple-use goals and needs for integrated natural resource planning and ecosystem management.

  11. FACET: A simulation software framework for modeling complex societal processes and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  12. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  13. A Multiscale/Multifidelity CFD Framework for Robust Simulations

    Science.gov (United States)

    Lee, Seungjoon; Kevrekidis, Yannis; Karniadakis, George

    2015-11-01

    We develop a general CFD framework based on multifidelity simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy simulated fields. We combine approximation theory and domain decomposition together with machine learning techniques, e.g. co-Kriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation with different patches of the domain simulated by finite differences at fine resolution or very low resolution but also with Monte Carlo, hence fusing multifidelity and heterogeneous models to obtain the final answer. Second, we simulate the flow in a driven cavity by fusing finite difference solutions with solutions obtained by dissipative particle dynamics - a coarse-grained molecular dynamics method. In addition to its robustness and resilience, the new framework generalizes previous multiscale approaches (e.g. continuum-atomistic) in a unified parallel computational framework.

  14. A Simulation and Modeling Framework for Space Situational Awareness

    Energy Technology Data Exchange (ETDEWEB)

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  15. OpenMM: A Hardware Independent Framework for Molecular Simulations

    OpenAIRE

    Eastman, Peter; Pande, Vijay S.

    2010-01-01

    The wide diversity of computer architectures today requires a new approach to software development. OpenMM is a framework for molecular mechanics simulations, allowing a single program to run efficiently on a variety of hardware platforms.

  16. Service-Oriented Simulation Framework: An Overview and Unifying Methodology

    CERN Document Server

    Wang, Wenguang; Zhu, Yifan; Li, Qun; 10.1177/0037549710391838

    2010-01-01

    The prevailing net-centric environment demands and enables modeling and simulation to combine efforts from numerous disciplines. Software techniques and methodology, in particular service-oriented architecture, provide such an opportunity. Service-oriented simulation has been an emerging paradigm following on from object- and process-oriented methods. However, the ad-hoc frameworks proposed so far generally focus on specific domains or systems and each has its pros and cons. They are capable of addressing different issues within service-oriented simulation from different viewpoints. It is increasingly important to describe and evaluate the progress of numerous frameworks. In this paper, we propose a novel three-dimensional reference model for a service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks. In particular, the model inspects the crossover between the disciplines of modelin...

  17. FDPS: Framework for Developing Particle Simulators

    Science.gov (United States)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-04-01

    FDPS provides the necessary functions for efficient parallel execution of particle-based simulations as templates independent of the data structure of particles and the functional form of the interaction. It is used to develop particle-based simulation programs for large-scale distributed-memory parallel supercomputers. FDPS includes templates for domain decomposition, redistribution of particles, and gathering of particle information for interaction calculation. It uses algorithms such as Barnes-Hut tree method for long-range interactions; methods to limit the calculation to neighbor particles are used for short-range interactions. FDPS reduces the time and effort necessary to write a simple, sequential and unoptimized program of O(N^2) calculation cost, and produces compiled programs that will run efficiently on large-scale parallel supercomputers.

  18. GEMFsim: A Stochastic Simulator for the Generalized Epidemic Modeling Framework

    CERN Document Server

    Sahneh, Faryad Darabi; Shakeri, Heman; Fan, Futing; Scoglio, Caterina

    2016-01-01

    The recently proposed generalized epidemic modeling framework (GEMF) \\cite{sahneh2013generalized} lays the groundwork for systematically constructing a broad spectrum of stochastic spreading processes over complex networks. This article builds an algorithm for exact, continuous-time numerical simulation of GEMF-based processes. Moreover the implementation of this algorithm, GEMFsim, is available in popular scientific programming platforms such as MATLAB, R, Python, and C; GEMFsim facilitates simulating stochastic spreading models that fit in GEMF framework. Using these simulations one can examine the accuracy of mean-field-type approximations that are commonly used for analytical study of spreading processes on complex networks.

  19. Software Framework for Advanced Power Plant Simulations

    Energy Technology Data Exchange (ETDEWEB)

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  20. A simulation framework for the CMS Track Trigger electronics

    International Nuclear Information System (INIS)

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system

  1. A generic testing framework for agent-based simulation models

    OpenAIRE

    Gürcan, Önder; Dikenelli, Oguz; Bernon, Carole

    2013-01-01

    International audience Agent-based modelling and simulation (ABMS) had an increasing attention during the last decade. However, the weak validation and verification of agent-based simulation models makes ABMS hard to trust. There is no comprehensive tool set for verification and validation of agent-based simulation models, which demonstrates that inaccuracies exist and/or reveals the existing errors in the model. Moreover, on the practical side, many ABMS frameworks are in use. In this sen...

  2. A Simulation Framework for Virtual Prototyping of Robotic Exoskeletons.

    Science.gov (United States)

    Agarwal, Priyanshu; Neptune, Richard R; Deshpande, Ashish D

    2016-06-01

    A number of robotic exoskeletons are being developed to provide rehabilitation interventions for those with movement disabilities. We present a systematic framework that allows for virtual prototyping (i.e., design, control, and experimentation (i.e. design, control, and experimentation) of robotic exoskeletons. The framework merges computational musculoskeletal analyses with simulation-based design techniques which allows for exoskeleton design and control algorithm optimization. We introduce biomechanical, morphological, and controller measures to optimize the exoskeleton performance. A major advantage of the framework is that it provides a platform for carrying out hypothesis-driven virtual experiments to quantify device performance and rehabilitation progress. To illustrate the efficacy of the framework, we present a case study wherein the design and analysis of an index finger exoskeleton is carried out using the proposed framework. PMID:27018453

  3. Power Aware Simulation Framework for Wireless Sensor Networks and Nodes

    Directory of Open Access Journals (Sweden)

    Daniel Weber

    2008-07-01

    Full Text Available The constrained resources of sensor nodes limit analytical techniques and cost-time factors limit test beds to study wireless sensor networks (WSNs. Consequently, simulation becomes an essential tool to evaluate such systems.We present the power aware wireless sensors (PAWiS simulation framework that supports design and simulation of wireless sensor networks and nodes. The framework emphasizes power consumption capturing and hence the identification of inefficiencies in various hardware and software modules of the systems. These modules include all layers of the communication system, the targeted class of application itself, the power supply and energy management, the central processing unit (CPU, and the sensor-actuator interface. The modular design makes it possible to simulate heterogeneous systems. PAWiS is an OMNeT++ based discrete event simulator written in C++. It captures the node internals (modules as well as the node surroundings (network, environment and provides specific features critical to WSNs like capturing power consumption at various levels of granularity, support for mobility, and environmental dynamics as well as the simulation of timing effects. A module library with standardized interfaces and a power analysis tool have been developed to support the design and analysis of simulation models. The performance of the PAWiS simulator is comparable with other simulation environments.

  4. Power Aware Simulation Framework for Wireless Sensor Networks and Nodes

    Directory of Open Access Journals (Sweden)

    Glaser Johann

    2008-01-01

    Full Text Available Abstract The constrained resources of sensor nodes limit analytical techniques and cost-time factors limit test beds to study wireless sensor networks (WSNs. Consequently, simulation becomes an essential tool to evaluate such systems.We present the power aware wireless sensors (PAWiS simulation framework that supports design and simulation of wireless sensor networks and nodes. The framework emphasizes power consumption capturing and hence the identification of inefficiencies in various hardware and software modules of the systems. These modules include all layers of the communication system, the targeted class of application itself, the power supply and energy management, the central processing unit (CPU, and the sensor-actuator interface. The modular design makes it possible to simulate heterogeneous systems. PAWiS is an OMNeT++ based discrete event simulator written in C++. It captures the node internals (modules as well as the node surroundings (network, environment and provides specific features critical to WSNs like capturing power consumption at various levels of granularity, support for mobility, and environmental dynamics as well as the simulation of timing effects. A module library with standardized interfaces and a power analysis tool have been developed to support the design and analysis of simulation models. The performance of the PAWiS simulator is comparable with other simulation environments.

  5. Particle Tracking and Simulation on the .NET Framework

    International Nuclear Information System (INIS)

    Particle tracking and simulation studies are becoming increasingly complex. In addition to the use of more sophisticated graphics, interactive scripting is becoming popular. Compatibility with different control systems requires network and database capabilities. It is not a trivial task to fulfill all the various requirements without sacrificing runtime performance. We evaluated the effectiveness of the .NET framework by converting a C++ simulation code to C. The portability to other platforms is mentioned in terms of Mono

  6. Development of a framework for optimization of reservoir simulation studies

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jiang; Delshad, Mojdeh; Sepehrnoori, Kamy [The University of Texas at Austin, Austin, TX (United States)

    2007-10-15

    We have developed a framework that distributes multiple reservoir simulations on a cluster of CPUs for fast and efficient process optimization studies. This platform utilizes several commercial reservoir simulators for flow simulations, an experimental design and a Monte Carlo algorithm with a global optimization search engine to identify the optimum combination of reservoir decision factors under uncertainty. This approach is applied to a well placement design for a field-scale development exercise. The uncertainties considered are in the fault structure, porosity and permeability, PVT, and relative permeabilities. The results indicate that the approach is practical and efficient for performing reservoir optimization studies. (author)

  7. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    Science.gov (United States)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  8. A framework for simulation and inversion in electromagnetics

    CERN Document Server

    Heagy, Lindsey J; Kang, Seogi; Rosenkjaer, Gudni K; Oldenburg, Douglas W

    2016-01-01

    Simulations and inversions of geophysical electromagnetic data are paramount for discerning meaningful information about the subsurface from these data. Depending on the nature of the source electromagnetic experiments may be classified as time-domain or frequency-domain. Multiple heterogeneous and sometimes anisotropic physical properties, including electrical conductivity and magnetic permeability, may need be considered in a simulation. Depending on what one wants to accomplish in an inversion, the parameters which one inverts for may be a voxel-based description of the earth or some parametric representation that must be mapped onto a simulation mesh. Each of these permutations of the electromagnetic problem has implications in a numerical implementation of the forward simulation as well as in the computation of the sensitivities, which are required when considering gradient-based inversions. This paper proposes a framework for organizing and implementing electromagnetic simulations and gradient-based inv...

  9. Simulation framework and XML detector description for the CMS experiment

    CERN Document Server

    Arce, P; Boccali, T; Case, M; de Roeck, A; Lara, V; Liendl, M; Nikitenko, A N; Schröder, M; Strässner, A; Wellisch, H P; Wenzel, H

    2003-01-01

    Currently CMS event simulation is based on GEANT3 while the detector description is built from different sources for simulation and reconstruction. A new simulation framework based on GEANT4 is under development. A full description of the detector is available, and the tuning of the GEANT4 performance and the checking of the ability of the physics processes to describe the detector response is ongoing. Its integration on the CMS mass production system and GRID is also currently under development. The Detector Description Database project aims at providing a common source of information for Simulation, Reconstruction, Analysis, and Visualisation, while allowing for different representations as well as specific information for each application. A functional prototype, based on XML, is already released. Also examples of the integration of DDD in the GEANT4 simulation and in the reconstruction applications are provided.

  10. A Generic Digitization Framework for the CDF Simulation

    Institute of Scientific and Technical Information of China (English)

    JimKowalkowski; MarcPaterno

    2001-01-01

    Digitization from GEANT tracking requires a predictable sequence of steps to produce raw simulated detector readout information.We have developed a software framework that simplifies the development and integration of digitizers by separating the coordination activities(sequencing and dispatching)from the actual digitization process.This separation allows the developers of digitizers to concentrate on digitization.The framework provides the sequencing infrastructure and a digitizer model,which means that all digitizers are required to follow the same sequencing rules and provide an interface that fits the model.

  11. linear accelerator simulation framework with placet and guinea-pig

    CERN Document Server

    Snuverink, Jochem; CERN. Geneva. ATS Department

    2016-01-01

    Many good tracking tools are available for simulations for linear accelerators. However, several simple tasks need to be performed repeatedly, like lattice definitions, beam setup, output storage, etc. In addition, complex simulations can become unmanageable quite easily. A high level layer would therefore be beneficial. We propose LinSim, a linear accelerator framework with the codes PLACET and GUINEA-PIG. It provides a documented well-debugged high level layer of functionality. Users only need to provide the input settings and essential code and / or use some of the many implemented imperfections and algorithms. It can be especially useful for first-time users. Currently the following accelerators are implemented: ATF2, ILC, CLIC and FACET. This note is the comprehensive manual, discusses the framework design and shows its strength in some condensed examples.

  12. A Simulink simulation framework of a MagLev model

    Energy Technology Data Exchange (ETDEWEB)

    Boudall, H.; Williams, R.D.; Giras, T.C. [University of Virginia, Charlottesville (United States). School of Enegineering and Applied Science

    2003-09-01

    This paper presents a three-degree-of-freedom model of a section of the magnetically levitated train Maglev. The Maglev system dealt with in this article utilizes electromagnetic levitation. Each MagLev vehicle section is viewed as two separate parts, namely a body and a chassis, coupled by a set of springs and dampers. The MagLev model includes the propulsion, the guidance and the levitation systems. The equations of motion are developed. A Simulink simulation framework is implemented in order to study the interaction between the different systems and the dynamics of a MagLev vehicle. The simulation framework will eventually serve as a tool to assist the design and development of the Maglev system in the United States of America. (author)

  13. A framework for the calibration of social simulation models

    CERN Document Server

    Ciampaglia, Giovanni Luca

    2013-01-01

    Simulation with agent-based models is increasingly used in the study of complex socio-technical systems and in social simulation in general. This paradigm offers a number of attractive features, namely the possibility of modeling emergent phenomena within large populations. As a consequence, often the quantity in need of calibration may be a distribution over the population whose relation with the parameters of the model is analytically intractable. Nevertheless, we can simulate. In this paper we present a simulation-based framework for the calibration of agent-based models with distributional output based on indirect inference. We illustrate our method step by step on a model of norm emergence in an online community of peer production, using data from three large Wikipedia communities. Model fit and diagnostics are discussed.

  14. Hierarchical Visual Analysis and Steering Framework for Astrophysical Simulations

    Institute of Scientific and Technical Information of China (English)

    肖健; 张加万; 原野; 周鑫; 纪丽; 孙济洲

    2015-01-01

    A framework for accelerating modern long-running astrophysical simulations is presented, which is based on a hierarchical architecture where computational steering in the high-resolution run is performed under the guide of knowledge obtained in the gradually refined ensemble analyses. Several visualization schemes for facilitating ensem-ble management, error analysis, parameter grouping and tuning are also integrated owing to the pluggable modular design. The proposed approach is prototyped based on the Flash code, and it can be extended by introducing user-defined visualization for specific requirements. Two real-world simulations, i.e., stellar wind and supernova remnant, are carried out to verify the proposed approach.

  15. A framework to simulate VANET scenarios with SUMO

    OpenAIRE

    KAISSER,F; GRANSART,C; Kassab, M.; Berbineau, M.

    2011-01-01

    Vehicular Ad hoc Networks (VANET) are a special kind of Mobile Ad-Hoc Networks (MANET) adapted to the communications between vehicles. Several specific protocols to VANETs have been developed to improve performances and satisfy vehicular application needs. To evaluate a protocol for VANET, some realistic mobility models are needed. Unfortunately, such models are not provided by OPNET Modeler. In this work, we propose a framework that enhances OPNET simulation scenario using realistic vehicula...

  16. Velo: A Knowledge Management Framework for Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  17. EIC detector simulations in FairRoot framework

    Science.gov (United States)

    Kiselev, Alexander; eRHIC task force Team

    2013-10-01

    The long-term RHIC facility upgrade plan foresees the addition of a high-energy electron beam to the existing hadron accelerator complex thus converting RHIC into an Electron-Ion Collider (eRHIC). A dedicated EIC detector, designed to efficiently register and identify deep inelastic electron scattering (DIS) processes in a wide range of center-of-mass energies is one of the key elements of this upgrade. Detailed Monte-Carlo studies are needed to optimize EIC detector components and to fine tune their design. The simulation package foreseen for this purpose (EicRoot) is based on the FairRoot framework developed and maintained at the GSI. A feature of this framework is its level of flexibility, allowing one to switch easily between different geometry (ROOT, GEANT) and transport (GEANT3, GEANT4, FLUKA) models. Apart from providing a convenient simulation environment the framework includes basic tools for visualization and allows for easy sharing of event reconstruction codes between higher level experiment-specific applications. The description of the main EicRoot features and first simulation results will be the main focus of the talk.

  18. Framework Application for Core Edge Transport Simulation (FACETS)

    Energy Technology Data Exchange (ETDEWEB)

    Krasheninnikov, Sergei; Pigarov, Alexander

    2011-10-15

    The FACETS (Framework Application for Core-Edge Transport Simulations) project of Scientific Discovery through Advanced Computing (SciDAC) Program was aimed at providing a high-fidelity whole-tokamak modeling for the U.S. magnetic fusion energy program and ITER through coupling separate components for each of the core region, edge region, and wall, with realistic plasma particles and power sources and turbulent transport simulation. The project also aimed at developing advanced numerical algorithms, efficient implicit coupling methods, and software tools utilizing the leadership class computing facilities under Advanced Scientific Computing Research (ASCR). The FACETS project was conducted by a multi-discipline, multi-institutional teams, the Lead PI was J.R. Cary (Tech-X Corp.). In the FACETS project, the Applied Plasma Theory Group at the MAE Department of UCSD developed the Wall and Plasma-Surface Interaction (WALLPSI) module, performed its validation against experimental data, and integrated it into the developed framework. WALLPSI is a one-dimensional, coarse grained, reaction/advection/diffusion code applied to each material boundary cell in the common modeling domain for a tokamak. It incorporates an advanced model for plasma particle transport and retention in the solid matter of plasma facing components, simulation of plasma heat power load handling, calculation of erosion/deposition, and simulation of synergistic effects in strong plasma-wall coupling.

  19. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  20. A Driver Behavior Learning Framework for Enhancing Traffic Simulation

    Directory of Open Access Journals (Sweden)

    Ramona Maria Paven

    2014-06-01

    Full Text Available Traffic simulation provides an essential support for developing intelligent transportation systems. It allows affordable validation of such systems using a large variety of scenarios that involves massive data input. However, realistic traffic models are hard to be implemented especially for microscopic traffic simulation. One of the hardest problems in this context is to model the behavior of drivers, due the complexity of human nature. The work presented in this paper proposes a framework for learning driver behavior based on a Hidden Markov Model technique. Moreover, we propose also a practical method to inject this behavior in a traffic model used by the SUMO traffic simulator. To demonstrate the effectiveness of this method we present a case study involving real traffic collected from Timisoara city area.

  1. ATLAS Detector Simulation in the Integrated Simulation Framework applied to the W Boson Mass Measurement

    CERN Document Server

    Ritsch, Elmar; Froidevaux, Daniel; Salzburger, Andreas

    One of the cornerstones for the success of the ATLAS experiment at the Large Hadron Collider (LHC) is a very accurate Monte Carlo detector simulation. However, a limit is being reached regarding the amount of simulated data which can be produced and stored with the computing resources available through the worldwide LHC computing grid (WLCG). The Integrated Simulation Framework (ISF) is a novel approach to detector simula- tion which enables a more efficient use of these computing resources and thus allows for the generation of more simulated data. Various simulation technologies are combined to allow for faster simulation approaches which are targeted at the specific needs of in- dividual physics studies. Costly full simulation technologies are only used where high accuracy is required by physics analyses and fast simulation technologies are applied everywhere else. As one of the first applications of the ISF, a new combined simulation approach is developed for the generation of detector calibration samples ...

  2. The PandaRoot framework for simulation, reconstruction and analysis

    Science.gov (United States)

    Spataro, Stefano; PANDA Collaboration

    2011-12-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  3. The Framework for Approximate Queries on Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Abdulla, G; Baldwin, C; Critchlow, T; Kamimura, R; Lee, B; Musick, R; Snapp, R; Tang, N

    2001-09-27

    AQSim is a system intended to enable scientists to query and analyze a large volume of scientific simulation data. The system uses the state of the art in approximate query processing techniques to build a novel framework for progressive data analysis. These techniques are used to define a multi-resolution index, where each node contains multiple models of the data. The benefits of these models are two-fold: (1) they are compact representations, reconstructing only the information relevant to the analysis, and (2) the variety of models capture different aspects of the data which may be of interest to the user but are not readily apparent in their raw form. To be able to deal with the data interactively, AQSim allows the scientist to make an informed tradeoff between query response accuracy and time. In this paper, we present the framework of AQSim with a focus on its architectural design. We also show the results from an initial proof-of-concept prototype developed at LLNL. The presented framework is generic enough to handle more than just simulation data.

  4. Sorting, Searching, and Simulation in the MapReduce Framework

    CERN Document Server

    Goodrich, Michael T; Zhang, Qin

    2011-01-01

    In this paper, we study the MapReduce framework from an algorithmic standpoint and demonstrate the usefulness of our approach by designing and analyzing efficient MapReduce algorithms for fundamental sorting, searching, and simulation problems. This study is motivated by a goal of ultimately putting the MapReduce framework on an equal theoretical footing with the well-known PRAM and BSP parallel models, which would benefit both the theory and practice of MapReduce algorithms. We describe efficient MapReduce algorithms for sorting, multi-searching, and simulations of parallel algorithms specified in the BSP and CRCW PRAM models. We also provide some applications of these results to problems in parallel computational geometry for the MapReduce framework, which result in efficient MapReduce algorithms for sorting, 2- and 3-dimensional convex hulls, and fixed-dimensional linear programming. For the case when mappers and reducers have a memory/message-I/O size of $M=\\Theta(N^\\epsilon)$, for a small constant $\\epsi...

  5. A framework of modeling detector systems for computed tomography simulations

    Science.gov (United States)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  6. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Samantha S [ORNL; Elwasif, Wael R [ORNL; Bernholdt, David E [ORNL

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels of parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.

  7. Sorting, Searching, and Simulation in the MapReduce Framework

    DEFF Research Database (Denmark)

    Goodrich, Michael T.; Sitchinava, Nodar; Zhang, Qin

    2011-01-01

    We study the MapReduce framework from an algorithmic standpoint, providing a generalization of the previous algorithmic models for MapReduce. We present optimal solutions for the fundamental problems of all-prefix-sums, sorting and multi-searching. Additionally, we design optimal simulations of t...... of the the well-established PRAM and BSP models in MapReduce, immediately resulting in optimal solutions to the problems of computing fixed-dimensional linear programming and 2-D and 3-D convex hulls....

  8. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  9. A wind turbine hybrid simulation framework considering aeroelastic effects

    Science.gov (United States)

    Song, Wei; Su, Weihua

    2015-04-01

    In performing an effective structural analysis for wind turbine, the simulation of turbine aerodynamic loads is of great importance. The interaction between the wake flow and the blades may impact turbine blades loading condition, energy yield and operational behavior. Direct experimental measurement of wind flow field and wind profiles around wind turbines is very helpful to support the wind turbine design. However, with the growth of the size of wind turbines for higher energy output, it is not convenient to obtain all the desired data in wind-tunnel and field tests. In this paper, firstly the modeling of dynamic responses of large-span wind turbine blades will consider nonlinear aeroelastic effects. A strain-based geometrically nonlinear beam formulation will be used for the basic structural dynamic modeling, which will be coupled with unsteady aerodynamic equations and rigid-body rotations of the rotor. Full wind turbines can be modeled by using the multi-connected beams. Then, a hybrid simulation experimental framework is proposed to potentially address this issue. The aerodynamic-dominant components, such as the turbine blades and rotor, are simulated as numerical components using the nonlinear aeroelastic model; while the turbine tower, where the collapse of failure may occur under high level of wind load, is simulated separately as the physical component. With the proposed framework, dynamic behavior of NREL's 5MW wind turbine blades will be studied and correlated with available numerical data. The current work will be the basis of the authors' further studies on flow control and hazard mitigation on wind turbine blades and towers.

  10. A Virtual Engineering Framework for Simulating Advanced Power System

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Stanislav Borodai

    2008-06-18

    In this report is described the work effort performed to provide NETL with VE-Suite based Virtual Engineering software and enhanced equipment models to support NETL's Advanced Process Engineering Co-simulation (APECS) framework for advanced power generation systems. Enhancements to the software framework facilitated an important link between APECS and the virtual engineering capabilities provided by VE-Suite (e.g., equipment and process visualization, information assimilation). Model enhancements focused on improving predictions for the performance of entrained flow coal gasifiers and important auxiliary equipment (e.g., Air Separation Units) used in coal gasification systems. In addition, a Reduced Order Model generation tool and software to provide a coupling between APECS/AspenPlus and the GE GateCycle simulation system were developed. CAPE-Open model interfaces were employed where needed. The improved simulation capability is demonstrated on selected test problems. As part of the project an Advisory Panel was formed to provide guidance on the issues on which to focus the work effort. The Advisory Panel included experts from industry and academics in gasification, CO2 capture issues, process simulation and representatives from technology developers and the electric utility industry. To optimize the benefit to NETL, REI coordinated its efforts with NETL and NETL funded projects at Iowa State University, Carnegie Mellon University and ANSYS/Fluent, Inc. The improved simulation capabilities incorporated into APECS will enable researchers and engineers to better understand the interactions of different equipment components, identify weaknesses and processes needing improvement and thereby allow more efficient, less expensive plants to be developed and brought on-line faster and in a more cost-effective manner. These enhancements to APECS represent an important step toward having a fully integrated environment for performing plant simulation and engineering

  11. A Simulation Framework for Optimal Energy Storage Sizing

    Directory of Open Access Journals (Sweden)

    Carlos Suazo-Martínez

    2014-05-01

    Full Text Available Despite the increasing interest in Energy Storage Systems (ESS, quantification of their technical and economical benefits remains a challenge. To assess the use of ESS, a simulation approach for ESS optimal sizing is presented. The algorithm is based on an adapted Unit Commitment, including ESS operational constraints, and the use of high performance computing (HPC. Multiple short-term simulations are carried out within a multiple year horizon. Evaluation is performed for Chile's Northern Interconnected Power System (SING. The authors show that a single year evaluation could lead to sub-optimal results when evaluating optimal ESS size. Hence, it is advisable to perform long-term evaluations of ESS. Additionally, the importance of detailed simulation for adequate assessment of ESS contributions and to fully capture storage value is also discussed. Furthermore, the robustness of the optimal sizing approach is evaluated by means of a sensitivity analyses. The results suggest that regulatory frameworks should recognize multiple value streams from storage in order to encourage greater ESS integration.

  12. Framework Application for Core Edge Transport Simulation (FACETS)

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D; Shende, Sameer S; Huck, Kevin A; Mr. Alan Morris, and Mr. Wyatt Spear

    2012-03-14

    The goal of the FACETS project (Framework Application for Core-Edge Transport Simulations) was to provide a multiphysics, parallel framework application (FACETS) that will enable whole-device modeling for the U.S. fusion program, to provide the modeling infrastructure needed for ITER, the next step fusion confinement device. Through use of modern computational methods, including component technology and object oriented design, FACETS is able to switch from one model to another for a given aspect of the physics in a flexible manner. This enables use of simplified models for rapid turnaround or high-fidelity models that can take advantage of the largest supercomputer hardware. FACETS does so in a heterogeneous parallel context, where different parts of the application execute in parallel by utilizing task farming, domain decomposition, and/or pipelining as needed and applicable. ParaTools, Inc. was tasked with supporting the performance analysis and tuning of the FACETS components and framework in order to achieve the parallel scaling goals of the project. The TAU Performance System® was used for instrumentation, measurement, archiving, and profile / tracing analysis. ParaTools, Inc. also assisted in FACETS performance engineering efforts. Through the use of the TAU Performance System, ParaTools provided instrumentation, measurement, analysis and archival support for the FACETS project. Performance optimization of key components has yielded significant performance speedups. TAU was integrated into the FACETS build for both the full coupled application and the UEDGE component. The performance database provided archival storage of the performance regression testing data generated by the project, and helped to track improvements in the software development.

  13. A new framework for magnetohydrodynamic simulations with anisotropic pressure

    CERN Document Server

    Hirabayashi, Kota; Amano, Takanobu

    2016-01-01

    We describe a new theoretical and numerical framework of the magnetohydrodynamic simulation incorporated with an anisotropic pressure tensor, which can play an important role in a collisionless plasma. A classical approach to handle the anisotropy is based on the double adiabatic approximation assuming that a pressure tensor is well described only by the components parallel and perpendicular to the local magnetic field. This gyrotropic assumption, however, fails around a magnetically neutral region, where the cyclotron period may get comparable to or even longer than a dynamical time in a system, and causes a singularity in the mathematical expression. In this paper, we demonstrate that this singularity can be completely removed away by the combination of direct use of the 2nd-moment of the Vlasov equation and an ingenious gyrotropization model. Numerical tests also verify that the present model properly reduces to the standard MHD or the double adiabatic formulation in an asymptotic manner under an appropria...

  14. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Thomas [California Inst. of Technology (CalTech), Pasadena, CA (United States); Efendiev, Yalchin [Stanford Univ., CA (United States); Tchelepi, Hamdi [Texas A & M Univ., College Station, TX (United States); Durlofsky, Louis [Stanford Univ., CA (United States)

    2016-05-24

    Our work in this project is aimed at making fundamental advances in multiscale methods for flow and transport in highly heterogeneous porous media. The main thrust of this research is to develop a systematic multiscale analysis and efficient coarse-scale models that can capture global effects and extend existing multiscale approaches to problems with additional physics and uncertainties. A key emphasis is on problems without an apparent scale separation. Multiscale solution methods are currently under active investigation for the simulation of subsurface flow in heterogeneous formations. These procedures capture the effects of fine-scale permeability variations through the calculation of specialized coarse-scale basis functions. Most of the multiscale techniques presented to date employ localization approximations in the calculation of these basis functions. For some highly correlated (e.g., channelized) formations, however, global effects are important and these may need to be incorporated into the multiscale basis functions. Other challenging issues facing multiscale simulations are the extension of existing multiscale techniques to problems with additional physics, such as compressibility, capillary effects, etc. In our project, we explore the improvement of multiscale methods through the incorporation of additional (single-phase flow) information and the development of a general multiscale framework for flows in the presence of uncertainties, compressible flow and heterogeneous transport, and geomechanics. We have considered (1) adaptive local-global multiscale methods, (2) multiscale methods for the transport equation, (3) operator-based multiscale methods and solvers, (4) multiscale methods in the presence of uncertainties and applications, (5) multiscale finite element methods for high contrast porous media and their generalizations, and (6) multiscale methods for geomechanics.

  15. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    Energy Technology Data Exchange (ETDEWEB)

    Tchelepi, Hamdi

    2014-11-14

    A multiscale linear-solver framework for the pressure equation associated with flow in highly heterogeneous porous formations was developed. The multiscale based approach is cast in a general algebraic form, which facilitates integration of the new scalable linear solver in existing flow simulators. The Algebraic Multiscale Solver (AMS) is employed as a preconditioner within a multi-stage strategy. The formulations investigated include the standard MultiScale Finite-Element (MSFE) andMultiScale Finite-Volume (MSFV) methods. The local-stage solvers include incomplete factorization and the so-called Correction Functions (CF) associated with the MSFV approach. Extensive testing of AMS, as an iterative linear solver, indicate excellent convergence rates and computational scalability. AMS compares favorably with advanced Algebraic MultiGrid (AMG) solvers for highly detailed three-dimensional heterogeneous models. Moreover, AMS is expected to be especially beneficial in solving time-dependent problems of coupled multiphase flow and transport in large-scale subsurface formations.

  16. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas;

    2015-01-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To addr......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns...... constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations...

  17. A framework for using simulation methodology in ergonomics interventions in design projects

    DEFF Research Database (Denmark)

    Broberg, Ole; Duarte, Francisco; Andersen, Simone Nyholm;

    2014-01-01

    The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective......The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective...

  18. artG4: A Generic Framework for Geant4 Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Arvanitis, Tasha [Harvey Mudd Coll.; Lyon, Adam [Fermilab

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  19. A framework for web browser-based medical simulation using WebGL.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2012-01-01

    This paper presents a web browser-based software framework that provides accessibility, portability, and platform independence for medical simulation. Typical medical simulation systems are restricted to the underlying platform and device, which limits widespread use. Our framework allows realistic and efficient medical simulation using only the web browser for anytime anywhere access using a variety of platforms ranging from desktop PCs to tablets. The framework consists of visualization, simulation, and hardware integration modules that are fundamental components for multimodal interactive simulation. Benchmark tests are performed to validate the rendering and computing performance of our framework with latest web browsers including Chrome and Firefox. The results are quite promising opening up the possibility of developing web-based medical simulation technology.

  20. A forward-muscular inverse-skeletal dynamics framework for human musculoskeletal simulations.

    Science.gov (United States)

    S Shourijeh, Mohammad; Smale, Kenneth B; Potvin, Brigitte M; Benoit, Daniel L

    2016-06-14

    This study provides a forward-muscular inverse-skeletal dynamics framework for musculoskeletal simulations. The simulation framework works based on solving the muscle redundancy problem forward in time parallel to a torque tracking between the musculotendon net torques and joint moments from inverse dynamics. The proposed framework can be used by any musculoskeletal modeling software package; however, just to exemplify, here in this study it is wrapped around OpenSim and the optimization is done in MATLAB. The novel simulation framework was highly robust for repeated runs and produced relatively high correlations between predicted muscle excitations and experimental EMGs for level gait trials. This simulation framework represents an efficient and robust approach to predict muscle excitation, musculotendon unit force, and to estimate net joint torque. PMID:27106173

  1. Parallel simulation of wormhole propagation with the Darcy-Brinkman-Forchheimer framework

    KAUST Repository

    Wu, Yuanqing

    2015-07-09

    The acid treatment of carbonate reservoirs is a widely practiced oil and gas well stimulation technique. The injected acid dissolves the material near the wellbore and creates flow channels that establish a good connectivity between the reservoir and the well. Such flow channels are called wormholes. Different from the traditional simulation technology relying on Darcy framework, the new Darcy-Brinkman-Forchheimer (DBF) framework is introduced to simulate the wormhole forming procedure. The DBF framework considers both large and small porosity conditions and should output better simulation results than the Darcy framework. To process the huge quantity of cells in the simulation grid and shorten the long simulation time of the traditional serial code, a parallel code with FORTRAN 90 and MPI was developed. The experimenting field approach to set coefficients in the model equations was also introduced. Moreover, a procedure to fill in the coefficient matrix in the linear system in the solver was described. After this, 2D dissolution experiments were carried out. In the experiments, different configurations of wormholes and a series of properties simulated by both frameworks were compared. We conclude that the numerical results of the DBF framework are more like wormholes and more stable than the Darcy framework, which is a demonstration of the advantages of the DBF framework. Finally, the scalability of the parallel code was evaluated, and we conclude that superlinear scalability can be achieved. © 2015 Elsevier Ltd.

  2. Extending the FairRoot framework to allow for simulation and reconstruction of free streaming data

    International Nuclear Information System (INIS)

    The FairRoot framework is the standard framework for simulation, reconstruction and data analysis for the FAIR experiments. The framework is designed to optimise the accessibility for beginners and developers, to be flexible and to cope with future developments. FairRoot enhances the synergy between the different physics experiments. As a first step toward simulation of free streaming data, the time based simulation was introduced to the framework. The next step is the event source simulation. This is achieved via a client server system. After digitization the so called 'samplers' can be started, where sampler can read the data of the corresponding detector from the simulation files and make it available for the reconstruction clients. The system makes it possible to develop and validate the online reconstruction algorithms. In this work, the design and implementation of the new architecture and the communication layer will be described.

  3. NEVESIM: Event-Driven Neural Simulation Framework with a Python Interface

    Directory of Open Access Journals (Sweden)

    Dejan ePecevski

    2014-08-01

    Full Text Available NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  4. NEVESIM: event-driven neural simulation framework with a Python interface.

    Science.gov (United States)

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  5. Sorting, Searching, and Simulation in the MapReduce Framework

    DEFF Research Database (Denmark)

    Goodrich, Michael T.; Sitchinava, Nodari; Zhang, Qin

    2011-01-01

    in parallel computational geometry for the MapReduce framework, which result in efficient MapReduce algorithms for sorting, 2- and 3-dimensional convex hulls, and fixed-dimensional linear programming. For the case when mappers and reducers have a memory/message-I/O size of M = (N), for a small constant > 0...

  6. Software for the international linear collider: Simulation and reconstruction frameworks

    Indian Academy of Sciences (India)

    Ties Behnke; Frank Gaede; DESY Hamburg

    2007-12-01

    Software plays an increasingly important role already in the early stages of a large project like the ILC. In international collaboration a data format for the ILC detector and physics studies has been developed. Building upon this software frameworks are made available which ease the event reconstruction and analysis.

  7. The Astrophysics Simulation Collaboratory portal: A framework for effective distributed research

    OpenAIRE

    Bondarescu, Ruxandra; Allen, Gabrielle; Daues, Gregory; Kelly, Ian; Russell, Michael; Seidel, Edward; Shalf, John; Tobias, Malcolm

    2003-01-01

    We describe the motivation, architecture, and implementation of the Astrophysics Simulation Collaboratory (ASC) portal. The ASC project provides a web-based problem solving framework for the astrophysics community that harnesses the capabilities of emerging computational grids.

  8. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    Science.gov (United States)

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  9. Environmental impact evaluation using an agent based simulation framework

    OpenAIRE

    Schroijen, M.J.T.; Van Tooren, M.J.L.

    2010-01-01

    Environmental issues play an increasingly important role in aviation, affecting the desirability of novel technologies directly. The desirability of a certain technology with respect to environmental issues is determined by the system of systems level impact instead of the often used system level impact. Changing this perspective introduces additional complexities in how the system level evaluation should be related to the desired system of systems (SoS) level evaluation. A framework is propo...

  10. A framework for simulating ultrasound imaging based on first order nonlinear pressure–velocity relations

    DEFF Research Database (Denmark)

    Du, Yigang; Fan, Rui; Li, Yong;

    2016-01-01

    An ultrasound imaging framework modeled with the first order nonlinear pressure–velocity relations (NPVR) based simulation and implemented by a half-time staggered solution and pseudospectral method is presented in this paper. The framework is capable of simulating linear and nonlinear ultrasound...... ultrasound image can be obtained by beamforming the simulated channel data. Various results simulated by different algorithms are illustrated for comparisons. The root mean square (RMS) errors for each compared pulses are calculated. The linear propagation is validated by an angular spectrum approach (ASA...

  11. Research on the simulation framework in Building Information Modeling

    OpenAIRE

    Liang, Nan; Xu, Hongqing; Yu, Qiong

    2012-01-01

    In recent ten years, Building Information Modeling (BIM) has been proposed and applied in the industry of architecture. For the high efficiency and visualization, BIM and correlative technologies are welcomed by architects, engineers, builders and owners, thus the technologies on modeling for design has been widely researched. However, little attention is given to simulation while simulation is an important part of design for building, maybe because it is seen as somewhat less related to the ...

  12. Performance Improvements for the ATLAS Detector Simulation Framework

    OpenAIRE

    Almalioglu Yasin; Salzburger Andreas; Ritsch Elmar

    2013-01-01

    Many physics and performance studies carried out with the ATLAS detector at the Long Hadron Collider (LHC) require very large event samples. A detailed simulation for the detector, however, requires a great amount of CPU resources. In addition to detailed simulation, fast techniques and new setups are developed and extensively used to supply large event samples. In addition to the development of new techniques and setups, it is still possible to find some performance improvements in the exist...

  13. Chemical research at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Argonne National Laboratory is a research and development laboratory located 25 miles southwest of Chicago, Illinois. It has more than 200 programs in basic and applied sciences and an Industrial Technology Development Center to help move its technologies to the industrial sector. At Argonne, basic energy research is supported by applied research in diverse areas such as biology and biomedicine, energy conservation, fossil and nuclear fuels, environmental science, and parallel computer architectures. These capabilities translate into technological expertise in energy production and use, advanced materials and manufacturing processes, and waste minimization and environmental remediation, which can be shared with the industrial sector. The Laboratory`s technologies can be applied to help companies design products, substitute materials, devise innovative industrial processes, develop advanced quality control systems and instrumentation, and address environmental concerns. The latest techniques and facilities, including those involving modeling, simulation, and high-performance computing, are available to industry and academia. At Argonne, there are opportunities for industry to carry out cooperative research, license inventions, exchange technical personnel, use unique research facilities, and attend conferences and workshops. Technology transfer is one of the Laboratory`s major missions. High priority is given to strengthening U.S. technological competitiveness through research and development partnerships with industry that capitalize on Argonne`s expertise and facilities. The Laboratory is one of three DOE superconductivity technology centers, focusing on manufacturing technology for high-temperature superconducting wires, motors, bearings, and connecting leads. Argonne National Laboratory is operated by the University of Chicago for the U.S. Department of Energy.

  14. Abdominal surgery process modeling framework for simulation using spreadsheets.

    Science.gov (United States)

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. PMID:26004999

  15. Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking

    Energy Technology Data Exchange (ETDEWEB)

    Vashishta, Priya

    2014-12-01

    Reaction Dynamics in Energetic Materials: Detonation is a prototype of mechanochemistry, in which mechanically and thermally induced chemical reactions far from equilibrium exhibit vastly different behaviors. It is also one of the hardest multiscale physics problems, in which diverse length and time scales play important roles. The CACS group has performed multimillion-atom reactive MD simulations to reveal a novel two-stage reaction mechanism during the detonation of cyclotrimethylenetrinitramine (RDX) crystal. Rapid production of N2 and H2O within ~10 ps is followed by delayed production of CO molecules within ~ 1 ns. They found that further decomposition towards the final products is inhibited by the formation of large metastable C- and O-rich clusters with fractal geometry. The CACS group has also simulated the oxidation dynamics of close-packed aggregates of aluminum nanoparticles passivated by oxide shells. Their simulation results suggest an unexpectedly active role of the oxide shell as a nanoreactor.

  16. NPTool: a simulation and analysis framework for low-energy nuclear physics experiments

    Science.gov (United States)

    Matta, A.; Morfouace, P.; de Séréville, N.; Flavigny, F.; Labiche, M.; Shearman, R.

    2016-08-01

    The Nuclear Physics Tool (NPTool) is an open source data analysis and Monte Carlo simulation framework that has been developed for low-energy nuclear physics experiments with an emphasis on radioactive beam experiments. The NPTool offers a unified framework for designing, preparing and analyzing complex experiments employing multiple detectors, each of which may comprise some hundreds of channels. The framework has been successfully used for the analysis and simulation of experiments at facilities including GANIL, RIKEN, ALTO and TRIUMF, using both stable and radioactive beams. This paper details the NPTool philosophy together with an overview of the workflow. The framework has been benchmarked through the comparison of simulated and experimental data for a variety of detectors used in charged particle and gamma-ray spectroscopy.

  17. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    Directory of Open Access Journals (Sweden)

    Rescigno R.

    2014-03-01

    Full Text Available Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  18. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    OpenAIRE

    Rescigno R.; Finck Ch.; Juliani D.; Baudot J.; Dauvergne D.; Dedes G.; Krimmer J.; Ray C.; Reithinger V.; Rousseau M.; Testa E; Winter M.

    2014-01-01

    Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  19. BOUT++: a framework for parallel plasma fluid simulations

    CERN Document Server

    Dudson, B D; Xu, X Q; Snyder, P B; Wilson, H R

    2008-01-01

    A new modular code called BOUT++ is presented, which simulates 3D fluid equations in curvilinear coordinates. Although aimed at simulating Edge Localised Modes (ELMs) in tokamak X-point geometry, the code is able to simulate a wide range of fluid models (magnetised and unmagnetised) involving an arbitrary number of scalar and vector fields, in a wide range of geometries. Time evolution is fully implicit, and 3rd-order WENO schemes are implemented. Benchmarks are presented for linear and non-linear problems (the Orszag-Tang vortex) showing good agreement. Performance of the code is tested by scaling with problem size and processor number, showing efficient scaling to thousands of processors. Linear initial-value simulations of ELMs using reduced ideal MHD are presented, and the results compared to the ELITE linear MHD eigenvalue code. The resulting mode-structures and growth-rate are found to be in good agreement (BOUT++ = 0.245, ELITE = 0.239). To our knowledge, this is the first time dissipationless, initial...

  20. NASA Earth Observing System Simulator Suite (NEOS3): A Forward Simulation Framework for Observing System Simulation Experiments

    Science.gov (United States)

    Niamsuwan, N.; Tanelli, S.; Johnson, M. P.; Jacob, J. C.; Jaruwatanadilok, S.; Oveisgharan, S.; Dao, D.; Simard, M.; Turk, F. J.; Tsang, L.; Liao, T. H.; Chau, Q.

    2014-12-01

    Future Earth observation missions will produce a large volume of interrelated data sets that will help us to cross-calibrate and validate spaceborne sensor measurements. A forward simulator is a crucial tool for examining the quality of individual products as well as resolving discrepancy among related data sets. NASA Earth Observing System Simulator Suite (NEOS3) is a highly customizable forward simulation tool for Earth remote sensing instruments. Its three-stage simulation process converts the 3D geophysical description of the scene being observed to corresponding electromagnetic emission and scattering signatures, and finally to observable parameters as reported by a (passive or active) remote sensing instrument. User-configurable options include selection of models for describing geophysical properties of atmospheric particles and their effects on the signal of interest, selection of wave scattering and propagation models, and activation of simplifying assumptions (trading between computation time and solution accuracy). The next generation of NEOS3, to be released in 2015, will feature additional state-of-the-art electromagnetic scattering models for various types of the Earth's surfaces and ground covers (e.g. layered snowpack, forest, vegetated soil, and sea ice) tailored specifically for missions like GPM and SMAP. To be included in 2015 is dedicated functionalities and interface that facilitate integrating NEOS3 into Observing System Simulation Experiment (OSSE) environments. This new generation of NEOS3 can also utilize high performance computing resources (parallel processing and cloud computing) and can be scaled to handle large or computation intensive problems. This presentation will highlight some notable features of NEOS3. Demonstration of its applications for evaluating new mission concepts, especially in the context of OSSE frameworks will also be presented.

  1. Flexible and modular MPI simulation framework and its use in modelling a µMPI

    NARCIS (Netherlands)

    Straub, M.; Lammers, T.G.G.M.; Kiessling, F.; Schulz, V.

    2015-01-01

    The availability of thorough system simulations for detailed and accurate performance prediction and optimization of existing and future designs for a new modality, such as magnetic particle imaging (MPI) are very important. Our framework aims to simulate a complete MPI system by providing a descrip

  2. Power Grid Simulation Applications Developed Using the GridPACKTM High Performance Computing Framework

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chen, Yousu; Diao, Ruisheng; Huang, Zhenyu; Perkins, William A.; Palmer, Bruce J.

    2016-12-01

    This paper describes the GridPACK™ software framework for developing power grid simulations that can run on high performance computing platforms, with several example applications (dynamic simulation, static contingency analysis, and dynamic contingency analysis) that have been developed using GridPACK.

  3. Designing a Virtual Olympic Games Framework by Using Simulation in Web 2.0 Technologies

    Science.gov (United States)

    Stoilescu, Dorian

    2013-01-01

    Instructional simulation had major difficulties in the past for offering limited possibilities in practice and learning. This article proposes a link between instructional simulation and Web 2.0 technologies. More exactly, I present the design of the Virtual Olympic Games Framework (VOGF), as a significant demonstration of how interactivity in…

  4. A general simulation model developing process based on five-object framework

    Institute of Scientific and Technical Information of China (English)

    胡安斌; 伞冶; 陈建明; 陈永强

    2003-01-01

    Different paradigms that relate verification and validation to the simulation model have different development process. A simulation model developing process based on Five-Object Framework (FOF) is discussed in this paper. An example is given to demonstrate the applications of the proposed method.

  5. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  6. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    OpenAIRE

    Guanyi Sun; Shengnan Xu; Xu Wang; Dawei Wang; Eugene Tang; Yangdong Deng; Sun Chan

    2011-01-01

    Today's System-on-Chips (SoCs) design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simu...

  7. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  8. An artificial intelligence framework for feedback and assessment mechanisms in educational Simulations and Serious Games

    OpenAIRE

    Stallwood, James

    2015-01-01

    Simulations and Serious Games are powerful e-learning tools that can be designed to provide learning opportunities that stimulate their participants. To achieve this goal, the design of Simulations and Serious Games will often include some balance of three factors: motivation, engagement, and flow. Whilst many frameworks and approaches for Simulation and Serious Game design do provide the means for addressing a combination of these factors to some degree, few address how those factors might b...

  9. A Java based framework for simulating peer-to-peer overlay networks

    OpenAIRE

    Hasselrot, Daniel

    2005-01-01

    In the last few years many new structured overlay network protocols for peer-to-peer systems have appeared. Following that, the need to test and develop the protocols in a controlled environment arose and many different simulators were written, often only supporting a single protocol and designed with a specific simulation task in mind. We introduce a general component based framework in Java for writing peer-to-peer simulators, complete with methods for exporting and vis...

  10. Digital system verification a combined formal methods and simulation framework

    CERN Document Server

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  11. Atomistic Simulation of Protein Encapsulation in Metal-Organic Frameworks.

    Science.gov (United States)

    Zhang, Haiyang; Lv, Yongqin; Tan, Tianwei; van der Spoel, David

    2016-01-28

    Fabrication of metal-organic frameworks (MOFs) with large apertures triggers a brand-new research area for selective encapsulation of biomolecules within MOF nanopores. The underlying inclusion mechanism is yet to be clarified however. Here we report a molecular dynamics study on the mechanism of protein encapsulation in MOFs. Evaluation for the binding of amino acid side chain analogues reveals that van der Waals interaction is the main driving force for the binding and that guest size acts as a key factor predicting protein binding with MOFs. Analysis on the conformation and thermodynamic stability of the miniprotein Trp-cage encapsulated in a series of MOFs with varying pore apertures and surface chemistries indicates that protein encapsulation can be achieved via maintaining a polar/nonpolar balance in the MOF surface through tunable modification of organic linkers and Mg-O chelating moieties. Such modifications endow MOFs with a more biocompatible confinement. This work provides guidelines for selective inclusion of biomolecules within MOFs and facilitates MOF functions as a new class of host materials and molecular chaperones. PMID:26730607

  12. Atomistic Simulation of Protein Encapsulation in Metal-Organic Frameworks.

    Science.gov (United States)

    Zhang, Haiyang; Lv, Yongqin; Tan, Tianwei; van der Spoel, David

    2016-01-28

    Fabrication of metal-organic frameworks (MOFs) with large apertures triggers a brand-new research area for selective encapsulation of biomolecules within MOF nanopores. The underlying inclusion mechanism is yet to be clarified however. Here we report a molecular dynamics study on the mechanism of protein encapsulation in MOFs. Evaluation for the binding of amino acid side chain analogues reveals that van der Waals interaction is the main driving force for the binding and that guest size acts as a key factor predicting protein binding with MOFs. Analysis on the conformation and thermodynamic stability of the miniprotein Trp-cage encapsulated in a series of MOFs with varying pore apertures and surface chemistries indicates that protein encapsulation can be achieved via maintaining a polar/nonpolar balance in the MOF surface through tunable modification of organic linkers and Mg-O chelating moieties. Such modifications endow MOFs with a more biocompatible confinement. This work provides guidelines for selective inclusion of biomolecules within MOFs and facilitates MOF functions as a new class of host materials and molecular chaperones.

  13. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    Science.gov (United States)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  14. A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs

    OpenAIRE

    Carlos Asilis; Paul D. McNelis

    1992-01-01

    A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs This paper investigates the dynamic paths of inflation and real balances in a general equilibrium intertemporal optimization model, with transactions costs and currency substitution; when budget deficits are financed by money creation.The results show that inflationany path show more 'jumps" or explosions under the assumptions if lower transactions costs or-an increasing degree of currency...

  15. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    Science.gov (United States)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  16. An implicit solution framework for reactor fuel performance simulation

    International Nuclear Information System (INIS)

    The simulation of nuclear reactor fuel performance involves complex thermomechanical processes between fuel pellets, made of fissile material, and the protective cladding that surrounds the pellets. An important design goal for a fuel is to maximize the life of the cladding thereby allowing the fuel to remain in the reactor for a longer period of time to achieve higher degrees of burnup. This presentation examines various mathematical and computational issues that impact the modeling of the thermomechanical response of reactor fuel, and are thus important to the development of INL's fuel performance analysis code, BISON. The code employs advanced methods for solving coupled partial differential equation systems that describe multidimensional fuel thermomechanics, heat generation, and transport within the fuel

  17. CoRoBa, a Multi Mobile Robot Control and Simulation Framework

    Directory of Open Access Journals (Sweden)

    Eric Colon

    2008-11-01

    Full Text Available This paper describes on-going development of a multi robot control framework named CoRoBa. CoRoBa is theoretically founded by reifying Real Time Design Patterns. It uses CORBA as its communication Middleware and consequently benefits from the interoperability of this standard. A multi-robot 3D simulator written in Java3D integrates seamlessly with this framework. Several demonstration applications have been developed to validate the design and implementation options.

  18. SIMPEG: An open source framework for simulation and gradient based parameter estimation in geophysical applications

    Science.gov (United States)

    Cockett, Rowan; Kang, Seogi; Heagy, Lindsey J.; Pidlisecky, Adam; Oldenburg, Douglas W.

    2015-12-01

    Inverse modeling is a powerful tool for extracting information about the subsurface from geophysical data. Geophysical inverse problems are inherently multidisciplinary, requiring elements from the relevant physics, numerical simulation, and optimization, as well as knowledge of the geologic setting, and a comprehension of the interplay between all of these elements. The development and advancement of inversion methodologies can be enabled by a framework that supports experimentation, is flexible and extensible, and allows the knowledge generated to be captured and shared. The goal of this paper is to propose a framework that supports many different types of geophysical forward simulations and deterministic inverse problems. Additionally, we provide an open source implementation of this framework in Python called SIMPEG (Simulation and Parameter Estimation in Geophysics,

  19. A framework of knowledge creation processes in participatory simulation of hospital work systems

    DEFF Research Database (Denmark)

    Andersen, Simone Nyholm; Broberg, Ole

    2016-01-01

    Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we...... suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis....... The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals...

  20. A detailed framework to incorporate dust in hydrodynamical simulations

    CERN Document Server

    Grassi, T; Haugboelle, T; Schleicher, D R G

    2016-01-01

    Dust plays a key role in the evolution of the ISM and its correct modelling in numerical simulations is therefore fundamental. We present a new and self-consistent model that treats grain thermal coupling with the gas, radiation balance, and surface chemistry for molecular hydrogen. This method can be applied to any dust distribution with an arbitrary number of grain types without affecting the overall computational cost. In this paper we describe in detail the physics and the algorithm behind our approach, and in order to test the methodology, we present some examples of astrophysical interest, namely (i) a one-zone collapse with complete gas chemistry and thermochemical processes, (ii) a 3D model of a low-metallicity collapse of a minihalo starting from cosmological initial conditions, and (iii) a turbulent molecular cloud with H-C-O chemistry (277 reactions), together with self-consistent cooling and heating solved on the fly. Although these examples employ the publicly available code KROME, our approach c...

  1. FERN – a Java framework for stochastic simulation and evaluation of reaction networks

    Science.gov (United States)

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-01-01

    Background Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. Results In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. Conclusion FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand

  2. FERN – a Java framework for stochastic simulation and evaluation of reaction networks

    Directory of Open Access Journals (Sweden)

    Zimmer Ralf

    2008-08-01

    Full Text Available Abstract Background Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a do not provide the most efficient simulation algorithms and are difficult to extend, b cannot be easily integrated into other applications or c do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. Results In this article, we present FERN (Framework for Evaluation of Reaction Networks, a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. Conclusion FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward

  3. A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition

    Science.gov (United States)

    Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.

    2012-01-01

    A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.

  4. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    OpenAIRE

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-01-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be i...

  5. A Framework for Teaching Programming on the Internet: A Web-Based Simulation Approach

    OpenAIRE

    Yousif A. Bastaki

    2012-01-01

    Problem statement: This research study describes the process of developing a web-based framework for simulating programming language activities on the Internet, in an interactive way, by enabling executable programs to perform automatically their function. Approach: The interaction process is played using Java applets. It emphasizes the importance of building the web-based architecture of the proposed simulation model. Results: The research concentrates on developing programming courses on th...

  6. Integrated Simulation Environment for Unmanned Autonomous Systems—Towards a Conceptual Framework

    OpenAIRE

    Perhinschi, M. G.; Napolitano, M. R.; S. Tamayo

    2010-01-01

    The paper initiates a comprehensive conceptual framework for an integrated simulation environment for unmanned autonomous systems (UAS) that is capable of supporting the design, analysis, testing, and evaluation from a “system of systems” perspective. The paper also investigates the current state of the art of modeling and performance assessment of UAS and their components and identifies directions for future developments. All the components of a comprehensive simulation environment focused o...

  7. Dynamically adaptive Lattice Boltzmann simulation of shallow water flows with the Peano framework

    KAUST Repository

    Neumann, Philipp

    2015-09-01

    © 2014 Elsevier Inc. All rights reserved. We present a dynamically adaptive Lattice Boltzmann (LB) implementation for solving the shallow water equations (SWEs). Our implementation extends an existing LB component of the Peano framework. We revise the modular design with respect to the incorporation of new simulation aspects and LB models. The basic SWE-LB implementation is validated in different breaking dam scenarios. We further provide a numerical study on stability of the MRT collision operator used in our simulations.

  8. A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, K; Seymour, R; Wang, W; Kalia, R; Nakano, A; Vashishta, P; Shimojo, F; Yang, L H

    2009-02-17

    A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based on hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops {center_dot} day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).

  9. A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations

    International Nuclear Information System (INIS)

    A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based on hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops · day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).

  10. COSMOS: A System-Level Modelling and Simulation Framework for Coprocessor-Coupled Reconfigurable Systems

    DEFF Research Database (Denmark)

    Wu, Kehuai; Madsen, Jan

    2007-01-01

    and resource management, and iii) present a SystemC based framework to model and simulate coprocessor-coupled reconfigurable systems. We illustrate how COSMOS may be used to capture the dynamic behavior of such systems and emphasize the need for capturing the system aspects of such systems in order to deal...

  11. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  12. The Umbra Simulation and Integration Framework Applied to Emergency Response Training

    Science.gov (United States)

    Hamilton, Paul Lawrence; Britain, Robert

    2010-01-01

    The Mine Emergency Response Interactive Training Simulation (MERITS) is intended to prepare personnel to manage an emergency in an underground coal mine. The creation of an effective training environment required realistic emergent behavior in response to simulation events and trainee interventions, exploratory modification of miner behavior rules, realistic physics, and incorporation of legacy code. It also required the ability to add rich media to the simulation without conflicting with normal desktop security settings. Our Umbra Simulation and Integration Framework facilitated agent-based modeling of miners and rescuers and made it possible to work with subject matter experts to quickly adjust behavior through script editing, rather than through lengthy programming and recompilation. Integration of Umbra code with the WebKit browser engine allowed the use of JavaScript-enabled local web pages for media support. This project greatly extended the capabilities of Umbra in support of training simulations and has implications for simulations that combine human behavior, physics, and rich media.

  13. Numerical simulation of the fracture process in ceramic FPD frameworks caused by oblique loading.

    Science.gov (United States)

    Kou, Wen; Qiao, Jiyan; Chen, Li; Ding, Yansheng; Sjögren, Göran

    2015-10-01

    Using a newly developed three-dimensional (3D) numerical modeling code, an analysis was performed of the fracture behavior in a three-unit ceramic-based fixed partial denture (FPD) framework subjected to oblique loading. All the materials in the study were treated heterogeneously; Weibull׳s distribution law was applied to the description of the heterogeneity. The Mohr-Coulomb failure criterion with tensile strength cut-off was utilized in judging whether the material was in an elastic or failed state. The simulated loading area was placed either on the buccal or the lingual cusp of a premolar-shaped pontic with the loading direction at 30°, 45°, 60°, 75° or 90° angles to the occlusal surface. The stress distribution, fracture initiation and propagation in the framework during the loading and fracture process were analyzed. This numerical simulation allowed the cause of the framework fracture to be identified as tensile stress failure. The decisive fracture was initiated in the gingival embrasure of the pontic, regardless of whether the buccal or lingual cusp of the pontic was loaded. The stress distribution and fracture propagation process of the framework could be followed step by step from beginning to end. The bearing capacity and the rigidity of the framework vary with the loading position and direction. The framework loaded with 90° towards the occlusal surface has the highest bearing capacity and the greatest rigidity. The framework loaded with 30° towards the occlusal surface has the least rigidity indicating that oblique loading has a major impact on the fracture of ceramic frameworks. PMID:26143353

  14. Integrated Simulation Environment for Unmanned Autonomous Systems—Towards a Conceptual Framework

    Directory of Open Access Journals (Sweden)

    M. G. Perhinschi

    2010-01-01

    Full Text Available The paper initiates a comprehensive conceptual framework for an integrated simulation environment for unmanned autonomous systems (UAS that is capable of supporting the design, analysis, testing, and evaluation from a “system of systems” perspective. The paper also investigates the current state of the art of modeling and performance assessment of UAS and their components and identifies directions for future developments. All the components of a comprehensive simulation environment focused on the testing and evaluation of UAS are identified and defined through detailed analysis of current and future required capabilities and performance. The generality and completeness of the simulation environment is ensured by including all operational domains, types of agents, external systems, missions, and interactions between components. The conceptual framework for the simulation environment is formulated with flexibility, modularity, generality, and portability as key objectives. The development of the conceptual framework for the UAS simulation reveals important aspects related to the mechanisms and interactions that determine specific UAS characteristics including complexity, adaptability, synergy, and high impact of artificial and human intelligence on system performance and effectiveness.

  15. Modelling and simulation of acrylic bone cement injection and curing within the framework of vertebroplasty

    CERN Document Server

    Landgraf, Ralf; Kolmeder, Sebastian; Lion, Alexander; Lebsack, Helena; Kober, Cornelia

    2013-01-01

    The minimal invasive procedure of vertebroplasty is a surgical technique to treat compression fractures of vertebral bodies. During the treatment liquid bone cement gets injected into the affected vertebral body and therein cures to a solid. In order to investigate the treatment and the impact of injected bone cement on the vertebra, an integrated modelling and simulation framework has been developed. The framework includes (i) the generation of computer models based on microCT images of human cancellous bone, (ii) CFD simulations of bone cement injection into the trabecular structure of a vertebral body as well as (iii) non-linear FEM simulations of the bone cement curing. Thereby, microstructural models of trabecular bone structures are employed. Furthermore, a detailed description of the material behaviour of acrylic bone cements is provided. More precisely, a non-linear fluid flow model is chosen for the representation of the bone cement behaviour during injection and a non-linear viscoelastic material mo...

  16. Delphes, a framework for fast simulation of a generic collider experiment

    CERN Document Server

    Ovyn, S; Lemaître, V

    2009-01-01

    It is always delicate to know whether theoretical predictions are visible and measurable in a high energy collider experiment due to the complexity of the related detectors, data acquisition chain and software. We introduce here a new C++ based framework, DELPHES, for fast simulation of a general-purpose experiment. The simulation includes a tracking system, embedded into a magnetic field, calorimetry and a muon system, and possible very forward detectors arranged along the beamline. The framework is interfaced to standard file formats (e.g. Les Houches Event File) and outputs observable objects for analysis, like missing transverse energy and collections of electrons or jets. The simulation of detector response takes into account the detector resolution, and usual reconstruction algorithms, such as FASTJET. A simplified preselection can also be applied on processed data for trigger emulation. Detection of very forward scattered particles relies on the transport in beamlines with the HECTOR software. Finally,...

  17. Prototyping a coherent framework for full, fast and parameteric detector simulation for the FCC project

    CERN Document Server

    Hrdinka, Julia; Salzburger, Andreas; Hegner, Benedikt

    2015-01-01

    The outstanding success of the physics program of the Large Hadron Collider (LHC) including the discovery of the Higgs boson shifted the focus of part of the high energy physics community onto the planning phase for future circular collider (FCC) projects. A proton-proton collider is in consideration, as well as an electron-positron ring and an electron-proton option as potential LHC successor projects. Common to all projects is the need for a coherent software framework in order to carry out simulation studies to establish the potential physics reach or to test different technol- ogy approaches. Detector simulation is a particularly necessary tool needed for design studies of different detector concepts and to allow establishing the relevant performance parameters. In ad- dition, it allows to generate data as input for the development of reconstruction algorithms needed to cope with the expected future environments. We present a coherent framework that combines full, fast and parametric detector simulation e...

  18. Implementation and performance of FDPS: A Framework Developing Parallel Particle Simulation Codes

    CERN Document Server

    Iwasawa, Masaki; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-01-01

    We have developed FDPS (Framework for Developing Particle Simulator), which enables researchers and programmers to develop high-performance parallel particle simulation codes easily. The basic idea of FDPS is to separate the program code for complex parallelization including domain decomposition, redistribution of particles, and exchange of particle information for interaction calculation between nodes, from actual interaction calculation and orbital integration. FDPS provides the former part and the users write the latter. Thus, a user can implement a high-performance fully parallelized $N$-body code only in 120 lines. In this paper, we present the structure and implementation of FDPS, and describe its performance on three sample applications: disk galaxy simulation, cosmological simulation and Giant impact simulation. All codes show very good parallel efficiency and scalability on K computer and XC30. FDPS lets the researchers concentrate on the implementation of physics and mathematical schemes, without wa...

  19. Flexible simulation framework to couple processes in complex 3D models for subsurface utilization assessment

    Science.gov (United States)

    Kempka, Thomas; Nakaten, Benjamin; De Lucia, Marco; Nakaten, Natalie; Otto, Christopher; Pohl, Maik; Tillner, Elena; Kühn, Michael

    2016-04-01

    Utilization of the geological subsurface for production and storage of hydrocarbons, chemical energy and heat as well as for waste disposal requires the quantification and mitigation of environmental impacts as well as the improvement of georesources utilization in terms of efficiency and sustainability. The development of tools for coupled process simulations is essential to tackle these challenges, since reliable assessments are only feasible by integrative numerical computations. Coupled processes at reservoir to regional scale determine the behaviour of reservoirs, faults and caprocks, generally demanding for complex 3D geological models to be considered besides available monitoring and experimenting data in coupled numerical simulations. We have been developing a flexible numerical simulation framework that provides efficient workflows for integrating the required data and software packages to carry out coupled process simulations considering, e.g., multiphase fluid flow, geomechanics, geochemistry and heat. Simulation results are stored in structured data formats to allow for an integrated 3D visualization and result interpretation as well as data archiving and its provision to collaborators. The main benefits in using the flexible simulation framework are the integration of data geological and grid data from any third party software package as well as data export to generic 3D visualization tools and archiving formats. The coupling of the required process simulators in time and space is feasible, while different spatial dimensions in the coupled simulations can be integrated, e.g., 0D batch with 3D dynamic simulations. User interaction is established via high-level programming languages, while computational efficiency is achieved by using low-level programming languages. We present three case studies on the assessment of geological subsurface utilization based on different process coupling approaches and numerical simulations.

  20. Simulation-based Modeling Frameworks for Networked Multi-processor System-on-Chip

    DEFF Research Database (Denmark)

    Mahadevan, Shankar

    2006-01-01

    This thesis deals with modeling aspects of multi-processor system-on-chip (MpSoC) design affected by the on-chip interconnect, also called the Network-on-Chip (NoC), at various levels of abstraction. To begin with, we undertook a comprehensive survey of research and design practices of networked Mp......SoC. The survey presents the challenges of modeling and performance analysis of the hardware and the software components used in such devices. These challenges are further exasperated in a mixed abstraction workspace, which is typical of complex MpSoC design environment. We provide two simulation-based frameworks...... and the RIPE frameworks allows easy incorporation of IP cores from either frameworks, into a new instance of the design. This could pave the way for seamless design evaluation from system-level to cycletrue abstraction in future component-based MpSoC design practice....

  1. A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs

    Directory of Open Access Journals (Sweden)

    Carlos Asilis

    1992-03-01

    Full Text Available A Dynamic Simulation Analysis of Currency Substitution in a Optimizing Framework with Transactions Costs This paper investigates the dynamic paths of inflation and real balances in a general equilibrium intertemporal optimization model, with transactions costs and currency substitution; when budget deficits are financed by money creation.The results show that inflationany path show more 'jumps" or explosions under the assumptions if lower transactions costs or-an increasing degree of currency substitution. Even small changes in the degrees of currency substitution with positive transactions costs sharply change the paths of intflation and real balances. Similarly, small changes in transactions costs for foreign currency, even without prior currency substirution, have marked effects on the paths of inflation and real balances.The results obtained from the simulated data are consistent with inflation processes in recent Latin American experience, where currency substitution may have taken place. Estimates of the simulated data for even a small degree of currency substitution generate generalized autoregressive conditionally heteroskedactic (GARCH estimares of the inflation process, which are consistent with estimares for Argentina, Bolivia, Mexico, and Peru. In these countries currency substitution may have gone hand-in-hand with inflationary instabiliiy through money-financed fiscal deficits.Our results suggest that fiscal deficits financed by monetary expansion should be avoided under conditions of increasing financial openness, which provide greater opportunities for financial adaptation through currency substitution or lower transactions costs on foreign currency accumulation.

  2. Microworlds, Simulators, and Simulation: Framework for a Benchmark of Human Reliability Data Sources

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Boring; Dana Kelly; Carol Smidts; Ali Mosleh; Brian Dyre

    2012-06-01

    In this paper, we propose a method to improve the data basis of human reliability analysis (HRA) by extending the data sources used to inform HRA methods. Currently, most HRA methods are based on limited empirical data, and efforts to enhance the empirical basis behind HRA methods have not yet yielded significant new data. Part of the reason behind this shortage of quality data is attributable to the data sources used. Data have been derived from unrelated industries, from infrequent risk-significant events, or from costly control room simulator studies. We propose a benchmark of four data sources: a simplified microworld simulator using unskilled student operators, a full-scope control room simulator using skilled student operators, a full-scope control room simulator using licensed commercial operators, and a human performance modeling and simulation system using virtual operators. The goal of this research is to compare findings across the data sources to determine to what extent data may be used and generalized from cost effective sources.

  3. A Hierarchical Framework for Visualising and Simulating Supply Chains in Virtual Environments

    Institute of Scientific and Technical Information of China (English)

    Hai-Yan Zhang; Zheng-Xu Zhao

    2005-01-01

    This paper presents research into applying virtual environment (VE) technology to supply chain management (SCM). Our research work has employed virtual manufacturing environments to represent supply chain nodes to simulate processes and activities in supply chain management. This will enable those who are involved in these processes and activities to gain an intuitive understanding of them, so as to design robust supply chains and make correct decisions at the right time.A framework system and its hierarchical structure for visualising and simulating supply chains in virtual environments are reported and detailed in this paper.

  4. A software framework for the portable parallelization of particle-mesh simulations

    DEFF Research Database (Denmark)

    Sbalzarini, I.F.; Walther, Jens Honore; Polasek, B.;

    2006-01-01

    Abstract: We present a software framework for the transparent and portable parallelization of simulations using particle-mesh methods. Particles are used to transport physical properties and a mesh is required in order to reinitialize the distorted particle locations, ensuring the convergence...... range of applications, and it enables orders of magnitude increase in the number of computational elements employed in particle methods. We demonstrate the performance and scalability of the library on several problems, including the first-ever billion particle simulation of diffusion in real biological...

  5. Simulation and real-time optimal scheduling: a framework for integration

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C.M.; Nevins, M.R. [Argonne National Lab., IL (United States); Williams, M.K.; Joines, J.C. [Military Traffic Management Command Transportation Engineering Agency, Newport News, VA (United States)

    1997-02-01

    Traditional scheduling and simulation models of the same system differ in several fundamental respects. These include the definition of a schedule, the existence of an objective function which orders schedules and indicates the performance of a given schedule according to specific criteria, and the level of fidelity at which the items are represented and processed through he system. This paper presents a conceptual, object-oriented, architecture for combining a traditional, high-level, scheduling system with a detailed, process- level, discrete-event simulation. A multi-echelon planning framework is established in the context of modeling end-to-end military deployments with the focus on detailed seaport operations.

  6. Beyond illumination: An interactive simulation framework for non-visual and perceptual aspects of daylighting performance

    OpenAIRE

    Andersen, Marilyne; Guillemin, Antoine; Ámundadóttir, María Lovísa; Rockcastle, Siobhan Francois

    2013-01-01

    This paper presents a proof-of-concept for a goal-based simulation structure that could offer design support for daylighting performance aspects beyond conventional ones such as illumination, glare or solar gains. The framework uses a previously established visualization platform that simultaneously and interactively displays time-based daylighting performance alongside renderings, and relies on a goal-based approach. Two novel performance aspects are investigated in the present paper: health...

  7. SCENARIO ANALYSIS OF TECHNOLOGY PRODUCTS WITH AN AGENT-BASED SIMULATION AND DATA MINING FRAMEWORK

    OpenAIRE

    AMIT SHINDE; MOEED HAGHNEVIS; Janssen, Marco A.; GEORGE C. RUNGER; MANI JANAKIRAM

    2013-01-01

    A framework is presented to simulate and analyze the effect of multiple business scenarios on the adoption behavior of a group of technology products. Diffusion is viewed as an emergent phenomenon that results from the interaction of consumers. An agent-based model is used in which potential adopters of technology product are allowed to be influenced by their local interactions within the social network. Along with social influence, the effect of product features is important and we ascribe f...

  8. A dynamic subgrid-scale modeling framework for large eddy simulation using approximate deconvolution

    CERN Document Server

    Maulik, Romit

    2016-01-01

    We put forth a dynamic modeling framework for sub-grid parametrization of large eddy simulation of turbulent flows based upon the use of the approximate deconvolution procedure to compute the Smagorinsky constant self-adaptively from the resolved flow quantities. Our numerical assessments for solving the Burgers turbulence problem shows that the proposed approach could be used as a viable tool to address the turbulence closure problem due to its flexibility.

  9. Analysis of GEANT4 Physics List Properties in the 12 GeV MOLLER Simulation Framework

    Science.gov (United States)

    Haufe, Christopher; Moller Collaboration

    2013-10-01

    To determine the validity of new physics beyond the scope of the electroweak theory, nuclear physicists across the globe have been collaborating on future endeavors that will provide the precision needed to confirm these speculations. One of these is the MOLLER experiment - a low-energy particle experiment that will utilize the 12 GeV upgrade of Jefferson Lab's CEBAF accelerator. The motivation of this experiment is to measure the parity-violating asymmetry of scattered polarized electrons off unpolarized electrons in a liquid hydrogen target. This measurement would allow for a more precise determination of the electron's weak charge and weak mixing angle. While still in its planning stages, the MOLLER experiment requires a detailed simulation framework in order to determine how the project should be run in the future. The simulation framework for MOLLER, called ``remoll'', is written in GEANT4 code. As a result, the simulation can utilize a number of GEANT4 coded physics lists that provide the simulation with a number of particle interaction constraints based off of different particle physics models. By comparing these lists with one another using the data-analysis application ROOT, the most optimal physics list for the MOLLER simulation can be determined and implemented. This material is based upon work supported by the National Science Foundation under Grant No. 714001.

  10. Ximpol: a new X-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Baldini, Luca; Muleri, Fabio; Soffitta, Paolo; Omodei, Nicola; Pesce-Rollins, Melissa; Sgro, Carmelo; Latronico, Luca; Spada, Francesca; Manfreda, Alberto; Di Lalla, Niccolo

    2016-07-01

    We present a new simulation framework, ximpol, based on the Python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. ximpol is designed to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC---which make it a useful tool not only for simulating observations of astronomical sources, but also to develop and test end-to-end analysis chains. In this contribution we shall give an overview of the basic architecture of the software. Although in principle the framework is not tied to any specific mission or instrument design we shall present a few physically interesting case studies in the context of the XIPE mission phase study.

  11. An Object-Oriented Framework for Versatile Finite Element Based Simulations of Neurostimulation

    Directory of Open Access Journals (Sweden)

    Edward T. Dougherty

    2016-01-01

    Full Text Available Computational simulations of transcranial electrical stimulation (TES are commonly utilized by the neurostimulation community, and while vastly different TES application areas can be investigated, the mathematical equations and physiological characteristics that govern this research are identical. The goal of this work was to develop a robust software framework for TES that efficiently supports the spectrum of computational simulations routinely utilized by the TES community and in addition easily extends to support alternative neurostimulation research objectives. Using well-established object-oriented software engineering techniques, we have designed a software framework based upon the physical and computational aspects of TES. The framework’s versatility is demonstrated with a set of diverse neurostimulation simulations that (i reinforce the importance of using anisotropic tissue conductivities, (ii demonstrate the enhanced precision of high-definition stimulation electrodes, and (iii highlight the benefits of utilizing multigrid solution algorithms. Our approaches result in a framework that facilitates rapid prototyping of real-world, customized TES administrations and supports virtually any clinical, biomedical, or computational aspect of this treatment. Software reuse and maintainability are optimized, and in addition, the same code can be effortlessly augmented to provide support for alternative neurostimulation research endeavors.

  12. A 3D MPI-Parallel GPU-accelerated framework for simulating ocean wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Raessi, Mehdi

    2015-11-01

    We present an MPI-parallel GPU-accelerated computational framework for studying the interaction between ocean waves and wave energy converters (WECs). The computational framework captures the viscous effects, nonlinear fluid-structure interaction (FSI), and breaking of waves around the structure, which cannot be captured in many potential flow solvers commonly used for WEC simulations. The full Navier-Stokes equations are solved using the two-step projection method, which is accelerated by porting the pressure Poisson equation to GPUs. The FSI is captured using the numerically stable fictitious domain method. A novel three-phase interface reconstruction algorithm is used to resolve three phases in a VOF-PLIC context. A consistent mass and momentum transport approach enables simulations at high density ratios. The accuracy of the overall framework is demonstrated via an array of test cases. Numerical simulations of the interaction between ocean waves and WECs are presented. Funding from the National Science Foundation CBET-1236462 grant is gratefully acknowledged.

  13. SMART: A New Semi-distributed Hydrologic Modelling Framework for Soil Moisture and Runoff Simulations

    Science.gov (United States)

    Ajami, Hoori; Sharma, Ashish

    2016-04-01

    A new GIS-based semi-distributed hydrological modelling framework is developed based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). The Soil Moisture and Runoff simulation Toolkit (SMART) performs topographic and geomorphic analysis of a catchment and delineates HRUs in each first order sub-basin. This HRU delineation approach maintains lateral flow dynamics in first order sub-basins and therefore it is suited for simulating runoff in upland catchments. Simulation elements in SMART are distributed cross sections or equivalent cross sections (ECS) in each first order sub-basin to represent hillslope hydrologic processes. Delineation of ECSs in SMART is performed by weighting the topographic and physiographic properties of the part or entire first-order sub-basin and has the advantage of reducing computational time/effort while maintaining reasonable accuracy in simulated hydrologic state and fluxes (e.g. soil moisture, evapotranspiration and runoff). SMART workflow is written in MATLAB to automate the HRU and cross section delineations, model simulations across multiple cross sections, and post-processing of model outputs to visualize the results. The MATLAB Parallel Processing Toolbox is used for simultaneous simulations of cross sections and is further reduced computational time. SMART workflow tasks are: 1) delineation of first order sub-basins of a catchment using a digital elevation model, 2) hillslope delineation, 3) landform delineation in every first order sub-basin based on topographic and geomorphic properties of a group of sub-basins or the entire catchment, 4) formulation of cross sections as well as equivalent cross sections in every first order sub-basin, and 5) deriving vegetation and soil parameters from spatially distributed land cover and soil information. The current version of SMART uses a 2-d distributed hydrological model based on the Richards' equation. However, any hydrologic model can be

  14. DELPHES 3: a modular framework for fast simulation of a generic collider experiment

    Science.gov (United States)

    de Favereau, J.; Delaere, C.; Demin, P.; Giammanco, A.; Lemaître, V.; Mertens, A.; Selvaggi, M.

    2014-02-01

    The version 3.0 of the Delphes fast-simulation is presented. The goal of Delphes is to allow the simulation of a multipurpose detector for phenomenological studies. The simulation includes a track propagation system embedded in a magnetic field, electromagnetic and hadron calorimeters, and a muon identification system. Physics objects that can be used for data analysis are then reconstructed from the simulated detector response. These include tracks and calorimeter deposits and high level objects such as isolated electrons, jets, taus, and missing energy. The new modular approach allows for greater flexibility in the design of the simulation and reconstruction sequence. New features such as the particle-flow reconstruction approach, crucial in the first years of the LHC, and pile-up simulation and mitigation, which is needed for the simulation of the LHC detectors in the near future, have also been implemented. The Delphes framework is not meant to be used for advanced detector studies, for which more accurate tools are needed. Although some aspects of Delphes are hadron collider specific, it is flexible enough to be adapted to the needs of electron-positron collider experiments. [Figure not available: see fulltext.

  15. A framework of passive millimeter-wave imaging simulation for typical ground scenes

    Science.gov (United States)

    Yan, Luxin; Ge, Rui; Zhong, Sheng

    2009-10-01

    Passive millimeter-wave (PMMW) imaging offers advantages over visible and IR imaging in having better all weather performance. However the PMMW imaging sensors are state-of-the-art to date, sometimes it is required to predict and evaluate the performance of a PMMW sensor under a variety of weather, terrain and sensor operational conditions. The PMMW scene simulation is an efficient way. This paper proposes a framework of the PMMW simulation for ground scenes. Commercial scene modeling software, Multigen and Vega, are used to generate the multi-viewpoint and multi-scale description for natural ground scenes with visible images. The background and objects in the scene are classified based on perceptive color clusters and mapped with different materials. Further, the radiometric temperature images of the scene are calculated according to millimeter wave phenomenology: atmospheric propagation and emission including sky temperature, weather conditions, and physical temperature. Finally, the simulated output PMMW images are generated by applying the sensor characteristics such as the aperture size, data sample scheme and system noise. Tentative results show the simulation framework can provide reasonable scene's PMMW image with high fidelity.

  16. A simulation framework for modeling tumor control probability in breast conserving therapy

    International Nuclear Information System (INIS)

    Background and purpose: Microscopic disease (MSD) left after tumorectomy is a major cause of local recurrence in breast conserving therapy (BCT). However, the effect of microscopic disease and RT dose on tumor control probability (TCP) was seldom studied quantitatively. A simulation framework was therefore constructed to explore the relationship between tumor load, radiation dose and TCP. Materials and methods: First, we modeled total disease load and microscopic spread with a pathology dataset. Then we estimated the remaining disease load after tumorectomy through surgery simulation. The Webb–Nahum TCP model was extended by clonogenic cell fraction to model the risk of local recurrence. The model parameters were estimated by fitting the simulated results to the observations in two clinical trials. Results: Higher histopathology grade has a strong correlation with larger MSD cell quantity. On average 12.5% of the MSD cells remained in the patient’s breast after surgery but varied considerably among patients (0–100%); illustrating the role of radiotherapy. A small clonogenic cell fraction was optimal in our model (one in every 2.7 * 106 cells). The mean radiosensitivity was estimated at 0.067 Gy−1 with standard deviation of 0.022 Gy−1. Conclusion: A relationship between radiation dose and TCP was established in a newly designed simulation framework with detailed disease load, surgery and radiotherapy models

  17. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    CERN Document Server

    Hummel, Jacob

    2016-01-01

    We present the first public release (v0.1) of the open-source GADGET Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes GADGET and GIZMO using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics (SPH) datasets.

  18. A Framework for Interactive Work Design based on Digital Work Analysis and Simulation

    CERN Document Server

    Ma, Liang; Fu, Huanzhang; Guo, Yang; Chablat, Damien; Bennis, Fouad; 10.1002/hfm.20178

    2010-01-01

    Due to the flexibility and adaptability of human, manual handling work is still very important in industry, especially for assembly and maintenance work. Well-designed work operation can improve work efficiency and quality; enhance safety, and lower cost. Most traditional methods for work system analysis need physical mock-up and are time consuming. Digital mockup (DMU) and digital human modeling (DHM) techniques have been developed to assist ergonomic design and evaluation for a specific worker population (e.g. 95 percentile); however, the operation adaptability and adjustability for a specific individual are not considered enough. In this study, a new framework based on motion tracking technique and digital human simulation technique is proposed for motion-time analysis of manual operations. A motion tracking system is used to track a worker's operation while he/she is conducting a manual handling work. The motion data is transferred to a simulation computer for real time digital human simulation. The data ...

  19. A Framework for Teaching Programming on the Internet: A Web-Based Simulation Approach

    Directory of Open Access Journals (Sweden)

    Yousif A. Bastaki

    2012-01-01

    Full Text Available Problem statement: This research study describes the process of developing a web-based framework for simulating programming language activities on the Internet, in an interactive way, by enabling executable programs to perform automatically their function. Approach: The interaction process is played using Java applets. It emphasizes the importance of building the web-based architecture of the proposed simulation model. Results: The research concentrates on developing programming courses on the Internet to contribute to the distribution of education for the benefit of learners. We emphasize on introducing interactivity between the user and the programming environment. Conclusion: The project is at its first phase and is still under development but we hope that the design of the course and the interactivity that the Java applets provides by simulating the run of an executable C++ code will appeal to our users.

  20. Lattice Boltzmann Simulations in the Slip and Transition Flow Regime with the Peano Framework

    KAUST Repository

    Neumann, Philipp

    2012-01-01

    We present simulation results of flows in the finite Knudsen range, which is in the slip and transition flow regime. Our implementations are based on the Lattice Boltzmann method and are accomplished within the Peano framework. We validate our code by solving two- and three-dimensional channel flow problems and compare our results with respective experiments from other research groups. We further apply our Lattice Boltzmann solver to the geometrical setup of a microreactor consisting of differently sized channels and a reactor chamber. Here, we apply static adaptive grids to fur-ther reduce computational costs. We further investigate the influence of using a simple BGK collision kernel in coarse grid regions which are further away from the slip boundaries. Our results are in good agreement with theory and non-adaptive simulations, demonstrating the validity and the capabilities of our adaptive simulation software for flow problems at finite Knudsen numbers.

  1. A modular modelling framework for hypotheses testing in the simulation of urbanisation

    CERN Document Server

    Cottineau, Clementine; Chapron, Paul; Coyrehourcq, Sebastien Rey; Pumain, Denise

    2015-01-01

    In this paper, we present a modelling experiment developed to study systems of cities and processes of urbanisation in large territories over long time spans. Building on geographical theories of urban evolution, we rely on agent-based models to 1/ formalise complementary and alternative hypotheses of urbanisation and 2/ explore their ability to simulate observed patterns in a virtual laboratory. The paper is therefore divided into two sections : an overview of the mechanisms implemented to represent competing hypotheses used to simulate urban evolution; and an evaluation of the resulting model structures in their ability to simulate - efficiently and parsimoniously - a system of cities (the Former Soviet Union) over several periods of time (before and after the crash of the USSR). We do so using a modular framework of model-building and evolutionary algorithms for the calibration of several model structures. This project aims at tackling equifinality in systems dynamics by confronting different mechanisms wi...

  2. Automated Object-Oriented Simulation Framework for Modelling of Superconducting Magnets at CERN

    CERN Document Server

    Maciejewski, Michał; Bartoszewicz, Andrzej

    The thesis aims at designing a flexible, extensible, user-friendly interface to model electro thermal transients occurring in superconducting magnets. Simulations are a fundamental tool for assessing the performance of a magnet and its protection system against the effects of a quench. The application is created using scalable and modular architecture based on object-oriented programming paradigm which opens an easy way for future extensions. What is more, each model composed of thousands of blocks is automatically created in MATLAB/Simulink. Additionally, the user is able to automatically run sets of simulations with varying parameters. Due to its scalability and modularity the framework can be easily used to simulate wide range of materials and magnet configurations.

  3. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    Science.gov (United States)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  4. A higher-order numerical framework for stochastic simulation of chemical reaction systems.

    KAUST Repository

    Székely, Tamás

    2012-07-15

    BACKGROUND: In this paper, we present a framework for improving the accuracy of fixed-step methods for Monte Carlo simulation of discrete stochastic chemical kinetics. Stochasticity is ubiquitous in many areas of cell biology, for example in gene regulation, biochemical cascades and cell-cell interaction. However most discrete stochastic simulation techniques are slow. We apply Richardson extrapolation to the moments of three fixed-step methods, the Euler, midpoint and θ-trapezoidal τ-leap methods, to demonstrate the power of stochastic extrapolation. The extrapolation framework can increase the order of convergence of any fixed-step discrete stochastic solver and is very easy to implement; the only condition for its use is knowledge of the appropriate terms of the global error expansion of the solver in terms of its stepsize. In practical terms, a higher-order method with a larger stepsize can achieve the same level of accuracy as a lower-order method with a smaller one, potentially reducing the computational time of the system. RESULTS: By obtaining a global error expansion for a general weak first-order method, we prove that extrapolation can increase the weak order of convergence for the moments of the Euler and the midpoint τ-leap methods, from one to two. This is supported by numerical simulations of several chemical systems of biological importance using the Euler, midpoint and θ-trapezoidal τ-leap methods. In almost all cases, extrapolation results in an improvement of accuracy. As in the case of ordinary and stochastic differential equations, extrapolation can be repeated to obtain even higher-order approximations. CONCLUSIONS: Extrapolation is a general framework for increasing the order of accuracy of any fixed-step stochastic solver. This enables the simulation of complicated systems in less time, allowing for more realistic biochemical problems to be solved.

  5. A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging

    International Nuclear Information System (INIS)

    Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R2) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R2 of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems. (paper)

  6. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    International Nuclear Information System (INIS)

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce

  7. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Jillian, E-mail: jillian.becker@health.qld.gov.au [Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia); Bridge, Pete [School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland (Australia); Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet [Radiation Oncology, Princess Alexandra Hospital, Brisbane, Queensland (Australia); Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia)

    2015-06-15

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.

  8. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Jan eHahne

    2015-09-01

    Full Text Available Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy...

  9. LUsim: A Framework for Simulation-Based Performance Modelingand Prediction of Parallel Sparse LU Factorization

    Energy Technology Data Exchange (ETDEWEB)

    Univ. of California, San Diego; Li, Xiaoye Sherry; Cicotti, Pietro; Li, Xiaoye Sherry; Baden, Scott B.

    2008-04-15

    Sparse parallel factorization is among the most complicated and irregular algorithms to analyze and optimize. Performance depends both on system characteristics such as the floating point rate, the memory hierarchy, and the interconnect performance, as well as input matrix characteristics such as such as the number and location of nonzeros. We present LUsim, a simulation framework for modeling the performance of sparse LU factorization. Our framework uses micro-benchmarks to calibrate the parameters of machine characteristics and additional tools to facilitate real-time performance modeling. We are using LUsim to analyze an existing parallel sparse LU factorization code, and to explore a latency tolerant variant. We developed and validated a model of the factorization in SuperLU_DIST, then we modeled and implemented a new variant of slud, replacing a blocking collective communication phase with a non-blocking asynchronous point-to-point one. Our strategy realized a mean improvement of 11percent over a suite of test matrices.

  10. Introducing FACETS, the Framework Application for Core-Edge Transport Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R. [Tech-X Corporation; Candy, Jeff [General Atomics; Cohen, Ronald H. [Lawrence Livermore National Laboratory (LLNL); Krasheninnikov, Sergei I [ORNL; McCune, Douglas C [ORNL; Estep, Donald J [Colorado State University, Fort Collins; Larson, Jay W [ORNL; Malony, Allen [University of Oregon; Worley, Patrick H [ORNL; Carlsson, Johann Anders [ORNL; Hakim, A H [Tech-X Corporation; Hamill, P [Tech-X Corporation; Kruger, Scott E [ORNL; Muzsala, S [Tech-X Corporation; Pletzer, Alexander [ORNL; Shasharina, Svetlana [Tech-X Corporation; Wade-Stein, D [Tech-X Corporation; Wang, N [Tech-X Corporation; McInnes, Lois C [ORNL; Wildey, T [Tech-X Corporation; Casper, T. A. [Lawrence Livermore National Laboratory (LLNL); Diachin, Lori A [ORNL; Epperly, Thomas [Lawrence Livermore National Laboratory (LLNL); Rognlien, T. D. [Lawrence Livermore National Laboratory (LLNL); Fahey, Mark R [ORNL; Kuehn, Jeffery A [ORNL; Morris, A [University of Oregon; Shende, Sameer [University of Oregon; Feibush, E [Tech-X Corporation; Hammett, Gregory W [ORNL; Indireshkumar, K [Tech-X Corporation; Ludescher, C [Tech-X Corporation; Randerson, L [Tech-X Corporation; Stotler, D. [Princeton Plasma Physics Laboratory (PPPL); Pigarov, A [University of California, San Diego; Bonoli, P. [Massachusetts Institute of Technology (MIT); Chang, C S [New York University; D' Ippolito, D. A. [Lodestar Research Corporation; Colella, Philip [Lawrence Berkeley National Laboratory (LBNL); Keyes, David E [Columbia University; Bramley, R [Indiana University; Myra, J. R. [Lodestar Research Corporation

    2007-06-01

    The FACETS (Framework Application for Core-Edge Transport Simulations) project began in January 2007 with the goal of providing core to wall transport modeling of a tokamak fusion reactor. This involves coupling previously separate computations for the core, edge, and wall regions. Such a coupling is primarily through connection regions of lower dimensionality. The project has started developing a component-based coupling framework to bring together models for each of these regions. In the first year, the core model will be a 1 dimensional model (1D transport across flux surfaces coupled to a 2D equilibrium) with fixed equilibrium. The initial edge model will be the fluid model, UEDGE, but inclusion of kinetic models is planned for the out years. The project also has an embedded Scientific Application Partnership that is examining embedding a full-scale turbulence model for obtaining the crosssurface fluxes into a core transport code.

  11. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Science.gov (United States)

    Hartwig, Zachary S.

    2016-04-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms.

  12. Delphes, a framework for fast simulation of a general purpose LHC detector

    International Nuclear Information System (INIS)

    Knowing whether theoretical predictions are visible and measurable in a High Energy experiment is always delicate, due to the complexity of the related detectors, DAQ chain and software. We introduce here a new framework, Delphes, for fast simulation of a general purpose experiment. The simulation includes a tracking system, embedded into a magnetic field, calorimetry and a muon system, and possible very forward detectors arranged along the beamline. The framework is interfaced to standard file format from event generators (e.g. Les Houches Event File) and outputs observable analysis data objects, like missing transverse energy and collections of electrons or jets. The simulation of the detector response takes into account the detector resolution, and usual reconstruction algorithms for complex objects, like FastJet. A simplified preselection can also be applied on processed data for trigger emulation. Detection of very forward scattered particles relies on the transport in beamlines with the Hector software. Finally, the FROG 2D/3D event display is used for visualisation of the collision final states. An overview of Delphes is given as well as a few use-cases for illustration

  13. DDG4 A Simulation Framework based on the DD4hep Detector Description Toolkit

    Science.gov (United States)

    Frank, M.; Gaede, F.; Nikiforou, N.; Petric, M.; Sailer, A.

    2015-12-01

    The detector description is an essential component that has to be used to analyse and simulate data resulting from particle collisions in high energy physics experiments. Based on the DD4hep detector description toolkit a flexible and data driven simulation framework was designed using the Geant4 tool-kit. We present this framework and describe the guiding requirements and the architectural design, which was strongly driven by ease of use. The goal was, given an existing detector description, to simulate the detector response to particle collisions in high energy physics experiments with minimal effort, but not impose restrictions to support enhanced or improved behaviour. Starting from the ROOT based geometry implementation used by DD4hep an automatic conversion mechanism to Geant4 was developed. The physics response and the mechanism to input particle data from generators was highly formalized and can be instantiated on demand using known factory patterns. A palette of components to model the detector response is provided by default, but improved or more sophisticated components may easily be added using the factory pattern. Only the final configuration of the instantiated components has to be provided by end-users using either C++ or python scripting or an XML based description.

  14. GNU polyxmass: a software framework for mass spectrometric simulations of linear (bio-polymeric analytes

    Directory of Open Access Journals (Sweden)

    Rusconi Filippo

    2006-04-01

    Full Text Available Abstract Background Nowadays, a variety of (bio-polymers can be analyzed by mass spectrometry. The detailed interpretation of the spectra requires a huge number of "hypothesis cycles", comprising the following three actions 1 put forth a structural hypothesis, 2 test it, 3 (invalidate it. This time-consuming and painstaking data scrutiny is alleviated by using specialized software tools. However, all the software tools available to date are polymer chemistry-specific. This imposes a heavy overhead to researchers who do mass spectrometry on a variety of (bio-polymers, as each polymer type will require a different software tool to perform data simulations and analyses. We developed a software to address the lack of an integrated software framework able to deal with different polymer chemistries. Results The GNU polyxmass software framework performs common (bio-chemical simulations–along with simultaneous mass spectrometric calculations–for any kind of linear bio-polymeric analyte (DNA, RNA, saccharides or proteins. The framework is organized into three modules, all accessible from one single binary program. The modules let the user to 1 define brand new polymer chemistries, 2 perform quick mass calculations using a desktop calculator paradigm, 3 graphically edit polymer sequences and perform (bio-chemical/mass spectrometric simulations. Any aspect of the mass calculations, polymer chemistry reactions or graphical polymer sequence editing is configurable. Conclusion The scientist who uses mass spectrometry to characterize (bio-polymeric analytes of different chemistries is provided with a single software framework for his data prediction/analysis needs, whatever the polymer chemistry being involved.

  15. The Application of Modeling and Simulation in Capacity Management within the ITIL Framework

    Science.gov (United States)

    Rahmani, Sonya; vonderHoff, Otto

    2010-01-01

    Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.

  16. Managing simulation-based training: A framework for optimizing learning, cost, and time

    Science.gov (United States)

    Richmond, Noah Joseph

    This study provides a management framework for optimizing training programs for learning, cost, and time when using simulation based training (SBT) and reality based training (RBT) as resources. Simulation is shown to be an effective means for implementing activity substitution as a way to reduce risk. The risk profile of 22 US Air Force vehicles are calculated, and the potential risk reduction is calculated under the assumption of perfect substitutability of RBT and SBT. Methods are subsequently developed to relax the assumption of perfect substitutability. The transfer effectiveness ratio (TER) concept is defined and modeled as a function of the quality of the simulator used, and the requirements of the activity trained. The Navy F/A-18 is then analyzed in a case study illustrating how learning can be maximized subject to constraints in cost and time, and also subject to the decision maker's preferences for the proportional and absolute use of simulation. Solution methods for optimizing multiple activities across shared resources are next provided. Finally, a simulation strategy including an operations planning program (OPP), an implementation program (IP), an acquisition program (AP), and a pedagogical research program (PRP) is detailed. The study provides the theoretical tools to understand how to leverage SBT, a case study demonstrating these tools' efficacy, and a set of policy recommendations to enable the US military to better utilize SBT in the future.

  17. Review of Molecular Simulations of Methane Storage in Metal-Organic Frameworks.

    Science.gov (United States)

    Lee, Seung-Joon; Bae, Youn-Sang

    2016-05-01

    Methane storage in porous materials is one of the hot issues because it can replace dangerous high-pressure compressed natural gas (CNG) tanks in natural gas vehicles. Among the diverse adsorbents, metal-organic frameworks (MOFs) are considered to be promising due to their extremely high surface areas and low crystal densities. Molecular simulation has been considered as an important tool for finding an appropriate MOF for methane storage. We review several important roles of molecular modeling for the studies of methane adsorption in MOFs. PMID:27483748

  18. Towards a unified framework for coarse-graining particle-based simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Junghans, Christoph [Los Alamos National Laboratory

    2012-06-28

    Different coarse-graining techniques for soft matter systems have been developed in recent years, however it is often very demanding to find the method most suitable for the problem studied. For this reason we began to develop the VOTCA toolkit to allow for easy comparison of different methods. We have incorporated 6 different techniques into the package and implemented a powerful and parallel analysis framework plus multiple simulation back-ends. We will discuss the specifics of the package by means of various studies, which have been performed with the toolkit and highlight problems we encountered along the way.

  19. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    Science.gov (United States)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  20. Direct numerical simulation of rigid bodies in multiphase flow within an Eulerian framework

    Science.gov (United States)

    Rauschenberger, P.; Weigand, B.

    2015-06-01

    A new method is presented to simulate rigid body motion in the Volume-of-Fluid based multiphase code Free Surface 3D. The specific feature of the new method is that it works within an Eulerian framework without the need for a Lagrangian representation of rigid bodies. Several test cases are shown to prove the validity of the numerical scheme. The technique is able to conserve the shape of arbitrarily shaped rigid bodies and predict terminal velocities of rigid spheres. The instability of a falling ellipsoid is captured. Multiple rigid bodies including collisions may be considered using only one Volume-of-Fluid variable which allows to simulate the drafting, kissing and tumbling phenomena of two rigid spheres. The method can easily be extended to rigid bodies undergoing phase change processes.

  1. A proposed simulation optimization model framework for emergency department problems in public hospital

    Science.gov (United States)

    Ibrahim, Ireen Munira; Liong, Choong-Yeun; Bakar, Sakhinah Abu; Ahmad, Norazura; Najmuddin, Ahmad Farid

    2015-12-01

    The Emergency Department (ED) is a very complex system with limited resources to support increase in demand. ED services are considered as good quality if they can meet the patient's expectation. Long waiting times and length of stay is always the main problem faced by the management. The management of ED should give greater emphasis on their capacity of resources in order to increase the quality of services, which conforms to patient satisfaction. This paper is a review of work in progress of a study being conducted in a government hospital in Selangor, Malaysia. This paper proposed a simulation optimization model framework which is used to study ED operations and problems as well as to find an optimal solution to the problems. The integration of simulation and optimization is hoped can assist management in decision making process regarding their resource capacity planning in order to improve current and future ED operations.

  2. Bounding box framework for efficient phase field simulation of grain growth in anisotropic systems

    CERN Document Server

    Vanherpe, L; Blanpain, B; Vandewalle, S

    2011-01-01

    A sparse bounding box algorithm is extended to perform efficient phase field simulations of grain growth in anisotropic systems. The extended bounding box framework allows to attribute different properties to different grain boundary types of a polycrystalline microstructure and can be combined with explicit, implicit or semi-implicit time stepping strategies. To illustrate the applicability of the software, the simulation results of a case study are analysed. They indicate the impact of a misorientation dependent boundary energy formulation on the evolution of the misorientation distribution of the grain boundary types and on the individual growth rates of the grains as a function of the number of grain faces. (C) 2011 Elsevier B.V. All rights reserved.

  3. GNSSim: An Open Source GNSS/GPS Framework for Unmanned Aerial Vehicular Network Simulation

    Directory of Open Access Journals (Sweden)

    Farha Jahan

    2015-08-01

    Full Text Available Unmanned systems are of great importance in accomplishing tasks where human lives are at risk. These systems are being deployed in tasks that are time-consuming, expensive or inconclusive if accomplished by human intervention. Design, development and testing of such vehicles using actual hardware could be quite costly and dangerous. Another issue is the limited outdoor usage permitted by Federal Aviation Administration regulations, which makes outdoor testing difficult. An optimal solution to this problem is to have a simulation environment where different operational scenarios, newly developed models, etc., can be studied. In this paper, we propose GNSSim, a Global Navigation Satellite System (GNSS simulation framework. We demonstrate its effectiveness by integrating it with UAVSim. This allows users to experiment easily by adjusting different satellite as well as UAV parameters. Related tests and evidence of the correctness of the implementation are presented.

  4. Simulating collisions of thick nuclei in the color glass condensate framework

    Science.gov (United States)

    Gelfand, Daniil; Ipp, Andreas; Müller, David

    2016-07-01

    We present our work on the simulation of the early stages of heavy-ion collisions with finite longitudinal thickness in the laboratory frame in 3 +1 dimensions. In particular we study the effects of nuclear thickness on the production of a glasma state in the McLerran-Venugopalan model within the color glass condensate framework. A finite thickness enables us to describe nuclei at lower energies, but forces us to abandon boost invariance. As a consequence, random classical color sources within the nuclei have to be included in the simulation, which is achieved by using the colored particle-in-cell method. We show that the description in the laboratory frame agrees with boost-invariant approaches as a limiting case. Furthermore we investigate collisions beyond boost invariance, in particular the pressure anisotropy in the glasma.

  5. Simulating collisions of thick nuclei in the color glass condensate framework

    CERN Document Server

    Gelfand, Daniil; Müller, David

    2016-01-01

    We present our work on the simulation of the early stages of heavy-ion collisions with finite longitudinal thickness in the laboratory frame in 3+1 dimensions. In particular we study the effects of nuclear thickness on the production of a glasma state in the McLerran-Venugopalan model within the color glass condensate framework. A finite thickness enables us to describe nuclei at lower energies, but forces us to abandon boost-invariance. As a consequence, random classical color sources within the nuclei have to be included in the simulation, which is achieved by using the colored particle-in-cell (CPIC) method. We show that the description in the laboratory frame agrees with boost-invariant approaches as a limiting case. Furthermore we investigate collisions beyond boost-invariance, in particular the pressure anisotropy in the glasma.

  6. DELPHES 3, A modular framework for fast simulation of a generic collider experiment

    CERN Document Server

    de Favereau, J; Demin, P; Giammanco, A; Lemaître, V; Mertens, A; Selvaggi, M

    2013-01-01

    The version 3.0 of the DELPHES fast-simulation framework is presented. The tool is written in C++ and is interfaced with the most common Monte-Carlo file formats. Its goal is the simulation of a multipurpose detector that includes a track propagation system embedded in a magnetic field, electromagnetic and hadronic calorimeters, and a muon identification system. The new modular design allows to easily produce the collections that are needed for later analysis, from low level objects such as tracks and calorimeter deposits up to high level collections such as isolated electrons, jets, taus, and missing energy. New features such as pile-up and improved algorithms like the particle-flow reconstruction approach have also been implemented.

  7. Using a New Event-Based Simulation Framework for Investigating Resource Provisioning in Clouds

    Directory of Open Access Journals (Sweden)

    Simon Ostermann

    2011-01-01

    Full Text Available Today, Cloud computing proposes an attractive alternative to building large-scale distributed computing environments by which resources are no longer hosted by the scientists' computational facilities, but leased from specialised data centres only when and for how long they are needed. This new class of Cloud resources raises new interesting research questions in the fields of resource management, scheduling, fault tolerance, or quality of service, requiring hundreds to thousands of experiments for finding valid solutions. To enable such research, a scalable simulation framework is typically required for early prototyping, extensive testing and validation of results before the real deployment is performed. The scope of this paper is twofold. In the first part we present GroudSim, a Grid and Cloud simulation toolkit for scientific computing based on a scalable simulation-independent discrete-event engine. GroudSim provides a comprehensive set of features for complex simulation scenarios from simple job executions on leased computing resources to file transfers, calculation of costs and background load on resources. Simulations can be parameterised and are easily extendable by probability distribution packages for failures which normally occur in complex distributed environments. Experimental results demonstrate the improved scalability of GroudSim compared to a related process-based simulation approach. In the second part, we show the use of the GroudSim simulator to analyse the problem of dynamic provisioning of Cloud resources to scientific workflows that do not benefit from sufficient Grid resources as required by their computational demands. We propose and study four strategies for provisioning and releasing Cloud resources that take into account the general leasing model encountered in today's commercial Cloud environments based on resource bulks, fuzzy descriptions and hourly payment intervals. We study the impact of our techniques to the

  8. CRUSDE: A plug-in based simulation framework for composable CRUstal DEformation studies using Green's functions

    Science.gov (United States)

    Grapenthin, R.

    2014-01-01

    CRUSDE is a plug-in based simulation framework written in C/C++ for Linux platforms (installation information, download and test cases: http://www.grapenthin.org/crusde). It utilizes Green's functions for simulations of the Earth's response to changes in surface loads. Such changes could involve, for example, melting glaciers, oscillating snow loads, or lava flow emplacement. The focus in the simulation could be the response of the Earth's crust in terms of stress changes, changes in strain rates, or simply uplift or subsidence and the respective horizontal displacements of the crust (over time). Rather than implementing a variety of specific models, CRUSDE approaches crustal deformation problems from a general formulation in which model elements (Green's function, load function, relaxation function, load history), operators, pre- and postprocessors, as well as input and output routines are independent, exchangeable, and reusable on the basis of a plug-in approach (shared libraries loaded at runtime). We derive the general formulation CRUSDE is based on, describe its architecture and use, and demonstrate its capabilities in a test case. With CRUSDE users can: (1) dynamically select software components to participate in a simulation (through XML experiment definitions), (2) extend the framework independently with new software components and reuse existing ones, and (3) exchange software components and experiment definitions with other users. CRUSDE's plug-in mechanism aims for straightforward extendability allowing modelers to add new Earth models/response functions. Current Green's function implementations include surface displacements due to the elastic response, final relaxed response, and pure thick plate response for a flat Earth. These can be combined to express exponential decay from elastic to final relaxed response, displacement rates due to one or multiple disks, irregular loads, or a combination of these. Each load can have its own load history and

  9. Neutronics Code Development at Argonne National Laboratory

    International Nuclear Information System (INIS)

    As part of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of U.S. DOE, a suite of modern fast reactor simulation tools is being developed at Argonne National Laboratory. The general goal is to reduce the uncertainties and biases in various areas of reactor design activities by providing enhanced prediction capabilities. Under this fast reactor simulation program, a high-fidelity deterministic neutron transport code named UNIC is being developed. The end goal of this development is to produce an integrated neutronics code that enables the high fidelity description of a nuclear reactor and simplifies the multi-step design process by direct and accurate coupling with thermal-hydraulics and structural mechanics calculations. (author)

  10. Parallel kinetic Monte Carlo simulation framework incorporating accurate models of adsorbate lateral interactions

    Science.gov (United States)

    Nielsen, Jens; d'Avezac, Mayeul; Hetherington, James; Stamatakis, Michail

    2013-12-01

    Ab initio kinetic Monte Carlo (KMC) simulations have been successfully applied for over two decades to elucidate the underlying physico-chemical phenomena on the surfaces of heterogeneous catalysts. These simulations necessitate detailed knowledge of the kinetics of elementary reactions constituting the reaction mechanism, and the energetics of the species participating in the chemistry. The information about the energetics is encoded in the formation energies of gas and surface-bound species, and the lateral interactions between adsorbates on the catalytic surface, which can be modeled at different levels of detail. The majority of previous works accounted for only pairwise-additive first nearest-neighbor interactions. More recently, cluster-expansion Hamiltonians incorporating long-range interactions and many-body terms have been used for detailed estimations of catalytic rate [C. Wu, D. J. Schmidt, C. Wolverton, and W. F. Schneider, J. Catal. 286, 88 (2012)]. In view of the increasing interest in accurate predictions of catalytic performance, there is a need for general-purpose KMC approaches incorporating detailed cluster expansion models for the adlayer energetics. We have addressed this need by building on the previously introduced graph-theoretical KMC framework, and we have developed Zacros, a FORTRAN2003 KMC package for simulating catalytic chemistries. To tackle the high computational cost in the presence of long-range interactions we introduce parallelization with OpenMP. We further benchmark our framework by simulating a KMC analogue of the NO oxidation system established by Schneider and co-workers [J. Catal. 286, 88 (2012)]. We show that taking into account only first nearest-neighbor interactions may lead to large errors in the prediction of the catalytic rate, whereas for accurate estimates thereof, one needs to include long-range terms in the cluster expansion.

  11. Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.

    Science.gov (United States)

    Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan

    2012-01-01

    The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.

  12. Self-consistently simulation of RF sheath boundary condition in BOUT + + framework

    Science.gov (United States)

    Gui, Bin; Xu, Xueqiao; Xia, Tianyang

    2015-11-01

    The effect of the RF sheath boundary condition on the edge-localized modes and the turbulent transport is simulated in this work. The work includes two parts. The first part is to calculate the equilibrium radial electric field with RF sheath boundary condition. It is known the thermal sheath or the rectified RF sheath will modify the potential in the SOL region. The modified potential induces addition shear flow in SOL. In this part, the equilibrium radial electric field across the separatrix is calculated by solving the 2D current continuity equation with sheath boundary condition, drifts and viscosity. The second part is applying the sheath boundary condition on the perturbed variables of the six-field two fluid model in BOUT + + framework. The six-field two-fluid model simulates the ELMs and turbulent transport. The sheath boundary condition is applied in this model and it aims to simulate effect of sheath boundary condition on the turbulent transport. It is found the sheath boundary plays as a sink in the plasma and suppresses the local perturbation. Based on this two work, the effect of RF sheath boundary condition on the ELMs and turbulent transport could be self-consistently simulated. Prepared by LLNL under Contract DE-AC52-07NA27344.

  13. A GIS/Simulation Framework for Assessing Change in Water Yield over Large Spatial Scales

    Energy Technology Data Exchange (ETDEWEB)

    Graham, R.; Hargrove, W.W.; Huff, D.D.; Nikolov, N.; Tharp, M.L.

    1999-11-13

    Recent legislation to,initiate vegetation management in the Central Sierra hydrologic region of California includes a focus on corresponding changes in water yield. This served as the impetus for developing a combined geographic information system (GIS) and simulation assessment framework. Using the existing vegetation density condition, together with proposed rules for thinning to reduce fire risk, a set of simulation model inputs were generated for examining the impact of the thinning scenario on water yield. The approach allows results to be expressed as the mean and standard deviation of change in water yield for each 1 km2 map cell that is treated. Values for groups of cells are aggregated for typical watershed units using area-weighted averaging. Wet, dry and average precipitation years were simulated over a large region. Where snow plays an important role in hydrologic processes, the simulated change in water yield was less than 0.5% of expected annual runoff for a typical water shed. Such small changes would be undetectable in the field using conventional stream flow analysis. These results suggest that use of water yield increases to help justify forest-thinning activities or offset their cost will be difficult.

  14. A simulation model of MAPS for the FairRoot framework

    Energy Technology Data Exchange (ETDEWEB)

    Amar-Youcef, Samir; Linnik, Benjamin; Sitzmann, Philipp [Goethe-Universitaet Frankfurt (Germany); Collaboration: CBM-MVD-Collaboration

    2014-07-01

    CMOS MAPS are the sensors of choice for the MVD of the CBM experiment at the FAIR facility. They offer a unique combination of features required for the CBM detector like low material budget, spatial resolution, radiation tolerance and yet sufficient read-out speed. The physics performance of various designs of the MVD integrated to the CBM detector system is evaluated in the CBM-/FairRoot simulation framework. In this context, algorithm are developed to simulate the realistic detector response and to optimize feature extraction from the sensor information. The objective of the sensor response model is to provide fast and realistic pixel response for a given track energy loss and position. In addition, we discuss aspects of simulating event pile-up and dataflow in the context of the CBM FLES event extraction and selection concept. This is of particular importance for the MVD since the sensors feature a comparably long integration time and a frame-wise read-out. All other detector systems operate with un-triggered front-end electronics and are freely streaming time-stamped data to the FLES. Because of the large data rates, event extraction is performed via distributed networking on a large HPC compute farm. We present an overview and status of the MVD software developments focusing on the integration of the system in a free-flowing read-out system and on the concurrent application for simulated and real data.

  15. Global Simulation of Bioenergy Crop Productivity: Analytical framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Jeff A. {Cyber Sciences} [ORNL; Post, Wilfred M [ORNL; Wang, Dali [ORNL; Wullschleger, Stan D [ORNL; Kline, Keith L [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL; Kang, Shujiang [ORNL

    2014-01-01

    Contemporary global assessments of the deployment potential and sustainability aspects of biofuel crops lack quantitative details. This paper describes an analytical framework capable of meeting the challenges associated with global scale agro-ecosystem modeling. We designed a modeling platform for bioenergy crops, consisting of five major components: (i) standardized global natural resources and management data sets, (ii) global simulation unit and management scenarios, (iii) model calibration and validation, (iv) high-performance computing (HPC) modeling, and (v) simulation output processing and analysis. A case study with the HPC- Environmental Policy Integrated Climate model (HPC-EPIC) to simulate a perennial bioenergy crop, switchgrass (Panicum virgatum L.) and global biomass feedstock analysis on grassland demonstrates the application of this platform. The results illustrate biomass feedstock variability of switchgrass and provide insights on how the modeling platform can be expanded to better assess sustainable production criteria and other biomass crops. Feedstock potentials on global grasslands and within different countries are also shown. Future efforts involve developing databases of productivity, implementing global simulations for other bioenergy crops (e.g. miscanthus, energycane and agave), and assessing environmental impacts under various management regimes. We anticipated this platform will provide an exemplary tool and assessment data for international communities to conduct global analysis of biofuel biomass feedstocks and sustainability.

  16. Global Simulation of Bioenergy Crop Productivity: Analytical Framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Shujiang [ORNL; Kline, Keith L [ORNL; Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Dr Jeff A [ORNL; Post, Wilfred M [ORNL; Brandt, Craig C [ORNL; Wullschleger, Stan D [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL

    2013-01-01

    A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulated a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.

  17. A parallel framework for the FE-based simulation of knee joint motion.

    Science.gov (United States)

    Wawro, Martin; Fathi-Torbaghan, Madjid

    2004-08-01

    We present an object-oriented framework for the finite-element (FE)-based simulation of the human knee joint motion. The FE model of the knee joint is acquired from the patients in vivo by using magnetic resonance imaging. The MRI images are converted into a three-dimensional model and finally an all-hexahedral mesh for the FE analysis is generated. The simulation environment uses nonlinear finite-element analysis (FEA) and is capable of handling contact of the model to handle the complex rolling/sliding motion of the knee joint. The software strictly follows object-oriented concepts of software engineering in order to guarantee maximum extensibility and maintainability. The final goal of this work-in-progress is the creation of a computer-based biomechanical model of the knee joint which can be used in a variety of applications, ranging from prosthesis design and treatment planning (e.g., optimal reconstruction of ruptured ligaments) over surgical simulation to impact computations in crashworthiness simulations.

  18. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries

    Directory of Open Access Journals (Sweden)

    Drawert Brian

    2012-06-01

    Full Text Available Abstract Background Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. Results We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods

  19. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State University

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  20. Developing a Conceptual Framework for Simulation Analysis in a Supply Chain Based on Common Platform (SCBCP

    Directory of Open Access Journals (Sweden)

    M. Fathollah

    2009-08-01

    Full Text Available As a competitive advantage in modern organizations, product diversification may cause complexities in today’s extended supplychains. However, the Common Platform (CP Strategy, as a concept of gaining maximum variety by minimum productionelements, is believed to be one of the answers to eliminate or decrease these complexities. The main purpose of this paper is toprovide a simulation framework for modeling the supply network of a case study in automotive industry in order to study theimpacts of part commonality through the chain. The electrical wiring harness is selected as the main part to be studiedaccording to essentiality and challenges of its procurement for the production of cars (as occurred in this case and many otherstudies. The paper does not provide the simulation results but it rather builds up the required foundation and gathers therelevant content to develop a realistic simulation model by closely studying the impacts of part multiplicity on differentfunctional areas of the selected supply network and extracting the critical success factors of applying part commonality.

  1. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  2. Social simulation theory: a framework to explain nurses' understanding of patients' experiences of ill-health.

    Science.gov (United States)

    Nordby, Halvor

    2016-09-01

    A fundamental aim in caring practice is to understand patients' experiences of ill-health. These experiences have a qualitative content and cannot, unlike thoughts and beliefs with conceptual content, directly be expressed in words. Nurses therefore face a variety of interpretive challenges when they aim to understand patients' subjective perspectives on disease and illness. The article argues that theories on social simulation can shed light on how nurses manage to meet these challenges. The core assumption of social simulationism is that we do not understand other people by forming mental representations of how they think, but by putting ourselves in their situation in a more imaginative way. According to simulationism, any attempt to understand a patient's behavior is made on the basis of simulating what it is like to be that patient in the given context. The article argues that this approach to social interpretation can clarify how nurses manage to achieve aims of patient understanding, even when they have limited time to communicate and incomplete knowledge of patients' perspectives. Furthermore, simulation theory provides a normative framework for interpretation, in the sense that its theoretical assumptions constitute ideals for how nurses should seek to understand patients' experiences of illness.

  3. Understanding virulence mechanisms in M. tuberculosis infection via a circuit-based simulation framework.

    Energy Technology Data Exchange (ETDEWEB)

    May, Elebeoba Eni; Oprea, Tudor I.; Joo, Jaewook; Misra, Milind; Leitao, Andrei; Faulon, Jean-Loup Michel

    2008-08-01

    Tuberculosis (TB), caused by the bacterium Mycobacterium tuberculosis (Mtb), is a growing international health crisis. Mtb is able to persist in host tissues in a non-replicating persistent (NRP) or latent state. This presents a challenge in the treatment of TB. Latent TB can re-activate in 10% of individuals with normal immune systems, higher for those with compromised immune systems. A quantitative understanding of latency-associated virulence mechanisms may help researchers develop more effective methods to battle the spread and reduce TB associated fatalities. Leveraging BioXyce's ability to simulate whole-cell and multi-cellular systems we are developing a circuit-based framework to investigate the impact of pathogenicity-associated pathways on the latency/reactivation phase of tuberculosis infection. We discuss efforts to simulate metabolic pathways that potentially impact the ability of Mtb to persist within host immune cells. We demonstrate how simulation studies can provide insight regarding the efficacy of potential anti-TB agents on biological networks critical to Mtb pathogenicity using a systems chemical biology approach

  4. GeNN: a code generation framework for accelerated brain simulations.

    Science.gov (United States)

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369

  5. A simulation-based framework for a mapping tool that assesses the energy performance of green roofs

    OpenAIRE

    Kokogiannakis, Georgios; Darkwa, Jo

    2012-01-01

    This paper presents a framework for the development of a GIS open source mapping tool that aims to disseminate a database with results of detailed simulations in order to assess in a quick and easy way the energy performance of green roof designs across a range of Chinese climates. Detailed simulation results for heating and cooling loads are obtained from the EnergyPlus simulation tool. The study covers 12264 configurations by varying model parameters such as climate, glazing type, roof insu...

  6. A Conceptual and UML models of procurement process for simulation framework

    Directory of Open Access Journals (Sweden)

    Abdessamad Douraid

    2012-11-01

    Full Text Available This paper presents a set of conceptual and UML models that can be used to construct a simulation framework of procurement process. Whereas the good control of this process is crucial as well it composes an interesting ratio of costs along the whole chain. For this purpose, we took into account the information and the material flows of the upstream supply chain that linking the manufacturer and its suppliers. Our contribution is to make a reusable and a modular pattern of procurement process, which is able to be configured and used for several manufacturer industries. In order to benchmark the different scenarios of each configuration and to furnish a decision aids tool, for the sake of the decision makers to obtain the right choices.

  7. A novel finite element framework for numerical simulation of fluidization processes and multiphase granular flow

    Science.gov (United States)

    Percival, James; Xie, Zhihua; Pavlidis, Dimitrios; Gomes, Jefferson; Pain, Christopher; Matar, Omar

    2013-11-01

    We present results from a new formulation of a numerical model for direct simulation of bed fluidization and multiphase granular flow. The model is based on a consistent application of continuous-discontinuous mixed control volume finite element methods applied to fully unstructured meshes. The unstructured mesh framework allows for both a mesh adaptive capability, modifying the computational geometry in order to bound the error in the numerical solution while maximizing computational efficiency, and a simple scripting interface embedded in the model which allows fast prototyping of correlation models and parameterizations in intercomparison experiments. The model is applied to standard test problems for fluidized beds. EPSRC Programme Grant EP/K003976/1.

  8. Theoretical Framework and Simulation Results for Implementing Weighted Multiple Sampling in Scientific CCDs

    CERN Document Server

    Alessandri, Cristobal; Abusleme, Angel; Avila, Diego; Alvarez, Enrique; Campillo, Hernan; Gallyas, Alexandra; Oberli, Christian; Guarini, Marcelo

    2015-01-01

    The Digital Correlated Double Sampling (DCDS) is a technique based on multiple analog-to-digital conversions of every pixel when reading a CCD out. This technique allows to remove analog integrators, simplifying the readout electronics circuitry. In this work, a theoretical framework that computes the optimal weighted coefficients of the pixels samples, which minimize the readout noise measured at the CCD output is presented. By using a noise model for the CCD output amplifier where white and flicker noise are treated separately, the mathematical tool presented allows for the computation of the optimal samples coefficients in a deterministic fashion. By modifying the noise profile, our simulation results get in agreement and thus explain results that were in mutual disagreement up until now.

  9. CRPropa 3 - a Public Astrophysical Simulation Framework for Propagating Extraterrestrial Ultra-High Energy Particles

    CERN Document Server

    Batista, Rafael Alves; Erdmann, Martin; Kampert, Karl-Heinz; Kuempel, Daniel; Müller, Gero; Sigl, Guenter; van Vliet, Arjen; Walz, David; Winchen, Tobias

    2016-01-01

    We present the simulation framework CRPropa version 3 designed for efficient development of astrophysical predictions for ultra-high energy particles. Users can assemble modules of the most relevant propagation effects in galactic and extragalactic space, include their own physics modules with new features, and receive on output primary and secondary cosmic messengers including nuclei, neutrinos and photons. In extension to the propagation physics contained in a previous CRPropa version, the new version facilitates high-performance computing and comprises new physical features such as an interface for galactic propagation using lensing techniques, an improved photonuclear interaction calculation, and propagation in time dependent environments to take into account cosmic evolution effects in anisotropy studies and variable sources. First applications using highlighted features are presented as well.

  10. A multiscale framework for the simulation of the anisotropic mechanical behavior of shale

    CERN Document Server

    Li, Weixin; Jin, Congrui; Zhou, Xinwei; Cusatis, Gianluca

    2016-01-01

    Shale, like many other sedimentary rocks, is typically heterogeneous, anisotropic, and is characterized by partial alignment of anisotropic clay minerals and naturally formed bedding planes. In this study, a micromechanical framework based on the Lattice Discrete Particle Model (LDPM) is formulated to capture these features. Material anisotropy is introduced through an approximated geometric description of shale internal structure, which includes representation of material property variation with orientation and explicit modeling of parallel lamination. The model is calibrated by carrying out numerical simulations to match various experimental data, including the ones relevant to elastic properties, Brazilian tensile strength, and unconfined compressive strength. Furthermore, parametric study is performed to investigate the relationship between the mesoscale parameters and the macroscopic properties. It is shown that the dependence of the elastic stiffness, strength, and failure mode on loading orientation ca...

  11. Molecular Simulation Study of Hexane Diffusion in Dynamic Metal-0rganic Frameworks

    Institute of Scientific and Technical Information of China (English)

    XUE,Chunyu; ZHONG,Chongli

    2009-01-01

    The modified MM3 force field for describing flexible IRMOF-1 was extended to include other IRMOFs,and a molecular dynamics simulation study was performed on hexane diffusion in IRMOF-1 and IRMOF-16.The self-diffusion coefficients and diffusion pathways of hexane,as well as the mobility of the frameworks were inves-tigated,as a function of both temperature and loading.The results revealed that the diffusion pathway of hexane Was largely influenced by loading,and the flexibility of IRMOF-16 was much larger than that of IRMOF-1.The microscopic information obtained is useful for understanding the diffusion mechanism of chain molecules in dy-namic MOF.

  12. A framework to quantify uncertainty in simulations of oil transport in the ocean

    KAUST Repository

    Gonçalves, Rafael C.

    2016-03-02

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model\\'s output to be presented in a probabilistic framework so that the model\\'s predictions reflect the uncertainty in the model\\'s input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model\\'s uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.

  13. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    Science.gov (United States)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication

  14. A multi-paradigm modeling framework to simulate dynamic reciprocity in a bioreactor.

    Directory of Open Access Journals (Sweden)

    Himanshu Kaul

    Full Text Available Despite numerous technology advances, bioreactors are still mostly utilized as functional black-boxes where trial and error eventually leads to the desirable cellular outcome. Investigators have applied various computational approaches to understand the impact the internal dynamics of such devices has on overall cell growth, but such models cannot provide a comprehensive perspective regarding the system dynamics, due to limitations inherent to the underlying approaches. In this study, a novel multi-paradigm modeling platform capable of simulating the dynamic bidirectional relationship between cells and their microenvironment is presented. Designing the modeling platform entailed combining and coupling fully an agent-based modeling platform with a transport phenomena computational modeling framework. To demonstrate capability, the platform was used to study the impact of bioreactor parameters on the overall cell population behavior and vice versa. In order to achieve this, virtual bioreactors were constructed and seeded. The virtual cells, guided by a set of rules involving the simulated mass transport inside the bioreactor, as well as cell-related probabilistic parameters, were capable of displaying an array of behaviors such as proliferation, migration, chemotaxis and apoptosis. In this way the platform was shown to capture not only the impact of bioreactor transport processes on cellular behavior but also the influence that cellular activity wields on that very same local mass transport, thereby influencing overall cell growth. The platform was validated by simulating cellular chemotaxis in a virtual direct visualization chamber and comparing the simulation with its experimental analogue. The results presented in this paper are in agreement with published models of similar flavor. The modeling platform can be used as a concept selection tool to optimize bioreactor design specifications.

  15. Modification of the Argonne tandem

    International Nuclear Information System (INIS)

    For nuclear structure experiments with heavy ions it is necessary to have ion energies in excess of 5 MeV per nucleon. At the Argonne tandem FN accelerator this was accomplished by the addition of a superconducting linac. Modifications of the FN tandem to improve the performance of the pair is described

  16. Progress report for FACETS (Framework Application for Core-Edge Transport Simulations): C.S. SAP

    International Nuclear Information System (INIS)

    The mission of the Computer Science Scientific Application Partnership (C.S. SAP) at LLNL is to develop and apply leading-edge scientific component technology to FACETS software. Contributions from LLNL's fusion energy program staff towards the underlying physics modules are described in a separate report. FACETS uses component technology to combine selectively multiple physics and solver software modules written in different languages by different institutions together in an tightly-integrated, parallel computing framework for Tokamak reactor modeling. In the past fiscal year, the C.S. SAP has focused on two primary tasks: applying Babel to connect UEDGE into the FACETS framework through UEDGE's existing Python interface and developing a next generation componentization strategy for UEDGE which avoids the use of Python. The FACETS project uses Babel to solve its language interoperability challenges. Specific accomplishments for the year include: (1) Refined SIDL interfaces for UEDGE to meet satisfy the standard interfaces required by FACETS for all physics modules. This required consensus building between framework and UEDGE developers. (2) Wrote prototype C++ driver for UEDGE to demonstrate how UEDGE can be called from C++ using Babel. (3) Supported the FACETS project by adding new features to Babel such as release number tagging, porting to new machines, and adding new configuration options. Babel modifications were delivered to FACETS by testing and publishing development snapshots in the projects software repository. (4) Assisted Tech-X Corporation in testing and debugging of a high level build system for the complete FACETS tool chain--the complete list of third-party software libraries that FACETS depends on directly or indirectly (e.g., MPI, HDF5, PACT, etc.). (5) Designed and implemented a new approach to wrapping UEDGE as a FACETS component without requiring Python. To get simulation results as soon as possible, our initial connection from the FACETS

  17. A discrete element based simulation framework to investigate particulate spray deposition processes

    KAUST Repository

    Mukherjee, Debanjan

    2015-06-01

    © 2015 Elsevier Inc. This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid-particle interactions, particle-surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid-particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions.

  18. A framework for stochastic simulation of distribution practices for hotel reservations

    Energy Technology Data Exchange (ETDEWEB)

    Halkos, George E.; Tsilika, Kyriaki D. [Laboratory of Operations Research, Department of Economics, University of Thessaly, Korai 43, 38 333, Volos (Greece)

    2015-03-10

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.

  19. A discrete element based simulation framework to investigate particulate spray deposition processes

    Energy Technology Data Exchange (ETDEWEB)

    Mukherjee, Debanjan, E-mail: debanjan@berkeley.edu; Zohdi, Tarek I., E-mail: zohdi@me.berkeley.edu

    2015-06-01

    This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid–particle interactions, particle–surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid–particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions.

  20. A Simulation-Based Framework for the Cooperation of VMS Travel Guidance and Traffic Signal Control

    Directory of Open Access Journals (Sweden)

    Meng Li

    2014-01-01

    Full Text Available Nowadays, both travel guidance systems and traffic signal control systems are quite common for urban traffic management. In order to achieve collaborative effect, different models had been proposed in the last two decades. In recent years, with the development of variable message sign (VMS technology, more and more VMS panels are installed on major arterials to provide highly visible and concise graphs or text messages to drivers, especially in developing countries. To discover drivers’ responses to VMS, we establish a drivers’ en route diversion model according to a stated-preference survey. Basically, we proposed a cooperative mechanism and systematic framework of VMS travel guidance and major arterials signal operations. And then a two-stage nested optimization problem is formulated. To solve this optimization problem, a simulation-based optimization method is adopted to optimize the cooperative strategies with TRANSIMS. The proposed method is applied to the real network of Tianjin City comprising of 30 nodes and 46 links. Simulations show that this new method could well improve the network condition by 26.3%. And analysis reveals that GA with nested dynamic programming is an effective technique to solve the optimization problem.

  1. A framework for stochastic simulation of distribution practices for hotel reservations

    International Nuclear Information System (INIS)

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system

  2. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    CERN Document Server

    Leetmaa, Mikael

    2014-01-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel pe...

  3. A systematic framework for molecular dynamics simulations of protein post-translational modifications.

    Directory of Open Access Journals (Sweden)

    Drazen Petrov

    Full Text Available By directly affecting structure, dynamics and interaction networks of their targets, post-translational modifications (PTMs of proteins play a key role in different cellular processes ranging from enzymatic activation to regulation of signal transduction to cell-cycle control. Despite the great importance of understanding how PTMs affect proteins at the atomistic level, a systematic framework for treating post-translationally modified amino acids by molecular dynamics (MD simulations, a premier high-resolution computational biology tool, has never been developed. Here, we report and validate force field parameters (GROMOS 45a3 and 54a7 required to run and analyze MD simulations of more than 250 different types of enzymatic and non-enzymatic PTMs. The newly developed GROMOS 54a7 parameters in particular exhibit near chemical accuracy in matching experimentally measured hydration free energies (RMSE=4.2 kJ/mol over the validation set. Using this tool, we quantitatively show that the majority of PTMs greatly alter the hydrophobicity and other physico-chemical properties of target amino acids, with the extent of change in many cases being comparable to the complete range spanned by native amino acids.

  4. A heterogeneous and parallel computing framework for high-resolution hydrodynamic simulations

    Science.gov (United States)

    Smith, Luke; Liang, Qiuhua

    2015-04-01

    Shock-capturing hydrodynamic models are now widely applied in the context of flood risk assessment and forecasting, accurately capturing the behaviour of surface water over ground and within rivers. Such models are generally explicit in their numerical basis, and can be computationally expensive; this has prohibited full use of high-resolution topographic data for complex urban environments, now easily obtainable through airborne altimetric surveys (LiDAR). As processor clock speed advances have stagnated in recent years, further computational performance gains are largely dependent on the use of parallel processing. Heterogeneous computing architectures (e.g. graphics processing units or compute accelerator cards) provide a cost-effective means of achieving high throughput in cases where the same calculation is performed with a large input dataset. In recent years this technique has been applied successfully for flood risk mapping, such as within the national surface water flood risk assessment for the United Kingdom. We present a flexible software framework for hydrodynamic simulations across multiple processors of different architectures, within multiple computer systems, enabled using OpenCL and Message Passing Interface (MPI) libraries. A finite-volume Godunov-type scheme is implemented using the HLLC approach to solving the Riemann problem, with optional extension to second-order accuracy in space and time using the MUSCL-Hancock approach. The framework is successfully applied on personal computers and a small cluster to provide considerable improvements in performance. The most significant performance gains were achieved across two servers, each containing four NVIDIA GPUs, with a mix of K20, M2075 and C2050 devices. Advantages are found with respect to decreased parametric sensitivity, and thus in reducing uncertainty, for a major fluvial flood within a large catchment during 2005 in Carlisle, England. Simulations for the three-day event could be performed

  5. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    Science.gov (United States)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  6. The effect of casting and masticatory simulation on strain and misfit of implant-supported metal frameworks.

    Science.gov (United States)

    Bhering, Cláudia Lopes Brilhante; Marques, Isabella da Silva Vieira; Takahashi, Jessica Mie Ferreira Koyama; Barão, Valentim Adelino Ricardo; Consani, Rafael Leonardo Xediek; Mesquita, Marcelo Ferraz

    2016-05-01

    The influence of casting and masticatory simulation on marginal misfit and strain in multiple implant-supported prostheses was evaluated. Three-unit screw retained fixed dental prosthesis (FDP) and screw retained full-arch fixed dental prosthesis (FAFDP) frameworks were made using calcinable or overcasted cylinders on conical dental implant abutment. Four groups were obtained according to the cylinder and prosthesis type (n=10). Frameworks were casted in CoCr alloy and subjected to strain gauge analyses and marginal misfit measurements before and after 10(6) mechanical cycles (2 Hz/280 N). Results were submitted to ANOVA, Tukey's HSD and Pearson correlation test (α=0.05). No difference was found on misfit among all groups and times (p>0.05). Overcasted frameworks showed higher strain than the calcinable ones (FDP - Initial p=0.0047; Final p=0.0004; FAFDP - Initial p=0.0476; Final p=0.0115). The masticatory simulation did not influence strain (p>0.05). No correlation was observed between strain and misfit (r=0.24; p>0.05). In conclusion, the marginal misfit value in the overcasted full-arch frameworks was higher than clinical acceptable data. It proved that overcasted method is not an ideal method for full-arch prosthesis. Overcasted frameworks generate higher strain upon the system. The masticatory simulation had no influence on misfit and strain of multiple prostheses. PMID:26952480

  7. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF)?a model-based software framework that shall enable seamless continuity of mission design and...

  8. Enhanced adsorption selectivity of hydrogen/methane mixtures in metal-organic frameworks with interpenetration: A molecular simulation study

    NARCIS (Netherlands)

    B. Liu; Q. Yang; C. Xue; C. Zhong; B. Chen; B. Smit

    2008-01-01

    In this work a systematic molecular simulation study was performed to study the effect of interpenetration on gas mixture separation in metal−organic frameworks (MOFs). To do this, three pairs of isoreticular MOFs (IRMOFs) with and without interpenetration were adopted to compare their adsorption se

  9. Numerical simulation of the Moon's rotation in a rigorous relativistic framework

    Science.gov (United States)

    Wang, Zai; Han, Wen-Biao; Tang, Kai; Tao, Jin-He

    2016-06-01

    This paper describes a numerical simulation of the rigid rotation of the Moon in a relativistic framework. Following a resolution passed by the International Astronomical Union (IAU) in 2000, we construct a kinematically non-rotating reference system named the Selenocentric Celestial Reference System (SCRS) and give the time transformation between the Selenocentric Coordinate Time (TCS) and Barycentric Coordinate Time (TCB). The post-Newtonian equations of the Moon's rotation are written in the SCRS, and they are integrated numerically. We calculate the correction to the rotation of the Moon due to total relativistic torque which includes post-Newtonian and gravitomagnetic torques as well as geodetic precession. We find two dominant periods associated with this correction: 18.6yr and 80.1 yr. In addition, the precession of the rotating axes caused by fourth-degree and fifth-degree harmonics of the Moon is also analyzed, and we have found that the main periods of this precession are 27.3d, 2.9 yr, 18.6 yr and 80.1 yr.

  10. Autogenerator-based modelling framework for development of strategic games simulations: rational pigs game extended.

    Science.gov (United States)

    Fabac, Robert; Radošević, Danijel; Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, "the tramp," one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players.

  11. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  12. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  13. A systematic intercomparison of regional flood frequency analysis models in a simulation framework

    Science.gov (United States)

    Ganora, Daniele; Laio, Francesco; Claps, Pierluigi

    2015-04-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve (or other discharge-related variables), based on the fundamental concept of substituting temporal information at a site (no data or short time series) by exploiting observations at other sites (spatial information). Different RFA paradigms exist, depending on the way the information is transferred to the site of interest. Despite the wide use of such methodology, a systematic comparison between these paradigms has not been performed. The aim of this study is to provide a framework wherein carrying out the intercomparison: we thus synthetically generate data through Monte Carlo simulations for a number of (virtual) stations, following a GEV parent distribution; different scenarios can be created to represent different spatial heterogeneity patterns by manipulating the parameters of the parent distribution at each station (e.g. with a linear variation in space of the shape parameter of the GEV). A special case is the homogeneous scenario where each station record is sampled from the same parent distribution. For each scenario and each simulation, different regional models are applied to evaluate the 200-year growth factor at each station. Results are than compared to the exact growth factor of each station, which is known in our virtual world. Considered regional approaches include: (i) a single growth curve for the whole region; (ii) a multiple-region model based on cluster analysis which search for an adequate number of homogeneous subregions; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially-smooth estimation procedure based on linear regressions.. A further benchmark model is the at-site estimate based on the analysis of the local record. A comprehensive analysis of the results of the simulations shows that, if the scenario is homogeneous (no spatial variability), all the regional approaches

  14. Elements of naturality in dynamical simulation frameworks for Hamiltonian, thermostatic, and Lindbladian flows on classical and quantum state-spaces

    CERN Document Server

    Sidles, John A; Jacky, Jonathan P; Picone, Rico A R; Harsila, Scott A

    2010-01-01

    The practical focus of this work is the dynamical simulation of polarization transport processes in quantum spin microscopy and spectroscopy. The simulation framework is built-up progressively, beginning with state-spaces (configuration manifolds) that are geometrically natural, introducing coordinates that are algebraically natural; and finally specifying dynamical potentials that are physically natural; in each respect explicit criteria are given for "naturality." The resulting framework encompasses Hamiltonian flow (both classical and quantum), quantum Lindbladian processes, and classical thermostatic processes. Constructive validation and verification criteria are given for metric and symplectic flows on classical, quantum, and hybrid state-spaces, with particular emphasis to tensor network state-spaces. Both classical and quantum examples are presented, including dynamic nuclear polarization (DNP). A broad span of applications and challenges is discussed, ranging from the design and simulation of quantum...

  15. A Simulation Framework for Exploring Socioecological Dynamics and Sustainability of Settlement Systems Under Stress in Ancient Mesopotamia and Beyond

    Science.gov (United States)

    Christiansen, J. H.; Altaweel, M. R.

    2007-12-01

    The presentation will describe an object-oriented, agent-based simulation framework being used to help answer longstanding questions regarding the development trajectories and sustainability of ancient Mesopotamian settlement systems. This multidisciplinary, multi-model framework supports explicit, fine-scale representations of the dynamics of key natural processes such as crop growth, hydrology, and weather, operating concurrently with social processes such as kinship-driven behaviors, farming and herding practices, social stratification, and economic and political activities carried out by social agents that represent individual persons, households, and larger-scale organizations. The framework has allowed us to explore the inherently coupled dynamics of modeled settlements and landscapes that are undergoing diverse social and environmental stresses, both acute and chronic, across multi-generational time spans. The simulation framework was originally used to address single-settlement scenarios, but has recently been extended to begin to address settlement system sustainability issues at sub-regional to regional scale, by introducing a number of new dynamic mechanisms, such as the activities of nomadic communities, that manifest themselves at these larger spatial scales. The framework is flexible and scalable and has broad applicability. It has, for example, recently been adapted to address agroeconomic sustainability of settlement systems in modern rural Thailand, testing the resilience and vulnerability of settled landscapes in the face of such perturbations as large-scale political interventions, global economic shifts, and climate change.

  16. Infectio: a Generic Framework for Computational Simulation of Virus Transmission between Cells.

    Science.gov (United States)

    Yakimovich, Artur; Yakimovich, Yauhen; Schmid, Michael; Mercer, Jason; Sbalzarini, Ivo F; Greber, Urs F

    2016-01-01

    Viruses spread between cells, tissues, and organisms by cell-free and cell-cell mechanisms, depending on the cell type, the nature of the virus, or the phase of the infection cycle. The mode of viral transmission has a large impact on disease development, the outcome of antiviral therapies or the efficacy of gene therapy protocols. The transmission mode of viruses can be addressed in tissue culture systems using live-cell imaging. Yet even in relatively simple cell cultures, the mechanisms of viral transmission are difficult to distinguish. Here we present a cross-platform software framework called "Infectio," which is capable of simulating transmission phenotypes in tissue culture of virtually any virus. Infectio can estimate interdependent biological parameters, for example for vaccinia virus infection, and differentiate between cell-cell and cell-free virus spreading. Infectio assists in elucidating virus transmission mechanisms, a feature useful for designing strategies of perturbing or enhancing viral transmission. The complexity of the Infectio software is low compared to that of other software commonly used to quantitate features of cell biological images, which yields stable and relatively error-free output from Infectio. The software is open source (GPLv3 license), and operates on the major platforms (Windows, Mac, and Linux). The complete source code can be downloaded from http://infectio.github.io/index.html. IMPORTANCE Infectio presents a generalized platform to analyze virus infection spread between cells. It allows the simulation of plaque phenotypes from image-based assays. Viral plaques are the result of virus spreading from primary infected cells to neighboring cells. This is a complex process and involves neighborhood effects at cell-cell contact sites or fluid dynamics in the extracellular medium. Infectio differentiates between two major modes of virus transmission between cells, allowing in silico testing of hypotheses about spreading

  17. On complexities of impact simulation of fiber reinforced polymer composites: a simplified modeling framework.

    Science.gov (United States)

    Alemi-Ardakani, M; Milani, A S; Yannacopoulos, S

    2014-01-01

    Impact modeling of fiber reinforced polymer composites is a complex and challenging task, in particular for practitioners with less experience in advanced coding and user-defined subroutines. Different numerical algorithms have been developed over the past decades for impact modeling of composites, yet a considerable gap often exists between predicted and experimental observations. In this paper, after a review of reported sources of complexities in impact modeling of fiber reinforced polymer composites, two simplified approaches are presented for fast simulation of out-of-plane impact response of these materials considering four main effects: (a) strain rate dependency of the mechanical properties, (b) difference between tensile and flexural bending responses, (c) delamination, and (d) the geometry of fixture (clamping conditions). In the first approach, it is shown that by applying correction factors to the quasistatic material properties, which are often readily available from material datasheets, the role of these four sources in modeling impact response of a given composite may be accounted for. As a result a rough estimation of the dynamic force response of the composite can be attained. To show the application of the approach, a twill woven polypropylene/glass reinforced thermoplastic composite laminate has been tested under 200 J impact energy and was modeled in Abaqus/Explicit via the built-in Hashin damage criteria. X-ray microtomography was used to investigate the presence of delamination inside the impacted sample. Finally, as a second and much simpler modeling approach it is shown that applying only a single correction factor over all material properties at once can still yield a reasonable prediction. Both advantages and limitations of the simplified modeling framework are addressed in the performed case study. PMID:25431787

  18. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    Science.gov (United States)

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; Mancuso, Adrian P.

    2016-04-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design.

  19. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    Science.gov (United States)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-09-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel performance with a simple one-dimensional A-B-C lattice KMC model and a more complex three-dimensional lattice KMC model of oxygen-vacancy diffusion in a fluorite structured metal oxide. KMCLib can keep track of individual particle movements and includes tools for mean square displacement analysis, and is therefore particularly well suited for studying diffusion processes at surfaces and in solids. Catalogue identifier: AESZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AESZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 49 064 No. of bytes in distributed program, including test data, etc.: 1 575 172 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer that can run a C++ compiler and a Python interpreter. Operating system: Tested on Ubuntu 12

  20. Towards a framework for teaching about information technology risk in health care: Simulating threats to health data and patient safety

    Directory of Open Access Journals (Sweden)

    Elizabeth M. Borycki

    2015-09-01

    Full Text Available In this paper the author describes work towards developing an integrative framework for educating health information technology professionals about technology risk. The framework considers multiple sources of risk to health data quality and integrity that can result from the use of health information technology (HIT and can be used to teach health professional students about these risks when using health technologies. This framework encompasses issues and problems that may arise from varied sources, including intentional alterations (e.g. resulting from hacking and security breaches as well as unintentional breaches and corruption of data (e.g. resulting from technical problems, or from technology-induced errors. The framework that is described has several levels: the level of human factors and usability of HIT, the level of monitoring of security and accuracy, the HIT architectural level, the level of operational and physical checks, the level of healthcare quality assurance policies and the data risk management strategies level. Approaches to monitoring and simulation of risk are also discussed, including a discussion of an innovative approach to monitoring potential quality issues. This is followed by a discussion of the application (using computer simulations to educate both students and health information technology professionals about the impact and spread of technology-induced and related types of data errors involving HIT.

  1. Environmental Survey preliminary report, Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    1988-11-01

    This report presents the preliminary findings of the first phase of the Environmental Survey of the United States Department of Energy's (DOE) Argonne National Laboratory (ANL), conducted June 15 through 26, 1987. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. The team includes outside experts supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with ANL. The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. The on-site phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at ANL, and interviews with site personnel. The Survey team developed a Sampling and Analysis (S A) Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The S A Plan will be executed by the Oak Ridge National Laboratory (ORNL). When completed, the S A results will be incorporated into the Argonne National Laboratory Environmental Survey findings for inclusion in the Environmental Survey Summary Report. 75 refs., 24 figs., 60 tabs.

  2. Environmental Survey preliminary report, Argonne National Laboratory, Argonne, Illinois

    International Nuclear Information System (INIS)

    This report presents the preliminary findings of the first phase of the Environmental Survey of the United States Department of Energy's (DOE) Argonne National Laboratory (ANL), conducted June 15 through 26, 1987. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. The team includes outside experts supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with ANL. The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. The on-site phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at ANL, and interviews with site personnel. The Survey team developed a Sampling and Analysis (S ampersand A) Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The S ampersand A Plan will be executed by the Oak Ridge National Laboratory (ORNL). When completed, the S ampersand A results will be incorporated into the Argonne National Laboratory Environmental Survey findings for inclusion in the Environmental Survey Summary Report. 75 refs., 24 figs., 60 tabs

  3. Proposed environmental remediation at Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The Department of Energy (DOE) has prepared an Environmental Assessment evaluating proposed environmental remediation activity at Argonne National Laboratory-East (ANL-E), Argonne, Illinois. The environmental remediation work would (1) reduce, eliminate, or prevent the release of contaminants from a number of Resource Conservation and Recovery Act (RCRA) Solid Waste Management Units (SWMUs) and two radiologically contaminated sites located in areas contiguous with SWMUs, and (2) decrease the potential for exposure of the public, ANL-E employees, and wildlife to such contaminants. The actions proposed for SWMUs are required to comply with the RCRA corrective action process and corrective action requirements of the Illinois Environmental Protection Agency; the actions proposed are also required to reduce the potential for continued contaminant release. Based on the analysis in the EA, the DOE has determined that the proposed action does not constitute a major federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act of 1969 (NEPA). Therefore, the preparation of an Environmental Impact Statement is not required.

  4. Proposed environmental remediation at Argonne National Laboratory, Argonne, Illinois

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) has prepared an Environmental Assessment evaluating proposed environmental remediation activity at Argonne National Laboratory-East (ANL-E), Argonne, Illinois. The environmental remediation work would (1) reduce, eliminate, or prevent the release of contaminants from a number of Resource Conservation and Recovery Act (RCRA) Solid Waste Management Units (SWMUs) and two radiologically contaminated sites located in areas contiguous with SWMUs, and (2) decrease the potential for exposure of the public, ANL-E employees, and wildlife to such contaminants. The actions proposed for SWMUs are required to comply with the RCRA corrective action process and corrective action requirements of the Illinois Environmental Protection Agency; the actions proposed are also required to reduce the potential for continued contaminant release. Based on the analysis in the EA, the DOE has determined that the proposed action does not constitute a major federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act of 1969 (NEPA). Therefore, the preparation of an Environmental Impact Statement is not required

  5. Argonne National Laboratory 1985 publications

    Energy Technology Data Exchange (ETDEWEB)

    Kopta, J.A. (ED.); Hale, M.R. (comp.)

    1987-08-01

    This report is a bibliography of scientific and technical 1985 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1985. This compilation, prepared by the Technical Information Services Technical Publications Section (TPB), lists all nonrestricted 1985 publications submitted to TPS by Laboratory's Divisions. The report is divided into seven parts: Journal Articles - Listed by first author, ANL Reports - Listed by report number, ANL and non-ANL Unnumbered Reports - Listed by report number, Non-ANL Numbered Reports - Listed by report number, Books and Book Chapters - Listed by first author, Conference Papers - Listed by first author, Complete Author Index.

  6. Argonne National Laboratory 1985 publications

    International Nuclear Information System (INIS)

    This report is a bibliography of scientific and technical 1985 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1985. This compilation, prepared by the Technical Information Services Technical Publications Section (TPB), lists all nonrestricted 1985 publications submitted to TPS by Laboratory's Divisions. The report is divided into seven parts: Journal Articles - Listed by first author, ANL Reports - Listed by report number, ANL and non-ANL Unnumbered Reports - Listed by report number, Non-ANL Numbered Reports - Listed by report number, Books and Book Chapters - Listed by first author, Conference Papers - Listed by first author, Complete Author Index

  7. A simulation framework for auditory discrimination experiments: Revealing the importance of across-frequency processing in speech perception.

    Science.gov (United States)

    Schädler, Marc René; Warzybok, Anna; Ewert, Stephan D; Kollmeier, Birger

    2016-05-01

    A framework for simulating auditory discrimination experiments, based on an approach from Schädler, Warzybok, Hochmuth, and Kollmeier [(2015). Int. J. Audiol. 54, 100-107] which was originally designed to predict speech recognition thresholds, is extended to also predict psychoacoustic thresholds. The proposed framework is used to assess the suitability of different auditory-inspired feature sets for a range of auditory discrimination experiments that included psychoacoustic as well as speech recognition experiments in noise. The considered experiments were 2 kHz tone-in-broadband-noise simultaneous masking depending on the tone length, spectral masking with simultaneously presented tone signals and narrow-band noise maskers, and German Matrix sentence test reception threshold in stationary and modulated noise. The employed feature sets included spectro-temporal Gabor filter bank features, Mel-frequency cepstral coefficients, logarithmically scaled Mel-spectrograms, and the internal representation of the Perception Model from Dau, Kollmeier, and Kohlrausch [(1997). J. Acoust. Soc. Am. 102(5), 2892-2905]. The proposed framework was successfully employed to simulate all experiments with a common parameter set and obtain objective thresholds with less assumptions compared to traditional modeling approaches. Depending on the feature set, the simulated reference-free thresholds were found to agree with-and hence to predict-empirical data from the literature. Across-frequency processing was found to be crucial to accurately model the lower speech reception threshold in modulated noise conditions than in stationary noise conditions. PMID:27250164

  8. High Performance Hybrid RANS-LES Simulation Framework for Turbulent Combusting Flows Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a computational framework for high performance, high fidelity computational fluid dynamics (CFD) to enable accurate, fast and robust...

  9. Advanced Simulation Framework for Design and Analysis of Space Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a computational framework for high performance, high fidelity computational fluid dynamics (CFD) to enable accurate, fast and robust...

  10. Advanced Simulation Framework for Design and Analysis of Space Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a high-performance, high-fidelity framework in the computational fluid dynamics (CFD) code called Loci-STREAM to enable accurate,...

  11. Framework for real-time forest fire animation: Simulating fire spread using the GPU

    OpenAIRE

    Kjærnet, Øystein

    2010-01-01

    In 2009 Odd Erik Gundersen and Jo Skjermo described a conceptual framework for animating physically based forest fires. This project expands on their ideas with a focus on how modern graphics hardware can be utilized to achieve real-time performance. A prototype demonstrating some of the concepts suggested for the framework have been implemented and tested, successfully achieving real-time frame rates on a simple animation of a burning tree.

  12. A General Simulation Framework for Supply Chain Modeling: State of the Art and Case Study

    Directory of Open Access Journals (Sweden)

    Antonio Cimino

    2010-03-01

    Full Text Available Nowadays there is a large availability of discrete event simulation software that can be easily used in different domains: from industry to supply chain, from healthcare to business management, from training to complex systems design. Simulation engines of commercial discrete event simulation software use specific rules and logics for simulation time and events management. Difficulties and limitations come up when commercial discrete event simulation software are used for modeling complex real world-systems (i.e. supply chains, industrial plants. The objective of this paper is twofold: first a state of the art on commercial discrete event simulation software and an overview on discrete event simulation models development by using general purpose programming languages are presented; then a Supply Chain Order Performance Simulator (SCOPS, developed in C++ for investigating the inventory management problem along the supply chain under different supply chain scenarios is proposed to readers.

  13. A General Simulation Framework for Supply Chain Modeling: State of the Art and Case Study

    CERN Document Server

    Cimino, Antonio; Mirabelli, Giovanni

    2010-01-01

    Nowadays there is a large availability of discrete event simulation software that can be easily used in different domains: from industry to supply chain, from healthcare to business management, from training to complex systems design. Simulation engines of commercial discrete event simulation software use specific rules and logics for simulation time and events management. Difficulties and limitations come up when commercial discrete event simulation software are used for modeling complex real world-systems (i.e. supply chains, industrial plants). The objective of this paper is twofold: first a state of the art on commercial discrete event simulation software and an overview on discrete event simulation models development by using general purpose programming languages are presented; then a Supply Chain Order Performance Simulator (SCOPS, developed in C++) for investigating the inventory management problem along the supply chain under different supply chain scenarios is proposed to readers.

  14. NEVESIM: Event-Driven Neural Simulation Framework with a Python Interface

    OpenAIRE

    Dejan ePecevski; David eKappel; Zeno eJonke

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes...

  15. NEVESIM: event-driven neural simulation framework with a Python interface

    OpenAIRE

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes...

  16. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  17. Run-Time Interoperability Between Neuronal Network Simulators Based on the MUSIC Framework

    OpenAIRE

    Djurfeldt, Mikael; Hjorth, Johannes,; Eppler, Jochen M; Dudani, Niraj; Helias, Moritz; Potjans, Tobias C.; Bhalla, Upinder S; Diesmann, Markus; Hellgren Kotaleski, Jeanette; Ekeberg, Örjan

    2010-01-01

    MUSIC is a standard API allowing large scale neuron simulators to exchange data within a parallel computer during runtime. A pilot implementation of this API has been released as open source. We provide experiences from the implementation of MUSIC interfaces for two neuronal network simulators of different kinds, NEST and MOOSE. A multi-simulation of a cortico-striatal network model involving both simulators is performed, demonstrating how MUSIC can promote inter-operability between models wr...

  18. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  19. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    Science.gov (United States)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  20. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    Science.gov (United States)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  1. Atomistic simulation studies on the dynamics and thermodynamics of nonpolar molecules within the zeolite imidazolate framework-8.

    Science.gov (United States)

    Pantatosaki, Evangelia; Pazzona, Federico G; Megariotis, Gregory; Papadopoulos, George K

    2010-02-25

    Statistical-mechanics-based simulation studies at the atomistic level of argon (Ar), methane (CH(4)), and hydrogen (H(2)) sorbed in the zeolite imidazolate framework-8 (ZIF-8) are reported. ZIF-8 is a product of a special kind of chemical process, recently termed as reticular synthesis, which has generated a class of materials of critical importance as molecular binders. In this work, we explore the mechanisms that govern the sorption thermodynamics and kinetics of nonpolar sorbates possessing different sizes and strength of interactions with the metal-organic framework to understand the outstanding properties of this novel class of sorbents, as revealed by experiments published elsewhere. For this purpose, we have developed an in-house modeling procedure involving calculations of sorption isotherms, partial internal energies, various probability density functions, and molecular dynamics for the simulation of the sorbed phase over a wide range of occupancies and temperatures within a digitally reconstructed unit cell of ZIF-8. The results showed that sorbates perceive a marked energetic inhomogeneity within the atomic framework of the metal-organic material under study, resulting in free energy barriers that give rise to inflections in the sorption isotherms and guide the dynamics of guest molecules.

  2. A new numerical framework for simulating the control of weather and climate on the evolution of soil-mantled hillslopes

    Science.gov (United States)

    Bovy, Benoît; Braun, Jean; Demoulin, Alain

    2016-06-01

    We present a new numerical framework for simulating short to long-term hillslope evolution. This modeling framework, to which we have given the name CLICHE (CLImate Control on Hillslope Evolution), aims to better capture the control of climate on soil dynamics. It allows the use of realistic forcing that involves, through a specific time discretization scheme, the variability of both the temperature and precipitation at time scales ranging from the daily rainfall events to the climatic oscillations of the Quaternary, also including seasonal variability. Two simple models of soil temperature and soil water balance permit the link between the climatic inputs and derived quantities that take part in the computation of the soil flux, such as the surface water discharge and the depth of the non-frozen soil layer. Using this framework together with a multi-process parameterization of soil transport, we apply an original method to calculate hillslope effective diffusivity as a function of climate. This allows us to demonstrate the ability of the model to simulate observed rates of hillslope erosion under different climates (cold and temperate) with a single set of parameter values. Numerical experiments furthermore suggest a potential high peak of sediment transport on hillslopes during the glacial-interglacial transitions of the Quaternary. We finally discuss the need to improve the parameterization of the soil production and transport processes in order to explicitly account for other key controlling factors that are also climate-sensitive, such as biological activity.

  3. Availability-based simulation and optimization modeling framework for open-pit mine truck allocation under dynamic constraints

    Institute of Scientific and Technical Information of China (English)

    Mena Rodrigo; Zio Enrico; Kristjanpoller Fredy; Arata Adolfo

    2013-01-01

    We present a novel system productivity simulation and optimization modeling framework in which equipment availability is a variable in the expected productivity function of the system.The framework is used for allocating trucks by route according to their operating performances in a truck-shovel system of an open-pit mine,so as to maximize the overall productivity of the fleet.We implement the framework in an originally designed and specifically developed simulator-optimizer software tool.We make an application on a real open-pit mine case study taking into account the stochasticity of the equipment behavior and environment.The total system production values obtained with and without considering the equipment reliability,availability and maintainability (RAM) characteristics are compared.We show that by taking into account the truck and shovel RAM aspects,we can maximize the total production of the system and obtain specific information on the production availability and productivity of its components.

  4. A Generalized Framework for Different Drought Indices: Testing its Suitability in a Simulation of the last two Millennia for Europe

    Science.gov (United States)

    Raible, Christoph C.; Baerenbold, Oliver; Gomez-Navarro, Juan Jose

    2016-04-01

    Over the past decades, different drought indices have been suggested in the literature. This study tackles the problem of how to characterize drought by defining a general framework and proposing a generalized family of drought indices that is flexible regarding the use of different water balance models. The sensitivity of various indices and its skill to represent drought conditions is evaluated using a regional model simulation in Europe spanning the last two millennia as test bed. The framework combines an exponentially damped memory with a normalization method based on quantile mapping. Both approaches are more robust and physically meaningful compared to the existing methods used to define drought indices. Still, framework is flexible with respect to the water balance, enabling users to adapt the index formulation to the data availability of different locations. Based on the framework, indices with different complex water balances are compared with each other. The comparison shows that a drought index considering only precipitation in the water balance is sufficient for Western to Central Europe. However, in the Mediterranean temperature effects via evapotranspiration need to be considered in order to produce meaningful indices representative of actual water deficit. Similarly, our results indicate that in north-eastern Europe and Scandinavia, snow and runoff effects needs to be considered in the index definition to obtain accurate results.

  5. Constrained multi-global optimization using a penalty stretched simulated annealing framework

    OpenAIRE

    Pereira, Ana I.; Edite M.G.P. Fernandes

    2009-01-01

    This paper presents a new simulated annealing algorithm to solve constrained multi-global optimization problems. To compute all global solutions in a sequential manner, we combine the function stretching technique with the adaptive simulated annealing variant. Constraint-handling is carried out through a nondifferentiable penalty function. To benchmark our penalty stretched simulated annealing algorithm we solve a set of well-known problems. Our preliminary numerical results show that the alg...

  6. Experimental spectra analysis in THM with the help of simulation based on Geant4 framework

    CERN Document Server

    Li, Chengbo; Zhou, Shuhua; Fu, Yuanyong; Zhou, Jing; Meng, Qiuying; Jiang, Zongjun; Wang, Xiaolian

    2014-01-01

    The Coulomb barrier and electron screening cause difficulties in directly measuring nuclear reaction cross sections of charged particles in astrophysical energies. The Trojan-horse method has been introduced to solve the difficulties as a powerful indirect tool. In order to understand experimental spectra better, Geant4 is employed to simulate the method for the first time. Validity and reliability of the simulation are examined by comparing the experimental data with simulated results. The Geant4 simulation can give useful information to understand the experimental spectra better in data analysis and is beneficial to the design for future related experiments.

  7. Field-wide flow simulation in fractured porous media within lattice Boltzmann framework

    Science.gov (United States)

    Benamram, Z.; Tarakanov, A.; Nasrabadi, H.; Gildin, E.

    2016-10-01

    In this paper, a generalized lattice Boltzmann model for simulating fluid flow in porous media at the representative volume element scale is extended towards applications of hydraulically and naturally fractured reservoirs. The key element within the model is the development of boundary conditions for a vertical well and horizontal fracture with minimal node usage. In addition, the governing non-dimensional equations are derived and a new set of dimensionless numbers are presented for the simulation of a fractured reservoir system. Homogenous and heterogeneous vertical well and fracture systems are simulated and verified against commercial reservoir simulation suites. Results are in excellent agreement to analytical and finite difference solutions.

  8. MASADA: A Modeling and Simulation Automated Data Analysis framework for continuous data-intensive validation of simulation models

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  9. An innovative strategy in evaluation: using a student engagement framework to evaluate a role-based simulation.

    Science.gov (United States)

    Smith, Morgan; Warland, Jane; Smith, Colleen

    2012-03-01

    Online role-play has the potential to actively engage students in authentic learning experiences and help develop their clinical reasoning skills. However, evaluation of student learning for this kind of simulation focuses mainly on the content and outcome of learning, rather than on the process of learning through student engagement. This article reports on the use of a student engagement framework to evaluate an online role-play offered as part of a course in Bachelor of Nursing and Bachelor of Midwifery programs. Instruments that measure student engagement to date have targeted large numbers of students at program and institutional levels, rather than at the level of a specific learning activity. Although the framework produced some useful findings for evaluation purposes, further refinement of the questions is required to be certain that deep learning results from the engagement that occurs with course-level learning initiatives.

  10. A Modeling Framework for Supply Chain Simulation: Opportunities for Improved Decision Making

    NARCIS (Netherlands)

    Zee, van der D.J.; Vorst, van der J.G.A.J.

    2005-01-01

    Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and th

  11. Push technology at Argonne National Laboratory.

    Energy Technology Data Exchange (ETDEWEB)

    Noel, R. E.; Woell, Y. N.

    1999-04-06

    Selective dissemination of information (SDI) services, also referred to as current awareness searches, are usually provided by periodically running computer programs (personal profiles) against a cumulative database or databases. This concept of pushing relevant content to users has long been integral to librarianship. Librarians traditionally turned to information companies to implement these searches for their users in business, academia, and the science community. This paper describes how a push technology was implemented on a large scale for scientists and engineers at Argonne National Laboratory, explains some of the challenges to designers/maintainers, and identifies the positive effects that SDI seems to be having on users. Argonne purchases the Institute for Scientific Information (ISI) Current Contents data (all subject areas except Humanities), and scientists no longer need to turn to outside companies for reliable SDI service. Argonne's database and its customized services are known as ACCESS (Argonne-University of Chicago Current Contents Electronic Search Service).

  12. Argonne National Laboratory 1986 publications

    Energy Technology Data Exchange (ETDEWEB)

    Kopta, J.A.; Springer, C.J.

    1987-12-01

    This report is a bibliography of scientific and technical 1986 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1986. This compilation, prepared by the Technical Information Services Technical Publications Section (TPS), lists all nonrestricted 1986 publications submitted to TPS by the Laboratory's Divisions. Author indexes list ANL authors only. If a first author is not an ANL employee, an asterisk in the bibliographic citation indicates the first ANL author. The report is divided into seven parts: Journal Articles -- Listed by first author; ANL Reports -- Listed by report number; ANL and non-ANL Unnumbered Reports -- Listed by report number; Non-ANL Numbered Reports -- Listed by report number; Books and Book Chapters -- Listed by first author; Conference Papers -- Listed by first author; and Complete Author Index.

  13. Argonne National Laboratory 1986 publications

    International Nuclear Information System (INIS)

    This report is a bibliography of scientific and technical 1986 publications of Argonne National Laboratory. Some are ANL contributions to outside organizations' reports published in 1986. This compilation, prepared by the Technical Information Services Technical Publications Section (TPS), lists all nonrestricted 1986 publications submitted to TPS by the Laboratory's Divisions. Author indexes list ANL authors only. If a first author is not an ANL employee, an asterisk in the bibliographic citation indicates the first ANL author. The report is divided into seven parts: Journal Articles -- Listed by first author; ANL Reports -- Listed by report number; ANL and non-ANL Unnumbered Reports -- Listed by report number; Non-ANL Numbered Reports -- Listed by report number; Books and Book Chapters -- Listed by first author; Conference Papers -- Listed by first author; and Complete Author Index

  14. A Framework for the Interactive Handling of High-Dimensional Simulation Data in Complex Geometries

    KAUST Repository

    Benzina, Amal

    2013-01-01

    Flow simulations around building infrastructure models involve large scale complex geometries, which when discretized in adequate detail entail high computational cost. Moreover, tasks such as simulation insight by steering or optimization require many such costly simulations. In this paper, we illustrate the whole pipeline of an integrated solution for interactive computational steering, developed for complex flow simulation scenarios that depend on a moderate number of both geometric and physical parameters. A mesh generator takes building information model input data and outputs a valid cartesian discretization. A sparse-grids-based surrogate model—a less costly substitute for the parameterized simulation—uses precomputed data to deliver approximated simulation results at interactive rates. Furthermore, a distributed multi-display visualization environment shows building infrastructure together with flow data. The focus is set on scalability and intuitive user interaction.

  15. Role-playing simulation as an educational tool for health care personnel: developing an embedded assessment framework.

    Science.gov (United States)

    Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne

    2010-04-01

    Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.

  16. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    Science.gov (United States)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  17. A Discrete Event Simulation Framework for Utility Accrual Scheduling Algorithm in Uniprocessor Environment

    Directory of Open Access Journals (Sweden)

    Idawaty Ahmad

    2011-01-01

    Full Text Available Problem statement: The heterogeneity in the choice of simulation platforms for real time scheduling stands behind the difficulty of developing a common simulation environment. A Discrete Event Simulation (DES for a real time scheduling domain encompassing event definition, time advancing mechanism and scheduler has yet to be developed. Approach: The study focused on the proposed and the development of an event based discrete event simulator for the existing General Utility Scheduling (GUS to facilitate the reuse of the algorithm under a common simulation environment. GUS is one of the existing TUF/UA scheduling algorithms that consider the Time/Utility Function (TUF of the executed tasks in its scheduling decision. The scheduling optimality criteria are based on maximizing accrued utility accumulated from execution of all tasks in the system. These criteria are named as Utility Accrual (UA. The TUF/ UA scheduling algorithms are design for adaptive real time system environment. The developed GUS simulator has derived the set of parameter, events, performance metrics and other unique TUF/UA scheduling element according to a detailed analysis of the base model. Results: The Accrued Utility Ratio (AUR is investigated and compared to the benchmark of the modeled domain. Successful deployment of the GUS simulator was proven by the generated results. Conclusion: Extensive performance analysis of GUS simulator can be deployed using the developed simulator with low computational overhead. Further enhancements were to extend the developed GUS simulator with detail performance metrics together with a fault tolerance mechanism to support a reliable real time application domain.

  18. Status on the Development of a Modeling and Simulation Framework for the Economic Assessment of Nuclear Hybrid Energy Systems

    International Nuclear Information System (INIS)

    An effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year 2015 (FY15). The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status on its progress. Several tasks have been accomplished. First, starting from a simulation strategy, a rigorous mathematical formulation has been achieved in which the economic optimization of a Nuclear Hybrid Energy System is presented as a constrained robust (under uncertainty) optimization problem. Some possible algorithms for the solution of the optimization problem are presented. A variation of the Simultaneous Perturbation Stochastic Approximation algorithm has been implemented in RAVEN and preliminary tests have been performed. The development of the software infrastructure to support the simulation of the whole NHES has also moved forward. The coupling between RAVEN and an implementation of the Modelica language (OpenModelica) has been implemented, migrated under several operating systems and tested using an adapted model of a desalination plant. In particular, this exercise was focused on testing the coupling of the different code systems; testing parallel, computationally expensive simulations on the INL cluster; and providing a proof of concept for the possibility of using surrogate models to represent the different NHES subsystems. Another important step was the porting of the RAVEN code under the Windows™ operating system. This accomplishment makes RAVEN compatible with the development environment that is being used for dynamic simulation of NHES components. A very simplified model of a NHES on the electric market has been built in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces even for stochastic constraints. This

  19. Status on the Development of a Modeling and Simulation Framework for the Economic Assessment of Nuclear Hybrid Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bragg-Sitton, Shannon Michelle [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert Arthur [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Jong Suk [Idaho National Lab. (INL), Idaho Falls, ID (United States); Deason, Wesley Ray [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boardman, Richard Doin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Garcia, Humberto E. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    An effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year 2015 (FY15). The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status on its progress. Several tasks have been accomplished. First, starting from a simulation strategy, a rigorous mathematical formulation has been achieved in which the economic optimization of a Nuclear Hybrid Energy System is presented as a constrained robust (under uncertainty) optimization problem. Some possible algorithms for the solution of the optimization problem are presented. A variation of the Simultaneous Perturbation Stochastic Approximation algorithm has been implemented in RAVEN and preliminary tests have been performed. The development of the software infrastructure to support the simulation of the whole NHES has also moved forward. The coupling between RAVEN and an implementation of the Modelica language (OpenModelica) has been implemented, migrated under several operating systems and tested using an adapted model of a desalination plant. In particular, this exercise was focused on testing the coupling of the different code systems; testing parallel, computationally expensive simulations on the INL cluster; and providing a proof of concept for the possibility of using surrogate models to represent the different NHES subsystems. Another important step was the porting of the RAVEN code under the Windows™ operating system. This accomplishment makes RAVEN compatible with the development environment that is being used for dynamic simulation of NHES components. A very simplified model of a NHES on the electric market has been built in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces even for stochastic constraints. This

  20. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs

    International Nuclear Information System (INIS)

    Purpose: To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. Methods: In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimated field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm2 were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R100, R50, Rp, and Rp+ for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Results: Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R100, R50, and Rp were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. Conclusions: We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6

  1. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Anna; Yin, Fang-Fang; Wu, Qiuwen, E-mail: Qiuwen.Wu@Duke.edu [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27710 and Medical Physics Graduate Program, Duke University Medical Center, Durham, North Carolina 27705 (United States); Sawkey, Daren [Varian Medical Systems, Palo Alto, California 94304 (United States)

    2015-05-15

    Purpose: To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. Methods: In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimated field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm{sup 2} were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R{sub 100}, R{sub 50}, R{sub p}, and R{sub p+} for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Results: Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R{sub 100}, R{sub 50}, and R{sub p} were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. Conclusions: We have presented a Monte Carlo simulation framework for electron beam dose calculations for

  2. Monte Carlo simulation of inverse geometry x-ray fluoroscopy using a modified MC-GPU framework

    Science.gov (United States)

    Dunkerley, David A. P.; Tomkowiak, Michael T.; Slagowski, Jordan M.; McCabe, Bradley P.; Funk, Tobias; Speidel, Michael A.

    2015-03-01

    Scanning-Beam Digital X-ray (SBDX) is a technology for low-dose fluoroscopy that employs inverse geometry x-ray beam scanning. To assist with rapid modeling of inverse geometry x-ray systems, we have developed a Monte Carlo (MC) simulation tool based on the MC-GPU framework. MC-GPU version 1.3 was modified to implement a 2D array of focal spot positions on a plane, with individually adjustable x-ray outputs, each producing a narrow x-ray beam directed toward a stationary photon-counting detector array. Geometric accuracy and blurring behavior in tomosynthesis reconstructions were evaluated from simulated images of a 3D arrangement of spheres. The artifact spread function from simulation agreed with experiment to within 1.6% (rRMSD). Detected x-ray scatter fraction was simulated for two SBDX detector geometries and compared to experiments. For the current SBDX prototype (10.6 cm wide by 5.3 cm tall detector), x-ray scatter fraction measured 2.8-6.4% (18.6-31.5 cm acrylic, 100 kV), versus 2.2-5.0% in MC simulation. Experimental trends in scatter versus detector size and phantom thickness were observed in simulation. For dose evaluation, an anthropomorphic phantom was imaged using regular and regional adaptive exposure (RAE) scanning. The reduction in kerma-area-product resulting from RAE scanning was 45% in radiochromic film measurements, versus 46% in simulation. The integral kerma calculated from TLD measurement points within the phantom was 57% lower when using RAE, versus 61% lower in simulation. This MC tool may be used to estimate tomographic blur, detected scatter, and dose distributions when developing inverse geometry x-ray systems.

  3. GridPACK™ : A Framework for Developing Power Grid Simulations on High-Performance Computing Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, Bruce J.; Perkins, William A.; Chen, Yousu; Jin, Shuangshuang; Callahan, David; Glass, Kevin A.; Diao, Ruisheng; Rice, Mark J.; Elbert, Stephen T.; Vallem, Mallikarjuna R.; Huang, Zhenyu

    2016-05-01

    This paper describes the GridPACK™ framework, which is designed to help power grid engineers develop modeling software capable of running on high performance computers. The framework makes extensive use of software templates to provide high level functionality while at the same time allowing developers the freedom to express whatever models and algorithms they are using. GridPACK™ contains modules for setting up distributed power grid networks, assigning buses and branches with arbitrary behaviors to the network, creating distributed matrices and vectors and using parallel linear and non-linear solvers to solve algebraic equations. It also provides mappers to create matrices and vectors based on properties of the network and functionality to support IO and to mana

  4. A versatile framework for simulating the dynamic mechanical structure of cytoskeletal networks

    CERN Document Server

    Freedman, Simon L; Hocky, Glen M; Dinner, Aaron R

    2016-01-01

    Computer simulations can aid in our understanding of how collective materials properties emerge from interactions between simple constituents. Here, we introduce a coarse- grained model of networks of actin filaments, myosin motors, and crosslinking proteins that enables simulation at biologically relevant time and length scales. We demonstrate that the model, with a consistent parameterization, qualitatively and quantitatively captures a suite of trends observed experimentally, including the statistics of filament fluctuations, mechanical responses to shear, motor motilities, and network rearrangements. The model can thus serve as a platform for interpretation and design of cytoskeletal materials experiments, as well as for further development of simulations incorporating active elements.

  5. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-06-23

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  6. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    Directory of Open Access Journals (Sweden)

    David P. Chassin

    2014-01-01

    Full Text Available Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  7. A Framework for Parallel Numerical Simulations on Multi-Scale Geometries

    KAUST Repository

    Varduhn, Vasco

    2012-06-01

    In this paper, an approach on performing numerical multi-scale simulations on fine detailed geometries is presented. In particular, the focus lies on the generation of sufficient fine mesh representations, whereas a resolution of dozens of millions of voxels is inevitable in order to sufficiently represent the geometry. Furthermore, the propagation of boundary conditions is investigated by using simulation results on the coarser simulation scale as input boundary conditions on the next finer scale. Finally, the applicability of our approach is shown on a two-phase simulation for flooding scenarios in urban structures running from a city wide scale to a fine detailed in-door scale on feature rich building geometries. © 2012 IEEE.

  8. Autonomic, Agent-Based Simulation Management (A2SM) Framework Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large scale numerical simulations, as typified by climate models, space weather models, and the like, typically involve non-linear governing equations in...

  9. A Virtual Simulation Environment for Lunar Rover: Framework and Key Technologies

    Directory of Open Access Journals (Sweden)

    Yan-chun Yang

    2008-11-01

    Full Text Available Lunar rover development involves a large amount of validation works in realistic operational conditions, including its mechanical subsystem and on-board software. Real tests require equipped rover platform and a realistic terrain. It is very time consuming and high cost. To improve the development efficiency, a rover simulation environment called RSVE that affords real time capabilities with high fidelity has been developed. It uses fractional Brown motion (fBm technique and statistical properties to generate lunar surface. Thus, various terrain models for simulation can be generated through changing several parameters. To simulate lunar rover evolving on natural and unstructured surface with high realism, the whole dynamics of the multi-body systems and complex interactions with soft ground is integrated in this environment. An example for path planning algorithm and controlling algorithm testing in this environment is tested. This simulation environment runs on PC or Silicon Graphics.

  10. An optimization framework for modeling and simulation of dynamic systems based on AIS

    OpenAIRE

    Leung, CSK; Lau, HYK

    2011-01-01

    Modeling and simulation can be used in many contexts for gaining insights into the functioning, performance, and operation, of complex systems. However, this method alone often produces feasible solutions under certain operating conditions of a system in which such solutions may not be optimal. This is inevitably inadequate in circumstances where optimality is required. In this respect, an approach to effectively evaluate and optimize system performance is to couple the simulation model with ...

  11. The IDES framework: A case study in development of a parallel discrete-event simulation system

    Energy Technology Data Exchange (ETDEWEB)

    Nicol, D.M. [Dartmouth Coll., Hanover, NH (United States). Dept. of Computer Science; Johnson, M.M.; Yoshimura, A.S. [Sandia National Labs., Livermore, CA (United States)

    1997-12-31

    This tutorial describes considerations in the design and development of the IDES parallel simulation system. IDES is a Java-based parallel/distributed simulation system designed to support the study of complex large-scale enterprise systems. Using the IDES system as an example, the authors discuss how anticipated model and system constraints molded the design decisions with respect to modeling, synchronization, and communication strategies.

  12. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    OpenAIRE

    Chassin, David P.; Jason C. Fuller; Ned Djilali

    2014-01-01

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies and the plethora of other resources and assets that are becoming part of modern electricity production, delivery and consumption systems. As a result, the US Department of Energy's Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D(TM) that uses an agent-based approach to simula...

  13. Development of a modelling and simulation method comparison and selection: Framework for health services management

    OpenAIRE

    Naseer Aisha; Harper Paul; Eldabi Tillal; Morris Zoe; Jun Gyuchan T; Patel Brijesh; Clarkson John P

    2011-01-01

    Abstract Background There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. Aim The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can b...

  14. A technical framework to describe occupant behavior for building energy simulations

    Energy Technology Data Exchange (ETDEWEB)

    Turner, William; Hong, Tianzhen

    2013-12-20

    Green buildings that fail to meet expected design performance criteria indicate that technology alone does not guarantee high performance. Human influences are quite often simplified and ignored in the design, construction, and operation of buildings. Energy-conscious human behavior has been demonstrated to be a significant positive factor for improving the indoor environment while reducing the energy use of buildings. In our study we developed a new technical framework to describe energy-related human behavior in buildings. The energy-related behavior includes accounting for individuals and groups of occupants and their interactions with building energy services systems, appliances and facilities. The technical framework consists of four key components: i. the drivers behind energy-related occupant behavior, which are biological, societal, environmental, physical, and economical in nature ii. the needs of the occupants are based on satisfying criteria that are either physical (e.g. thermal, visual and acoustic comfort) or non-physical (e.g. entertainment, privacy, and social reward) iii. the actions that building occupants perform when their needs are not fulfilled iv. the systems with which an occupant can interact to satisfy their needs The technical framework aims to provide a standardized description of a complete set of human energy-related behaviors in the form of an XML schema. For each type of behavior (e.g., occupants opening/closing windows, switching on/off lights etc.) we identify a set of common behaviors based on a literature review, survey data, and our own field study and analysis. Stochastic models are adopted or developed for each type of behavior to enable the evaluation of the impact of human behavior on energy use in buildings, during either the design or operation phase. We will also demonstrate the use of the technical framework in assessing the impact of occupancy behavior on energy saving technologies. The technical framework presented is

  15. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework.

    Science.gov (United States)

    Cañadas, M; Arce, P; Rato Mendes, P

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL(-1) and the simulated peak

  16. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework

    Science.gov (United States)

    Cañadas, M.; Arce, P.; Rato Mendes, P.

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was

  17. Thermal large Eddy simulations and experiments in the framework of non-isothermal blowing

    International Nuclear Information System (INIS)

    The aim of this work is to study thermal large-eddy simulations and to determine the nonisothermal blowing impact on a turbulent boundary layer. An experimental study is also carried out in order to complete and validate simulation results. In a first time, we developed a turbulent inlet condition for the velocity and the temperature, which is necessary for the blowing simulations.We studied the asymptotic behavior of the velocity, the temperature and the thermal turbulent fluxes in a large-eddy simulation point of view. We then considered dynamics models for the eddy-diffusivity and we simulated a turbulent channel flow with imposed temperature, imposed flux and adiabatic walls. The numerical and experimental study of blowing permitted to obtain to the modifications of a thermal turbulent boundary layer with the blowing rate. We observed the consequences of the blowing on mean and rms profiles of velocity and temperature but also on velocity-velocity and velocity-temperature correlations. Moreover, we noticed an increase of the turbulent structures in the boundary layer with blowing. (author)

  18. A hybrid local/non-local framework for the simulation of damage and fracture

    KAUST Repository

    Azdoud, Yan

    2014-01-01

    Recent advances in non-local continuum models, notably peridynamics, have spurred a paradigm shift in solid mechanics simulation by allowing accurate mathematical representation of singularities and discontinuities. This doctoral work attempts to extend the use of this theory to a community more familiar with local continuum models. In this communication, a coupling strategy - the morphing method -, which bridges local and non-local models, is presented. This thesis employs the morphing method to ease use of the non-local model to represent problems with failure-induced discontinuities. First, we give a quick review of strategies for the simulation of discrete degradation, and suggest a hybrid local/non-local alternative. Second, we present the technical concepts involved in the morphing method and evaluate the quality of the coupling. Third, we develop a numerical tool for the simulation of the hybrid model for fracture and damage and demonstrate its capabilities on numerical model examples

  19. CO 2 adsorption in mono-, di- and trivalent cation-exchanged metal-organic frameworks: A molecular simulation study

    KAUST Repository

    Chen, Yifei

    2012-02-28

    A molecular simulation study is reported for CO 2 adsorption in rho zeolite-like metal-organic framework (rho-ZMOF) exchanged with a series of cations (Na +, K +, Rb +, Cs +, Mg 2+, Ca 2+, and Al 3+). The isosteric heat and Henry\\'s constant at infinite dilution increase monotonically with increasing charge-to-diameter ratio of cation (Cs + < Rb + < K + < Na + < Ca 2+ < Mg 2+ < Al 3+). At low pressures, cations act as preferential adsorption sites for CO 2 and the capacity follows the charge-to-diameter ratio. However, the free volume of framework becomes predominant with increasing pressure and Mg-rho-ZMOF appears to possess the highest saturation capacity. The equilibrium locations of cations are observed to shift slightly upon CO 2 adsorption. Furthermore, the adsorption selectivity of CO 2/H 2 mixture increases as Cs + < Rb + < K + < Na + < Ca 2+ < Mg 2+ ≈ Al 3+. At ambient conditions, the selectivity is in the range of 800-3000 and significantly higher than in other nanoporous materials. In the presence of 0.1% H 2O, the selectivity decreases drastically because of the competitive adsorption between H 2O and CO 2, and shows a similar value in all of the cation-exchanged rho-ZMOFs. This simulation study provides microscopic insight into the important role of cations in governing gas adsorption and separation, and suggests that the performance of ionic rho-ZMOF can be tailored by cations. © 2012 American Chemical Society.

  20. Novel Simulation Framework of Three-Dimensional Skull Bio-Metric Measurement

    Directory of Open Access Journals (Sweden)

    Shihab A. Hameed

    2009-11-01

    Full Text Available Previously, most of the researcher was suffering from simulate any three dimension applications for biometrics application, likewise, various applications of forensics and cosmetology has not been easy to be simulated. Three dimensional figures have approved the fact that, it has been more reliable than two dimensional figures in most of the applications used to be implemented for the purposes above. The reason behind this reliability was the features that extract from the three dimensional applications more close to the reality. The goal of this paper is to study and evaluate how far three-dimensional skull biometric is reliable in term of the accurate measurements, capability and applicability. As it mentions above, it was hard to evaluate or simulate an application use three-dimensional skull in biometric, however, Canfield Imaging Systems provide a new suitable environment to simulate a new three-dimensional skull biometric. The second goal of this paper is to assess how good the new threedimensional image system is. This paper will also go through the recognition and verification based on a different biometric application. Subsequently this paper will study the reliability and dependability of using skull biometric. The simulation based on the three-dimensional Skull recognition using threedimensional matching technique. The feature of the simulate system shows the capability of using three-dimensional matching system as an efficient way to identify the person through his or her skull by match it with database, this technique grantee fast processing with optimizing the false positive and negative as well .

  1. The framework for simulation of bioinspired security mechanisms against network infrastructure attacks.

    Science.gov (United States)

    Shorov, Andrey; Kotenko, Igor

    2014-01-01

    The paper outlines a bioinspired approach named "network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed procedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine necessary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described.

  2. Through the lens of instructional design: appraisal of the Jeffries/National League for Nursing Simulation Framework for use in acute care.

    Science.gov (United States)

    Wilson, Rebecca D; Hagler, Debra

    2012-09-01

    As human patient simulation becomes more prevalent in acute care settings, clinical experts are often asked to assist in developing scenarios. Although the Jeffries/National League for Nursing Simulation Framework has been used in academic settings to guide the instructional design of clinical simulations, its use in acute care settings is less known. This framework incorporates a consideration of contextual elements, design characteristics, and outcomes. An external validation study applying the framework within the context of acute care showed its overall strength as well as elements that were problematic. The implications derived from the study of the design characteristics in a hospital setting can be used by nurses who are considering either adopting or adapting this framework for their own practice. PMID:22715871

  3. Numerical simulation of a full scale fire test on a loaded steel framework

    OpenAIRE

    Franssen, Jean-Marc; Cooke, C. M. E.; Latham, D. J.

    1995-01-01

    A single bay single storey steel portal frame has been tested under fire conditions. It is here simulated using hte non linear computer code CEFICOSS. The elements have composite steel-concrete sections for the thermal analysis, but only the steel part of the sections is load bearing.

  4. Multi-agent based modeling and execution framework for complex simulation, control and measuring tasks

    NARCIS (Netherlands)

    Papp, Z.; Hoeve, H.J.

    2000-01-01

    The paper presents a modeling concept and a supporting runtime environment, which enables running simulation, control and measuring (data processing) tasks on distributed implementation platforms. Its main features: (1) it is scaleable in various application domains; (2) it has a model based system

  5. Variable-resolution frameworks for the simulation of tropical cyclones in global atmospheric general circulation models

    Science.gov (United States)

    Zarzycki, Colin

    The ability of atmospheric General Circulation Models (GCMs) to resolve tropical cyclones in the climate system has traditionally been difficult. The challenges include adequately capturing storms which are small in size relative to model grids and the fact that key thermodynamic processes require a significant level of parameterization. At traditional GCM grid spacings of 50-300 km tropical cyclones are severely under-resolved, if not completely unresolved. This thesis explores a variable-resolution global model approach that allows for high spatial resolutions in areas of interest, such as low-latitude ocean basins where tropical cyclogenesis occurs. Such GCM designs with multi-resolution meshes serve to bridge the gap between globally-uniform grids and limited area models and have the potential to become a future tool for regional climate assessments. A statically-nested, variable-resolution option has recently been introduced into the Department of Energy/National Center for Atmospheric Research (DoE/NCAR) Community Atmosphere Model's (CAM) Spectral Element (SE) dynamical core. Using an idealized tropical cyclone test, variable-resolution meshes are shown to significantly lessen computational requirements in regional GCM studies. Furthermore, the tropical cyclone simulations are free of spurious numerical errors at the resolution interfaces. Utilizing aquaplanet simulations as an intermediate test between idealized simulations and fully-coupled climate model runs, climate statistics within refined patches are shown to be well-matched to globally-uniform simulations of the same grid spacing. Facets of the CAM version 4 (CAM4) subgrid physical parameterizations are likely too scale sensitive for variable-resolution applications, but the newer CAM5 package is vastly improved in performance at multiple grid spacings. Multi-decadal simulations following 'Atmospheric Model Intercomparison Project' protocols have been conducted with variable-resolution grids. Climate

  6. An Integrated GIS, optimization and simulation framework for optimal PV size and location in campus area environments

    International Nuclear Information System (INIS)

    Highlights: • The optimal size and locations for PV units for campus environments are achieved. • The GIS module finds the suitable rooftops and their panel capacity. • The optimization module maximizes the long-term profit of PV installations. • The simulation module evaluates the voltage profile of the distribution network. • The proposed work has been successfully demonstrated for a real university campus. - Abstract: Finding the optimal size and locations for Photovoltaic (PV) units has been a major challenge for distribution system planners and researchers. In this study, a framework is proposed to integrate Geographical Information Systems (GIS), mathematical optimization, and simulation modules to obtain the annual optimal placement and size of PV units for the next two decades in a campus area environment. First, a GIS module is developed to find the suitable rooftops and their panel capacity considering the amount of solar radiation, slope, elevation, and aspect. The optimization module is then used to maximize the long-term net profit of PV installations considering various costs of investment, inverter replacement, operation, and maintenance as well as savings from consuming less conventional energy. A voltage profile of the electricity distribution network is then investigated in the simulation module. In the case of voltage limit violation by intermittent PV generations or load fluctuations, two mitigation strategies, reallocation of the PV units or installation of a local storage unit, are suggested. The proposed framework has been implemented in a real campus area, and the results show that it can effectively be used for long-term installation planning of PV panels considering both the cost and power quality

  7. I. Dissociation free energies in drug-receptor systems via non equilibrium alchemical simulations: theoretical framework

    CERN Document Server

    Procacci, Piero

    2016-01-01

    In this contribution I critically revise the alchemical reversible approach in the context of the statistical mechanics theory of non covalent bonding in drug receptor systems. I show that most of the pitfalls and entanglements for the binding free energies evaluation in computer simulations are rooted in the equilibrium assumption that is implicit in the reversible method. These critical issues can be resolved by using a non-equilibrium variant of the alchemical method in molecular dynamics simulations, relying on the production of many independent trajectories with a continuous dynamical evolution of an externally driven alchemical coordinate, completing the decoupling of the ligand in a matter of few tens of picoseconds rather than nanoseconds. The absolute binding free energy can be recovered from the annihilation work distributions by applying an unbiased unidirectional free energy estimate, on the assumption that any observed work distribution is given by a mixture of normal distributions, whose compone...

  8. Unusual adsorption site behavior in PCN-14 metal-organic framework predicted from Monte Carlo simulation.

    Science.gov (United States)

    Lucena, Sebastião M P; Mileo, Paulo G M; Silvino, Pedro F G; Cavalcante, Célio L

    2011-12-01

    The adsorption equilibrium of methane in PCN-14 was simulated by the Monte Carlo technique in the grand canonical ensemble. A new force field was proposed for the methane/PCN-14 system, and the temperature dependence of the molecular siting was investigated. A detailed study of the statistics of the center of mass and potential energy showed a surprising site behavior with no energy barriers between weak and strong sites, allowing open metal sites to guide methane molecules to other neighboring sites. Moreover, this study showed that a model assuming weakly adsorbing open metal clusters in PCN-14, densely populated only at low temperatures (below 150 K), can explain published experimental data. These results also explain previously observed discrepancies between neutron diffraction experiments and Monte Carlo simulations.

  9. Federated Simulation and Gaming Framework for a Decentralized Space-Based Resource Economy

    OpenAIRE

    Grogan, Paul Thomas; de Weck, Olivier L.

    2012-01-01

    Future human space exploration will require large amounts of resources for shielding and building materials, propellants, and consumables. A space-based resource economy could produce, transport, and store resource at distributed locations such as the lunar surface, stable orbits, or Lagrange points to avoid Earth's deep gravity well. Design challenges include decentralized operation and management and socio-technical complexities not commonly addressed by modeling and simulation methods. Thi...

  10. OpenSim: a musculoskeletal modeling and simulation framework for in silico investigations and exchange

    OpenAIRE

    Seth, Ajay; Sherman, Michael; Reinbolt, Jeffrey A.; Delp, Scott L.

    2011-01-01

    Movement science is driven by observation, but observation alone cannot elucidate principles of human and animal movement. Biomechanical modeling and computer simulation complement observations and inform experimental design. Biological models are complex and specialized software is required for building, validating, and studying them. Furthermore, common access is needed so that investigators can contribute models to a broader community and leverage past work. We are developing OpenSim, a fr...

  11. Evaluating Standard and Custom Applications in IPv6 Within a Simulation Framework

    OpenAIRE

    Clore, Brittany Michelle

    2012-01-01

    Internet Protocol version 6 (IPv6) is being adopted in networks around the world as the Internet Protocol version 4 (IPv4) addressing space reaches its maximum capacity. Although there are IPv6 applications being developed, there are not many production IPv6 networks in place in which these applications can be deployed. Simulation presents a cost effective alternative to setting up a live test bed of devices to validate specific IPv6 environments before actual physical deployment. OPNET Mode...

  12. A generic open-source software framework supporting scenario simulations in bioterrorist crises.

    Science.gov (United States)

    Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie

    2013-09-01

    Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.

  13. A generic open-source software framework supporting scenario simulations in bioterrorist crises.

    Science.gov (United States)

    Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie

    2013-09-01

    Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios. PMID:23971799

  14. Chemical analysis of Argonne premium coal samples. Bulletin

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, C.A.

    1997-11-01

    Contents: The Chemical Analysis of Argonne Premium Coal Samples: An Introduction; Rehydration of Desiccated Argonne Premium Coal Samples; Determination of 62 Elements in 8 Argonne Premium Coal Ash Samples by Automated Semiquantitative Direct-Current Arc Atomic Emission Spectrography; Determination of 18 Elements in 5 Whole Argonne Premium Coal Samples by Quantitative Direct-Current Arc Atomic Emission Spectrography; Determination of Major and Trace Elements in Eight Argonne Premium Coal Samples (Ash and Whole Coal) by X-Ray Fluorescence Spectrometry; Determination of 29 Elements in 8 Argonne Premium Coal Samples by Instrumental Neutron Activation Analysis; Determination of Selected Elements in Coal Ash from Eight Argonne Premium Coal Samples by Atomic Absorption Spectrometry and Atomic Emission Spectrometry; Determination of 25 Elements in Coal Ash from 8 Argonne Premium Coal Samples by Inductively Coupled Argon Plasma-Atomic Emission Spectrometry; Determination of 33 Elements in Coal Ash from 8 Argonne Premium Coal Samples by Inductively Coupled Argon Plasma-Mass Spectrometry; Determination of Mercury and Selenium in Eight Argonne Premium Coal Samples by Cold-Vapor and Hydride-Generation Atomic Absorption Spectrometry; Determinaton of Carbon, Hydrogen, and Nitrogen in Eight Argonne Premium Coal Samples by Using a Gas Chromatographic Analyzer with a Thermal Conductivity Detector; and Compilation of Multitechnique Determinations of 51 Elements in 8 Argonne Premium Coal Samples.

  15. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  16. Diffusion dynamics and concentration of toxic materials from quantum dots-based nanotechnologies: an agent-based modeling simulation framework

    Energy Technology Data Exchange (ETDEWEB)

    Agusdinata, Datu Buyung, E-mail: bagusdinata@niu.edu; Amouie, Mahbod [Northern Illinois University, Department of Industrial & Systems Engineering and Environment, Sustainability, & Energy Institute (United States); Xu, Tao [Northern Illinois University, Department of Chemistry and Biochemistry (United States)

    2015-01-15

    Due to their favorable electrical and optical properties, quantum dots (QDs) nanostructures have found numerous applications including nanomedicine and photovoltaic cells. However, increased future production, use, and disposal of engineered QD products also raise concerns about their potential environmental impacts. The objective of this work is to establish a modeling framework for predicting the diffusion dynamics and concentration of toxic materials released from Trioctylphosphine oxide-capped CdSe. To this end, an agent-based model simulation with reaction kinetics and Brownian motion dynamics was developed. Reaction kinetics is used to model the stability of surface capping agent particularly due to oxidation process. The diffusion of toxic Cd{sup 2+} ions in aquatic environment was simulated using an adapted Brownian motion algorithm. A calibrated parameter to reflect sensitivity to reaction rate is proposed. The model output demonstrates the stochastic spatial distribution of toxic Cd{sup 2+} ions under different values of proxy environmental factor parameters. With the only chemistry considered was oxidation, the simulation was able to replicate Cd{sup 2+} ion release from Thiol-capped QDs in aerated water. The agent-based method is the first to be developed in the QDs application domain. It adds both simplicity of the solubility and rate of release of Cd{sup 2+} ions and complexity of tracking of individual atoms of Cd at the same time.

  17. The YUIMA Project: A Computational Framework for Simulation and Inference of Stochastic Differential Equations

    Directory of Open Access Journals (Sweden)

    Alexandre Brouste

    2014-04-01

    Full Text Available The YUIMA Project is an open source and collaborative effort aimed at developing the R package yuima for simulation and inference of stochastic differential equations. In the yuima package stochastic differential equations can be of very abstract type, multidimensional, driven by Wiener process or fractional Brownian motion with general Hurst parameter, with or without jumps specified as Lvy noise. The yuima package is intended to offer the basic infrastructure on which complex models and inference procedures can be built on. This paper explains the design of the yuima package and provides some examples of applications.

  18. Towards multi-phase flow simulations in the PDE framework Peano

    KAUST Repository

    Bungartz, Hans-Joachim

    2011-07-27

    In this work, we present recent enhancements and new functionalities of our flow solver in the partial differential equation framework Peano. We start with an introduction including an overview of the Peano development and a short description of the basic concepts of Peano and the flow solver in Peano concerning the underlying structured but adaptive Cartesian grids, the data structure and data access optimisation, and spatial and time discretisation of the flow solver. The new features cover geometry interfaces and additional application functionalities. The two geometry interfaces, a triangulation-based description supported by the tool preCICE and a built-in geometry using geometry primitives such as cubes, spheres, or tetrahedra allow for the efficient treatment of complex and changing geometries, an essential ingredient for most application scenarios. The new application functionality concerns a coupled heat-flow problem and two-phase flows. We present numerical examples, performance and validation results for these new functionalities. © 2011 Springer-Verlag.

  19. Casting Simulation Within the Framework of ICME: Coupling of Solidification, Heat Treatment, and Structural Analysis

    Science.gov (United States)

    Guo, Jianzheng; Scott, Sam; Cao, Weisheng; Köser, Ole

    2016-05-01

    Integrated computational materials engineering (ICME) is becoming a compulsory practice for developing advanced materials, re-thinking manufacturing processing, and engineering components to meet challenging design goals quickly and cost-effectively. As a key component of the ICME approach, a numerical approach is being developed for the prediction of casting microstructure, defects formation and mechanical properties from solidification to heat treatment. Because of the processing conditions and complexity of geometry, material properties of a cast part are not normally homogeneous. This variation and the potential weakening inherent in manufacturing are currently accommodated by incorporating large safety factors that counter design goals. The simulation of the different manufacturing process stages is integrated such that the resultant microstructure of the previous event is used as the initial condition of the following event, ensuring the tracking of the component history while maintaining a high level of accuracy across these manufacturing stages. This paper explains the significance of integrated analytical prediction to obtain more precise simulation results and sets out how available techniques may be applied accordingly.

  20. Research framework of integrated simulation on bilateral interaction between water cycle and socio-economic development

    Science.gov (United States)

    Hao, C. F.; Yang, X. L.; Niu, C. W.; Jia, Y. W.

    2016-08-01

    The mechanism of bilateral interaction between natural water cycle evolution and socio-economic development has been obscured in current research due to the complexity of the hydrological process and the socio-economic system. The coupling of economic model CGE (Computable General Equilibrium) and distributed hydrological model WEP (Water and Energy transfer Processes) provides a model-based tool for research on response and feedback of water cycle and social development, as well as economic prospects under the constraint of water resources. On one hand, water policies, such as water use limitation and water price adjustment under different levels of socio-economic development, are to be evaluated by CGE model as assumed conditions and corresponding results of water demand could be put into WEP model to simulate corresponding response during the whole process of water cycle. On the other hand, variation of available water resources quantity under different scenarios simulated by WEP model may provide proper limitation for water demand in CGE model, and corresponding change of economic factors could indicate the influence of water resources constraints on socio-economic development. The research is believed to be helpful for better understanding of bilateral interaction between water and society.

  1. On a framework for generating PoD curves assisted by numerical simulations

    Science.gov (United States)

    Subair, S. Mohamed; Agrawal, Shweta; Balasubramaniam, Krishnan; Rajagopal, Prabhu; Kumar, Anish; Rao, Purnachandra B.; Tamanna, Jayakumar

    2015-03-01

    The Probability of Detection (PoD) curve method has emerged as an important tool for the assessment of the performance of NDE techniques, a topic of particular interest to the nuclear industry where inspection qualification is very important. The conventional experimental means of generating PoD curves though, can be expensive, requiring large data sets (covering defects and test conditions), and equipment and operator time. Several methods of achieving faster estimates for PoD curves using physics-based modelling have been developed to address this problem. Numerical modelling techniques are also attractive, especially given the ever-increasing computational power available to scientists today. Here we develop procedures for obtaining PoD curves, assisted by numerical simulation and based on Bayesian statistics. Numerical simulations are performed using Finite Element analysis for factors that are assumed to be independent, random and normally distributed. PoD curves so generated are compared with experiments on austenitic stainless steel (SS) plates with artificially created notches. We examine issues affecting the PoD curve generation process including codes, standards, distribution of defect parameters and the choice of the noise threshold. We also study the assumption of normal distribution for signal response parameters and consider strategies for dealing with data that may be more complex or sparse to justify this. These topics are addressed and illustrated through the example case of generation of PoD curves for pulse-echo ultrasonic inspection of vertical surface-breaking cracks in SS plates.

  2. SIMULATION FRAMEWORK FOR REGIONAL GEOLOGIC CO{sub 2} STORAGE ALONG ARCHES PROVINCE OF MIDWESTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Sminchak, Joel

    2012-09-30

    This report presents final technical results for the project Simulation Framework for Regional Geologic CO{sub 2} Storage Infrastructure along Arches Province of the Midwest United States. The Arches Simulation project was a three year effort designed to develop a simulation framework for regional geologic carbon dioxide (CO{sub 2}) storage infrastructure along the Arches Province through development of a geologic model and advanced reservoir simulations of large-scale CO{sub 2} storage. The project included five major technical tasks: (1) compilation of geologic, hydraulic and injection data on Mount Simon, (2) development of model framework and parameters, (3) preliminary variable density flow simulations, (4) multi-phase model runs of regional storage scenarios, and (5) implications for regional storage feasibility. The Arches Province is an informal region in northeastern Indiana, northern Kentucky, western Ohio, and southern Michigan where sedimentary rock formations form broad arch and platform structures. In the province, the Mount Simon sandstone is an appealing deep saline formation for CO{sub 2} storage because of the intersection of reservoir thickness and permeability. Many CO{sub 2} sources are located in proximity to the Arches Province, and the area is adjacent to coal fired power plants along the Ohio River Valley corridor. Geophysical well logs, rock samples, drilling logs, and geotechnical tests were evaluated for a 500,000 km{sup 2} study area centered on the Arches Province. Hydraulic parameters and historical operational information was also compiled from Mount Simon wastewater injection wells in the region. This information was integrated into a geocellular model that depicts the parameters and conditions in a numerical array. The geologic and hydraulic data were integrated into a three-dimensional grid of porosity and permeability, which are key parameters regarding fluid flow and pressure buildup due to CO{sub 2} injection. Permeability data

  3. SIMULATION FRAMEWORK FOR REGIONAL GEOLOGIC CO{sub 2} STORAGE ALONG ARCHES PROVINCE OF MIDWESTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Sminchak, Joel

    2012-09-30

    This report presents final technical results for the project Simulation Framework for Regional Geologic CO{sub 2} Storage Infrastructure along Arches Province of the Midwest United States. The Arches Simulation project was a three year effort designed to develop a simulation framework for regional geologic carbon dioxide (CO{sub 2}) storage infrastructure along the Arches Province through development of a geologic model and advanced reservoir simulations of large-scale CO{sub 2} storage. The project included five major technical tasks: (1) compilation of geologic, hydraulic and injection data on Mount Simon, (2) development of model framework and parameters, (3) preliminary variable density flow simulations, (4) multi-phase model runs of regional storage scenarios, and (5) implications for regional storage feasibility. The Arches Province is an informal region in northeastern Indiana, northern Kentucky, western Ohio, and southern Michigan where sedimentary rock formations form broad arch and platform structures. In the province, the Mount Simon sandstone is an appealing deep saline formation for CO{sub 2} storage because of the intersection of reservoir thickness and permeability. Many CO{sub 2} sources are located in proximity to the Arches Province, and the area is adjacent to coal fired power plants along the Ohio River Valley corridor. Geophysical well logs, rock samples, drilling logs, and geotechnical tests were evaluated for a 500,000 km{sup 2} study area centered on the Arches Province. Hydraulic parameters and historical operational information was also compiled from Mount Simon wastewater injection wells in the region. This information was integrated into a geocellular model that depicts the parameters and conditions in a numerical array. The geologic and hydraulic data were integrated into a three-dimensional grid of porosity and permeability, which are key parameters regarding fluid flow and pressure buildup due to CO{sub 2} injection. Permeability data

  4. A comparison of regional flood frequency analysis approaches in a simulation framework

    Science.gov (United States)

    Ganora, D.; Laio, F.

    2016-07-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve at ungauged (or scarcely gauged) sites. Different RFA approaches exist, depending on the way the information is transferred to the site of interest, but it is not clear in the literature if a specific method systematically outperforms the others. The aim of this study is to provide a framework wherein carrying out the intercomparison by building up a virtual environment based on synthetically generated data. The considered regional approaches include: (i) a unique regional curve for the whole region; (ii) a multiple-region model where homogeneous subregions are determined through cluster analysis; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially smooth estimation procedure where the parameters of the regional model vary continuously along the space. Virtual environments are generated considering different patterns of heterogeneity, including step change and smooth variations. If the region is heterogeneous, with the parent distribution changing continuously within the region, the spatially smooth regional approach outperforms the others, with overall errors 10-50% lower than the other methods. In the case of a step-change, the spatially smooth and clustering procedures perform similarly if the heterogeneity is moderate, while clustering procedures work better when the step-change is severe. To extend our findings, an extensive sensitivity analysis has been performed to investigate the effect of sample length, number of virtual stations, return period of the predicted quantile, variability of the scale parameter of the parent distribution, number of predictor variables and different parent distribution. Overall, the spatially smooth approach appears as the most robust approach as its performances are more stable across different patterns of heterogeneity, especially when short records are

  5. A framework of motion capture system based human behaviours simulation for ergonomic analysis

    CERN Document Server

    Ma, Ruina; Bennis, Fouad; Ma, Liang

    2011-01-01

    With the increasing of computer capabilities, Computer aided ergonomics (CAE) offers new possibilities to integrate conventional ergonomic knowledge and to develop new methods into the work design process. As mentioned in [1], different approaches have been developed to enhance the efficiency of the ergonomic evaluation. Ergonomic expert systems, ergonomic oriented information systems, numerical models of human, etc. have been implemented in numerical ergonomic software. Until now, there are ergonomic software tools available, such as Jack, Ergoman, Delmia Human, 3DSSPP, and Santos, etc. [2-4]. The main functions of these tools are posture analysis and posture prediction. In the visualization part, Jack and 3DSSPP produce results to visualize virtual human tasks in 3-dimensional, but without realistic physical properties. Nowadays, with the development of computer technology, the simulation of physical world is paid more attention. Physical engines [5] are used more and more in computer game (CG) field. The a...

  6. Optimising and extending the geometrical modeller of a physics simulation framework

    CERN Document Server

    Urban, P

    1998-01-01

    The design of highly complex particle detectors used in High Energy Physics involves both CAD systems and physics simulation packages like Geant4. Geant4 is able to exchange detector geometries with CAD systems, conforming to the Standard for the Exchange of Product Model Data (STEP); Boundary Representation (B-Rep) models are transferred. Particle tracking is performed in these models, requiring efficient and accurate intersection computations from the geometrical modeller. The results of extending and optimising the modeller of Geant4 form the contents of this thesis. Swept surfaces: surfaces of linear extrusion and surfaces of revolution have been implemented. The problem of classifying points on surfaces bounded by curves as being inside or outside has been solved. These tasks necessitated the extension and optimisation of code related to curves and lead to a re-design of this code. Emphasis was put on efficiency and on dealing with numerical errors. The results will be integrated into the upcoming beta t...

  7. The Design of Cognitive Social Simulation Framework using Statistical Methodology in the Domain of Academic Science

    Directory of Open Access Journals (Sweden)

    R. Sivakumar

    2013-05-01

    Full Text Available Modeling the behavior of the cognitive architecture in the context of social simulation using statistical methodologies is currently a growing research area. Normally, a cognitive architecture for an intelligent agent involves artificial computational process which exemplifies theories of cognition in computer algorithms under the consideration of state space. More specifically, for such cognitive system with large state space the problem like large tables and data sparsity are faced. Hence in this paper, we have proposed a method using a value iterative approach based on Q-learning algorithm, with function approximation technique to handle the cognitive systems with large state space. From the experimental results in the application domain of academic science it has been verified that the proposed approach has better performance compared to its existing approaches.

  8. Handling of the Generation of Primary Events in Gauss, the LHCb Simulation Framework

    CERN Multimedia

    Corti, G; Brambach, T; Brook, N H; Gauvin, N; Harrison, K; Harrison, P; He, J; Ilten, P J; Jones, C R; Lieng, M H; Manca, G; Miglioranzi, S; Robbe, P; Vagnoni, V; Whitehead, M; Wishahi, J

    2010-01-01

    The LHCb simulation application, Gauss, consists of two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup. For the LHCb experimental program it is particularly important to model B meson decays: the EvtGen code developed in CLEO and BaBar has been chosen and customized for non coherent B production as occuring in pp collisions at the LHC. The initial proton-proton collision is provided by a different generator engine, currently Pythia 6 for massive production of signal and generic pp collisions events. Beam gas events, background events originating from proton halo, cosmics and calibration events for different detectors can be generated in addition to pp collisions. Different generator packages are available in the physics community or specifically developed in LHCb, and are used for the different purposes. Running conditions affecting the events generated such as the size of the luminous region, the number of collisions occuring in a bunc...

  9. CO adsorption over Pd nanoparticles: A general framework for IR simulations on nanoparticles

    Science.gov (United States)

    Zeinalipour-Yazdi, Constantinos D.; Willock, David J.; Thomas, Liam; Wilson, Karen; Lee, Adam F.

    2016-04-01

    CO vibrational spectra over catalytic nanoparticles under high coverages/pressures are discussed from a DFT perspective. Hybrid B3LYP and PBE DFT calculations of CO chemisorbed over Pd4 and Pd13 nanoclusters, and a 1.1 nm Pd38 nanoparticle, have been performed in order to simulate the corresponding coverage dependent infrared (IR) absorption spectra, and hence provide a quantitative foundation for the interpretation of experimental IR spectra of CO over Pd nanocatalysts. B3LYP simulated IR intensities are used to quantify site occupation numbers through comparison with experimental DRIFTS spectra, allowing an atomistic model of CO surface coverage to be created. DFT adsorption energetics for low CO coverage (θ → 0) suggest the CO binding strength follows the order hollow > bridge > linear, even for dispersion-corrected functionals for sub-nanometre Pd nanoclusters. For a Pd38 nanoparticle, hollow and bridge-bound are energetically similar (hollow ≈ bridge > atop). It is well known that this ordering has not been found at the high coverages used experimentally, wherein atop CO has a much higher population than observed over Pd(111), confirmed by our DRIFTS spectra for Pd nanoparticles supported on a KIT-6 silica, and hence site populations were calculated through a comparison of DFT and spectroscopic data. At high CO coverage (θ = 1), all three adsorbed CO species co-exist on Pd38, and their interdiffusion is thermally feasible at STP. Under such high surface coverages, DFT predicts that bridge-bound CO chains are thermodynamically stable and isoenergetic to an entirely hollow bound Pd/CO system. The Pd38 nanoparticle undergoes a linear (3.5%), isotropic expansion with increasing CO coverage, accompanied by 63 and 30 cm- 1 blue-shifts of hollow and linear bound CO respectively.

  10. A discrete element and ray framework for rapid simulation of acoustical dispersion of microscale particulate agglomerations

    Science.gov (United States)

    Zohdi, T. I.

    2016-03-01

    In industry, particle-laden fluids, such as particle-functionalized inks, are constructed by adding fine-scale particles to a liquid solution, in order to achieve desired overall properties in both liquid and (cured) solid states. However, oftentimes undesirable particulate agglomerations arise due to some form of mutual-attraction stemming from near-field forces, stray electrostatic charges, process ionization and mechanical adhesion. For proper operation of industrial processes involving particle-laden fluids, it is important to carefully breakup and disperse these agglomerations. One approach is to target high-frequency acoustical pressure-pulses to breakup such agglomerations. The objective of this paper is to develop a computational model and corresponding solution algorithm to enable rapid simulation of the effect of acoustical pulses on an agglomeration composed of a collection of discrete particles. Because of the complex agglomeration microstructure, containing gaps and interfaces, this type of system is extremely difficult to mesh and simulate using continuum-based methods, such as the finite difference time domain or the finite element method. Accordingly, a computationally-amenable discrete element/discrete ray model is developed which captures the primary physical events in this process, such as the reflection and absorption of acoustical energy, and the induced forces on the particulate microstructure. The approach utilizes a staggered, iterative solution scheme to calculate the power transfer from the acoustical pulse to the particles and the subsequent changes (breakup) of the pulse due to the particles. Three-dimensional examples are provided to illustrate the approach.

  11. MASH: a framework for the automation of x-ray optical simulations

    Science.gov (United States)

    Sondhauss, Peter

    2014-09-01

    MASH stands for "Macros for the Automation of SHadow". It allows to run a set of ray-tracing simulations, for a range of photon energies for example, fully automatically. Undulator gaps, crystal angles etc. are tuned automatically. Important output parameters, such as photon flux, photon irradiance, focal spot size, bandwidth, etc. are then directly provided as function of photon energy. A photon energy scan is probably the most commonly requested one, but any parameter or set of parameters can be scanned through as well. Heat load calculations with finite element analysis providing temperatures, stress and deformations (Comsol) are fully integrated. The deformations can be fed back into the ray-tracing process simply by activating a switch. MASH tries to hide program internals such as le names, calls to pre-processors etc., so that the user (nearly) only needs to provide the optical setup. It comes with a web interface, which allows to run it remotely on a central computation server. Hence, no local installation or licenses are required, just a web browser and access to the local network. Numerous tools are provided to look at the ray-tracing results in the web-browser. The results can be also downloaded for local analysis. All files are human readable text files that can be easily imported into third-party programs for further processing. All set parameters are stored in a single human-readable file in XML format.

  12. Development of a Parallel Overset Grid Framework for Moving Body Simulations in OpenFOAM

    Directory of Open Access Journals (Sweden)

    Dominic Chandar

    2015-12-01

    Full Text Available OpenFOAM is an industry-standard Open-Source fluid dynamics code that is used to solve the Navier-Stokes equations for a variety of flow situations. It is currently being used extensively by researchers to study a plethora of physical problems ranging from fundamental fluid dynamics to complex multiphase flows. When it comes to modeling the flow surrounding moving bodies that involve large displacements such as that of ocean risers, sinking of a ship, or the free-flight of an insect, it is cumbersome to utilize a single computational grid and move the body of interest. In this work, we discuss a high-fidelity approach based on overset or overlapping grids which overcomes the necessity of using a single computational grid. The overset library is parallelized using the Message Passing Interface (MPI and Pthreads and is linked dynamically to OpenFOAM. Computational results are presented to demonstrate the potential of this method for simulating problems with large displacements.

  13. A framework for the evaluation of turbulence closures used in mesoscale ocean large-eddy simulations

    CERN Document Server

    Graham, Jonathan Pietarila

    2012-01-01

    We present a methodology to determine the best turbulence closure for an eddy-permitting ocean model: measurement of the error-landscape of the closure's subgrid spectral transfers and flux. Using a high-resolution benchmark, we compare each closure's model of energy and enstrophy transfer to the actual transfer observed in the benchmark run. The error-landscape norms enable us to both make objective comparisons between the closures and to optimize each closure's free parameter for a fair comparison. We apply this method to 6 different closures for forced-dissipative simulations of the barotropic vorticity equation on a f-plane (2D Navier-Stokes equation). The hyper-viscous closure most closely reproduces the enstrophy cascade especially at larger scales due to the concentration of its dissipative effects to the very smallest scales. The viscous and Leith closures perform nearly as well especially at smaller scales where all three models were dissipative. The Smagorinsky closure dissipates enstrophy at the wr...

  14. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  15. C++QEDv2 Milestone 10: A C++/Python application-programming framework for simulating open quantum dynamics

    Science.gov (United States)

    Sandner, Raimar; Vukics, András

    2014-09-01

    The v2 Milestone 10 release of C++QED is primarily a feature release, which also corrects some problems of the previous release, especially as regards the build system. The adoption of C++11 features has led to many simplifications in the codebase. A full doxygen-based API manual [1] is now provided together with updated user guides. A largely automated, versatile new testsuite directed both towards computational and physics features allows for quickly spotting arising errors. The states of trajectories are now savable and recoverable with full binary precision, allowing for trajectory continuation regardless of evolution method (single/ensemble Monte Carlo wave-function or Master equation trajectory). As the main new feature, the framework now presents Python bindings to the highest-level programming interface, so that actual simulations for given composite quantum systems can now be performed from Python. Catalogue identifier: AELU_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: yes No. of lines in distributed program, including test data, etc.: 492422 No. of bytes in distributed program, including test data, etc.: 8070987 Distribution format: tar.gz Programming language: C++/Python. Computer: i386-i686, x86 64. Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2. External routines: Boost C

  16. Causal Mathematical Logic as a guiding framework for the prediction of "Intelligence Signals" in brain simulations

    Science.gov (United States)

    Lanzalaco, Felix; Pissanetzky, Sergio

    2013-12-01

    A recent theory of physical information based on the fundamental principles of causality and thermodynamics has proposed that a large number of observable life and intelligence signals can be described in terms of the Causal Mathematical Logic (CML), which is proposed to encode the natural principles of intelligence across any physical domain and substrate. We attempt to expound the current definition of CML, the "Action functional" as a theory in terms of its ability to possess a superior explanatory power for the current neuroscientific data we use to measure the mammalian brains "intelligence" processes at its most general biophysical level. Brain simulation projects define their success partly in terms of the emergence of "non-explicitly programmed" complex biophysical signals such as self-oscillation and spreading cortical waves. Here we propose to extend the causal theory to predict and guide the understanding of these more complex emergent "intelligence Signals". To achieve this we review whether causal logic is consistent with, can explain and predict the function of complete perceptual processes associated with intelligence. Primarily those are defined as the range of Event Related Potentials (ERP) which include their primary subcomponents; Event Related Desynchronization (ERD) and Event Related Synchronization (ERS). This approach is aiming for a universal and predictive logic for neurosimulation and AGi. The result of this investigation has produced a general "Information Engine" model from translation of the ERD and ERS. The CML algorithm run in terms of action cost predicts ERP signal contents and is consistent with the fundamental laws of thermodynamics. A working substrate independent natural information logic would be a major asset. An information theory consistent with fundamental physics can be an AGi. It can also operate within genetic information space and provides a roadmap to understand the live biophysical operation of the phenotype

  17. Architecture Framework for Trapped-Ion Quantum Computer based on Performance Simulation Tool

    Science.gov (United States)

    Ahsan, Muhammad

    The challenge of building scalable quantum computer lies in striking appropriate balance between designing a reliable system architecture from large number of faulty computational resources and improving the physical quality of system components. The detailed investigation of performance variation with physics of the components and the system architecture requires adequate performance simulation tool. In this thesis we demonstrate a software tool capable of (1) mapping and scheduling the quantum circuit on a realistic quantum hardware architecture with physical resource constraints, (2) evaluating the performance metrics such as the execution time and the success probability of the algorithm execution, and (3) analyzing the constituents of these metrics and visualizing resource utilization to identify system components which crucially define the overall performance. Using this versatile tool, we explore vast design space for modular quantum computer architecture based on trapped ions. We find that while success probability is uniformly determined by the fidelity of physical quantum operation, the execution time is a function of system resources invested at various layers of design hierarchy. At physical level, the number of lasers performing quantum gates, impact the latency of the fault-tolerant circuit blocks execution. When these blocks are used to construct meaningful arithmetic circuit such as quantum adders, the number of ancilla qubits for complicated non-clifford gates and entanglement resources to establish long-distance communication channels, become major performance limiting factors. Next, in order to factorize large integers, these adders are assembled into modular exponentiation circuit comprising bulk of Shor's algorithm. At this stage, the overall scaling of resource-constraint performance with the size of problem, describes the effectiveness of chosen design. By matching the resource investment with the pace of advancement in hardware technology

  18. Toward an ontology framework supporting the integration of geographic information with modeling and simulation for critical infrastructure protection

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, John J [Los Alamos National Laboratory; Bent, Russell W [Los Alamos National Laboratory; Linger, Steve P [Los Alamos National Laboratory

    2009-01-01

    Protecting the nation's infrastructure from natural disasters, inadvertent failures, or intentional attacks is a major national security concern. Gauging the fragility of infrastructure assets, and understanding how interdependencies across critical infrastructures affect their behavior, is essential to predicting and mitigating cascading failures, as well as to planning for response and recovery. Modeling and simulation (M&S) is an indispensable part of characterizing this complex system of systems and anticipating its response to disruptions. Bringing together the necessary components to perform such analyses produces a wide-ranging and coarse-grained computational workflow that must be integrated with other analysis workflow elements. There are many points in both types of work flows in which geographic information (GI) services are required. The GIS community recognizes the essential contribution of GI in this problem domain as evidenced by past OGC initiatives. Typically such initiatives focus on the broader aspects of GI analysis workflows, leaving concepts crucial to integrating simulations within analysis workflows to that community. Our experience with large-scale modeling of interdependent critical infrastructures, and our recent participation in a DRS initiative concerning interoperability for this M&S domain, has led to high-level ontological concepts that we have begun to assemble into an architecture that spans both computational and 'world' views of the problem, and further recognizes the special requirements of simulations that go beyond common workflow ontologies. In this paper we present these ideas, and offer a high-level ontological framework that includes key geospatial concepts as special cases of a broader view.

  19. Computational Investigations of Potential Energy Function Development for Metal--Organic Framework Simulations, Metal Carbenes, and Chemical Warfare Agents

    Science.gov (United States)

    Cioce, Christian R.

    Metal-Organic Frameworks (MOFs) are three-dimensional porous nanomaterials with a variety of applications, including catalysis, gas storage and separation, and sustainable energy. Their potential as air filtration systems is of interest for designer carbon capture materials. The chemical constituents (i.e. organic ligands) can be functionalized to create rationally designed CO2 sequestration platforms, for example. Hardware and software alike at the bleeding edge of supercomputing are utilized for designing first principles-based molecular models for the simulation of gas sorption in these frameworks. The classical potentials developed herein are named PHAST --- Potentials with High Accuracy, Speed, and Transferability, and thus are designed via a "bottom-up" approach. Specifically, models for N2 and CH4 are constructed and presented. Extensive verification and validation leads to insights and range of applicability. Through this experience, the PHAST models are improved upon further to be more applicable in heterogeneous environments. Given this, the models are applied to reproducing high level ab initio energies for gas sorption trajectories of helium atoms in a variety of rare-gas clusters, the geometries of which being representative of sorption-like environments commonly encountered in a porous nanomaterial. This work seeks to push forward the state of classical and first principles materials modeling. Additionally, the characterization of a new type of tunable radical metal---carbene is presented. Here, a cobalt(II)---porphyrin complex, [Co(Por)], was investigated to understand its role as an effective catalyst in stereoselective cyclopropanation of a diazoacetate reagent. Density functional theory along with natural bond order analysis and charge decomposition analysis gave insight into the electronics of the catalytic intermediate. The bonding pattern unveiled a new class of radical metal---carbene complex, with a doublet cobalt into which a triplet carbene

  20. Exploring the "what if?" in geology through a RESTful open-source framework for cloud-based simulation and analysis

    Science.gov (United States)

    Klump, Jens; Robertson, Jess

    2016-04-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Our framework consist of two layers: (a) a ground-truth layer that contains geological models, which can be statistically based on historical operations data, and (b) a network of RESTful synthetic sensor microservices which can query the ground-truth for underlying properties and produce a simulated measurement to a control layer, which could be a database or LIMS, a machine learner or a companies' existing data infrastructure. Ground truth data are generated by an implicit geological model which serves as a host for nested models of geological processes as smaller scales. Our two layers are implemented using Flask and Gunicorn, which are open source Python web application framework and server, the PyData stack (numpy, scipy etc) and Rabbit MQ (an open-source queuing library). Sensor data is encoded using a JSON-LD version of the SensorML and Observations and Measurements standards. Containerisation of the synthetic sensors using Docker and CoreOS allows rapid and scalable deployment of large numbers of sensors, as well as sensor discovery to form a self-organized dynamic network of sensors. Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the

  1. ELIST8: simulating military deployments in Java

    International Nuclear Information System (INIS)

    Planning for the transportation of large amounts of equipment, troops, and supplies presents a complex problem. Many options, including modes of transportation, vehicles, facilities, routes, and timing, must be considered. The amount of data involved in generating and analyzing a course of action (e.g., detailed information about military units, logistical infrastructures, and vehicles) is enormous. Software tools are critical in defining and analyzing these plans. Argonne National Laboratory has developed ELIST (Enhanced Logistics Intra-theater Support Tool), a simulation-based decision support system, to assist military planners in determining the logistical feasibility of an intra-theater course of action. The current version of ELIST (v.8) contains a discrete event simulation developed using the Java programming language. Argonne selected Java because of its object-oriented framework, which has greatly facilitated entity and process development within the simulation, and because it fulfills a primary requirement for multi-platform execution. This paper describes the model, including setup and analysis, a high-level architectural design, and an evaluation of Java

  2. On-lattice agent-based simulation of populations of cells within the open-source Chaste framework

    KAUST Repository

    Figueredo, G. P.

    2013-02-21

    Over the years, agent-based models have been developed that combine cell division and reinforced random walks of cells on a regular lattice, reaction-diffusion equations for nutrients and growth factors; and ordinary differential equations for the subcellular networks regulating the cell cycle. When linked to a vascular layer, this multiple scale model framework has been applied to tumour growth and therapy. Here, we report on the creation of an agent-based multi-scale environment amalgamating the characteristics of these models within a Virtual Physiological Human (VPH) Exemplar Project. This project enables reuse, integration, expansion and sharing of the model and relevant data. The agent-based and reaction-diffusion parts of the multi-scale model have been implemented and are available for download as part of the latest public release of Chaste (Cancer, Heart and Soft Tissue Environment; http://www.cs.ox.ac.uk/chaste/), part of the VPH Toolkit (http://toolkit.vph-noe.eu/). The environment functionalities are verified against the original models, in addition to extra validation of all aspects of the code. In this work, we present the details of the implementation of the agent-based environment, including the system description, the conceptual model, the development of the simulation model and the processes of verification and validation of the simulation results. We explore the potential use of the environment by presenting exemplar applications of the \\'what if\\' scenarios that can easily be studied in the environment. These examples relate to tumour growth, cellular competition for resources and tumour responses to hypoxia (low oxygen levels). We conclude our work by summarizing the future steps for the expansion of the current system.

  3. Numerical simulation and experimental validation of biofilm in a multi-physics framework using an SPH based method

    Science.gov (United States)

    Soleimani, Meisam; Wriggers, Peter; Rath, Henryke; Stiesch, Meike

    2016-10-01

    In this paper, a 3D computational model has been developed to investigate biofilms in a multi-physics framework using smoothed particle hydrodynamics (SPH) based on a continuum approach. Biofilm formation is a complex process in the sense that several physical phenomena are coupled and consequently different time-scales are involved. On one hand, biofilm growth is driven by biological reaction and nutrient diffusion and on the other hand, it is influenced by fluid flow causing biofilm deformation and interface erosion in the context of fluid and deformable solid interaction. The geometrical and numerical complexity arising from these phenomena poses serious complications and challenges in grid-based techniques such as finite element. Here the solution is based on SPH as one of the powerful meshless methods. SPH based computational modeling is quite new in the biological community and the method is uniquely robust in capturing the interface-related processes of biofilm formation such as erosion. The obtained results show a good agreement with experimental and published data which demonstrates that the model is capable of simulating and predicting overall spatial and temporal evolution of biofilm.

  4. Numerical simulation and experimental validation of biofilm in a multi-physics framework using an SPH based method

    Science.gov (United States)

    Soleimani, Meisam; Wriggers, Peter; Rath, Henryke; Stiesch, Meike

    2016-06-01

    In this paper, a 3D computational model has been developed to investigate biofilms in a multi-physics framework using smoothed particle hydrodynamics (SPH) based on a continuum approach. Biofilm formation is a complex process in the sense that several physical phenomena are coupled and consequently different time-scales are involved. On one hand, biofilm growth is driven by biological reaction and nutrient diffusion and on the other hand, it is influenced by fluid flow causing biofilm deformation and interface erosion in the context of fluid and deformable solid interaction. The geometrical and numerical complexity arising from these phenomena poses serious complications and challenges in grid-based techniques such as finite element. Here the solution is based on SPH as one of the powerful meshless methods. SPH based computational modeling is quite new in the biological community and the method is uniquely robust in capturing the interface-related processes of biofilm formation such as erosion. The obtained results show a good agreement with experimental and published data which demonstrates that the model is capable of simulating and predicting overall spatial and temporal evolution of biofilm.

  5. A framework for the design and specification of hard real-time, hardware-in-the-loop simulations of large, avionic systems

    Science.gov (United States)

    Ricks, Kenneth Gerald

    High-level design tools for the design and specification of avionic systems and real-time systems currently exist. However, real-time, hardware-in-the-loop simulations of avionic systems are based upon principles fundamentally different than those used to design avionic systems and represent a specialized case of real-time systems. As a result, the high-level software tools used to design avionic systems and real-time systems cannot be applied to the design of real-time, hardware-in-the-loop simulations of avionic systems. For this reason, such simulations of avionic systems should not be considered part of the domain containing avionic systems or general-purpose real-time systems and should be considered as an application domain unto itself for which design tools are unavailable. To fill this void, this dissertation proposes a framework for the design and specification of real-time, hardware-in-the-loop simulations of avionic systems. This framework is based upon a new specification language called the Simulation Architecture Description Language. This specification language is a graphical language with constructs and semantics defined to provide the user with the capability to completely define the simulation and its software execution characteristics at various levels of abstraction. The language includes a new method for combining precedence constraints for a single software process. These semantics provide a more accurate description of the behavior of software systems having a dynamic job structure than existing semantics. An environment that supports the execution of simulation software having the semantics defined within this language is also described. A toolset that interfaces to the language and provides additional functionality such as design analysis, schedulability analysis, and simulation file generation is also discussed. This framework provides a complete design and specification environment for real-time, hardware-in-the-loop simulations of

  6. Materials technology at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Betten, P.

    1989-01-01

    Argonne is actively involved in the research and development of new materials research and development (R D). Five new materials technologies have been identified for commercial potential and are presented in this paper as follows: (1) nanophase materials, (2) nuclear magnetic resonance (NMR) imaging of ceramics, (3) superconductivity developments and technology transfer mechanisms, and (4) COMMIX computer code modeling for metal castings, and (5) tribology using ion-assisted deposition (IAB). 4 refs., 7 figs., 1 tab.

  7. A flexible object-based software framework for modeling complex systems with interacting natural and societal processes.

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-15

    The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations. The DIAS infrastructure makes it feasible to build and manipulate complex simulation scenarios in which many thousands of objects can interact via dozens to hundreds of concurrent dynamic processes. The flexibility and extensibility of the DIAS software infrastructure stem mainly from (1) the abstraction of object behaviors, (2) the encapsulation and formalization of model functionality, and (3) the mutability of domain object contents. DIAS simulation objects are inherently capable of highly flexible and heterogeneous spatial realizations. Geospatial graphical representation of DIAS simulation objects is addressed via the GeoViewer, an object-based GIS toolkit application developed at ANL. DIAS simulation capabilities have been extended by inclusion of societal process models generated by the Framework for Addressing Cooperative Extended Transactions (FACET), another object-based framework developed at Argonne National Laboratory. By using FACET models to implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations, it has become possible to conveniently address a broad range of issues involving interaction and feedback among natural and societal processes. Example DIAS application areas discussed in this paper include a dynamic virtual oceanic environment, detailed simulation of clinical, physiological, and logistical aspects of health care delivery, and studies of agricultural sustainability of urban centers under environmental stress in ancient Mesopotamia.

  8. Recent developments in the target facilities at Argonne National Laboratory

    International Nuclear Information System (INIS)

    A description is given of recent developments in the target facility at Argonne National Laboratory. Highlights include equipment upgrades which enables us to provide enhanced capabilities for support of the Argonne Heavy-Ion ATLAS Accelerator Project. Also future plans and additional equipment acquisitions will be discussed. 3 refs., 3 tabs

  9. Environmental monitoring at Argonne National Laboratory. Annual report for 1983

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1983 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in air, surface water, ground water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The potential radiation dose to off-site population groups is also estimated. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. 19 references, 8 figures, 49 tables

  10. Environmental monitoring at Argonne National Laboratory. Annual report for 1980

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Duffy, T. L.; Sedlet, J.

    1981-03-01

    The results of the environmental monitoring program at Argonne National Laboratory for 1980 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated.

  11. Environmental monitoring at Argonne National Laboratory. Annual report for 1984

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1984 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, ground water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in surface water, ground water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made on the site, at the site boundary, and off the Argonne site for comparison purposes. The potential radiation dose to off-site population groups is also estimated. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. 20 refs., 8 figs., 46 tabs

  12. Environmental monitoring at Argonne National Laboratory. Annual report for 1980

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1980 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environmental penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measurements were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated

  13. Environmental monitoring at Argonne National Laboratory. Annual report for 1979

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory for 1979 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, Argonne effluent water, soil, grass, bottom sediment, and foodstuffs; for a variety of chemical constituents in air, surface water, and Argonne effluent water; and of the environemetal penetrating radiation dose. Sample collections and measurements were made at the site boundary and off the Argonne site for comparison purposes. Some on-site measuremenets were made to aid in the interpretation of the boundary and off-site data. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances and are compared with applicable environmental quality standards. The potential radiation dose to off-site population groups is also estimated

  14. Liquid Metal Fast Breeder Reactor Program: Argonne facilities

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, S. V. [comp.

    1976-09-01

    The objective of the document is to present in one volume an overview of the Argonne National Laboratory test facilities involved in the conduct of the national LMFBR research and development program. Existing facilities and those under construction or authorized as of September 1976 are described. Each profile presents brief descriptions of the overall facility and its test area and data relating to its experimental and testing capability. The volume is divided into two sections: Argonne-East and Argonne-West. Introductory material for each section includes site and facility maps. The profiles are arranged alphabetically by title according to their respective locations at Argonne-East or Argonne-West. A glossary of acronyms and letter designations in common usage to describe organizations, reactor and test facilities, components, etc., involved in the LMFBR program is appended.

  15. Steady-State Gyrokinetics Transport Code (SSGKT), A Scientific Application Partnership with the Framework Application for Core-Edge Transport Simulations, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Fahey, Mark R. [Oak Ridge National Laboratory; Candy, Jeff [General Atomics

    2013-11-07

    This project initiated the development of TGYRO ? a steady-state Gyrokinetic transport code (SSGKT) that integrates micro-scale GYRO turbulence simulations into a framework for practical multi-scale simulation of conventional tokamaks as well as future reactors. Using a lightweight master transport code, multiple independent (each massively parallel) gyrokinetic simulations are coordinated. The capability to evolve profiles using the TGLF model was also added to TGYRO and represents a more typical use-case for TGYRO. The goal of the project was to develop a steady-state Gyrokinetic transport code (SSGKT) that integrates micro-scale gyrokinetic turbulence simulations into a framework for practical multi-scale simulation of a burning plasma core ? the International Thermonuclear Experimental Reactor (ITER) in particular. This multi-scale simulation capability will be used to predict the performance (the fusion energy gain, Q) given the H-mode pedestal temperature and density. At present, projections of this type rely on transport models like GLF23, which are based on rather approximate fits to the results of linear and nonlinear simulations. Our goal is to make these performance projections with precise nonlinear gyrokinetic simulations. The method of approach is to use a lightweight master transport code to coordinate multiple independent (each massively parallel) gyrokinetic simulations using the GYRO code. This project targets the practical multi-scale simulation of a reactor core plasma in order to predict the core temperature and density profiles given the H-mode pedestal temperature and density. A master transport code will provide feedback to O(16) independent gyrokinetic simulations (each massively parallel). A successful feedback scheme offers a novel approach to predictive modeling of an important national and international problem. Success in this area of fusion simulations will allow US scientists to direct the research path of ITER over the next two

  16. A generalized adjoint framework for sensitivity and global error estimation in time-dependent nuclear reactor simulations

    International Nuclear Information System (INIS)

    Highlights: ► We develop an abstract framework for computing the adjoint to the neutron/nuclide burnup equations posed as a system of differential algebraic equations. ► We validate use of the adjoint for computing both sensitivity to uncertain inputs and for estimating global time discretization error. ► Flexibility of the framework is leveraged to add heat transfer physics and compute its adjoint without a reformulation of the adjoint system. ► Such flexibility is crucial for high performance computing applications. -- Abstract: We develop a general framework for computing the adjoint variable to nuclear engineering problems governed by a set of differential–algebraic equations (DAEs). The nuclear engineering community has a rich history of developing and applying adjoints for sensitivity calculations; many such formulations, however, are specific to a certain set of equations, variables, or solution techniques. Any change or addition to the physics model would require a reformulation of the adjoint problem and substantial difficulties in its software implementation. In this work we propose an abstract framework that allows for the modification and expansion of the governing equations, leverages the existing theory of adjoint formulation for DAEs, and results in adjoint equations that can be used to efficiently compute sensitivities for parametric uncertainty quantification. Moreover, as we justify theoretically and demonstrate numerically, the same framework can be used to estimate global time discretization error. We first motivate the framework and show that the coupled Bateman and transport equations, which govern the time-dependent neutronic behavior of a nuclear reactor, may be formulated as a DAE system with a power constraint. We then use a variational approach to develop the parameter-dependent adjoint framework and apply existing theory to give formulations for sensitivity and global time discretization error estimates using the adjoint

  17. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    DEFF Research Database (Denmark)

    Misfeldt, Morten

    2015-01-01

    In this paper I describe s how students use a project management simulation game based on an attack‑defense mechanism where two teams of players compete by challenging each other⠒s projects. The project management simulation game is intended to be playe d by pre‑service construction workers...... opponent⠒s building project for weak spots. The intention of the project management simulation game, is to provide students with an increased sensitivity towards the relation between planning and reality in complex construction projects. The project management simulation game can be interpreted both...... as a competitive game and as a simulation. Both of these views are meaningful and can be seen as supporting learnin g. Emphasizing the simulation aspect let us explain how students learn by being immersed into a simulated world, where the players identify with specific roles, live out specific situations...

  18. The SOPHY framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, M. F.; Bendtsen, Jan Dimon;

    2005-01-01

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  19. The SOPHY Framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, Martin Fejrskov; Bendtsen, Jan Dimon;

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  20. Argonne Bubble Experiment Thermal Model Development II

    Energy Technology Data Exchange (ETDEWEB)

    Buechler, Cynthia Eileen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report describes the continuation of the work reported in “Argonne Bubble Experiment Thermal Model Development”.1 The experiment was performed at Argonne National Laboratory (ANL) in 2014.2 A rastered 35 MeV electron beam deposited power in a solution of uranyl sulfate, generating heat and radiolytic gas bubbles. Irradiations were performed at three beam power levels, 6, 12 and 15 kW. Solution temperatures were measured by thermocouples, and gas bubble behavior was observed. This report will describe the Computational Fluid Dynamics (CFD) model that was developed to calculate the temperatures and gas volume fractions in the solution vessel during the irradiations. The previous report1 described an initial analysis performed on a geometry that had not been updated to reflect the as-built solution vessel. Here, the as-built geometry is used. Monte-Carlo N-Particle (MCNP) calculations were performed on the updated geometry, and these results were used to define the power deposition profile for the CFD analyses, which were performed using Fluent, Ver. 16.2. CFD analyses were performed for the 12 and 15 kW irradiations, and further improvements to the model were incorporated, including the consideration of power deposition in nearby vessel components, gas mixture composition, and bubble size distribution. The temperature results of the CFD calculations are compared to experimental measurements.

  1. Argonne Bubble Experiment Thermal Model Development II

    Energy Technology Data Exchange (ETDEWEB)

    Buechler, Cynthia Eileen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report describes the continuation of the work reported in “Argonne Bubble Experiment Thermal Model Development”. The experiment was performed at Argonne National Laboratory (ANL) in 2014. A rastered 35 MeV electron beam deposited power in a solution of uranyl sulfate, generating heat and radiolytic gas bubbles. Irradiations were performed at three beam power levels, 6, 12 and 15 kW. Solution temperatures were measured by thermocouples, and gas bubble behavior was observed. This report will describe the Computational Fluid Dynamics (CFD) model that was developed to calculate the temperatures and gas volume fractions in the solution vessel during the irradiations. The previous report described an initial analysis performed on a geometry that had not been updated to reflect the as-built solution vessel. Here, the as-built geometry is used. Monte-Carlo N-Particle (MCNP) calculations were performed on the updated geometry, and these results were used to define the power deposition profile for the CFD analyses, which were performed using Fluent, Ver. 16.2. CFD analyses were performed for the 12 and 15 kW irradiations, and further improvements to the model were incorporated, including the consideration of power deposition in nearby vessel components, gas mixture composition, and bubble size distribution. The temperature results of the CFD calculations are compared to experimental measurements.

  2. Little by little does the trick design and construction of a discrete event agent-based simulation framework

    OpenAIRE

    Matsopoulos, Alexandros

    2007-01-01

    Simulation is one of the most widely used techniques in operations research. In the military context, agent-based simulations have been extensively used by defense agencies worldwide. Despite the numerous disadvantages and limitations associated with timestepping, most of the combat-oriented agent-based simulation models are time-step implementations. The Discrete Event Scheduling (DES) paradigm, on the other hand, is free of these disadvantages and limitations. The scope of this thesis...

  3. Pricing Caps in the Heath, Jarrow and Morton Framework Using Monte Carlo Simulations in a Java Applet

    OpenAIRE

    Kalavrezos, Michail

    2007-01-01

    In this paper the Heath, Jarrow and Morton (HJM) framework is applied in the programming language Java for the estimation of the future spot rate. The subcase of an exponential model for the diffusion coefficient (volatility) is used for the pricing of interest rate derivatives (caps).

  4. The Development of Dynamic Human Reliability Analysis Simulations for Inclusion in Risk Informed Safety Margin Characterization Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring; Curtis L. Smith; Rachel B. Shirley

    2015-07-01

    The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describes current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.

  5. Low-coverage adsorption properties of the metal-organic framework MIL-47 studied by pulse chromatography and Monte Carlo simulations.

    Science.gov (United States)

    Finsy, Vincent; Calero, Sofia; García-Pérez, Elena; Merkling, Patrick J; Vedts, Gill; De Vos, Dirk E; Baron, Gino V; Denayer, Joeri F M

    2009-05-14

    Low-coverage adsorption properties of the metal-organic framework MIL-47 were determined by a combined experimental and simulation study. Henry constants and low coverage adsorption enthalpies of C5-C8 linear and branched alkanes, cyclohexane and benzene were measured from 120 to 240 degrees C using pulse gas chromatography. An adapted force field for linear and branched alkanes in MIL-47 was used to compute the adsorption properties of those molecules. A new set of charges was developed for simulations with benzene in MIL-47. The adsorption enthalpy of linear alkanes increases with about 7.6 kJ mol(-1) per additional -CH2- group. Henry adsorption constants of iso-alkanes are slightly lower than those of the linear chains but the MIL-47 framework is not imposing steric constraints on the branched chains. Benzene and cyclohexane are adsorbed less strongly than n-hexane as they have less hydrogen atoms. For the studied non-polar molecules, the adsorption energies are dominated by van der Waals interactions and benzene adsorption is additionally influenced by Coulombic interactions. The simulated tendencies are in good agreement with the experiments. PMID:19421556

  6. Environmental assessment related to the operation of Argonne National Laboratory, Argonne, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    1982-08-01

    In order to evaluate the environmental impacts of Argonne National Laboratory (ANL) operations, this assessment includes a descriptive section which is intended to provide sufficient detail to allow the various impacts to be viewed in proper perspective. In particular, details are provided on site characteristics, current programs, characterization of the existing site environment, and in-place environmental monitoring programs. In addition, specific facilities and operations that could conceivably impact the environment are described at length. 77 refs., 16 figs., 47 tabs.

  7. Nuclear Accident Dosimetry at Argonne National Laboratory

    International Nuclear Information System (INIS)

    This report summarizes current planning at Argonne National Laboratory with respect to dose determination following a criticality incident. The discussion relates chiefly to two types of commercially obtained dosimeter packages, and includes the results of independent calibrations performed at the Laboratory. The primary dosimeter system incorporates threshold detectors developed at Oak Ridge National Laboratory for neutron spectrum measurement. Fission foil decay calibration curves have been determined experimentally for scintillation counting equipment routinely used at Argonne. This equipment also has been calibrated for determination of sodium-24 activity in blood. Dosimeter units of the type designed at Savannah River Laboratory are deployed as secondary stations. Data from the neutron activation components of these units will be used to make corrections to, the neutron spectrum for intermediate as well as thermal energies. The epicadmium copper foil activation, for a given fluence of intermediate energy neutrons, has been shown relatively insensitive to neutron spectrum variations within the region, and a meaningful average of copper cross-section has been determined. Counter calibration factors determined at Argonne are presented for the copper, indium, and sulphur components. The total neutron fluence is computed using the corrected spectrum in conjunction with a capture probability function and the blood sodium result. One or more specifications of neutron dose then may be calculated by applying the spectral information to the appropriate conversion function. The gamma portion of the primary dosimeter package contains fluorescent rods and a thermoluminescent dosimeter in addition to a two-phase chemical dosimeter. The gamma dosimeter in the secondary package is a polyacrylamide solution which is degraded by exposure to gamma radiation. The absorbed dose is derived from a measured change insolution viscosity. Difficulties in evaluation, placement, and

  8. The PyZgoubi framework and the simulation of dynamic aperture in fixed-field alternating-gradient accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Tygier, S., E-mail: sam.tygier@hep.manchester.ac.uk [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Appleby, R.B., E-mail: robert.appleby@manchester.ac.uk [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Garland, J.M. [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Hock, K. [University of Liverpool (United Kingdom); Owen, H. [Cockcroft Accelerator Group, The University of Manchester (United Kingdom); Kelliher, D.J.; Sheehy, S.L. [STFC Rutherford Appleton Laboratory (United Kingdom)

    2015-03-01

    We present PyZgoubi, a framework that has been developed based on the tracking engine Zgoubi to model, optimise and visualise the dynamics in particle accelerators, especially fixed-field alternating-gradient (FFAG) accelerators. We show that PyZgoubi abstracts Zgoubi by wrapping it in an easy-to-use Python framework in order to allow simple construction, parameterisation, visualisation and optimisation of FFAG accelerator lattices. Its object oriented design gives it the flexibility and extensibility required for current novel FFAG design. We apply PyZgoubi to two example FFAGs; this includes determining the dynamic aperture of the PAMELA medical FFAG in the presence of magnet misalignments, and illustrating how PyZgoubi may be used to optimise FFAGs. We also discuss a robust definition of dynamic aperture in an FFAG and show its implementation in PyZgoubi.

  9. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    DEFF Research Database (Denmark)

    Misfeldt, Morten

    2015-01-01

    as a competitive game and as a simulation. Both of these views are meaningful and can be seen as supporting learnin g. Emphasizing the simulation aspect let us explain how students learn by being immersed into a simulated world, where the players identify with specific roles, live out specific situations...... in construction w ork. The goal of the paper is to investigate empirically how these two understandings influence game experience and learning outcome. This question is approached by qualitative post‑game interviews about the experienced fun, competition and realism. Speci fic attention is given to how...

  10. The Design for Tractable Analysis (DTA) Framework: A Methodology for the Analysis and Simulation of Complex Systems

    OpenAIRE

    John M. Linebarger; Mark J. De Spain; McDonald, Michael J.; Floyd W. Spencer; Robert J. Cloutier

    2009-01-01

    The Design for Tractable Analysis (DTA) framework was developed to address the analysis of complex systems and so-called “wicked problems.†DTA is distinctive because it treats analytic processes as key artifacts that can be created and improved through formal design processes. Systems (or enterprises) are analyzed as a whole, in conjunction with decomposing them into constituent elements for domain-specific analyses that are informed by the whole. After using the Systems Modeling Language...

  11. A locally p-adaptive approach for Large Eddy Simulation of compressible flows in a DG framework

    CERN Document Server

    Tugnoli, Matteo; Bonaventura, Luca; Restelli, Marco

    2016-01-01

    We investigate the possibility of reducing the computational burden of LES models by employing local polynomial degree adaptivity in the framework of a high order DG method. A novel degree adaptation technique especially featured to be effective for LES applications is proposed and its effectiveness is compared to that of other criteria already employed in the literature. The resulting locally adaptive approach allows to achieve significant reductions in computational cost of representative LES computations.

  12. Hydrogeological framework, numerical simulation of groundwater flow, and effects of projected water use and drought for the Beaver-North Canadian River alluvial aquifer, northwestern Oklahoma

    Science.gov (United States)

    Ryter, Derek W.; Correll, Jessica S.

    2016-01-14

    This report describes a study of the hydrology, hydrogeological framework, numerical groundwater-flow models, and results of simulations of the effects of water use and drought for the Beaver-North Canadian River alluvial aquifer, northwestern Oklahoma. The purpose of the study was to provide analyses, including estimating equal-proportionate-share (EPS) groundwater-pumping rates and the effects of projected water use and droughts, pertinent to water management of the Beaver-North Canadian River alluvial aquifer for the Oklahoma Water Resources Board.

  13. An overset curvilinear/immersed boundary framework for high resolution simulations of wind and hydrokinetic turbine flows

    Science.gov (United States)

    Borazjani, Iman; Behara, Suresh; Natarajan, Ganesh; Sotiropoulos, Fotis

    2009-11-01

    We generalize the curvilinear/immersed boundary method to incorporate overset grids to enable the simulation of more complicated geometries and increase grid resolution locally near complex immersed boundary. The new method has been applied to carry out high resolution simulations of wind and hydrokinetic turbine rotors. An interior fine mesh contains the rotor blades and is embedded within a coarser background mesh. The rotor blades can be treated either as immersed boundaries or using curvilinear, boundary-conforming overset grids. The numerical methodology has been generalized to include both inertial and non-inertial frame formulations. The method is validated by applying it to simulate the flow for the NREL wind turbine rotor for various turbine operating points. Inviscid, unsteady RANS and LES simulations are carried out and compared with experimental data. Preliminary results will also be presented for the hydrokinetic turbine rotor installed at the Roosevelt Island Tidal Energy project in New York City.

  14. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    OpenAIRE

    Comminal, Raphaël; Spangenberg, Jon; Hattel, Jesper Henri

    2015-01-01

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our calculations predict a hydrodynamic instability in the 4:1 contraction geometry at a Weissenberg number of order 4. This new result is in qualitative agreement with the prediction of a non-linear sub...

  15. Argonne National Laboratory research offers clues to Alzheimer's plaques

    CERN Multimedia

    2003-01-01

    Researchers from Argonne National Laboratory and the University of Chicago have developed methods to directly observe the structure and growth of microscopic filaments that form the characteristic plaques found in the brains of those with Alzheimer's Disease (1 page).

  16. Drive linac for the Argonne Wakefield Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Chojnacki, E.; Konecny, R.; Rosing, M.; Simpson, J.

    1993-08-01

    The drive linac in Phase I of the Argonne Wakefield Accelerator (AWA) will be used to accelerate short duration (10 ps), high charge (100 nC) electron bunches from 2 MV to 20 MV for use in a variety of wakefield acceleration and measurement studies. The high charge is required since this drive bunch will generate the wakefields of interest in various test sections and their amplitudes are proportional to bunch charge. The short bunch duration is required to drive high-frequency wakefields without intra-bunch cancellation effects. The drive linac design was a balance between having a small wake function to maintain a drive bunch energy spread of {le}10% and obtaining an adequate accelerating gradient of {ge}10 MV/m. This yielded a large aperture, low shunt impedance, high group velocity, L-band, standing-wave linac. Details of the design, fabrication, and testing are presented in the following.

  17. Development and validation of a modelling framework for simulating 2D-mammography and breast tomosynthesis images

    International Nuclear Information System (INIS)

    Planar 2D x-ray mammography is generally accepted as the preferred screening technique used for breast cancer detection. Recently, digital breast tomosynthesis (DBT) has been introduced to overcome some of the inherent limitations of conventional planar imaging, and future technological enhancements are expected to result in the introduction of further innovative modalities. However, it is crucial to understand the impact of any new imaging technology or methodology on cancer detection rates and patient recall. Any such assessment conventionally requires large scale clinical trials demanding significant investment in time and resources. The concept of virtual clinical trials and virtual performance assessment may offer a viable alternative to this approach. However, virtual approaches require a collection of specialized modelling tools which can be used to emulate the image acquisition process and simulate images of a quality indistinguishable from their real clinical counterparts. In this paper, we present two image simulation chains constructed using modelling tools that can be used for the evaluation of 2D-mammography and DBT systems. We validate both approaches by comparing simulated images with real images acquired using the system being simulated. A comparison of the contrast-to-noise ratios and image blurring for real and simulated images of test objects shows good agreement ( < 9% error). This suggests that our simulation approach is a promising alternative to conventional physical performance assessment followed by large scale clinical trials. (paper)

  18. The RD53 Collaboration's SystemVerilog-UVM Simulation Framework and its General Applicability to Design of Advanced Pixel Readout Chips

    CERN Document Server

    Marconi, S; Placidi, Pisana; Christiansen, Jorgen; Hemperek, Tomasz

    2014-01-01

    The foreseen Phase 2 pixel upgrades at the LHC have very challenging requirements for the design of hybrid pixel readout chips. A versatile pixel simulation platform is as an essential development tool for the design, verification and optimization of both the system architecture and the pixel chip building blocks (Intellectual Properties, IPs). This work is focused on the implemented simulation and verification environment named VEPIX53, built using the SystemVerilog language and the Universal Verification Methodology (UVM) class library in the framework of the RD53 Collaboration. The environment supports pixel chips at different levels of description: its reusable components feature the generation of different classes of parameterized input hits to the pixel matrix, monitoring of pixel chip inputs and outputs, conformity checks between predicted and actual outputs and collection of statistics on system performance. The environment has been tested performing a study of shared architectures of the trigger late...

  19. SmartCell, a framework to simulate cellular processes that combines stochastic approximation with diffusion and localisation: analysis of simple networks.

    Science.gov (United States)

    Ander, M; Beltrao, P; Di Ventura, B; Ferkinghoff-Borg, J; Foglierini, M; Kaplan, A; Lemerle, C; Tomás-Oliveira, I; Serrano, L

    2004-06-01

    SmartCell has been developed to be a general framework for modelling and simulation of diffusion-reaction networks in a whole-cell context. It supports localisation and diffusion by using a mesoscopic stochastic reaction model. The SmartCell package can handle any cell geometry, considers different cell compartments, allows localisation of species, supports DNA transcription and translation, membrane diffusion and multistep reactions, as well as cell growth. Moreover, different temporal and spatial constraints can be applied to the model. A GUI interface that facilitates model making is also available. In this work we discuss limitations and advantages arising from the approach used in SmartCell and determine the impact of localisation on the behaviour of simple well-defined networks, previously analysed with differential equations. Our results show that this factor might play an important role in the response of networks and cannot be neglected in cell simulations.

  20. Upgrading the Benchmark Simulation Model Framework with emerging challenges - A study of N2O emissions and the fate of pharmaceuticals in urban wastewater systems

    DEFF Research Database (Denmark)

    Snip, Laura

    for an extension of the BSM. Various challenges were encountered regarding the mathematical structure and the parameter values when expanding the BSM. The N2O models produced different results due to the assumptions on which they are based. In addition, pH and inorganic carbon concentrations have been demonstrated......Nowadays a wastewater treatment plant (WWTP) is not only expected to remove traditional pollutants from the wastewater; other emerging challenges have arisen as well. A WWTP is now, among other things, expected to also minimise its carbon footprint and deal with micropollutants. Optimising...... the performance of a WWTP can be done with mathematical models that can be used in simulation studies. The Benchmark Simulation Model (BSM) framework was developed to compare objectively different operational/control strategies. As different operational strategies of a WWTP will most likely have an effect...

  1. Generation of annular, high-charge electron beams at the Argonne wakefield accelerator

    Science.gov (United States)

    Wisniewski, E. E.; Li, C.; Gai, W.; Power, J.

    2013-01-01

    We present and discuss the results from the experimental generation of high-charge annular(ring-shaped)electron beams at the Argonne Wakefield Accelerator (AWA). These beams were produced by using laser masks to project annular laser profiles of various inner and outer diameters onto the photocathode of an RF gun. The ring beam is accelerated to 15 MeV, then it is imaged by means of solenoid lenses. Transverse profiles are compared for different solenoid settings. Discussion includes a comparison with Parmela simulations, some applications of high-charge ring beams,and an outline of a planned extension of this study.

  2. Argonne National Lab gets Linux network teraflop cluster

    CERN Document Server

    2003-01-01

    "Linux NetworX, Salt Lake City, Utah, has delivered an Evolocity II (E2) Linux cluster to Argonne National Laboratory that is capable of performing more than one trillion calculations per second (1 teraFLOP). The cluster, named "Jazz" by Argonne, is designed to provide optimum performance for multiple disciplines such as chemistry, physics and reactor engineering and will be used by the entire scientific community at the Lab" (1 page).

  3. Argonne National Laboratory's photooxidation organic mixed-waste treatment system

    International Nuclear Information System (INIS)

    This paper describes the installation and startup testing of the Argonne National Laboratory-East (ANL-E) photo-oxidation organic mixed-waste treatment system. This system will treat organic mixed (i.e., radioactive and hazardous) waste by oxidizing the organics to carbon dioxide and inorganic salts in an aqueous media. The residue will be treated in the existing radwaste evaporators. The system is installed in the waste management facility at the ANL-E site in Argonne, Illinois

  4. 着色Petri网的混杂系统仿真平台构架%Platform Framework of Complex Hybrid System Simulation Based on Colored Petri Nets

    Institute of Scientific and Technical Information of China (English)

    方哲梅; 王明哲; 杨翠蓉

    2011-01-01

    提出一种以Petri网为仿真进程控制,以着色Petri网与Matlab交互为主题的混杂仿真跨平台构架.该仿真构架通过运用和扩展着色Petri网中替代变迁的概念,结合融合库所和折叠功能,实现了混杂系统的复杂逻辑建模和连续系统内嵌.同时,着色Petri网的分析功能在一定程度上缓解了逻辑结构复杂的混杂系统检验困难的问题.最后通过一个混杂系统实例的建模与仿真分析,验证了该平台的可行性与逻辑检验的有效性,为复杂混杂系统的建模与仿真提供了一条新途径.%This paper proposes a cross-platform framework for hybrid simulation based on the interaction between colored Petri net (CPN) and Matlab, using Petri net as a tool for simulation process control. Utilizing and extending the concept of substitution transition, with the function of fusion place and folding, this framework can accomplish complex logical modeling and establishment of imbedded continuous process for hybrid systems. Besides, the analytical function of CPN reduces difficulty in logical verification for hybrid systems with complex logical behaviors. Finally, by modeling, simulation and analysis of a simple instance, feasibility of the platform and validity of the logic are shown. It provides a new method of modeling and simulation for large and complicated hybrid systems.

  5. Investigating H 2 Sorption in a Fluorinated Metal–Organic Framework with Small Pores Through Molecular Simulation and Inelastic Neutron Scattering

    KAUST Repository

    Forrest, Katherine A.

    2015-07-07

    © 2015 American Chemical Society. Simulations of H2 sorption were performed in a metal-organic framework (MOF) consisting of Zn2+ ions coordinated to 1,2,4-triazole and tetrafluoroterephthalate ligands (denoted [Zn(trz)(tftph)] in this work). The simulated H2 sorption isotherms reported in this work are consistent with the experimental data for the state points considered. The experimental H2 isosteric heat of adsorption (Qst) values for this MOF are approximately 8.0 kJ mol-1 for the considered loading range, which is in the proximity of those determined from simulation. The experimental inelastic neutron scattering (INS) spectra for H2 in [Zn(trz)(tftph)] reveal at least two peaks that occur at low energies, which corresponds to high barriers to rotation for the respective sites. The most favorable sorption site in the MOF was identified from the simulations as sorption in the vicinity of a metal-coordinated H2O molecule, an exposed fluorine atom, and a carboxylate oxygen atom in a confined region in the framework. Secondary sorption was observed between the fluorine atoms of adjacent tetrafluoroterephthalate ligands. The H2 molecule at the primary sorption site in [Zn(trz)(tftph)] exhibits a rotational barrier that exceeds that for most neutral MOFs with open-metal sites according to an empirical phenomenological model, and this was further validated by calculating the rotational potential energy surface for H2 at this site. (Figure Presented).

  6. Argonne National Laboratory site environmental report for calendar year 2006.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; ESH/QA Oversight

    2007-09-13

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2006. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  7. Argonne National Laboratory Site Environmental Report for Calendar Year 2013

    Energy Technology Data Exchange (ETDEWEB)

    Davis, T. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Gomez, J. L. [Argonne National Lab. (ANL), Argonne, IL (United States); Moos, L. P. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-09-02

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2013. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with environmental management, sustainability efforts, environmental corrective actions, and habitat restoration. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, Argonne, and other) and are compared with applicable standards intended to protect human health and the environment. A U.S. Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection (ICRP) recommendations and the U.S. Environmental Protection Agency’s (EPA) CAP-88 Version 3 computer code, was used in preparing this report.

  8. 1986 annual site environmental report for Argonne National Laboratory

    International Nuclear Information System (INIS)

    The results of the environmental monitoring program at Argonne National Laboratory (ANL) for 1986 are presented and discussed. To evaluate the effect of Argonne operations on the environment, measurements were made for a variety of radionuclides in air, surface water, ground water, soil, grass, bottom sediment, and milk; of the environmental penetrating radiation dose; and for a variety of chemical constituents in surface water, ground water, and Argonne effluent water. Sample collections and measurements were made on the site, at the site boundary, and off the Argonne site for comparison purposes. The results of the program are interpreted in terms of the sources and origin of the radioactive and chemical substances (natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A US Department of Energy (DOE) dose calculation methodology based on recent International Commission on Radiological Protection (ICRP) recommendations is required and used in this report. The radiation dose to off-site population groups is estimated. The average concentrations and total amounts of radioactive and chemical pollutants released by Argonne to the environment were all below appropriate standards. 21 refs., 7 figs., 52 tabs

  9. Argonne National Laboratory Site Environmental report for calendar year 2009.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.

    2010-08-04

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2009. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy (DOE) dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's (EPA) CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  10. Argonne National Laboratory site environmental report for calendar year 2007.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.; ESH/QA Oversight

    2008-09-09

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2007. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  11. Argonne National Laboratory site enviromental report for calendar year 2008.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Davis, T. M.; Moos, L. P.

    2009-09-02

    This report discusses the status and the accomplishments of the environmental protection program at Argonne National Laboratory for calendar year 2008. The status of Argonne environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of Argonne operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the Argonne site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and Argonne effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, Argonne, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 Version 3 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  12. An analytical drilling force model and GPU-accelerated haptics-based simulation framework of the pilot drilling procedure for micro-implants surgery training.

    Science.gov (United States)

    Zheng, Fei; Lu, Wen Feng; Wong, Yoke San; Foong, Kelvin Weng Chiong

    2012-12-01

    The placement of micro-implants is a common but relatively new surgical procedure in clinical dentistry. This paper presents a haptics-based simulation framework for the pilot drilling of micro-implants surgery to train orthodontists to successfully perform this essential procedure by tactile sensation, without damaging tooth roots. A voxel-based approach was employed to model the inhomogeneous oral tissues. A preprocessing pipeline was designed to reduce imaging noise, smooth segmentation results and construct an anatomically correct oral model from patient-specific data. In order to provide a physically based haptic feedback, an analytical drilling force model based on metal cutting principles was developed and adapted for the voxel-based approach. To improve the real-time response, the parallel computing power of Graphics Processing Units is exploited through extra efforts for data structure design, algorithms parallelization, and graphic memory utilization. A prototype system has been developed based on the proposed framework. Preliminary results show that, by using this framework, proper drilling force can be rendered at different tissue layers with reduced cycle time, while the visual display has also been enhanced.

  13. pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2014-01-01

    This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions. PMID:24732497

  14. pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2014-01-01

    This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions.

  15. A Robust Metal-Organic Framework with An Octatopic Ligand for Gas Adsorption and Separation: A Combined Characterization by Experiments and Molecular Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Zhuang, Wenjuan; Yuan, Daqiang; Liu, Dahuan; Zhong, Chongli; Li, Jian-Rong; Zhou, Hong-Cai

    2012-01-10

    A newly designed octatopic carboxylate ligand, tetrakis[(3,5-dicarboxyphenyl)oxamethyl]methane (TDM8–) has been used to connect a dicopper paddlewheel building unit affording a metal–organic framework (MOF), Cu₄(H₂O)₄(TDM)·xS (PCN-26·xS, S represents noncoordinated solvent molecules, PCN = porous coordination network) with novel structure, high gas uptake, and interesting gas adsorption selectivity. PCN-26 contains two different types of cages, octahedral and cuboctahedral, to form a polyhedron-stacked three-dimensional framework with open channels in three orthogonal directions. Gas adsorption studies of N₂, Ar, and H₂ on an activated PCN-26 at 77 K, 1 bar, reveals a Langmuir surface area of 2545 m²/g, a Brunauer–Emmett–Teller (BET) surface area of 1854 m²/g, a total pore volume of 0.84 cm³/g, and a H₂ uptake capacity of 2.57 wt %. Additionally, PCN-26 exhibits a CO₂/N₂ selectivity of 49:1 and CO₂/CH₄ selectivity of 8.4:1 at 273 K. To investigate properties of gas adsorption and the adsorption sites for CO₂ in activated PCN-26, theoretical simulations of the adsorption isotherms of CO₂, CH₄, and N₂ at different temperatures were carried out. Experimental results corroborate very well with those of molecular simulations.

  16. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    Science.gov (United States)

    Comminal, R.; Spangenberg, J.; Hattel, J. H.

    2015-04-01

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our calculations predict a hydrodynamic instability in the 4:1 contraction geometry at a Weissenberg number of order 4. This new result is in qualitative agreement with the prediction of a non-linear subcritical elastic instability in Poiseuille flow. Our viscoelastic flow solver is coupled with a volume-of-fluid solver in order to predict free- surfaces in extrusion.

  17. Study of the response and photon-counting resolution of silicon photomultipliers using a generic simulation framework

    CERN Document Server

    Eckert, P; Schultz-Coulon, H.C

    2012-01-01

    which enables detailed modelling of the SiPM response using basic SiPM parameters and geometry as an input. Depending on the specified SiPM properties which can be determined from basic characterisation measurements, the simulation generates the signal charge and pulse shape for arbitrary incident light pulse distributions. The simulation has been validated in the whole dynamic range for a Hamamatsu S10362-11-100C MPPC and was used to study the effect of different noise sources like optical cross-talk and after-pulsing on the response curve and the photon-counting resolution.

  18. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    DEFF Research Database (Denmark)

    Comminal, Raphaël; Spangenberg, Jon; Hattel, Jesper Henri

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our...... calculations predict a hydrodynamic instability in the 4:1 contraction geometry at a Weissenberg number of order 4. This new result is in qualitative agreement with the prediction of a non-linear subcritical elastic instability in Poiseuille flow. Our viscoelastic flow solver is coupled with a volume...

  19. A new numerical framework to simulate viscoelastic free-surface flows with the finite-volume method

    DEFF Research Database (Denmark)

    Comminal, Raphaël; Spangenberg, Jon; Hattel, Jesper Henri

    2015-01-01

    A new method for the simulation of 2D viscoelastic flow is presented. Numerical stability is obtained by the logarithmic-conformation change of variable, and a fully-implicit pure-streamfunction flow formulation, without use of any artificial diffusion. As opposed to other simulation results, our...... calculations predict a hydrodynamic instability in the 4:1 contraction geometry at a Weissenberg number of order 4. This new result is in qualitative agreement with the prediction of a non-linear subcritical elastic instability in Poiseuille flow. Our viscoelastic flow solver is coupled with a volume...

  20. Intermittent communications modeling and simulation for autonomous unmanned maritime vehicles using an integrated APM and FSMC framework

    Science.gov (United States)

    Coker, Ayodeji; Straatemeier, Logan; Rogers, Ted; Valdez, Pierre; Griendling, Kelly; Cooksey, Daniel

    2014-06-01

    In this work a framework is presented for addressing the issue of intermittent communications faced by autonomous unmanned maritime vehicles operating at sea. In particular, this work considers the subject of predictive atmospheric signal transmission over multi-path fading channels in maritime environments. A Finite State Markov Channel is used to represent a Nakagami-m modeled physical fading radio channel. The range of the received signal-to-noise ratio is partitioned into a finite number of intervals which represent application-specific communications states. The Advanced Propagation Model (APM), developed at the Space and Naval Warfare Systems Center San Diego, provides a characterization of the transmission channel in terms of evaporation duct induced signal propagation loss. APM uses a hybrid ray-optic and parabolic equations model which allows for the computation of electromagnetic (EM) wave propagation over various sea and/or terrain paths. These models which have been integrated in the proposed framework provide a strategic and mission planning aid for the operation of maritime unmanned vehicles at sea.

  1. Helios: a Multi-Purpose LIDAR Simulation Framework for Research, Planning and Training of Laser Scanning Operations with Airborne, Ground-Based Mobile and Stationary Platforms

    Science.gov (United States)

    Bechtold, S.; Höfle, B.

    2016-06-01

    In many technical domains of modern society, there is a growing demand for fast, precise and automatic acquisition of digital 3D models of a wide variety of physical objects and environments. Laser scanning is a popular and widely used technology to cover this demand, but it is also expensive and complex to use to its full potential. However, there might exist scenarios where the operation of a real laser scanner could be replaced by a computer simulation, in order to save time and costs. This includes scenarios like teaching and training of laser scanning, development of new scanner hardware and scanning methods, or generation of artificial scan data sets to support the development of point cloud processing and analysis algorithms. To test the feasibility of this idea, we have developed a highly flexible laser scanning simulation framework named Heidelberg LiDAR Operations Simulator (HELIOS). HELIOS is implemented as a Java library and split up into a core component and multiple extension modules. Extensible Markup Language (XML) is used to define scanner, platform and scene models and to configure the behaviour of modules. Modules were developed and implemented for (1) loading of simulation assets and configuration (i.e. 3D scene models, scanner definitions, survey descriptions etc.), (2) playback of XML survey descriptions, (3) TLS survey planning (i.e. automatic computation of recommended scanning positions) and (4) interactive real-time 3D visualization of simulated surveys. As a proof of concept, we show the results of two experiments: First, a survey planning test in a scene that was specifically created to evaluate the quality of the survey planning algorithm. Second, a simulated TLS scan of a crop field in a precision farming scenario. The results show that HELIOS fulfills its design goals.

  2. Simulating Star Clusters with the AMUSE Software Framework: I. Dependence of Cluster Lifetimes on Model Assumptions and Cluster Dissolution Modes

    CERN Document Server

    Whitehead, Alfred J; Vesperini, Enrico; Zwart, Simon Portegies

    2013-01-01

    We perform a series of simulations of evolving star clusters using AMUSE (the Astrophysical Multipurpose Software Environment), a new community-based multi-physics simulation package, and compare our results to existing work. These simulations model a star cluster beginning with a King model distribution and a selection of power-law initial mass functions, and contain a tidal cut-off. They are evolved using collisional stellar dynamics and include mass loss due to stellar evolution. After determining that the differences between AMUSE results and prior publications are understood, we explored the variation in cluster lifetimes due to the random realization noise introduced by transforming a King model to specific initial conditions. This random realization noise can affect the lifetime of a simulated star cluster by up to 30%. Two modes of star cluster dissolution were identified: a mass evolution curve that contains a run-away cluster dissolution with a sudden loss of mass, and a dissolution mode that does n...

  3. Argonne National Laboratory institutional plan FY 2001--FY 2006.

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, S.D.

    2000-12-07

    This Institutional Plan describes what Argonne management regards as the optimal future development of Laboratory activities. The document outlines the development of both research programs and support operations in the context of the nation's R and D priorities, the missions of the Department of Energy (DOE) and Argonne, and expected resource constraints. The Draft Institutional Plan is the product of many discussions between DOE and Argonne program managers, and it also reflects programmatic priorities developed during Argonne's summer strategic planning process. That process serves additionally to identify new areas of strategic value to DOE and Argonne, to which Laboratory Directed Research and Development funds may be applied. The Draft Plan is provided to the Department before Argonne's On-Site Review. Issuance of the final Institutional Plan in the fall, after further comment and discussion, marks the culmination of the Laboratory's annual planning cycle. Chapter II of this Institutional Plan describes Argonne's missions and roles within the DOE laboratory system, its underlying core competencies in science and technology, and six broad planning objectives whose achievement is considered critical to the future of the Laboratory. Chapter III presents the Laboratory's ''Science and Technology Strategic Plan,'' which summarizes key features of the external environment, presents Argonne's vision, and describes how Argonne's strategic goals and objectives support DOE's four business lines. The balance of Chapter III comprises strategic plans for 23 areas of science and technology at Argonne, grouped according to the four DOE business lines. The Laboratory's 14 major initiatives, presented in Chapter IV, propose important advances in key areas of fundamental science and technology development. The ''Operations and Infrastructure Strategic Plan'' in Chapter V includes

  4. Simulations

    CERN Document Server

    Ngada, N M

    2015-01-01

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  5. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    Science.gov (United States)

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  6. Electron-cloud simulation studies for the CERN-PS in the framework of the LHC Injectors Upgrade project

    CERN Document Server

    Rioja Fuentelsaz, Sergio

    The present study aims to provide a consistent picture of the electron cloud effect in the CERN Proton Synchrotron (PS) and to investigate possible future limitations due to the requirements foreseen by the LHC Injectors Upgrade (LIU) project. It consists of a complete simulation survey of the electron cloud build-up in the different beam pipe sections of the ring depending on several controllable beam parameters and vacuum chamber surface properties, covering present and future operation parameters. As the combined function magnets of the accelerator constitute almost the $80\\%$ in length of the ring, the implementation of a new feature for the simulation of any external magnetic field on the PyECLOUD code, made it possible to perform this study. All the results of the simulations are given as a function of the vacuum chamber surface properties in order to deduce them, both locally and globally, when compared with experimental data. In a first step, we characterize locally the maximum possible number of ...

  7. Numerical Simulation of Single Concrete Framework Implosion Demolition%单榀钢混框架结构内爆法拆除模拟

    Institute of Scientific and Technical Information of China (English)

    李胜林; 王宇涛; 黄明升; 刘凯; 骆之悦

    2012-01-01

    To find out the basic law of the concrete frame structure implosion demolition, the model of single concrete framework of ten stories with five-span frame was constructed by using MAT159 software and adopting separating constrained method to simulate reinforced concrete. First, the gravity balance was executed. Then, the numerical simulation was implemented according to the designed different sequences of implosion demolition. From the results of simulation, the collapsed rules could be achieved. Also the criterion to determine the delay time could be put forward.%为了研究内爆法拆除混凝土框架结构的基本规律,作者使用LS-DYNA中的MAT159本构,采用分离式constrained方法建立钢筋混凝土模型,建立了一个10层5跨的混凝土单榀结构模型,待重力平衡后,采用不同拆除顺序对内爆拆除进行了数值模拟.模拟结果显示了不同内爆方法的倒塌规律,并提出了延期时间的确定原则.

  8. Using E-Z Reader to simulate eye movements in nonreading tasks: a unified framework for understanding the eye-mind link.

    Science.gov (United States)

    Reichle, Erik D; Pollatsek, Alexander; Rayner, Keith

    2012-01-01

    Nonreading tasks that share some (but not all) of the task demands of reading have often been used to make inferences about how cognition influences when the eyes move during reading. In this article, we use variants of the E-Z Reader model of eye-movement control in reading to simulate eye-movement behavior in several of these tasks, including z-string reading, target-word search, and visual search of Landolt Cs arranged in both linear and circular arrays. These simulations demonstrate that a single computational framework is sufficient to simulate eye movements in both reading and nonreading tasks but also suggest that there are task-specific differences in both saccadic targeting (i.e., decisions about where to move the eyes) and the coupling between saccadic programming and the movement of attention (i.e., decisions about when to move the eyes). These findings suggest that some aspects of the eye-mind link are flexible and can be configured in a manner that supports efficient task performance.

  9. Using E-Z Reader to simulate eye movements in nonreading tasks: a unified framework for understanding the eye-mind link.

    Science.gov (United States)

    Reichle, Erik D; Pollatsek, Alexander; Rayner, Keith

    2012-01-01

    Nonreading tasks that share some (but not all) of the task demands of reading have often been used to make inferences about how cognition influences when the eyes move during reading. In this article, we use variants of the E-Z Reader model of eye-movement control in reading to simulate eye-movement behavior in several of these tasks, including z-string reading, target-word search, and visual search of Landolt Cs arranged in both linear and circular arrays. These simulations demonstrate that a single computational framework is sufficient to simulate eye movements in both reading and nonreading tasks but also suggest that there are task-specific differences in both saccadic targeting (i.e., decisions about where to move the eyes) and the coupling between saccadic programming and the movement of attention (i.e., decisions about when to move the eyes). These findings suggest that some aspects of the eye-mind link are flexible and can be configured in a manner that supports efficient task performance. PMID:22229492

  10. The RD53 collaboration's SystemVerilog-UVM simulation framework and its general applicability to design of advanced pixel readout chips

    International Nuclear Information System (INIS)

    The foreseen Phase 2 pixel upgrades at the LHC have very challenging requirements for the design of hybrid pixel readout chips. A versatile pixel simulation platform is as an essential development tool for the design, verification and optimization of both the system architecture and the pixel chip building blocks (Intellectual Properties, IPs). This work is focused on the implemented simulation and verification environment named VEPIX53, built using the SystemVerilog language and the Universal Verification Methodology (UVM) class library in the framework of the RD53 Collaboration. The environment supports pixel chips at different levels of description: its reusable components feature the generation of different classes of parameterized input hits to the pixel matrix, monitoring of pixel chip inputs and outputs, conformity checks between predicted and actual outputs and collection of statistics on system performance. The environment has been tested performing a study of shared architectures of the trigger latency buffering section of pixel chips. A fully shared architecture and a distributed one have been described at behavioral level and simulated; the resulting memory occupancy statistics and hit loss rates have subsequently been compared

  11. I. Dissociation free energies of drug-receptor systems via non-equilibrium alchemical simulations: a theoretical framework.

    Science.gov (United States)

    Procacci, Piero

    2016-06-01

    In this contribution I critically revise the alchemical reversible approach in the context of the statistical mechanics theory of non-covalent bonding in drug-receptor systems. I show that most of the pitfalls and entanglements for the binding free energy evaluation in computer simulations are rooted in the equilibrium assumption that is implicit in the reversible method. These critical issues can be resolved by using a non-equilibrium variant of the alchemical method in molecular dynamics simulations, relying on the production of many independent trajectories with a continuous dynamical evolution of an externally driven alchemical coordinate, completing the decoupling of the ligand in a matter of a few tens of picoseconds rather than nanoseconds. The absolute binding free energy can be recovered from the annihilation work distributions by applying an unbiased unidirectional free energy estimate, on the assumption that any observed work distribution is given by a mixture of normal distributions, whose components are identical in either direction of the non-equilibrium process, with weights regulated by the Crooks theorem. I finally show that the inherent reliability and accuracy of the unidirectional estimate of the decoupling free energies, based on the production of a few hundreds of non-equilibrium independent sub-nanosecond unrestrained alchemical annihilation processes, is a direct consequence of the funnel-like shape of the free energy surface in molecular recognition. An application of the technique to a real drug-receptor system is presented in the companion paper.

  12. Tiger team assessment of the Argonne Illinois site

    Energy Technology Data Exchange (ETDEWEB)

    1990-10-19

    This report documents the results of the Department of Energy's (DOE) Tiger Team Assessment of the Argonne Illinois Site (AIS) (including the DOE Chicago Operations Office, DOE Argonne Area Office, Argonne National Laboratory-East, and New Brunswick Laboratory) and Site A and Plot M, Argonne, Illinois, conducted from September 17 through October 19, 1990. The Tiger Team Assessment was conducted by a team comprised of professionals from DOE, contractors, consultants. The purpose of the assessment was to provide the Secretary of Energy with the status of Environment, Safety, and Health (ES H) Programs at AIS. Argonne National Laboratory-East (ANL-E) is the principal tenant at AIS. ANL-E is a multiprogram laboratory operated by the University of Chicago for DOE. The mission of ANL-E is to perform basic and applied research that supports the development of energy-related technologies. There are a significant number of ES H findings and concerns identified in the report that require prompt management attention. A significant change in culture is required before ANL-E can attain consistent and verifiable compliance with statutes, regulations and DOE Orders. ES H activities are informal, fragmented, and inconsistently implemented. Communication is seriously lacking, both vertically and horizontally. Management expectations are not known or commondated adequately, support is not consistent, and oversight is not effective.

  13. Tiger team assessment of the Argonne Illinois site

    International Nuclear Information System (INIS)

    This report documents the results of the Department of Energy's (DOE) Tiger Team Assessment of the Argonne Illinois Site (AIS) (including the DOE Chicago Operations Office, DOE Argonne Area Office, Argonne National Laboratory-East, and New Brunswick Laboratory) and Site A and Plot M, Argonne, Illinois, conducted from September 17 through October 19, 1990. The Tiger Team Assessment was conducted by a team comprised of professionals from DOE, contractors, consultants. The purpose of the assessment was to provide the Secretary of Energy with the status of Environment, Safety, and Health (ES ampersand H) Programs at AIS. Argonne National Laboratory-East (ANL-E) is the principal tenant at AIS. ANL-E is a multiprogram laboratory operated by the University of Chicago for DOE. The mission of ANL-E is to perform basic and applied research that supports the development of energy-related technologies. There are a significant number of ES ampersand H findings and concerns identified in the report that require prompt management attention. A significant change in culture is required before ANL-E can attain consistent and verifiable compliance with statutes, regulations and DOE Orders. ES ampersand H activities are informal, fragmented, and inconsistently implemented. Communication is seriously lacking, both vertically and horizontally. Management expectations are not known or commondated adequately, support is not consistent, and oversight is not effective

  14. Highly porous ionic rht metal-organic framework for H2 and CO2 storage and separation: A molecular simulation study

    KAUST Repository

    Babarao, Ravichandar

    2010-07-06

    The storage and separation of H2 and CO2 are investigated in a highly porous ionic rht metal-organic framework (rht-MOF) using molecular simulation. The rht-MOF possesses a cationic framework and charge-balancing extraframework NO3 - ions. Three types of unique open cages exist in the framework: rhombicuboctahedral, tetrahedral, and cuboctahedral cages. The NO3 - ions exhibit small mobility and are located at the windows connecting the tetrahedral and cuboctahedral cages. At low pressures, H2 adsorption occurs near the NO 3 - ions that act as preferential sites. With increasing pressure, H2 molecules occupy the tetrahedral and cuboctahedral cages and the intersection regions. The predicted isotherm of H2 at 77 K agrees well with the experimental data. The H2 capacity is estimated to be 2.4 wt % at 1 bar and 6.2 wt % at 50 bar, among the highest in reported MOFs. In a four-component mixture (15:75:5:5 CO2/H 2/CO/CH4) representing a typical effluent gas of H 2 production, the selectivity of CO2/H2 in rht-MOF decreases slightly with increasing pressure, then increases because of cooperative interactions, and finally decreases as a consequence of entropy effect. By comparing three ionic MOFs (rht-MOF, soc-MOF, and rho-ZMOF), we find that the selectivity increases with increasing charge density or decreasing free volume. In the presence of a trace amount of H2O, the interactions between CO2 and NO3 - ions are significantly shielded by H2O; consequently, the selectivity of CO 2/H2 decreases substantially. © 2010 American Chemical Society.

  15. Research on Framework of Virtual Simulation System of Nuclear Facilities Decommissioning%核设施退役虚拟仿真系统框架研究

    Institute of Scientific and Technical Information of China (English)

    刘中坤; 彭敏俊; 朱海山; 成守宇; 巩诚

    2011-01-01

    Since nuclear facilities have strong radioactivity, the process of dismantling nuclear facilities has some characteristics of high risk and complex procedure. The virtual simulation system of nuclear facilities decommissioning based on virtual reality technology can provide an aiding tool for decommissioning project. This paper analyzed the research results of native and abroad relative fields and the demand for one specific decommissioning project, then some functions of the virtual simulation system modules were proposed. Furthermore, a systematic framework structure based on software development technique was developed, and the corresponding technologies were analyzed and discussed. Combined with the latest research results of computer hardware and software technologies, the cutting, blasting, and the radioactive fluid simulation technique were discussed. The analysis results show that a comprehensive virtual simulation for decommissioning process can be achieved even on the common personal computer. The experts point out the framework has feasibility.%退役核设施具有放射性,其退役过程危险且复杂,基于虚拟现实等技术的退役虚拟仿真系统可为退役工程提供辅助工具.本文在分析国内外研究成果的基础上,结合一专项退役工程的需求提出了该虚拟仿真系统的功能模块组成.基于软件开发技术进一步给出了系统的框架结构,并对各模块实现的相应技术进行了分析和探讨.分析中结合了计算机软硬件技术的最新研究成果,对退役中的切割、爆破、流体及放射性等的仿真技术进行了探讨.分析表明,基于普通个人计算机也可实现退役过程的全面虚拟仿真.经专家论证本方案具有可行性.

  16. Injury Profile SIMulator, a qualitative aggregative modelling framework to predict crop injury profile as a function of cropping practices, and the abiotic and biotic environment. I. Conceptual bases.

    Science.gov (United States)

    Aubertot, Jean-Noël; Robin, Marie-Hélène

    2013-01-01

    The limitation of damage caused by pests (plant pathogens, weeds, and animal pests) in any agricultural crop requires integrated management strategies. Although significant efforts have been made to i) develop, and to a lesser extent ii) combine genetic, biological, cultural, physical and chemical control methods in Integrated Pest Management (IPM) strategies (vertical integration), there is a need for tools to help manage Injury Profiles (horizontal integration). Farmers design cropping systems according to their goals, knowledge, cognition and perception of socio-economic and technological drivers as well as their physical, biological, and chemical environment. In return, a given cropping system, in a given production situation will exhibit a unique injury profile, defined as a dynamic vector of the main injuries affecting the crop. This simple description of agroecosystems has been used to develop IPSIM (Injury Profile SIMulator), a modelling framework to predict injury profiles as a function of cropping practices, abiotic and biotic environment. Due to the tremendous complexity of agroecosystems, a simple holistic aggregative approach was chosen instead of attempting to couple detailed models. This paper describes the conceptual bases of IPSIM, an aggregative hierarchical framework and a method to help specify IPSIM for a given crop. A companion paper presents a proof of concept of the proposed approach for a single disease of a major crop (eyespot on wheat). In the future, IPSIM could be used as a tool to help design ex-ante IPM strategies at the field scale if coupled with a damage sub-model, and a multicriteria sub-model that assesses the social, environmental, and economic performances of simulated agroecosystems. In addition, IPSIM could also be used to help make diagnoses on commercial fields. It is important to point out that the presented concepts are not crop- or pest-specific and that IPSIM can be used on any crop.

  17. Injury Profile SIMulator, a qualitative aggregative modelling framework to predict crop injury profile as a function of cropping practices, and the abiotic and biotic environment. I. Conceptual bases.

    Directory of Open Access Journals (Sweden)

    Jean-Noël Aubertot

    Full Text Available The limitation of damage caused by pests (plant pathogens, weeds, and animal pests in any agricultural crop requires integrated management strategies. Although significant efforts have been made to i develop, and to a lesser extent ii combine genetic, biological, cultural, physical and chemical control methods in Integrated Pest Management (IPM strategies (vertical integration, there is a need for tools to help manage Injury Profiles (horizontal integration. Farmers design cropping systems according to their goals, knowledge, cognition and perception of socio-economic and technological drivers as well as their physical, biological, and chemical environment. In return, a given cropping system, in a given production situation will exhibit a unique injury profile, defined as a dynamic vector of the main injuries affecting the crop. This simple description of agroecosystems has been used to develop IPSIM (Injury Profile SIMulator, a modelling framework to predict injury profiles as a function of cropping practices, abiotic and biotic environment. Due to the tremendous complexity of agroecosystems, a simple holistic aggregative approach was chosen instead of attempting to couple detailed models. This paper describes the conceptual bases of IPSIM, an aggregative hierarchical framework and a method to help specify IPSIM for a given crop. A companion paper presents a proof of concept of the proposed approach for a single disease of a major crop (eyespot on wheat. In the future, IPSIM could be used as a tool to help design ex-ante IPM strategies at the field scale if coupled with a damage sub-model, and a multicriteria sub-model that assesses the social, environmental, and economic performances of simulated agroecosystems. In addition, IPSIM could also be used to help make diagnoses on commercial fields. It is important to point out that the presented concepts are not crop- or pest-specific and that IPSIM can be used on any crop.

  18. A framework for incorporating DTI Atlas Builder registration into tract-based spatial statistics and a simulated comparison to standard TBSS

    Science.gov (United States)

    Leming, Matthew; Steiner, Rachel; Styner, Martin

    2016-03-01

    Tract-based spatial statistics (TBSS)6 is a software pipeline widely employed in comparative analysis of the white matter integrity from diffusion tensor imaging (DTI) datasets. In this study, we seek to evaluate the relationship between different methods of atlas registration for use with TBSS and different measurements of DTI (fractional anisotropy, FA, axial diffusivity, AD, radial diffusivity, RD, and medial diffusivity, MD). To do so, we have developed a novel tool that builds on existing diffusion atlas building software, integrating it into an adapted version of TBSS called DAB-TBSS (DTI Atlas Builder-Tract-Based Spatial Statistics) by using the advanced registration offered in DTI Atlas Builder7. To compare the effectiveness of these two versions of TBSS, we also propose a framework for simulating population differences for diffusion tensor imaging data, providing a more substantive means of empirically comparing DTI group analysis programs such as TBSS. In this study, we used 33 diffusion tensor imaging datasets and simulated group-wise changes in this data by increasing, in three different simulations, the principal eigenvalue (directly altering AD), the second and third eigenvalues (RD), and all three eigenvalues (MD) in the genu, the right uncinate fasciculus, and the left IFO. Additionally, we assessed the benefits of comparing the tensors directly using a functional analysis of diffusion tensor tract statistics (FADTTS10). Our results indicate comparable levels of FA-based detection between DAB-TBSS and TBSS, with standard TBSS registration reporting a higher rate of false positives in other measurements of DTI. Within the simulated changes investigated here, this study suggests that the use of DTI Atlas Builder's registration enhances TBSS group-based studies.

  19. Preparing for the SWOT mission by evaluating the simulations of river water levels within a regional-scale hydrometeorological modeling framework

    Science.gov (United States)

    Häfliger, Vincent; Martin, Eric; Boone, Aaron; Habets, Florence; David, Cédric H.; Garambois, Pierre-André; Roux, Hélène; Ricci, Sophie

    2014-05-01

    The upcoming Surface Water Ocean Topography (SWOT) mission will provide unprecedented observations of water elevation in rivers and lakes. The vertical accuracy of SWOT measurements is expected to be around 10 cm for rivers of width greater than 50-100m. Over France, new observations will be available every 5 days. Such observations will allow new opportunities for validation of hydrological models and for data assimilation within these models. The objective of the proposed work is to evaluate the quality of simulated river water levels in the Garonne River Basin (55,000 km²) located in Southwestern France. The simulations are produced using a distributed regional-scale hydrometeorological modeling framework composed of a land surface model (ISBA), a hydrogeological model (MODCOU) and a river network model (RAPID). The modeling framework had been initially calibrated over France although this study focuses on the smaller Garonne Basin and the proposed research emphasizes on modifications made to RAPID. First, the existing RAPID parameters (i.e. temporally-constant but spatially-variable Muskingum parameters) were updated in the Garonne River Basin based on estimations made using a lagged cross correlation method applied to observed hydrographs. Second, the model code was modified to allow for the use of a kinematic or a kinematic-diffusive wave equation for routing, both allowing for temporally and spatially variables wave celerities. Such modification required prescribing the values of hydraulic parameters of the river-channel. Initial results show that the variable flow velocity scheme is advantageous for discharge computations when compared to the original Muskingum method in RAPID. Additionally, water level computations led to root mean square errors of 50-60 cm in the improved Muskingum method and 40-50 cm in the kinematic-diffusive wave method. Discharge computations were also shown to be comparable to those obtained with high-resolution models solving the

  20. Ultra long-term simulation by the integrated model. 1. Framework and energy system module; Togo model ni yoru tanchoki simulation. 1. Flame work to energy system module

    Energy Technology Data Exchange (ETDEWEB)

    Kurosawa, A.; Yagita, H.; Yanagisawa, Y. [Research Inst. of Innovative Technology for the Earth, Kyoto (Japan)

    1997-01-30

    This paper introduces the study on the ultra long-term energy model `GRAPE` with considering global environment and the results of trial calculation. The GRAPE model is to consist of modules of energy system, climate change, change of land use, food demand/supply, macro economy, and environmental impact. This is a model that divides the world into ten regions, gives 1990 as a base year, and enables the ultra long-term simulation. In this time, emission of carbon is calculated as a trial. In the case of constrained quantity of carbon emission, energy supply in the latter half of 21st century is to compose photovoltaic energy, methanol from coal gasification, and biomass energy. In addition, the shear of nuclear energy is to remarkably increase. For the constitution of power generation, IGCC power generation with carbon recovery, wind power generation, photovoltaic power generation, and nuclear power generation are to extend their shears. In the case of constrained concentration of carbon emission, structural change of power generation option is to be delayed compared with the case of constrained quantity of carbon emission. 6 refs., 4 figs.

  1. Investigation of interphase effects in silica-polystyrene nanocomposites based on a hybrid molecular-dynamics-finite-element simulation framework

    Science.gov (United States)

    Pfaller, Sebastian; Possart, Gunnar; Steinmann, Paul; Rahimi, Mohammad; Müller-Plathe, Florian; Böhm, Michael C.

    2016-05-01

    A recently developed hybrid method is employed to study the mechanical behavior of silica-polystyrene nanocomposites (NCs) under uniaxial elongation. The hybrid method couples a particle domain to a continuum domain. The region of physical interest, i.e., the interphase around a nanoparticle (NP), is treated at molecular resolution, while the surrounding elastic continuum is handled with a finite-element approach. In the present paper we analyze the polymer behavior in the neighborhood of one or two nanoparticle(s) at molecular resolution. The coarse-grained hybrid method allows us to simulate a large polymer matrix region surrounding the nanoparticles. We consider NCs with dilute concentration of NPs embedded in an atactic polystyrene matrix formed by 300 chains with 200 monomer beads. The overall orientation of polymer segments relative to the deformation direction is determined in the neighborhood of the nanoparticle to investigate the polymer response to this perturbation. Calculations of strainlike quantities give insight into the deformation behavior of a system with two NPs and show that the applied strain and the nanoparticle distance have significant influence on the deformation behavior. Finally, we investigate to what extent a continuum-based description may account for the specific effects occurring in the interphase between the polymer matrix and the NPs.

  2. L-Py: an L-System simulation framework for modeling plant development based on a dynamic language

    Directory of Open Access Journals (Sweden)

    Frederic eBoudon

    2012-05-01

    Full Text Available The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e. languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: i by keeping a simple syntax while allowing for high-level programming constructs, ii by making code execution easy and avoiding compilation overhead iii allowing a high level of model reusability and the building of complex modular models iv and by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  3. L-py: an L-system simulation framework for modeling plant architecture development based on a dynamic language.

    Science.gov (United States)

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  4. Mixed traffic flow model considering illegal lane-changing behavior: Simulations in the framework of Kerner’s three-phase theory

    Science.gov (United States)

    Hu, Xiaojian; Wang, Wei; Yang, Haifei

    2012-11-01

    This paper studies the mixed motorized vehicle (m-vehicle) and non-motorized vehicle (nm-vehicle) traffic flow in the m-vehicle lane. We study the formation mechanism of the nm-vehicle illegal lane-changing behavior (NILB) by considering the overtaking motivation and the traffic safety awareness. In the framework of Kerner’s three-phase theory, we propose a model for the mixed traffic flow by introducing a new set of rules. A series of simulations are carried out in order to reveal the formation, travel process and influence of the mixed traffic flow. The simulation results show that the proposed model can be used to study not only the travel characteristic of the mixed traffic flow, but also some complex traffic problems such as traffic breakdown, moving synchronized flow pattern (MSP) and moving jam. Moreover, the results illustrate that the proposed model reflects the phenomenon of the mixed flow and the influence of the MSP caused by the NILB, which is consistent with the actual traffic system, and thus this work is helpful for the management of the mixed traffic flow.

  5. Performance model of the Argonne Voyager multimedia server

    Energy Technology Data Exchange (ETDEWEB)

    Disz, T.; Olson, R.; Stevens, R. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.

    1997-07-01

    The Argonne Voyager Multimedia Server is being developed in the Futures Lab of the Mathematics and Computer Science Division at Argonne National Laboratory. As a network-based service for recording and playing multimedia streams, it is important that the Voyager system be capable of sustaining certain minimal levels of performance in order for it to be a viable system. In this article, the authors examine the performance characteristics of the server. As they examine the architecture of the system, they try to determine where bottlenecks lie, show actual vs potential performance, and recommend areas for improvement through custom architectures and system tuning.

  6. Photographic as-builts for Argonne National Laboratory-West

    Energy Technology Data Exchange (ETDEWEB)

    Sherman, E.K.; Wiegand, C.V.

    1995-04-19

    Located 35 miles West of Idaho Falls, Idaho, Argonne National Laboratory-West operates a number of nuclear facilities for the Department of Energy (DOE) through the University of Chicago. Part of the present mission of Argonne National Laboratory-West includes shutdown of the EBR-II Reactor. In order to accomplish this task the Engineering-Drafting Department is exploring cost effective methods of providing as-building services. A new technology of integrating photographic images and AUTOCAD drawing files is considered one of those methods that shows promise.

  7. Argonne's contribution to regional development : successful examples.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y. I.

    2000-11-14

    Argonne National Laboratory's mission is basic research and technology development to meet national goals in scientific leadership, energy technology, and environmental quality. In addition to its core missions as a national research and development center, Argonne has exerted a positive impact on its regional economic development, has carried out outstanding educational programs not only for college/graduate students but also for pre-college students and teachers, and has fostered partnerships with universities for research collaboration and with industry for shaping the new technological frontiers.

  8. Development of high intensity source of thermal positrons APosS (Argonne Positron Source)

    International Nuclear Information System (INIS)

    We present an update on the positron-facility development at Argonne National Laboratory. We will discuss advantages of using low-energy electron accelerator, present our latest results on slow positron production simulations, and plans for further development of the facility. We have installed a new converter/moderator assembly that is appropriate for our electron energy that allows increasing the yield about an order of magnitude. We have simulated the relative yields of thermalized positrons as a function of incident positron energy on the moderator. We use these data to calculate positron yields that we compare with our experimental data as well as with available literature data. We will discuss the new design of the next generation positron front end utilization of reflection moderator geometry. We also will discuss planned accelerator upgrades and their impact on APosS.

  9. Hydrometeorological multi-model ensemble simulations of the 4 November 2011 flash flood event in Genoa, Italy, in the framework of the DRIHM project

    Directory of Open Access Journals (Sweden)

    A. Hally

    2015-03-01

    Full Text Available The e-Science environment developed in the framework of the EU-funded DRIHM project was used to demonstrate its ability to provide relevant, meaningful hydrometeorological forecasts. This was illustrated for the tragic case of 4 November 2011, when Genoa, Italy, was flooded as the result of heavy, convective precipitation that inundated the Bisagno catchment. The Meteorological Model Bridge (MMB, an innovative software component developed within the DRIHM project for the interoperability of meteorological and hydrological models, is a key component of the DRIHM e-Science environment. The MMB allowed three different rainfall-discharge models (DRiFt, RIBS and HBV to be driven by four mesoscale limited-area atmospheric models (WRF-NMM, WRF-ARW, Meso-NH and AROME and a downscaling algorithm (RainFARM in a seamless fashion. In addition to this multi-model configuration, some of the models were run in probabilistic mode, thus giving a comprehensive account of modelling errors and a very large amount of likely hydrometeorological scenarios (> 1500. The multi-model approach proved to be necessary because, whilst various aspects of the event were successfully simulated by different models, none of the models reproduced all of these aspects correctly. It was shown that the resulting set of simulations helped identify key atmospheric processes responsible for the large rainfall accumulations over the Bisagno basin. The DRIHM e-Science environment facilitated an evaluation of the sensitivity to atmospheric and hydrological modelling errors. This showed that both had a significant impact on predicted discharges, the former being larger than the latter. Finally, the usefulness of the set of hydrometeorological simulations was assessed from a flash flood early-warning perspective.

  10. The SPRINTARS version 3.80/4D-Var data assimilation system: development and inversion experiments based on the observing system simulation experiment framework

    Directory of Open Access Journals (Sweden)

    K. Yumimoto

    2013-06-01

    Full Text Available We present an aerosol data assimilation system based on a global aerosol climate model (SPRINTARS and a four-dimensional variational data assimilation method (4D-Var. Its main purposes are to optimize emission estimates, improve composites, and obtain the best estimate of the radiative effects of aerosols in conjunction with observations. To reduce the huge computational cost caused by the iterative integrations in the models, we developed an off-line model and a corresponding adjoint model, which are driven by pre-calculated meteorological, land, and soil data. The off-line and adjoint model shortened the computational time of the inner loop by more than 30%. By comparing the results with a 1yr simulation from the original on-line model, the consistency of the off-line model was verified, with correlation coefficient R^2 > 0.97 and absolute value of normalized mean bias NMB The feasibility and capability of the developed system for aerosol inverse modelling was demonstrated in several inversion experiments based on the observing system simulation experiment framework. In the experiments, we generated the simulated observation data sets of fine- and coarse-mode AOTs from sun-synchronous polar orbits to investigate the impact of the observational frequency (number of satellites and coverage (land and ocean. Observations over land have a notably positive impact on the performance of inverse modelling comparing with observations over ocean, implying that reliable observational information over land is important for inverse modelling of land-born aerosols. The experimental results also indicate that aerosol type classification is crucial to inverse modelling over regions where various aerosol species co-exist (e.g. industrialized regions and areas downwind of them.

  11. Hydrometeorological multi-model ensemble simulations of the 4 November 2011 flash-flood event in Genoa, Italy, in the framework of the DRIHM project

    Directory of Open Access Journals (Sweden)

    A. Hally

    2014-11-01

    Full Text Available The e-Science environment developed in the framework of the EU-funded DRIHM project was used to demonstrate its capability to provide relevant, meaningful hydrometeorological forecasts. This was illustrated for the tragic case of 4 November 2011, when Genoa, Italy, was flooded as the result of heavy, convective precipitation that inundated the Bisagno catchment. The Meteorological Model Bridge (MMB, an innovative software component developped within the DRIHM project for the interoperability of meteorological and hydrological models, is a key component of the DRIHM e-Science environment. The MMB allowed three different rainfall-discharge models (DRiFt, RIBS, and HBV to be driven by four mesoscale limited-area atmospheric models (WRF-NMM, WRF-ARW, Meso-NH, and AROME and a downscaling algorithm (RainFARM in a seamless fashion. In addition to this multi-model configuration, some of the models were run in probabilistic mode, thus allowing a comprehensive account of modelling errors and a very large amount of likely hydrometeorological scenarios (>1500. The multi-model approach proved to be necessary because, whilst various aspects of the event were successfully simulated by different models, none of the models reproduced all of these aspects correctly. It was shown that the resulting set of simulations helped identify key atmospheric processes responsible for the large rainfall accumulations over the Bisagno basin. The DRIHM e-Science environment facilitated an evaluation of the sensitivity to atmospheric and hydrological modelling errors. This showed that both had a significant impact on predicted discharges, the former being larger than the latter. Finally, the usefulness of the set of hydrometeorological simulations was assessed from a flash-flood early-warning perspective.

  12. A FULL GPU IMPLEMENTATION FRAMEWORK OF SPH FLUID REAL-TIME SIMULATION%一个SPH流体实时模拟的全GPU实现框架

    Institute of Scientific and Technical Information of China (English)

    郭秋雷; 唐逸之; 刘诗秋; 李桂清

    2011-01-01

    How to implement timely the high realistic imitation of large-scale fluid simulation is an important element in computer graphics research. Fluid simulation consists of quite a few components including physical calculation, collision detection, surface reconstruction and rendering, so that there are a lot of works with regard to GPU acceleration aiming at the algorithms of each component of the fluid simulation. This paper proposes a set of GPU-based framework for SPH fluid simulation acceleration. On the basis of employing smoothed-particle hydrodynamics to solve Navier-Stokes equation, we speed up greatly the particle collision detection with GPU-based parallel spatial subdivision. Meanwhile, we design a fluid surface information reconstruction algorithm which uses the geometry shader, and the index-based optimisation is further carried out, which makes the reconstruction process of fluid surface avoid from traversing those areas do not contain surfaces. Experimental results show that the method in this paper can simulate timely the fluid scene with perfect reality.%怎样实时地进行高度逼真的大规模流体模拟是图形学要研究的一个重要内容.流体的模拟由物理计算、碰撞检测、表面重构和渲染几个部分组成,因此有大量工作针对流体模拟中的各个部分算法进行GPU加速.提出一整套基于GPU的SPH流体模拟加速框架.在利用平滑粒子动力学(SPH)求解Navier-Stokes方程的基础上,借助基于GPU的空间划分PSS( Parallel Spatial Subdivision)来大幅度提升粒子碰撞的检测速度.同时,设计一种基于几何着色器(Geometry Shader)的流体表面信息的重建算法,并进一步地实现基于索引的优化,使得在流体表面重建过程无须遍历不包含表面的区域.实验结果表明,该方法能实时模拟出具有较好真实感的流体场景.

  13. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  14. FACET: an object-oriented software framework for modeling complex social behavior patterns

    Energy Technology Data Exchange (ETDEWEB)

    Dolph, J. E.; Christiansen, J. H.; Sydelko, P. J.

    2000-06-30

    The Framework for Addressing Cooperative Extended Transactions (FACET) is a flexible, object-oriented architecture for implementing models of dynamic behavior of multiple individuals, or agents, in a simulation. These agents can be human (individuals or organizations) or animal and may exhibit any type of organized social behavior that can be logically articulated. FACET was developed by Argonne National Laboratory's (ANL) Decision and Information Sciences Division (DIS) out of the need to integrate societal processes into natural system simulations. The FACET architecture includes generic software components that provide the agents with various mechanisms for interaction, such as step sequencing and logic, resource management, conflict resolution, and preemptive event handling. FACET components provide a rich environment within which patterns of behavior can be captured in a highly expressive manner. Interactions among agents in FACET are represented by Course of Action (COA) object-based models. Each COA contains a directed graph of individual actions, which represents any known pattern of social behavior. The agents' behavior in a FACET COA, in turn, influences the natural landscape objects in a simulation (i.e., vegetation, soil, and habitat) by updating their states. The modular design of the FACET architecture provides the flexibility to create multiple and varied simulation scenarios by changing social behavior patterns, without disrupting the natural process models. This paper describes the FACET architecture and presents several examples of FACET models that have been developed to assess the effects of anthropogenic influences on the dynamics of the natural environment.

  15. Three Argonne technologies win R&D 100 awards

    CERN Multimedia

    2003-01-01

    "Three technologies developed or co-developed at the U.S. Department of Energy's Argonne National Laboratory have been recognized with R&D 100 Awards, which highlight some of the best products and technologies from around the world" (1 page).

  16. Argonne to open new facility for advanced vehicle testing

    CERN Multimedia

    2002-01-01

    Argonne National Laboratory will open it's Advanced Powertrain Research Facility on Friday, Nov. 15. The facility is North America's only public testing facility for engines, fuel cells, electric drives and energy storage. State-of-the-art performance and emissions measurement equipment is available to support model development and technology validation (1 page).

  17. Brookhaven Lab and Argonne Lab scientists invent a plasma valve

    CERN Multimedia

    2003-01-01

    Scientists from Brookhaven National Laboratory and Argonne National Laboratory have received U.S. patent number 6,528,948 for a device that shuts off airflow into a vacuum about one million times faster than mechanical valves or shutters that are currently in use (1 page).

  18. Argonne National Laboratory Publications July 1, 1968 - June 30, 1969.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1969-08-01

    This publication list is a bibliography of scientific and technical accounts originated at Argonne and published during the fiscal year 1969 (July 1, 1968 through June 30, 1969). It includes items published as journal articles, technical reports, books, etc., all of which have been made available to the public.

  19. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  20. An object-oriented modeling and simulation framework for bearings-only multi-target tracking using an unattended acoustic sensor network

    Science.gov (United States)

    Aslan, Murat Šamil

    2013-10-01

    Tracking ground targets using low cost ground-based sensors is a challenging field because of the limited capabilities of such sensors. Among the several candidates, including seismic and magnetic sensors, the acoustic sensors based on microphone arrays have a potential of being useful: They can provide a direction to the sound source, they can have a relatively better range, and the sound characteristics can provide a basis for target classification. However, there are still many problems. One of them is the difficulty to resolve multiple sound sources, another is that they do not provide distance, a third is the presence of background noise from wind, sea, rain, distant air and land traffic, people, etc., and a fourth is that the same target can sound very differently depending on factors like terrain type, topography, speed, gear, distance, etc. Use of sophisticated signal processing and data fusion algorithms is the key for compensating (to an extend) the limited capabilities and mentioned problems of these sensors. It is hard, if not impossible, to evaluate the performance of such complex algorithms analytically. For an effective evaluation, before performing expensive field trials, well-designed laboratory experiments and computer simulations are necessary. Along this line, in this paper, we present an object-oriented modeling and simulation framework which can be used to generate simulated data for the data fusion algorithms for tracking multiple on-road targets in an unattended acoustic sensor network. Each sensor node in the network is a circular microphone array which produces the direction of arrival (DOA) (or bearing) measurements of the targets and sends this information to a fusion center. We present the models for road networks, targets (motion and acoustic power) and acoustic sensors in an object-oriented fashion where different and possibly time-varying sampling periods for each sensor node is possible. Moreover, the sensor's signal processing and

  1. Using the DeNitrification-DeComposition Framework to Simulate Global Soil Nitrous Oxide Emissions in the Community Land Model with Coupled Carbon and Nitrogen

    Science.gov (United States)

    Seok, B.; Saikawa, E.

    2015-12-01

    Soils are among the largest emission sources of nitrous oxide (N2O), which is a strong greenhouse gas and is the leading stratospheric ozone depleting substance. Thus, there is a rising concern for mitigating N2O emissions from soils. And yet, our understanding of the global magnitude and the impacts of soil N2O emissions on the climate and the stratospheric ozone layer is still limited, and our ability to mitigate N2O emissions thus remains a challenge. One approach to assess the global magnitude and impacts of N2O emissions is to use global biogeochemical models. Currently, most of these models use a simple or a conceptual framework to simulate soil N2O emissions. However, if we are to reduce the uncertainty in determining the N2O budget, a better representation of the soil N2O emissions process is essential. In our attempts to fulfill this objective, we have further improved the parameterization of soil N2O emissions in the Community Land Model with coupled Carbon and Nitrogen (CLM-CN) by implementing the DeNitrification-DeComposition (DNDC) model and validated our model results to existing measurements. We saw a general improvement in simulated N2O emissions with the updated parameterization and further improvements for specific sites when the model was nudged with measured soil temperature and moisture data of the respective site. We present the latest updates and changes made to CLM-CN with one-way coupled DNDC model (CLMCN-N2O) and compare the results between model versions and against other global biogeochemical models.

  2. PHP frameworks

    OpenAIRE

    Srša, Aljaž

    2016-01-01

    The thesis presents one of the four most popular PHP web frameworks: Laravel, Symfony, CodeIgniter and CakePHP. These frameworks are compared with each other according to the four criteria, which can help with the selection of a framework. These criteria are size of the community, quality of official support, comprehensibility of framework’s documentation and implementation of functionalities in individual frameworks, which are automatic code generation, routing, object-relational mapping and...

  3. Employment impacts of EU biofuels policy. Combining bottom-up technology information and sectoral market simulations in an input-output framework

    International Nuclear Information System (INIS)

    This paper analyses the employment consequences of policies aimed to support biofuels in the European Union. The promotion of biofuel use has been advocated as a means to promote the sustainable use of natural resources and to reduce greenhouse gas emissions originating from transport activities on the one hand, and to reduce dependence on imported oil and thereby increase security of the European energy supply on the other hand. The employment impacts of increasing biofuels shares are calculated by taking into account a set of elements comprising the demand for capital goods required to produce biofuels, the additional demand for agricultural feedstock, higher fuel prices or reduced household budget in the case of price subsidisation, price effects ensuing from a hypothetical world oil price reduction linked to substitution in the EU market, and price impacts on agro-food commodities. The calculations refer to scenarios for the year 2020 targets as set out by the recent Renewable Energy Roadmap. Employment effects are assessed in an input-output framework taking into account bottom-up technology information to specify biofuels activities and linked to partial equilibrium models for the agricultural and energy sectors. The simulations suggest that biofuels targets on the order of 10-15% could be achieved without adverse net employment effects. (author)

  4. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J;

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  5. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  6. 基于数据耕耘的探索性仿真实验框架研究%Research on the Framework of Exploratory Simulation Experiment Based on Data Farming

    Institute of Scientific and Technical Information of China (English)

    李斌; 李春洪; 刘苏洋

    2011-01-01

    Based on data farming, the paper designs the framework of exploratory simulation experiment, and discusses all parts in the framework mainly. The framework of exploratory simulation experiment, which is based on data farming, includes Ping-Pang wargaming preparation experiment, single scenario building loop, simulation scenario space execution loop and simulation results analysis loop. Combining human being's experiences and intelligence with computer simulation method, the advices needed for military decision are formed and interested war rules are found gradually in the process of circulations. Many qualitative and quantitative analysis methods are combined under the framework of exploratory simulation experiment to carry out the experiment aimming at the object of experiment.%基于数据耕耘思想,设计了探索性仿真实验框架,并对框架中每个部分进行了简要探讨.基于数据耕耘的探索性仿真实验框架由乒乓式对抗推演预实验、单个仿真想定生成环、仿真想定空间运行环和仿真结果分析环四部分组成,通过将人的经验、智慧与计算机仿真手段相结合,在多次循环的过程中逐渐形成所需的军事决策建议或寻找感兴趣的战争规律.通过构建探索性仿真实验框架,能够将各种定量、定性分析方法整合起来,围绕实验目标实施探索性仿真实验.

  7. Argonne's Expedited Site Characterization: An integrated approach to cost- and time-effective remedial investigation

    International Nuclear Information System (INIS)

    Argonne National Laboratory has developed a methodology for remedial site investigation that has proven to be both technically superior to and more cost- and time-effective than traditional methods. This methodology is referred to as the Argonne Expedited Site Characterization (ESC). Quality is the driving force within the process. The Argonne ESC process is abbreviated only in time and cost and never in terms of quality. More usable data are produced with the Argonne ESC process than with traditional site characterization methods that are based on statistical-grid sampling and multiple monitoring wells. This paper given an overview of the Argonne ESC process and compares it with traditional methods for site characterization. Two examples of implementation of the Argonne ESC process are discussed to illustrate the effectiveness of the process in CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act) and RCRA (Resource Conservation and Recovery Act) programs

  8. Selection, specification, design and use of various nuclear power plant training simulators. Report prepared within the framework of the International Working Group on Nuclear Power Plant Control and Instrumentation

    International Nuclear Information System (INIS)

    Several IAEA publications consider the role of training and particularly the role of simulator training to enhance the safety of NPP operations. Initially, the focus was on full scope simulators for the training of main control room operators. Experience shows that other types of simulator are also effective tools that allow simulator training for a broader range of target groups and training objectives. This report provides guidance to training centers and suppliers on the proper selection, specification, design and use of various forms of simulators. In addition, it provides examples of their use in several Member States. This report is the result of a series of advisory and consultants meetings held in the framework of the International Working Group on Nuclear Power Plant Control and Instrumentation (IWG-NPPCI) in 1995-1996

  9. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  10. APEX user`s guide - (Argonne production, expansion, and exchange model for electrical systems), version 3.0

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Veselka, T.D.; Guziel, K.A.; Blodgett, D.W.; Hamilton, S.; Kavicky, J.A.; Koritarov, V.S.; North, M.J.; Novickas, A.A.; Paprockas, K.R. [and others

    1994-11-01

    This report describes operating procedures and background documentation for the Argonne Production, Expansion, and Exchange Model for Electrical Systems (APEX). This modeling system was developed to provide the U.S. Department of Energy, Division of Fossil Energy, Office of Coal and Electricity with in-house capabilities for addressing policy options that affect electrical utilities. To meet this objective, Argonne National Laboratory developed a menu-driven programming package that enables the user to develop and conduct simulations of production costs, system reliability, spot market network flows, and optimal system capacity expansion. The APEX system consists of three basic simulation components, supported by various databases and data management software. The components include (1) the investigation of Costs and Reliability in Utility Systems (ICARUS) model, (2) the Spot Market Network (SMN) model, and (3) the Production and Capacity Expansion (PACE) model. The ICARUS model provides generating-unit-level production-cost and reliability simulations with explicit recognition of planned and unplanned outages. The SMN model addresses optimal network flows with recognition of marginal costs, wheeling charges, and transmission constraints. The PACE model determines long-term (e.g., longer than 10 years) capacity expansion schedules on the basis of candidate expansion technologies and load growth estimates. In addition, the Automated Data Assembly Package (ADAP) and case management features simplify user-input requirements. The ADAP, ICARUS, and SMN modules are described in detail. The PACE module is expected to be addressed in a future publication.

  11. Simulation of the Mechanism of Gas Sorption in a Metal–Organic Framework with Open Metal Sites: Molecular Hydrogen in PCN-61

    KAUST Repository

    Forrest, Katherine A.

    2012-07-26

    Grand canonical Monte Carlo (GCMC) simulations were performed to investigate hydrogen sorption in an rht-type metal-organic framework (MOF), PCN-61. The MOF was shown to have a large hydrogen uptake, and this was studied using three different hydrogen potentials, effective for bulk hydrogen, but of varying sophistication: a model that includes only repulsion/dispersion parameters, one augmented with charge-quadrupole interactions, and one supplemented with many-body polarization interactions. Calculated hydrogen uptake isotherms and isosteric heats of adsorption, Q st, were in quantitative agreement with experiment only for the model with explicit polarization. This success in reproducing empirical measurements suggests that modeling MOFs that have open metal sites is feasible, though it is often not considered to be well described via a classical potential function; here it is shown that such systems may be accurately described by explicitly including polarization effects in an otherwise traditional empirical potential. Decomposition of energy terms for the models revealed deviations between the electrostatic and polarizable results that are unexpected due to just the augmentation of the potential surface by the addition of induction. Charge-quadrupole and induction energetics were shown to have a synergistic interaction, with inclusion of the latter resulting in a significant increase in the former. Induction interactions strongly influence the structure of the sorbed hydrogen compared to the models lacking polarizability; sorbed hydrogen is a dipolar dense fluid in the MOF. This study demonstrates that many-body polarization makes a critical contribution to gas sorption structure and must be accounted for in modeling MOFs with polar interaction sites. © 2012 American Chemical Society.

  12. Present and future radioactive nuclear beam developments at Argonne

    Energy Technology Data Exchange (ETDEWEB)

    Decrock, P.

    1996-11-01

    A scheme for building an ISOL-based radioactive nuclear beam facility at the Argonne Physics Division, is currently evaluated. The feasibility and efficiency of the different steps in the proposed production- and acceleration cycles are being tested. At the Dynamitron Facility of the ANL Physics Division, stripping yields of Kr, Xe and Ph beams in a windowless gas cell have been measured and the study of fission of {sup 238}U induced by fast neutrons from the {sup 9}Be(dn) reaction is in progress. Different aspects of the post-acceleration procedure are currently being investigated. In parallel with this work, energetic radioactive beams such as {sup 17}F, {sup 18}F and {sup 56}Ni have recently been developed at Argonne using the present ATLAS facility.

  13. Multiscale framework for predicting the coupling between deformation and fluid diffusion in porous rocks

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, José E; Rudnicki, John W

    2012-12-14

    In this project, a predictive multiscale framework will be developed to simulate the strong coupling between solid deformations and fluid diffusion in porous rocks. We intend to improve macroscale modeling by incorporating fundamental physical modeling at the microscale in a computationally efficient way. This is an essential step toward further developments in multiphysics modeling, linking hydraulic, thermal, chemical, and geomechanical processes. This research will focus on areas where severe deformations are observed, such as deformation bands, where classical phenomenology breaks down. Multiscale geometric complexities and key geomechanical and hydraulic attributes of deformation bands (e.g., grain sliding and crushing, and pore collapse, causing interstitial fluid expulsion under saturated conditions), can significantly affect the constitutive response of the skeleton and the intrinsic permeability. Discrete mechanics (DEM) and the lattice Boltzmann method (LBM) will be used to probe the microstructure---under the current state---to extract the evolution of macroscopic constitutive parameters and the permeability tensor. These evolving macroscopic constitutive parameters are then directly used in continuum scale predictions using the finite element method (FEM) accounting for the coupled solid deformation and fluid diffusion. A particularly valuable aspect of this research is the thorough quantitative verification and validation program at different scales. The multiscale homogenization framework will be validated using X-ray computed tomography and 3D digital image correlation in situ at the Advanced Photon Source in Argonne National Laboratories. Also, the hierarchical computations at the specimen level will be validated using the aforementioned techniques in samples of sandstone undergoing deformation bands.

  14. The virtual reality framework for engineering objects

    OpenAIRE

    Ivankov, Petr R.; Ivankov, Nikolay P.

    2006-01-01

    A framework for virtual reality of engineering objects has been developed. This framework may simulate different equipment related to virtual reality. Framework supports 6D dynamics, ordinary differential equations, finite formulas, vector and matrix operations. The framework also supports embedding of external software.

  15. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  16. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  17. Enhanced Methane Adsorption in Catenated Metal-organic Frameworks: A Molecular Simulation Study%连锁结构金属-有机骨架材料强化甲烷吸附的分子模拟研究

    Institute of Scientific and Technical Information of China (English)

    薛春瑜; 周了娥; 阳庆元; 仲崇立

    2009-01-01

    A systematic molecular simulation study was performed to investigate the effect of catenation on meth-ane adsorption in metal-organic frameworks (MOFs).Four pairs of isoreticular MOFs (IRMOFs) with and without catenation were adopted and their capacities for methane adsorption were compared at room temperature. The pre-sent work showed that catenation could greatly enhance the storage capacity of methane in MOFs, due to the for-mation of additional small pores and adsorption sites formed by the catenation of frameworks. In addition, the simulation results obtained at 298 K and 3.5 MPa showed that catenated MOFs could easily meet the requirement for methane storage in porous materials.

  18. Numerical Simulation of Casting Deformation and Stress in the Ti-Alloy Parts with Framework Structure%钛合金框架铸件铸造变形和应力的数值模拟

    Institute of Scientific and Technical Information of China (English)

    崔新鹏; 张晨; 范世玺; 南海

    2015-01-01

    基于ProCAST软件对框架型钛合金铸件铸造过程中的温度场、应力场及铸造变形进行了数值模拟。在应力计算初期将石墨型壳设为RIGID模型,后期设为VACANT模型。模拟结果显示铸件上部及内部框架处温度较低,下部浇道处温度较高;铸件两侧尖部有沿-Y方向的弯曲变形;内部框架结构的转角处会出现应力集中。实际尺寸测量结果与模拟结果吻合较好,验证了模拟的准确性。%The temperature field,stress field and casting deformation of the Ti-Alloy framework castings during pouring and solidification process were simulated based on the ProCAST software.The model of the graphite mould was set as RIGID at the initial stage of the stress calculation whereas VACANT at the later stage.The simulation results reveal that the temperature of the upper and inner part of the framework castings is much lower than the one of the bottom part and the pouring gates.The bending deflection along -Y axis was observed at two tips of the framework castings.Stress concentration was observed at the corners of the internal framework structure.The actual casting dimension measurement results are well in agreement with the simulated ones,which virified the accuracy of the simulation.

  19. Argonne National Laboratory institutional plan FY 2002 - FY 2007

    International Nuclear Information System (INIS)

    The national laboratory system provides a unique resource for addressing the national needs inherent in the mission of the Department of Energy. Argonne, which grew out of Enrico Fermi's pioneering work on the development of nuclear power, was the first national laboratory and, in many ways, has set the standard for those that followed. As the Laboratory's new director, I am pleased to present the Argonne National Laboratory Institutional Plan for FY 2002 through FY 2007 on behalf of the extraordinary group of scientists, engineers, technicians, administrators, and others who re responsible for the Laboratory's distinguished record of achievement. Like our sister DOE laboratories, Argonne uses a multifaceted approach to advance U.S. R and D priorities. First, we assemble interdisciplinary teams of scientists and engineers to address complex problems. For example, our initiative in Functional Genomics will bring together biologists, computer scientists, environmental scientists, and staff of the Advanced Photon Source to develop complete maps of cellular function. Second, we cultivate specific core competencies in science and technology; this Institutional Plan discusses the many ways in which our core competencies support DOE's four mission areas. Third, we serve the scientific community by designing, building, and operating world-class user facilities, such as the Advanced Photon Source, the Intense Pulsed Neutron Source, and the Argonne Tandem-Linac Accelerator System. This Plan summarizes the visions, missions, and strategic plans for the Laboratory's existing major user facilities, and it explains our approach to the planned Rare Isotope Accelerator. Fourth, we help develop the next generation of scientists and engineers through educational programs, many of which involve bright young people in research. This Plan summarizes our vision, objectives, and strategies in the education area, and it gives statistics on student and faculty participation. Finally, we

  20. Statistical framework for evaluation of climate model simulations by use of climate proxy data from the last millennium – Part 3: Practical considerations, relaxed assumptions, and using tree-ring data to address the amplitude of solar forcing

    Directory of Open Access Journals (Sweden)

    A. Moberg

    2014-06-01

    Full Text Available Practical issues arise when applying a statistical framework for unbiased ranking of alternative forced climate model simulations by comparison with climate observations from instrumental and proxy data (Part 1 in this series. Given a set of model and observational data, several decisions need to be made; e.g. concerning the region that each proxy series represents, the weighting of different regions, and the time resolution to use in the analysis. Objective selection criteria cannot be made here, but we argue to study how sensitive the results are to the choices made. The framework is improved by the relaxation of two assumptions; to allow autocorrelation in the statistical model for simulated climate variability, and to enable direct comparison of alternative simulations to test if any of them fit the observations significantly better. The extended framework is applied to a set of simulations driven with forcings for the pre-industrial period 1000–1849 CE and fifteen tree-ring based temperature proxy series. Simulations run with only one external forcing (land-use, volcanic, small-amplitude solar, or large-amplitude solar, do not significantly capture the variability in the tree-ring data – although the simulation with volcanic forcing does so for some experiment settings. When all forcings are combined (using either the small- or large-amplitude solar forcing including also orbital, greenhouse-gas and non-volcanic aerosol forcing, and additionally used to produce small simulation ensembles starting from slightly different initial ocean conditions, the resulting simulations are highly capable of capturing some observed variability. Nevertheless, for some choices in the experiment design, they are not significantly closer to the observations than when unforced simulations are used, due to highly variable results between regions. It is also not possible to tell whether the small-amplitude or large-amplitude solar forcing causes the multiple

  1. Digital Polygon Model Grid of the Hydrogeologic Framework of Bedrock Units for a Simulation of Groundwater Flow for the Lake Michigan Basin

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The hydrogeologic framework for the Lake Michigan Basin model was developed by grouping the bedrock geology of the study area into hydrogeologic units on the basis...

  2. Argonne National Lab deploys Force10 networks' massively dense ethernet switch for supercomputing cluster

    CERN Multimedia

    2003-01-01

    "Force10 Networks, Inc. today announced that Argonne National Laboratory (Argonne, IL) has successfully deployed Force10 E-Series switch/routers to connect to the TeraGrid, the world's largest supercomputing grid, sponsored by the National Science Foundation (NSF)" (1/2 page).

  3. Frontiers: Research Highlights 1946-1996 [50th Anniversary Edition. Argonne National Laboratory

    Science.gov (United States)

    1996-01-01

    This special edition of 'Frontiers' commemorates Argonne National Laboratory's 50th anniversary of service to science and society. America's first national laboratory, Argonne has been in the forefront of U.S. scientific and technological research from its beginning. Past accomplishments, current research, and future plans are highlighted.

  4. Frontiers: Research highlights 1946-1996 [50th Anniversary Edition. Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This special edition of 'Frontiers' commemorates Argonne National Laboratory's 50th anniversary of service to science and society. America's first national laboratory, Argonne has been in the forefront of U.S. scientific and technological research from its beginning. Past accomplishments, current research, and future plans are highlighted.

  5. SIMULATION MODEL RESOURCE SEARCH FRAMEWORK BASED ON SEMANTICS DESCRIPTION OF CONCEPTUAL MODEL%基于概念模型语义描述的仿真模型资源搜索框架

    Institute of Scientific and Technical Information of China (English)

    康晓予; 邓贵仕

    2011-01-01

    重用已有模型构建新的仿真应用一直受到系统仿真领域的关注.基于模型数据库搜索、判断与应用需求相匹配的仿真模型资源是实现重用的关键问题.提出一个基于概念模型语义描述的仿真模型资源搜索框架,详细说明了该搜索框架的结构.框架建立了由实体、任务、交互等概念模型元素构成的仿真模型资源语义描述模型,采用本体语义和关键字匹配等搜索策略.模拟实验表明该框架可以很大程度上提高搜索判断的准确性.%Constructing new simulation applications based on the reuse of existing simulation models has always been paid attention in the system simulation field. It is a key issue in realising the reuse that to search, estimate and apply the simulation model resources matching the needs of application based on the database. This paper proposed a search framework for simulation model resources based on semantics description of conceptual model, and its structure is expounded as well. The framework sets up a semantics description model for simulation model resources constructed by the conceptual model elements of entities, tasks and interactions, and uses searching policies of ontology semantics and keywords matching. Simulation experiments indicate that the frame can improve the accuracy of search and estimation remarkably.

  6. Using the C4ISR Architecture Framework as a Tool to Facilitate VV&A for Simulation Systems within the Military Application Domain

    CERN Document Server

    Tolk, Andreas

    2010-01-01

    To harmonize the individual architectures of the different commands, services, and agencies dealing with the development and procurement of Command, Control, Communications, Computing, Surveillance, Reconnaissance, and Intelligence (C4ISR) systems, the C4ISR Architecture Framework was developed based on existing and matured modeling techniques and methods. Within a short period, NATO adapted this method family as the NATO Consultation, Command, and Control (C3) System Architecture Framework to harmonize the efforts of the different nations. Based on these products, for every system to be fielded to be used in the US Armed Forces, a C4I Support Plan (C4ISP) has to be developed enabling the integration of the special system into the integrated C4I Architecture. The tool set proposed by these architecture frameworks connects operational views of the military user, system views of the developers, and the technical views for standards and integration methods needed to make the network centric system of systems wor...

  7. Research in mathematics and computer science at Argonne

    Energy Technology Data Exchange (ETDEWEB)

    Pieper, G.W.

    1989-08-01

    This report reviews the research activities in the Mathematics and Computer Science Division at Argonne National Laboratory for the period January 1988 - August 1989. The body of the report gives a brief look at the MCS staff and the research facilities, and discusses various projects carried out in two major areas of research: analytical and numerical methods and advanced computing concepts. Projects funded by non-DOE sources are also discussed, and new technology transfer activities are described. Further information on division staff, visitors, workshops, and seminars is found in the appendices.

  8. 1985 annual site environmental report for Argonne National Laboratory

    International Nuclear Information System (INIS)

    This is one in a series of annual reports prepared to provide DOE, environmental agencies, and the public with information on the level of radioactive and chemical pollutants in the environment and on the amounts of such substances, if any, added to the environment as a result of Argonne operations. Included in this report are the results of measurements obtained in 1985 for a number of radionuclides in air, surface water, ground water, soil, grass, bottom sediment, and milk; for a variety of chemical constituents in surface and subsurface water; and for the external penetrating radiation dose

  9. Argonne National Laboratory monthly progress report, April 1952

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1952-04-01

    This progress report from the Argonne National Laboratory covers the work in Biological and Medical Research, Radiological Physics, and Health services for the quarterly period ending March 31, 1952. Numerous experiments were conducted in an attempt to answer some of the questions arising from exposure to ionizing radiation, especially X radiation. Some of the research involved the radiosensitivity of cells and some involved animals. The effects of radium in humans was also evaluated. Other studies were performed in biology, such as the effect of photoperiodism on plant growth and the biological of beryllium.

  10. Change in argonne national laboratory: a case study.

    Science.gov (United States)

    Mozley, A

    1971-10-01

    Despite traditional opposition to change within an institution and the known reluctance of an "old guard" to accept new managerial policies and techniques, the reactions suggested in this study go well beyond the level of a basic resistance to change. The response, indeed, drawn from a random sampling of Laboratory scientific and engineering personnel, comes close to what Philip Handler has recently described as a run on the scientific bank in a period of depression (1, p. 146). It appears that Argonne's apprehension stems less from the financial cuts that have reduced staff and diminished programs by an annual 10 percent across the last 3 fiscal years than from the administrative and conceptual changes that have stamped the institution since 1966. Administratively, the advent of the AUA has not forged a sense of collaborative effort implicit in the founding negotiations or contributed noticeably to increasing standards of excellence at Argonne. The AUA has, in fact, yet to exercise the constructive powers vested in them by the contract of reviewing and formulating long-term policy on the research and reactor side. Additionally, the University of Chicago, once the single operator, appears to have forfeited some of the trust and understanding that characterized the Laboratory's attitude to it in former years. In a period of complex and sensitive management the present directorate at Argonne is seriously dissociated from a responsible spectrum of opinion within the Laboratory. The crux of discontent among the creative scientific and engineering community appears to lie in a developed sense of being overadministered. In contrast to earlier periods, Argonne's professional staff feels a critical need for a voice in the formulation of Laboratory programs and policy. The Argonne senate could supply this mechanism. Slow to rally, their present concern springs from a firm conviction that the Laboratory is "withering on the vine." By contrast, the Laboratory director Powers

  11. Microscale chemistry technology exchange at Argonne National Laboratory - east.

    Energy Technology Data Exchange (ETDEWEB)

    Pausma, R.

    1998-06-04

    The Division of Educational Programs (DEP) at Argonne National Laboratory-East interacts with the education community at all levels to improve science and mathematics education and to provide resources to instructors of science and mathematics. DEP conducts a wide range of educational programs and has established an enormous audience of teachers, both in the Chicago area and nationally. DEP has brought microscale chemistry to the attention of this huge audience. This effort has been supported by the U.S. Department of Energy through the Environmental Management Operations organization within Argonne. Microscale chemistry is a teaching methodology wherein laboratory chemistry training is provided to students while utilizing very small amounts of reagents and correspondingly small apparatus. The techniques enable a school to reduce significantly the cost of reagents, the cost of waste disposal and the dangers associated with the manipulation of chemicals. The cost reductions are achieved while still providing the students with the hands-on laboratory experience that is vital to students who might choose to pursue careers in the sciences. Many universities and colleges have already begun to switch from macroscale to microscale chemistry in their educational laboratories. The introduction of these techniques at the secondary education level will lead to freshman being better prepared for the type of experimentation that they will encounter in college.

  12. Draft environmental assessment of Argonne National Laboratory, East

    International Nuclear Information System (INIS)

    This environmental assessment of the operation of the Argonne National Laboratory is related to continuation of research and development work being conducted at the Laboratory site at Argonne, Illinois. The Laboratory has been monitoring various environmental parameters both offsite and onsite since 1949. Meteorological data have been collected to support development of models for atmospheric dispersion of radioactive and other pollutants. Gaseous and liquid effluents, both radioactive and non-radioactive, have been measured by portable monitors and by continuous monitors at fixed sites. Monitoring of constituents of the terrestrial ecosystem provides a basis for identifying changes should they occur in this regime. The Laboratory has established a position of leadership in monitoring methodologies and their application. Offsite impacts of nonradiological accidents are primarily those associated with the release of chlorine and with sodium fires. Both result in releases that cause no health damage offsite. Radioactive materials released to the environment result in a cumulative dose to persons residing within 50 miles of the site of about 47 man-rem per year, compared to an annual total of about 950,000 man-rem delivered to the same population from natural background radiation. 100 refs., 17 figs., 33 tabs

  13. Draft environmental assessment of Argonne National Laboratory, East

    Energy Technology Data Exchange (ETDEWEB)

    1975-10-01

    This environmental assessment of the operation of the Argonne National Laboratory is related to continuation of research and development work being conducted at the Laboratory site at Argonne, Illinois. The Laboratory has been monitoring various environmental parameters both offsite and onsite since 1949. Meteorological data have been collected to support development of models for atmospheric dispersion of radioactive and other pollutants. Gaseous and liquid effluents, both radioactive and non-radioactive, have been measured by portable monitors and by continuous monitors at fixed sites. Monitoring of constituents of the terrestrial ecosystem provides a basis for identifying changes should they occur in this regime. The Laboratory has established a position of leadership in monitoring methodologies and their application. Offsite impacts of nonradiological accidents are primarily those associated with the release of chlorine and with sodium fires. Both result in releases that cause no health damage offsite. Radioactive materials released to the environment result in a cumulative dose to persons residing within 50 miles of the site of about 47 man-rem per year, compared to an annual total of about 950,000 man-rem delivered to the same population from natural background radiation. 100 refs., 17 figs., 33 tabs.

  14. Monte Carlo simulations versus experimental measurements in a small animal PET system. A comparison in the NEMA NU 4-2008 framework

    Science.gov (United States)

    Popota, F. D.; Aguiar, P.; España, S.; Lois, C.; Udias, J. M.; Ros, D.; Pavia, J.; Gispert, J. D.

    2015-01-01

    In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system’s sensitivity and scatter fraction for an energy window of 350-650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system’s dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems.

  15. Monte Carlo simulations versus experimental measurements in a small animal PET system. A comparison in the NEMA NU 4-2008 framework

    International Nuclear Information System (INIS)

    In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system’s sensitivity and scatter fraction for an energy window of 350–650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system’s dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems. (paper)

  16. Monte-Carlo simulation of colliding particles or coalescing droplets transported by a turbulent flow in the framework of a joint fluid–particle pdf approach

    OpenAIRE

    Fede, Pascal; Simonin, Olivier; Villedieu, Philippe

    2015-01-01

    The aim of the paper is to introduce and validate a Monte-Carlo algorithm for the prediction of an ensemble of colliding solid particles, or coalescing liquid droplets, suspended in a turbulent gas flow predicted by Reynolds Averaged Navier Stokes approach (RANS). The new algorithm is based on the direct discretization of the collision/coalescence kernel derived in the framework of a joint fluid–particle pdf approach proposed by Simonin et al. (2002). This approach allows to take into account...

  17. Statistical framework for evaluation of climate model simulations by use of climate proxy data from the last millennium – Part 2: A pseudo-proxy study addressing the amplitude of solar forcing

    Directory of Open Access Journals (Sweden)

    A. Hind

    2012-08-01

    Full Text Available The statistical framework of Part 1 (Sundberg et al., 2012, for comparing ensemble simulation surface temperature output with temperature proxy and instrumental records, is implemented in a pseudo-proxy experiment. A set of previously published millennial forced simulations (Max Planck Institute – COSMOS, including both "low" and "high" solar radiative forcing histories together with other important forcings, was used to define "true" target temperatures as well as pseudo-proxy and pseudo-instrumental series. In a global land-only experiment, using annual mean temperatures at a 30-yr time resolution with realistic proxy noise levels, it was found that the low and high solar full-forcing simulations could be distinguished. In an additional experiment, where pseudo-proxies were created to reflect a current set of proxy locations and noise levels, the low and high solar forcing simulations could only be distinguished when the latter served as targets. To improve detectability of the low solar simulations, increasing the signal-to-noise ratio in local temperature proxies was more efficient than increasing the spatial coverage of the proxy network. The experiences gained here will be of guidance when these methods are applied to real proxy and instrumental data, for example when the aim is to distinguish which of the alternative solar forcing histories is most compatible with the observed/reconstructed climate.

  18. The RD53 Collaboration's SystemVerilog-UVM Simulation Framework and its General Applicability to Design of Advanced Pixel Readout Chips

    OpenAIRE

    Marconi, S; Conti, E; Placidi, P; Christiansen, J.; Hemperek, T.

    2014-01-01

    The foreseen Phase 2 pixel upgrades at the LHC have very challenging requirements for the design of hybrid pixel readout chips. A versatile pixel simulation platform is as an essential development tool for the design, verification and optimization of both the system architecture and the pixel chip building blocks (Intellectual Properties, IPs). This work is focused on the implemented simulation and verification environment named VEPIX53, built using the SystemVerilog language and the Universa...

  19. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    Science.gov (United States)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-09-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers.

  20. A framework for geometry acquisition, 3-D printing, simulation, and measurement of head-related transfer functions with a focus on hearing-assistive devices

    DEFF Research Database (Denmark)

    Harder, Stine; Paulsen, Rasmus Reinhold; Larsen, Martin;

    2016-01-01

    of a three-dimensional (3D) head model for acquisition of individual HRTFs. Two aspects were investigated; whether a 3D-printed model can replace measurements on a human listener and whether numerical simulations can replace acoustic measurements. For this purpose, HRTFs were acoustically measured for four...... human listeners and for a 3D printed head model of one of these listeners. Further, HRTFs were simulated by applying the finite element method to the 3D head model. The monaural spectral features and spectral distortions were very similar between re-measurements and between human and printed...

  1. Users Handbook for the Argonne Premium Coal Sample Program

    Energy Technology Data Exchange (ETDEWEB)

    Vorres, K.S.

    1993-10-01

    This Users Handbook for the Argonne Premium Coal Samples provides the recipients of those samples with information that will enhance the value of the samples, to permit greater opportunities to compare their work with that of others, and aid in correlations that can improve the value to all users. It is hoped that this document will foster a spirit of cooperation and collaboration such that the field of basic coal chemistry may be a more efficient and rewarding endeavor for all who participate. The different sections are intended to stand alone. For this reason some of the information may be found in several places. The handbook is also intended to be a dynamic document, constantly subject to change through additions and improvements. Please feel free to write to the editor with your comments and suggestions.

  2. The Argonne Wakefield Accelerator Facility Status and Recent Activities

    CERN Document Server

    Conde, Manoel; Gai, Wei; Jing, Chunguang; Konecny, Richard; Liu Wan Ming; Power, John G; Wang, Haitao; Yusof, Zikri

    2005-01-01

    The Argonne Wakefield Accelerator Facility (AWA) is dedicated to the study of electron beam physics and the development of accelerating structures based on electron beam driven wakefields. In order to carry out these studies, the facility employs a photocathode RF gun capable of generating electron beams with high bunch charges (up to 100 nC) and short bunch lengths. This high intensity beam is used to excite wakefields in the structures under investigation. The wakefield structures presently under development are dielectric loaded cylindrical waveguides with operating frequencies of 7.8 or 15.6 GHz. The facility is also used to investigate the generation and propagation of high brightness electron beams. Presently under investigation, is the use of photons with energies lower than the work function of the cathode surface (Schottky-enabled photoemission), aimed at generating electron beams with low thermal emittance. Novel electron beam diagnostics are also developed and tested at the facility. The AWA electr...

  3. Beam measurements on Argonne linac for collider injector design

    International Nuclear Information System (INIS)

    The 20 MeV electron linac at Argonne produces 5 x 1010 electrons in a single bunch. This amount of charge per bunch is required for the proposed single pass collider at SLAC. For this reason the characteristics of the beam from this machine are of interest. The longitudinal charge distribution has been measured by a new technique. The technique is a variation on the deduction of bunch shape from a spectrum measurement. Under favorable conditions a resolution of about 10 of phase is possible, which is considerably better than can be achieved with streak cameras. The bunch length at 4.5 x 1010e- per bunch was measured to be 150 FWHM. The transverse emittance has also been measured using standard techniques. The emittance is 16 mm-mrad at 17.2 MeV. (Auth.)

  4. Beam measurements on Argonne linac for collider injector design

    International Nuclear Information System (INIS)

    The 20 MeV electron linac at Argonne produces 5 x 1010 electrons in a single bunch. This amount of charge per bunch is required for the proposed single pass collider at SLAC. For this reason the characteristics of the beam from this machine are of interest. The longitudinal charge distribution has been measured by a new technique. The technique is a variation on the deduction of bunch shape from a spectrum measurement. Under favorable conditions a resolution of about 10 of phase is possible, which is considerably better than can be achieved with streak cameras. The bunch length at 4.5 x 1010 e- per bunch was measured to be 150 FWHM. The transverse emittance has also been measured using standard techniques. The emittance is 16 mm-mrad at 17.2 MeV

  5. Status of the Advanced Photon Source at Argonne National Laboratory

    International Nuclear Information System (INIS)

    The Advanced Photon Source at Argonne National Laboratory is a third-generation light source optimized for production of high-brilliance undulator radiation in the hard x-ray portion of the spectrum. A user community representing all major centers of synchrotron research, including universities, industry, and federal laboratories, will utilize these x-ray beams for investigations across a diverse range of disciplines. All technical facilities and components required for operations have been completed and installed, and are well along in the commissioning process. Major design goals and Department of Energy milestones have been met or exceeded. Project funds have been maximized to construct a number of beamline components and user facilities over and above those called for in the original project scope. Research teams preparing experimental apparatus at the Advanced Photon Source have procured strong funding support. copyright 1996 American Institute of Physics

  6. Sodium carbonate facility at Argonne National Laboratory-West

    International Nuclear Information System (INIS)

    The Sodium Carbonate Facility, located at Argonne National Laboratory - West (ANL-W) in Idaho, was designed and built as an addition to the existing Sodium Processing Facility. The Sodium Process and Sodium Carbonate Facilities will convert radioactive sodium into a product that is acceptable for land disposal in Idaho. The first part of the process occurs in the Sodium Process Facility where radioactive sodium is converted into sodium hydroxide (caustic). The second part of the process occurs in the Sodium Carbonate Facility where the caustic solution produced in the Sodium Process Facility is converted into a dry sodium carbonate waste suitable for land disposal. Due to the radioactivity in the sodium, shielding, containment, and HEPA filtered off-gas systems are required throughout both processes

  7. Characterisation and testing of a prototype $6 \\times 6$ cm$^2$ Argonne MCP-PMT

    CERN Document Server

    Cowan, Greig A; Needham, Matthew; Gambetta, Silvia; Eisenhardt, Stephan; McBlane, Neil; Malek, Matthew

    2016-01-01

    The Argonne micro-channel plate photomultiplier tube (MCP-PMT) is an offshoot of the Large Area Pico-second Photo Detector (LAPPD) project, wherein \\mbox{6 $\\times$ 6 cm$^2$} sized detectors are made at Argonne National Laboratory. Measurements of the properties of these detectors, including gain, time and spatial resolution, dark count rates, cross-talk and sensitivity to magnetic fields are reported. In addition, possible applications of these devices in future neutrino and collider physics experiments are discussed.

  8. A use of the microdosimetric Kinetic Model (MKM) for the interpretation of cell irradiation in the framework of the hadron-therapy: Application of Monte-Carlo simulations

    International Nuclear Information System (INIS)

    Hadron-therapy is a cancer treatment method based on the use of heavy charged particles. The physical characteristics of these particles allow more precise targeting of tumours and offer higher biological efficiency than photons and electrons. This thesis addresses the problem of modelling the biological effects induced by such particles. One part of this work is devoted to the analysis of the Monte-Carlo simulation tool-kit 'Geant4' used to simulate the physical stage of the particle interactions with the biological medium. We evaluated the ability of 'Geant4' to simulate the microscopic distribution of energy deposition produced by charged particles and we compared these results with those of another simulation code dedicated to radiobiological applications. The other part of the work is dedicated to the study of two radiobiological models that are the LEM (Local Effect Model) based on an amorphous track structure approach and the MKM (Microdosimetric Kinetic Model) based on microdosimetric approach. A theoretical analysis of both models and a comparison of their concepts are presented. Then we focused on a detailed analysis of the microdosimetric model 'MKM'. Finally, we applied the MKM to reproduce the experimental results obtained at GANIL by irradiation of two tumour cell lines (cell line SCC61 and SQ20B) of different radiosensitivity with carbon and argon ions. (author)

  9. Using E-Z Reader to Simulate Eye Movements in Nonreading Tasks: A Unified Framework for Understanding the Eye-Mind Link

    Science.gov (United States)

    Reichle, Erik D.; Pollatsek, Alexander; Rayner, Keith

    2012-01-01

    Nonreading tasks that share some (but not all) of the task demands of reading have often been used to make inferences about how cognition influences when the eyes move during reading. In this article, we use variants of the E-Z Reader model of eye-movement control in reading to simulate eye-movement behavior in several of these tasks, including…

  10. Science based integrated approach to advanced nuclear fuel development - integrated multi-scale multi-physics hierarchical modeling and simulation framework Part III: cladding

    Energy Technology Data Exchange (ETDEWEB)

    Tome, Carlos N [Los Alamos National Laboratory; Caro, J A [Los Alamos National Laboratory; Lebensohn, R A [Los Alamos National Laboratory; Unal, Cetin [Los Alamos National Laboratory; Arsenlis, A [LLNL; Marian, J [LLNL; Pasamehmetoglu, K [INL

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.

  11. Kinetic model framework for aerosol and cloud surface chemistry and gas-particle interactions ─ Part 2: Exemplary practical applications and numerical simulations

    Directory of Open Access Journals (Sweden)

    M. Ammann

    2007-12-01

    Full Text Available A kinetic model framework with consistent and unambiguous terminology and universally applicable rate equations and parameters for aerosol and cloud surface chemistry and gas-particle interactions has been presented in the preceding companion paper by Pöschl, Rudich and Ammann (Pöschl et al., 2007, abbreviated PRA. It allows to describe mass transport and chemical reaction at the gas-particle interface and to link aerosol and cloud surface processes with gas phase and particle bulk processes. Here we present multiple exemplary model systems and calculations illustrating how the general mass balance and rate equations of the PRA framework can be easily reduced to compact sets of equations which enable a mechanistic description of time and concentration dependencies of trace gas uptake and particle composition in systems with one or more chemical components and physicochemical processes. Time-dependent model scenarios show the effects of reversible adsorption, surface-bulk transport, and chemical aging on the temporal evolution of trace gas uptake by solid particles and solubility saturation of liquid particles. They demonstrate how the transformation of particles and the variation of trace gas accommodation and uptake coefficients by orders of magnitude over time scales of microseconds to days can be explained and predicted from the initial composition and basic kinetic parameters of model systems by iterative calculations using standard spreadsheet programs. Moreover, they show how apparently inconsistent experimental data sets obtained with different techniques and on different time scales can be efficiently linked and mechanistically explained by application of consistent model formalisms and terminologies within the PRA framework. Steady-state model scenarios illustrate characteristic effects of gas phase composition and basic kinetic parameters on the rates of mass transport and chemical reactions. They demonstrate how adsorption and

  12. 电子对抗作战仿真分层半自治Agent系统框架设计%Design of Electronic Warfare Simulation System Framework Based on Semi-autonomous Agent

    Institute of Scientific and Technical Information of China (English)

    成晓鹏; 齐锋; 王枭

    2016-01-01

    In order to redisplay the process of real combat in the electronic warfare simulation system, it designs a layered semi⁃autonomous Agent system framework. Beginning with the point of devising the electronic warfare simulation entity, a new manner of creating entity Agent based on the BDI framework was built and the advantage of semi⁃autonomous Agent was probed. Not only the way how to design Agent became clear, but also the layered semi⁃autonomous Agent structure based on the sort of entities was established. It displays how to use the semi⁃Agent technique in electronic warfare simulation and is re⁃ally significant.%为了在电子对抗作战仿真系统中真实地复现现实作战过程,设计了基于分层半自治Agent的系统框架。从设计电子对抗作战仿真实体Agent的现实意义出发,提出了基于BDI框架的仿真实体Agent设计方法,着重探讨了半自治Agent技术应用于作战仿真领域的巨大优势,明确了个体Agent的设计方法。根据电子对抗作战仿真系统的实体分类结果提出了一种基于分层半自治Agent的系统结构,并分析了不同层次的Agent的功能,为半自治Agent技术应用于电子对抗作战仿真领域提供了思路,具有一定的借鉴意义。

  13. V&V framework

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Maniaci, David Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naughton, Jonathan W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3) uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.

  14. Identifying the origin of differences between 3D numerical simulations of ground motion in sedimentary basins: lessons from stringent canonical test models in the E2VP framework

    Science.gov (United States)

    Chaljub, Emmanuel; Maufroy, Emeline; Moczo, Peter; Kristek, Jozef; Priolo, Enrico; Klin, Peter; De Martin, Florent; Zhang, Zenghuo; Hollender, Fabrice; Bard, Pierre-Yves

    2013-04-01

    Numerical simulation is playing a role of increasing importance in the field of seismic hazard by providing quantitative estimates of earthquake ground motion, its variability, and its sensitivity to geometrical and mechanical properties of the medium. Continuous efforts to develop accurate and computationally efficient numerical methods, combined with increasing computational power have made it technically feasible to calculate seismograms in 3D realistic configurations and for frequencies of interest in seismic design applications. Now, in order to foster the use of numerical simulations in practical prediction of earthquake ground motion, it is important to evaluate the accuracy of current numerical methods when applied to realistic 3D sites. This process of verification is a necessary prerequisite to confrontation of numerical predictions and observations. Through the ongoing Euroseistest Verification and Validation Project (E2VP), which focuses on the Mygdonian basin (northern Greece), we investigated the capability of numerical methods to predict earthquake ground motion for frequencies up to 4 Hz. Numerical predictions obtained by several teams using a wide variety of methods were compared using quantitative goodness-of-fit criteria. In order to better understand the cause of misfits between different simulations, initially performed for the realistic geometry of the Mygdonian basin, we defined five stringent canonical configurations. The canonical models allow for identifying sources of misfits and quantify their importance. Detailed quantitative comparison of simulations in relation to dominant features of the models shows that even relatively simple heterogeneous models must be treated with maximum care in order to achieve sufficient level of accuracy. One important conclusion is that the numerical representation of models with strong variations (e.g. discontinuities) may considerably vary from one method to the other, and may become a dominant source of

  15. Framework faults

    Science.gov (United States)

    Vierkorn-Rudolph, Beatrix

    2009-02-01

    Your news story "Carbon-capture and gamma-ray labs top Euro wish list" (January p6) states that the European Strategy Forum for Research Infrastructures (ESFRI) has a budget of €1.7bn and is "part of the European Union's Seventh Framework Programme (FP7)". Neither of these statements is true. In fact, as vice-chair of the ESFRI, I should point out that it is an independent strategic forum where delegates (nominated and mandated by the research ministers of the member states and associated states of the European Community) jointly reflect on the development of strategic policies for pan-European research infrastructures. As the forum is an informal body, it does not have any funds.

  16. A novel framework to simulating non-stationary, non-linear, non-Normal hydrological time series using Markov Switching Autoregressive Models

    Science.gov (United States)

    Birkel, C.; Paroli, R.; Spezia, L.; Tetzlaff, D.; Soulsby, C.

    2012-12-01

    In this paper we present a novel model framework using the class of Markov Switching Autoregressive Models (MSARMs) to examine catchments as complex stochastic systems that exhibit non-stationary, non-linear and non-Normal rainfall-runoff and solute dynamics. Hereby, MSARMs are pairs of stochastic processes, one observed and one unobserved, or hidden. We model the unobserved process as a finite state Markov chain and assume that the observed process, given the hidden Markov chain, is conditionally autoregressive, which means that the current observation depends on its recent past (system memory). The model is fully embedded in a Bayesian analysis based on Markov Chain Monte Carlo (MCMC) algorithms for model selection and uncertainty assessment. Hereby, the autoregressive order and the dimension of the hidden Markov chain state-space are essentially self-selected. The hidden states of the Markov chain represent unobserved levels of variability in the observed process that may result from complex interactions of hydroclimatic variability on the one hand and catchment characteristics affecting water and solute storage on the other. To deal with non-stationarity, additional meteorological and hydrological time series along with a periodic component can be included in the MSARMs as covariates. This extension allows identification of potential underlying drivers of temporal rainfall-runoff and solute dynamics. We applied the MSAR model framework to streamflow and conservative tracer (deuterium and oxygen-18) time series from an intensively monitored 2.3 km2 experimental catchment in eastern Scotland. Statistical time series analysis, in the form of MSARMs, suggested that the streamflow and isotope tracer time series are not controlled by simple linear rules. MSARMs showed that the dependence of current observations on past inputs observed by transport models often in form of the long-tailing of travel time and residence time distributions can be efficiently explained by

  17. Toward Realistic Simulation of low-Level Clouds Using a Multiscale Modeling Framework With a Third-Order Turbulence Closure in its Cloud-Resolving Model Component

    Science.gov (United States)

    Xu, Kuan-Man; Cheng, Anning

    2010-01-01

    This study presents preliminary results from a multiscale modeling framework (MMF) with an advanced third-order turbulence closure in its cloud-resolving model (CRM) component. In the original MMF, the Community Atmosphere Model (CAM3.5) is used as the host general circulation model (GCM), and the System for Atmospheric Modeling with a first-order turbulence closure is used as the CRM for representing cloud processes in each grid box of the GCM. The results of annual and seasonal means and diurnal variability are compared between the modified and original MMFs and the CAM3.5. The global distributions of low-level cloud amounts and precipitation and the amounts of low-level clouds in the subtropics and middle-level clouds in mid-latitude storm track regions in the modified MMF show substantial improvement relative to the original MMF when both are compared to observations. Some improvements can also be seen in the diurnal variability of precipitation.

  18. Thermal large Eddy simulations and experiments in the framework of non-isothermal blowing; Simulations des grandes echelles thermiques et experiences dans le cadre d'effusion anisotherme

    Energy Technology Data Exchange (ETDEWEB)

    Brillant, G

    2004-06-15

    The aim of this work is to study thermal large-eddy simulations and to determine the nonisothermal blowing impact on a turbulent boundary layer. An experimental study is also carried out in order to complete and validate simulation results. In a first time, we developed a turbulent inlet condition for the velocity and the temperature, which is necessary for the blowing simulations.We studied the asymptotic behavior of the velocity, the temperature and the thermal turbulent fluxes in a large-eddy simulation point of view. We then considered dynamics models for the eddy-diffusivity and we simulated a turbulent channel flow with imposed temperature, imposed flux and adiabatic walls. The numerical and experimental study of blowing permitted to obtain to the modifications of a thermal turbulent boundary layer with the blowing rate. We observed the consequences of the blowing on mean and rms profiles of velocity and temperature but also on velocity-velocity and velocity-temperature correlations. Moreover, we noticed an increase of the turbulent structures in the boundary layer with blowing. (author)

  19. A new generic plant growth model framework (PMF): Simulating distributed dynamic interaction of biomass production and its interaction with water and nutrients fluxes

    Science.gov (United States)

    Multsch, Sebastian; Kraft, Philipp; Frede, Hans-Georg; Breuer, Lutz

    2010-05-01

    Today, crop models have a widespread application in natural sciences, because plant growth interacts and modifies the environment. Transport processes involve water and nutrient uptake from the saturated and unsaturated zone in the pedosphere. Turnover processes include the conversion of dead root biomass into organic matter. Transpiration and the interception of radiation influence the energy exchange between atmosphere and biosphere. But many more feedback mechanisms might be of interest, including erosion, soil compaction or trace gas exchanges. Most of the existing crop models have a closed structure and do not provide interfaces or code design elements for easy data transfer or process exchange with other models during runtime. Changes in the model structure, the inclusion of alternative process descriptions or the implementation of additional functionalities requires a lot of coding. The same is true if models are being upscaled from field to landscape or catchment scale. We therefore conclude that future integrated model developments would benefit from a model structure that has the following requirements: replaceability, expandability and independency. In addition to these requirements we also propose the interactivity of models, which means that models that are being coupled are highly interacting and depending on each other, i.e. the model should be open for influences from other independent models and react on influences directly. Hence, a model which consists of building blocks seems to be reasonable. The aim of the study is the presentation of the new crop model type, the plant growth model framework, PMF. The software concept refers to an object-oriented approach, which is developed with the Unified Modeling Language (UML). The model is implemented with Python, a high level object-oriented programming language. The integration of the models with a setup code enables the data transfer on the computer memory level and direct exchange of information

  20. r.avaflow: An advanced open source computational framework for the GIS-based simulation of two-phase mass flows and process chains

    Science.gov (United States)

    Mergili, Martin; Fischer, Jan-Thomas; Fellin, Wolfgang; Ostermann, Alexander; Pudasaini, Shiva P.

    2015-04-01

    Geophysical mass flows stand for a broad range of processes and process chains such as flows and avalanches of snow, soil, debris or rock, and their interactions with water bodies resulting in flood waves. Despite considerable efforts put in model development, the simulation, and therefore the appropriate prediction of these types of events still remains a major challenge in terms of the complex material behaviour, strong phase interactions, process transformations and the complex mountain topography. Sophisticated theories exist, but they have hardly been brought to practice yet. We fill this gap by developing a novel and unified high-resolution computational tool, r.avaflow, representing a comprehensive and advanced open source GIS simulation environment for geophysical mass flows. Based on the latest and most advanced two-phase physical-mathematical models, r.avaflow includes the following features: (i) it is suitable for a broad spectrum of mass flows such as rock, rock-ice and snow avalanches, glacial lake outburst floods, debris and hyperconcentrated flows, and even landslide-induced tsunamis and submarine landslides, as well as process chains involving more than one of these phenomena; (ii) it accounts for the real two-phase nature of many flow types: viscous fluids and solid particles are considered separately with advanced mechanics and strong phase interactions; (iii) it is freely available and adoptable along with the GRASS GIS software. In the future, it will include the intrinsic topographic influences on the flow dynamics and morphology as well as an advanced approach to simulate the entrainment and deposition of solid and fluid material. As input r.avaflow needs information on (a) the mountain topography, (b) the material properties and (c) the spatial distribution of the solid and fluid release masses or one or more hydrographs of fluid and solid material. We demonstrate the functionalities and performance of r.avaflow by using some generic and real

  1. iFit: a new data analysis framework. Applications for data reduction and optimization of neutron scattering instrument simulations with McStas

    DEFF Research Database (Denmark)

    Farhi, E.; Y., Debab,; Willendrup, Peter Kjær

    2014-01-01

    We present a new tool, iFit, which uses a single object class to hold any data set, and provides an extensive list of methods to import and export data, view, manipulate, apply mathematical operators, optimize problems and fit models to the data sets. Currently implemented using Matlab...... and noisy problems. These optimizers can then be used to fit models onto data objects, and optimize McStas instrument simulations. As an application, we propose a methodology to analyse neutron scattering measurements in a pure Monte Carlo optimization procedure using McStas and iFit. As opposed...... to the conventional data reduction and analysis procedures, this new methodology is able to intrinsically account for most of the experimental effects, and results in the sample only model, de-convolved from the instrument....

  2. Argonne National Laboratory summary site environmental report for calendar year 2006.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; ESH/QA Oversight

    2008-03-27

    This booklet is designed to inform the public about what Argonne National Laboratory is doing to monitor its environment and to protect its employees and neighbors from any adverse environmental impacts from Argonne research. The Downers Grove South Biology II class was selected to write this booklet, which summarizes Argonne's environmental monitoring programs for 2006. Writing this booklet also satisfies the Illinois State Education Standard, which requires that students need to know and apply scientific concepts to graduate from high school. This project not only provides information to the public, it will help students become better learners. The Biology II class was assigned to condense Argonne's 300-page, highly technical Site Environmental Report into a 16-page plain-English booklet. The site assessment relates to the class because the primary focus of the Biology II class is ecology and the environment. Students developed better learning skills by working together cooperatively, writing and researching more effectively. Students used the Argonne Site Environmental Report, the Internet, text books and information from Argonne scientists to help with their research on their topics. The topics covered in this booklet are the history of Argonne, groundwater, habitat management, air quality, Argonne research, Argonne's environmental non-radiological program, radiation, and compliance. The students first had to read and discuss the Site Environmental Report and then assign topics to focus on. Dr. Norbert Golchert and Mr. David Baurac, both from Argonne, came into the class to help teach the topics more in depth. The class then prepared drafts and wrote a final copy. Ashley Vizek, a student in the Biology class stated, 'I reviewed my material and read it over and over. I then took time to plan my paper out and think about what I wanted to write about, put it into foundation questions and started to write my paper. I rewrote and revised so I

  3. Flow Induced Vibration Program at Argonne National Laboratory

    International Nuclear Information System (INIS)

    Argonne National Laboratory has had a Flow Induced Vibration Program since 1967; the Program currently resides in the Laboratory's Components Technology Division. Throughout its existence, the overall objective of the program has been to develop and apply new and/or improved methods of analysis and testing for the design evaluation of nuclear reactor plant components and heat exchange equipment from the standpoint of flow induced vibration. Historically, the majority of the program activities have been funded by the US Atomic Energy Commission (AEC), Energy Research and Development Administration (ERDA), and Department of Energy (DOE). Current DOE funding is from the Breeder Mechanical Component Development Division, Office of Breeder Technology Projects; Energy Conversion and Utilization Technology (ECUT) Program, Office of Energy Systems Research; and Division of Engineering, Mathematical and Geosciences, Office of Basic Energy Sciences. Testing of Clinch River Breeder Reactor upper plenum components has been funded by the Clinch River Breeder Reactor Plant (CRBRP) Project Office. Work has also been performed under contract with Foster Wheeler, General Electric, Duke Power Company, US Nuclear Regulatory Commission, and Westinghouse

  4. Preliminary characterization of the 100 area at Argonne National Laboratory

    International Nuclear Information System (INIS)

    This characterization report is based on the results of sampling and an initial environmental assessment of the 100 Area of Argonne National Laboratory. It addresses the current status, projected data requirements, and recommended actions for five study areas within the 100 Area: the Lime Sludge Pond, the Building 108 Liquid Retention Pond, the Coal Yard, the East Area Burn Pit, and the Eastern Perimeter Area. Two of these areas are solid waste management units under the Resource Conservation and Recovery Act (the Lime Sludge Pond and the Building 108 Liquid Retention Pond); however, the Illinois Environmental Protection Agency has determined that no further action is necessary for the Lime Sludge Pond. Operational records for some of the activities were not available, and one study area (the East Area Burn Pit) could not be precisely located. Recommendations for further investigation include sample collection to obtain the following information: (1) mineralogy of major minerals and clays within the soils and underlying aquifer, (2) pH of the soils, (3) total clay fraction of the soils, (4) cation exchange capacity of the soils and aquifer materials, and (5) exchangeable cations of the soils and aquifer material. Various other actions are recommended for the 100 Area, including an electromagnetic survey, sampling of several study areas to determine the extent of contamination and potential migration pathways, and sampling to determine the presence of any radionuclides. For some of the study areas, additional actions are contingent on the results of the initial recommendations

  5. Routine environmental reaudit of the Argonne National Laboratory - West

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    This report documents the results of the Routine Environmental Reaudit of the Argonne National Laboratory - West (ANL-W), Idaho Falls, Idaho. During this audit, the activities conducted by the audit team included reviews of internal documents and reports from previous audits and assessments; interviews with U.S. Department of Energy (DOE), U.S. Environmental Protection Agency (EPA), State of Idaho Department of Health and Welfare (IDHW), and DOE contractor personnel; and inspections and observations of selected facilities and operations. The onsite portion of the audit was conducted from October 11 to October 22, 1993, by the DOE Office of Environmental Audit (EH-24), located within the Office of Environment, Safety and Health (EH). DOE 5482.113, {open_quotes}Environment, Safety, and Health Appraisal Program,{close_quotes} established the mission of EH-24 to provide comprehensive, independent oversight of Department-wide environmental programs on behalf of the Secretary of Energy. The ultimate goal of EH-24 is enhancement of environmental protection and minimization of risk to public health and the environment. EH-24 accomplishes its mission by conducting systematic and periodic evaluations of the Department`s environmental programs within line organizations, and by utilizing supplemental activities that serve to strengthen self-assessment and oversight functions within program, field, and contractor organizations.

  6. Flow Induced Vibration Program at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    Argonne National Laboratory has had a Flow Induced Vibration Program since 1967; the Program currently resides in the Laboratory's Components Technology Division. Throughout its existence, the overall objective of the program has been to develop and apply new and/or improved methods of analysis and testing for the design evaluation of nuclear reactor plant components and heat exchange equipment from the standpoint of flow induced vibration. Historically, the majority of the program activities have been funded by the US Atomic Energy Commission (AEC), Energy Research and Development Administration (ERDA), and Department of Energy (DOE). Current DOE funding is from the Breeder Mechanical Component Development Division, Office of Breeder Technology Projects; Energy Conversion and Utilization Technology (ECUT) Program, Office of Energy Systems Research; and Division of Engineering, Mathematical and Geosciences, Office of Basic Energy Sciences. Testing of Clinch River Breeder Reactor upper plenum components has been funded by the Clinch River Breeder Reactor Plant (CRBRP) Project Office. Work has also been performed under contract with Foster Wheeler, General Electric, Duke Power Company, US Nuclear Regulatory Commission, and Westinghouse.

  7. Routine environmental reaudit of the Argonne National Laboratory - West

    International Nuclear Information System (INIS)

    This report documents the results of the Routine Environmental Reaudit of the Argonne National Laboratory - West (ANL-W), Idaho Falls, Idaho. During this audit, the activities conducted by the audit team included reviews of internal documents and reports from previous audits and assessments; interviews with U.S. Department of Energy (DOE), U.S. Environmental Protection Agency (EPA), State of Idaho Department of Health and Welfare (IDHW), and DOE contractor personnel; and inspections and observations of selected facilities and operations. The onsite portion of the audit was conducted from October 11 to October 22, 1993, by the DOE Office of Environmental Audit (EH-24), located within the Office of Environment, Safety and Health (EH). DOE 5482.113, open-quotes Environment, Safety, and Health Appraisal Program,close quotes established the mission of EH-24 to provide comprehensive, independent oversight of Department-wide environmental programs on behalf of the Secretary of Energy. The ultimate goal of EH-24 is enhancement of environmental protection and minimization of risk to public health and the environment. EH-24 accomplishes its mission by conducting systematic and periodic evaluations of the Department's environmental programs within line organizations, and by utilizing supplemental activities that serve to strengthen self-assessment and oversight functions within program, field, and contractor organizations

  8. Argonne National Laboratory site environmental report for calendar year 2004.

    Energy Technology Data Exchange (ETDEWEB)

    Golchert, N. W.; Kolzow, R. G.

    2005-09-02

    This report discusses the accomplishments of the environmental protection program at Argonne National Laboratory (ANL) for calendar year 2004. The status of ANL environmental protection activities with respect to compliance with the various laws and regulations is discussed, along with the progress of environmental corrective actions and restoration projects. To evaluate the effects of ANL operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the ANL site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and ANL effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups was estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, ANL, and other) and are compared with applicable environmental quality standards. A U.S. Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the U.S. Environmental Protection Agency's CAP-88 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report.

  9. The IceProd Framework

    DEFF Research Database (Denmark)

    Aartsen, M.G.; Abbasi, R.; Ackermann, M.;

    2015-01-01

    IceCube is a one-gigaton instrument located at the geographic South Pole, designed to detect cosmic neutrinos, iden- tify the particle nature of dark matter, and study high-energy neutrinos themselves. Simulation of the IceCube detector and processing of data require a significant amount of compu...... the details of job submission and job management from the framework....

  10. Argonne National Laboratory: Laboratory Directed Research and Development FY 1993 program activities. Annual report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1993-12-23

    The purposes of Argonne`s Laboratory Directed Research and Development (LDRD) Program are to encourage the development of novel concepts, enhance the Laboratory`s R&D capabilities, and further the development of its strategic initiatives. Projects are selected from proposals for creative and innovative R&D studies which are not yet eligible for timely support through normal programmatic channels. Among the aims of the projects supported by the Program are establishment of engineering ``proof-of-principle`` assessment of design feasibility for prospective facilities; development of an instrumental prototype, method, or system; or discovery in fundamental science. Several of these projects are closely associated with major strategic thrusts of the Laboratory as described in Argonne`s Five Year Institutional Plan, although the scientific implications of the achieved results extend well beyond Laboratory plans and objectives. The projects supported by the Program are distributed across the major programmatic areas at Argonne as indicated in the Laboratory LDRD Plan for FY 1993.

  11. Quantifying the effect of tissue deformation on diffusion-weighted MRI: a mathematical model and an efficient simulation framework applied to cardiac diffusion imaging

    Science.gov (United States)

    Mekkaoui, Imen; Moulin, Kevin; Croisille, Pierre; Pousin, Jerome; Viallon, Magalie

    2016-08-01

    Cardiac motion presents a major challenge in diffusion weighted MRI, often leading to large signal losses that necessitate repeated measurements. The diffusion process in the myocardium is difficult to investigate because of the unqualified sensitivity of diffusion measurements to cardiac motion. A rigorous mathematical formalism is introduced to quantify the effect of tissue motion in diffusion imaging. The presented mathematical model, based on the Bloch-Torrey equations, takes into account deformations according to the laws of continuum mechanics. Approximating this mathematical model by using finite elements method, numerical simulations can predict the sensitivity of the diffusion signal to cardiac motion. Different diffusion encoding schemes are considered and the diffusion weighted MR signals, computed numerically, are compared to available results in literature. Our numerical model can identify the existence of two time points in the cardiac cycle, at which the diffusion is unaffected by myocardial strain and cardiac motion. Of course, these time points depend on the type of diffusion encoding scheme. Our numerical results also show that the motion sensitivity of the diffusion sequence can be reduced by using either spin echo technique with acceleration motion compensation diffusion gradients or stimulated echo acquisition mode with unipolar and bipolar diffusion gradients.

  12. Quantifying the effect of tissue deformation on diffusion-weighted MRI: a mathematical model and an efficient simulation framework applied to cardiac diffusion imaging.

    Science.gov (United States)

    Mekkaoui, Imen; Moulin, Kevin; Croisille, Pierre; Pousin, Jerome; Viallon, Magalie

    2016-08-01

    Cardiac motion presents a major challenge in diffusion weighted MRI, often leading to large signal losses that necessitate repeated measurements. The diffusion process in the myocardium is difficult to investigate because of the unqualified sensitivity of diffusion measurements to cardiac motion. A rigorous mathematical formalism is introduced to quantify the effect of tissue motion in diffusion imaging. The presented mathematical model, based on the Bloch-Torrey equations, takes into account deformations according to the laws of continuum mechanics. Approximating this mathematical model by using finite elements method, numerical simulations can predict the sensitivity of the diffusion signal to cardiac motion. Different diffusion encoding schemes are considered and the diffusion weighted MR signals, computed numerically, are compared to available results in literature. Our numerical model can identify the existence of two time points in the cardiac cycle, at which the diffusion is unaffected by myocardial strain and cardiac motion. Of course, these time points depend on the type of diffusion encoding scheme. Our numerical results also show that the motion sensitivity of the diffusion sequence can be reduced by using either spin echo technique with acceleration motion compensation diffusion gradients or stimulated echo acquisition mode with unipolar and bipolar diffusion gradients. PMID:27385441

  13. Conceptual Model Summary Report Simulation Framework for Regional Geologic CO2 Storage Along Arches Province of Midwestern United States

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2011-06-30

    A conceptual model was developed for the Arches Province that integrates geologic and hydrologic information on the Eau Claire and Mt. Simon formations into a geocellular model. The conceptual model describes the geologic setting, stratigraphy, geologic structures, hydrologic features, and distribution of key hydraulic parameters. The conceptual model is focused on the Mt. Simon sandstone and Eau Claire formations. The geocellular model depicts the parameters and conditions in a numerical array that may be imported into the numerical simulations of carbon dioxide (CO2) storage. Geophysical well logs, rock samples, drilling logs, geotechnical test results, and reservoir tests were evaluated for a 500,000 km2 study area centered on the Arches Province. The geologic and hydraulic data were integrated into a three-dimensional (3D) grid of porosity and permeability, which are key parameters regarding fluid flow and pressure buildup due to CO2 injection. Permeability data were corrected in locations where reservoir tests have been performed in Mt. Simon injection wells. The final geocellular model covers an area of 600 km by 600 km centered on the Arches Province. The geocellular model includes a total of 24,500,000 cells representing estimated porosity and permeability distribution. CO2 injection scenarios were developed for on-site and regional injection fields at rates of 70 to 140 million metric tons per year.

  14. ZEND FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Lupasc Adrian

    2013-12-01

    Full Text Available In this paper we present Zend Architecture, which is an open source technology for developing web applications and services, based on object-oriented components, and the Model-View-Controller architectural pattern, also known as MVC, which is the fundament of this architecture. The MVC presentation emphasises its main characteristics, such as facilitating the components reuse by dividing the application into distinct interconnected modules, tasks distribution in the process of developing an application, the MVC life cycle and also the essential features of the components in which it separates the application: model, view, controller. The controller coordinates the models and views and it’s responsible with manipulating the user events through the corresponding actions. The model contains application rules, respectively the scripts that implement the database manipulation. The third component, the view represents the controllers interface with the user or the way it displays the response to the event triggered by the user. Another aspect treated in this paper consists in highlighting the Zend architecture advantages and disadvantages. Among the framework advantages, we can enumerate good code organization, due to its delimitation into three sections, presentation, logic and data access, and dividing the code into components, which facilitates the code reuse and testing. Other advantages are the open-source license and the support for multiple database systems. The main disadvantages are represented by its size and complexity, that makes it hard to understand for a beginner programmer, the resources it needs etc. The last section of the paper presents a comparison between Zend and other PHP architectures, like Symphony, CakePHP and CodeIgniter, which includes their essential features and points out their similarities and differences, based on the unique functions that set them apart from others. The main thing that distinguishes ZF from the

  15. Evaluation of regional-scale water level simulations using various river routing schemes within a hydrometeorological modelling framework for the preparation of the SWOT mission

    Science.gov (United States)

    Häfliger, V.; Martin, E.; Boone, A. A.; Habets, F.; David, C. H.; Garambois, P. A.; Roux, H.; Ricci, S. M.; Thévenin, A.; Berthon, L.; Biancamaria, S.

    2014-12-01

    The ability of a regional hydrometeorological model to simulate water depth is assessed in order to prepare for the SWOT (Surface Water and Ocean Topography) mission that will observe free surface water elevations for rivers having a width larger than 50/100 m. The Garonne river (56 000 km2, in south-western France) has been selected owing to the availability of operational gauges, and the fact that different modeling platforms, the hydrometeorological model SAFRAN-ISBA-MODCOU and several fine scale hydraulic models, have been extensively evaluated over two reaches of the river. Several routing schemes, ranging from the simple Muskingum method to time-variable parameter kinematic and diffusive waves schemes with time varying parameters, are tested using predetermined hydraulic parameters. The results show that the variable flow velocity scheme is advantageous for discharge computations when compared to the original Muskingum routing method. Additionally, comparisons between water level computations and in situ observations led to root mean square errors of 50-60 cm for the improved Muskingum method and 40-50 cm for the kinematic-diffusive wave method, in the downstream Garonne river. The error is larger than the anticipated SWOT resolution, showing the potential of the mission to improve knowledge of the continental water cycle. Discharge computations are also shown to be comparable to those obtained with high-resolution hydraulic models over two reaches. However, due to the high variability of river parameters (e.g. slope and river width), a robust averaging method is needed to compare the hydraulic model outputs and the regional model. Sensitivity tests are finally performed in order to have a better understanding of the mechanisms which control the key hydrological processes. The results give valuable information about the linearity, Gaussianity and symetry of the model, in order to prepare the assimilation of river heights in the model.

  16. Including Complexity, Heterogeneity and Vegetation Response Characteristics into Carbon Balance Assessments at Continental Scale: Stepwise Development of a Simulation Framework with the Bottom-Up Core Model PIXGRO

    Science.gov (United States)

    Tenhunen, J.; Geyer, R.; Owen, K.; Falge, E.; Reichstein, M.

    2005-12-01

    observations at flux tower sites during the extremely dry year 2003. PIXGRO is applied over a European grid with 10 km resolution in MODIS sinusoidal projection, where 1 km2 land cover and soils information is used to define average LAI, most frequent soil type and weighting of flux rates in the spatial summary. Phenological influences are based on temperature climate (onset of growth and senescence) and soil water availability (herbaceous dieback and stomatal restrictions). Initial results demonstrate good agreement of the simulations with observations at flux tower sites and year to year variation in the spatial patterns of NEE. These spatial patterns are examined with respect to plausibility, the identification of key factors determining contributions to the overall European flux rates, and future research needs.

  17. ATMOSPHERIC HEALTH EFFECTS FRAMEWORK (AHEF) MODEL

    Science.gov (United States)

    The Atmospheric and Health Effects Framework (AHEF) is used to assess theglobal impacts of substitutes for ozone-depleting substances (ODS). The AHEF is a series of FORTRAN modeling modules that collectively form a simulation framework for (a) translating ODS production into emi...

  18. Crystallization Kinetics within a Generic Modelling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist;

    2013-01-01

    An existing generic modelling framework has been expanded with tools for kinetic model analysis. The analysis of kinetics is carried out within the framework where kinetic constitutive models are collected, analysed and utilized for the simulation of crystallization operations. A modelling...... procedure is proposed to gain the information of crystallization operation kinetic model analysis and utilize this for faster evaluation of crystallization operations....

  19. Analysis of the Argonne distance tabletop exercise method.

    Energy Technology Data Exchange (ETDEWEB)

    Tanzman, E. A.; Nieves, L. A.; Decision and Information Sciences

    2008-02-14

    The purpose of this report is to summarize and evaluate the Argonne Distance Tabletop Exercise (DISTEX) method. DISTEX is intended to facilitate multi-organization, multi-objective tabletop emergency response exercises that permit players to participate from their own facility's incident command center. This report is based on experience during its first use during the FluNami 2007 exercise, which took place from September 19-October 17, 2007. FluNami 2007 exercised the response of local public health officials and hospitals to a hypothetical pandemic flu outbreak. The underlying purpose of the DISTEX method is to make tabletop exercising more effective and more convenient for playing organizations. It combines elements of traditional tabletop exercising, such as scenario discussions and scenario injects, with distance learning technologies. This distance-learning approach also allows playing organizations to include a broader range of staff in the exercise. An average of 81.25 persons participated in each weekly webcast session from all playing organizations combined. The DISTEX method required development of several components. The exercise objectives were based on the U.S. Department of Homeland Security's Target Capabilities List. The ten playing organizations included four public health departments and six hospitals in the Chicago area. An extent-of-play agreement identified the objectives applicable to each organization. A scenario was developed to drive the exercise over its five-week life. Weekly problem-solving task sets were designed to address objectives that could not be addressed fully during webcast sessions, as well as to involve additional playing organization staff. Injects were developed to drive play between webcast sessions, and, in some cases, featured mock media stories based in part on player actions as identified from the problem-solving tasks. The weekly 90-minute webcast sessions were discussions among the playing organizations

  20. Hadoop Based Distributed Computing Framework for Large-scale Cascading Failure Simulation and Analysis of Power System%基于Hadoop架构的电力系统连锁故障分布式计算技术

    Institute of Scientific and Technical Information of China (English)

    刘友波; 刘洋; 刘俊勇; 李勇; 刘挺坚; 刁塑

    2016-01-01

    以提升大规模组合故障快速仿真分析能力为目标,在Hadoop框架下研发了连锁故障分布式计算技术.基于PSD-BPA软件计算模块,利用Java开发连锁故障计算分析功能,实现驱动判定、故障集筛选、事故链搜索、严重度评估4类模块.通过部署Hadoop分布式文件系统(HDFS)存储调度功能,将事故链解耦为小粒度单一故障场景进行计算,可针对连锁故障仿真的不同复杂度提供跨系统的分布式计算服务,灵活应对计算开始前连锁故障中事故链组合的不可预测性.利用10机、16机系统和某省网实际数据进行技术测试,结果表明所研发系统实现了连锁故障分析应用与数据在计算服务网络中的分离,具备动态调配计算节点资源的能力,能自动适应事件规模为电网连锁故障的仿真分析提供强大计算能力,具有在线应用前景.%Aiming at improving the computing capability of power system cascading failure analysis,this paper proposes a platform which employs a distributed computing framework, Hadoop as an underlying infrastructure for power system cascading failures simulation and evaluation.Based on the power flow computation and time-domain simulation supported by PSD-BPA suites,the research significantly functions the action tripping logic,the filtering of pre-defined contingencies,fault chains searching and severity measurement.By means of the deployment of Hadoop framework with its distributed file system and customized dispatching program,our platform can integrate multiple simulation modules and supply services in processing computational intensive cascading failure scenarios.The experiments were carried out using the data of two benchmark systems and one practical power system respectively.The results indicate that the platform can dispatch computing tasks to nodes with considering load balancing.And it is self-adaptively processing the scale of fault chains in parallel.It is proved advanced